Jump to content

The GPU shortage is over


You are about to reply to a thread that has been inactive for 666 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

The GTX 560 has 1GB VRAM. (I also had a GTX 560 Ti from new in 2012, until it melted in 2016!)

If you did want to replace your 560, the GTX 1660 would be the present day equivilent replacement. The 1660 has 6 GB VRAM using GDDR 5 chips for its memory. Or for slightly more money, the GTX 1660 Super has 6 GB using faster GDDR 6 memory chips for its VRAM.

However, both GTX 1660's I've mentioned are now considered more than last generation as they don't feature ray tracing graphics capabilities introduced by newer RTX 2000 or RTX 3000 series GPUs.

And RTX 4000 series are only a few months away....

 

Edited by SarahKB7 Koskinen
  • Like 1
Link to comment
Share on other sites

12 minutes ago, sandi Mexicola said:
16 minutes ago, Love Zhaoying said:

GeForce GTX 560/PCIe/SSE2

Just curious, how much VRAM that thing have?  I have 4GB, and I keep running out in SL.

I dunno about VRAM and graphics cards, please forgive my ignorance.  * Edit * I wonder why they call it VRAM, if it is "not virtual"?

Google says:  "NVIDIA has paired 1,024 MB GDDR5 memory with the GeForce GTX 560, which are connected using a 256-bit memory interface. The GPU is operating at a frequency of 810 MHz, memory is running at 1000 MHz (4 Gbps effective)."

The full "About" info for Second Life:

Second Life Release 6.6.1.572458 (64bit)
Release Notes

You are at 119.7, 87.8, 24.7 in Luna located at simhost-01345183e60655013.agni
SLURL: http://maps.secondlife.com/secondlife/Luna/120/88/25
(global coordinates 254,584.0, 256,088.0, 24.7)
Second Life Server 2022-06-16.572665
Release Notes

CPU: Intel(R) Core(TM) i7-2600K CPU @ 3.40GHz (3410.02 MHz)
Memory: 32720 MB
OS Version: Microsoft Windows 10 64-bit (Build 19043.1766)
Graphics Card Vendor: NVIDIA Corporation
Graphics Card: GeForce GTX 560/PCIe/SSE2

Windows Graphics Driver Version: 23.21.13.9135
OpenGL Version: 4.6.0 NVIDIA 391.35

Window size: 1920x1017
Font Size Adjustment: 96pt
UI Scaling: 1
Draw distance: 128m
Bandwidth: 3000kbit/s
LOD factor: 1.125
Render quality: 5
Advanced Lighting Model: Enabled
Texture memory: 512MB
Disk cache: Max size 204.0 MB (99.9% used)

J2C Decoder Version: KDU v7.10.4
Audio Driver Version: FMOD Studio 2.02.06
Dullahan: 1.12.3.202111032221
  CEF: 91.1.21+g9dd45fe+chromium-91.0.4472.114
  Chromium: 91.0.4472.114
LibVLC Version: 3.0.16
Voice Server Version: Not Connected
Packets Lost: 0/995 (0.0%)
July 07 2022 07:36:33

Link to comment
Share on other sites

10 minutes ago, SarahKB7 Koskinen said:

The GTX 560 has 1GB VRAM. (I also had a GTX 560 Ti from new in 2012, until it melted in 2016!)

The GTX 1660 or GTX 1660 Super would be the present day equivilent replacement, they have 6 GB VRAM using GDDR 6 memory.

 

So, do you suggest I upgrade? I don't know if I see any performance issues due to the graphics card. 

"I don't know what I don't know!"

Link to comment
Share on other sites

the difference might be in our different render qualities (and, possibly, different viewers), and, possibly, i dunno what...

Firestorm 6.5.3 (65658) Mar  1 2022 10:01:35 (64bit / SSE2) (Firestorm-Releasex64) with Havok support

CPU: Intel(R) Core(TM) i9-10900K CPU @ 3.70GHz (3696 MHz)
Memory: 32617 MB
Concurrency: 20
OS Version: Microsoft Windows 10 64-bit (Build 19044.1806)
Graphics Card Vendor: NVIDIA Corporation
Graphics Card: NVIDIA GeForce GTX 1050 Ti/PCIe/SSE2
Graphics Card Memory: 4096 MB

Windows Graphics Driver Version: 30.0.15.1259
OpenGL Version: 4.6.0 NVIDIA 512.59

RestrainedLove API: (disabled)
libcurl Version: libcurl/7.54.1 OpenSSL/1.1.1l zlib/1.2.11 nghttp2/1.40.0
J2C Decoder Version: KDU v8.2
Audio Driver Version: FMOD Studio 2.02.05
Dullahan: 1.12.3.202111032221
  CEF: 91.1.21+g9dd45fe+chromium-91.0.4472.114
  Chromium: 91.0.4472.114
LibVLC Version: 3.0.16
Voice Server Version: Vivox 4.10.0000.32327

Settings mode: Firestorm
Viewer Skin: Firestorm (Dark)
Window size: 1920x1177 px
Font Used: Deja Vu (96 dpi)
Font Size Adjustment: 0 pt
UI Scaling: 1
Draw distance: 128 m
Bandwidth: 1500 kbit/s
LOD factor: 2
Render quality: High-Ultra (6/7)
Advanced Lighting Model: Yes
Texture memory: Dynamic (2048 MB min / 20% Cache / 20% VRAM)
Disk cache: Max size 9984.0 MB (78.1% used)
Built with MSVC version 1916
Packets Lost: 24/6,182 (0.4%)
July 07 2022 08:10:24 SLT

Link to comment
Share on other sites

30 minutes ago, Love Zhaoying said:

I dunno about VRAM and graphics cards, please forgive my ignorance.  * Edit * I wonder why they call it VRAM, if it is "not virtual"?

Google says:  "NVIDIA has paired 1,024 MB GDDR5 memory with the GeForce GTX 560, which are connected using a 256-bit memory interface. The GPU is operating at a frequency of 810 MHz, memory is running at 1000 MHz (4 Gbps effective)."

The full "About" info for Second Life:

Second Life Release 6.6.1.572458 (64bit)
Release Notes

You are at 119.7, 87.8, 24.7 in Luna located at simhost-01345183e60655013.agni
SLURL: http://maps.secondlife.com/secondlife/Luna/120/88/25
(global coordinates 254,584.0, 256,088.0, 24.7)
Second Life Server 2022-06-16.572665
Release Notes

CPU: Intel(R) Core(TM) i7-2600K CPU @ 3.40GHz (3410.02 MHz)
Memory: 32720 MB
OS Version: Microsoft Windows 10 64-bit (Build 19043.1766)
Graphics Card Vendor: NVIDIA Corporation
Graphics Card: GeForce GTX 560/PCIe/SSE2

Windows Graphics Driver Version: 23.21.13.9135
OpenGL Version: 4.6.0 NVIDIA 391.35

Window size: 1920x1017
Font Size Adjustment: 96pt
UI Scaling: 1
Draw distance: 128m
Bandwidth: 3000kbit/s
LOD factor: 1.125
Render quality: 5
Advanced Lighting Model: Enabled
Texture memory: 512MB
Disk cache: Max size 204.0 MB (99.9% used)

J2C Decoder Version: KDU v7.10.4
Audio Driver Version: FMOD Studio 2.02.06
Dullahan: 1.12.3.202111032221
  CEF: 91.1.21+g9dd45fe+chromium-91.0.4472.114
  Chromium: 91.0.4472.114
LibVLC Version: 3.0.16
Voice Server Version: Not Connected
Packets Lost: 0/995 (0.0%)
July 07 2022 07:36:33

As was stated if your system works fine to you, then don't upgrade. The largest limiting factor of your setup is the 1gig of VRAM , IMO.

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

Slow framerates, slow object, texture and distance rezzing, random on-screen graphical glitchiness (called "artifacts"), GPU fans noisily running at full speed and the distinctive smell of a hot dusty GPU on its last legs.

If the insides of your desktop PC case resemble a rats nest, then clean it. Reroute cabling so that it does not obstruct air flow through the case. Replace noisy or broken fans and perhaps replace CPU thermal paste too. Doing all this should lower temperatures and dust inside a PC and will help prolong the life of the PC - until Microsoft turns off all support for it when the inevitable next version of Windows comes around.... 

Edited by SarahKB7 Koskinen
  • Like 1
Link to comment
Share on other sites

3 minutes ago, sandi Mexicola said:

I use HWiNFO, and it starts yelling at me once I've used up about 90% of VRAM.  

Also, much slower framerates than I would like.  

Better framerates sounds like a good reason to upgrade. I figured that my CPU was the bottleneck.

Link to comment
Share on other sites

25 minutes ago, Love Zhaoying said:

Better framerates sounds like a good reason to upgrade. I figured that my CPU was the bottleneck.

Well, that's one reason why I use HWiNFO to monitor things... so I know where in my system things are maxing out.  

Also GPU-Z can be helpful if you want to focus on just your GPU(s).

All I really want is to be able to run around on SL all day on ULTRA graphics settings... is that too much to ask?  😋

  • Thanks 1
Link to comment
Share on other sites

2 hours ago, sandi Mexicola said:

Just curious, how much VRAM that thing have?  I have 4GB, and I keep running out in SL.

My GTX 1050 in a laptop has 4 GB VRAM. Works fine as long as I limit it to 2-3 GB. I might allow 5-10% extra in some viewers (usually in Firestorm). For me I get better mileage in Linux as the drivers there allow me to use the GPU alone without having to be throughput partly with the CPU's integrated graphics as it is in Windows. Also better FPS on average for me in Linux, about 4-5 more vs Windows... not much but it makes enough of a difference unless I'm in a crowded spot. Lets me run on mid-high settings most of the time with shadows on. The CPU on mine is a 8th Gen Intel i7-8750H, with 32 GB of DDR4 RAM.

Edited by JeromFranzic
extra detail
Link to comment
Share on other sites

Just now, sandi Mexicola said:

I was considering moving from 4GB to 8GB, but now I'm not so sure.

It won't make much difference unless you're working on SL video, stuff in 3d modeling apps like Blender, or using other apps that require more VRAM (and possibly more system RAM as well).

  • Like 1
Link to comment
Share on other sites

Anything above a GTX 960 can run SL at maxed out graphics settings, even with ALM + shadows.

As for the GPU shortage, it might be ”over”, but prices are still too high here, in EU, to consider buying now.

With the RTX 4000 arriving (probably this coming Autumn), it is wiser to wait: you will then get big discounts on the RTX 3000s. I am myself hoping to upgrade my main PC with a RTX 3070 to replace its GTX 1070 Ti, even though the latter is plenty powerful enough for SL.

Edited by Henri Beauchamp
  • Like 2
  • Thanks 1
Link to comment
Share on other sites

3 hours ago, Love Zhaoying said:

Should I finally bother to upgrade my GPU? My PC seems to run Second Life just fine.


CPU: Intel(R) Core(TM) i7-2600K CPU @ 3.40GHz (3410.02 MHz)
Memory: 32720 MB
OS Version: Microsoft Windows 10 64-bit (Build 19043.1766)
Graphics Card Vendor: NVIDIA Corporation
Graphics Card: GeForce GTX 560/PCIe/SSE2

 

I'd upgrade from that to something newer with at least 4GB of VRAM and I'd be looking at one of the 16xx series cards. 1650 Super or 1660 Super, depending on prices which are still out of whack. You don't need to worry about the RTX cards with an i7-2600, unless you find one super cheap of course. 

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

14 hours ago, Coffee Pancake said:

an i7 2600 is a massive bottleneck.

Indeed... Once the GPU issue out of the equation (i.e. for anything equal to or better than an old GTX 660 for ALM without shadows, or a GTX 460 with ALM off), the performances of the viewer increase almost exactly proportionally to the CPU single core performances (because the rendering engine is mono-threaded, even though some drivers such as NVIDIA's proprietary ones may use a few threads by themselves and load half to a full more core).

With an old 2600K, I would, even before considering buying a new GPU, attempt to overclock my CPU: the Sandy Bridge is super easy to overclock and can easily reach 4.4GHz locked on all cores (disabling all but C0 and C1 states), and even up to 4.7GHz if you have been lucky at the Silicium lottery. I still have an overclocked 2500K @ 4.5GHz as my second PC (it used to be @ 4.6GHz, but after 7+ years running flawlessly at that frequency I had to reduce slightly the speed to avoid a daily freeze when the CPU is idle).

Here, going from 3.4 to 4.4GHz would get you almost +30% on frame rates.

Edited by Henri Beauchamp
  • Like 3
Link to comment
Share on other sites

2 hours ago, Henri Beauchamp said:

Indeed... Once the GPU issue out of the equation (i.e. for anything equal to or better than an old GTX 660 for ALM without shadows, or a GTX 460 with ALM off), the performances of the viewer increase almost exactly proportionally to the CPU single core performances (because the rendering engine is mono-threaded, even though some drivers such as NVIDIA's proprietary ones may use a few threads by themselves and load half to a full more core).

With an old 2600K, I would, even before considering buying a new GPU, attempt to overclock my CPU: the Sandy Bridge is super easy to overclock and can easily reach 4.4GHz locked on all cores (disabling all but C0 and C1 states), and even up to 4.7GHz if you have been lucky at the Silicium lottery. I still have an overclocked 2500K @ 4.5GHz as my second PC (it used to be @ 4.6GHz, but after 7+ years running flawlessly at that frequency I had to reduce slightly the speed to avoid a daily freeze when the CPU is idle).

Here, going from 3.4 to 4.4GHz would get you almost +30% on frame rates.

Agreed, the 2600K is a solid and well documented entry point into the world of PC overclocking.

I still have one in 24/7 use at stock speeds with the ram maxed out and relegated to running the home modded minecraft server (and NAS and a few dev dockers).

The problems with trying to use it for current SL tasks isn't so much the chip, it's really one of intel's highpoints, it's the supporting Sandy Bridge architecture. It only supports PCIe 2.0 which limits the bandwidth between the CPU and the GPU (and everything else) of which SL hammers pretty hard, combined with memory speeds maxing out at DDR3 1333Mhz, it's very easy to end up in a situation where the entire system is bottlenecked shoving data around rather than doing things with the data.

 

  • Like 1
Link to comment
Share on other sites

1 hour ago, Love Zhaoying said:

One of these years, "real soon" then, I'll upgrade! So many other things to sink money into.

I was perusing ebay this morning and 1060 model gpus 3gb and 6gb versions can be had at decent prices currently. Good way to upgrade your rig for not a lot of dough. and the 1060 is a solid card all around.

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 666 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...