Jump to content

Which one is better for SL? ATI (AMD) or GeForce (nVidia)


You are about to reply to a thread that has been inactive for 3325 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

If you really want Forum response to this question (keeping in mind that very few of us are qualified to make a definitive reply) you could probably search on the terms AMD (or Radeon) and Nvidia. You'll find more threads than you can imagine.

In my reading, I've seen at least ten posts praising Nvidia's performance in SL over Radeon's as I have seen posts that say the opposite, so I've just gone with the flow and used Nvidia right from the start. I will offer one piece of real advice: if you do decide to go with Nvidia, do the research and get the best performing card you can afford. Sometimes that means not buying something from the latest and greatest series.

Link to comment
Share on other sites

Although I am by no means an expert, I do know a lot of people have issues with certain AMD cards as they experience weird spikes with mesh.

You may want to google a bit on both cards ( I am a very happy Nvidia user, but what do I know!) , and read this blogpost by Inara Pey on the issues with AMD Cards, and then it is all up to you :)

 

Link to comment
Share on other sites

There's been a recent rash of mesh rendering problems caused by AMD/ATI driver bugs. AMD support for OpenGL (the graphics architecture under SL) isn't the best and AMD drivers have been problematic in the past. I hear far less grumbling about NVIDIA.

Here's a searchable table of benchmark data for various graphics chips/cards...

http://www.videocardbenchmark.net/gpu_list.php

If you can afford it, I'd recommend something from NVIDIA with at least a 5 in the middle digit, like 650, 660, 750 etc. As Dillon suggests, the best deals are not necessarily in the newest cards. You can sort that table by "Videocard value", where the benchmark score is divided by the price. That will guide you towards the best bang for the buck.

Happy hunting!

Link to comment
Share on other sites


jonhnnyroleplay wrote:

This arguement never dies...Intel-Nvidia vs Amd-Radeon. I only go by facts. Here is the best deal to date (3/13/15) on a great AMD card:

In this price range you can not get any Nvidia card even close to the performance of this one. Two of my pc services' customers have just placed an order for this one along with power supply upgrade.  Intel CPUs (processors) rule; due to fact they are faster, use less electricity and have come down in price. On other hand, IF you are good overclocker can make AMD CPUs run as fast, BUT you'll use more electricity & usually run hotter.  It is just the opposite when it comes to whose best in graphics cards. AMD-Radeon cards are still faster, run cooler and cost ALOT less. Intel-Nvidia are gaining ground on them, but have a ways to go.
As for the "myths" about AMD-Radeon driver issues
please checkout some of my other posts where I go into step by step on WIndows AND Ubuntu-Linux AMD-Radeon driver installations.  Here is set of graphics card rules (ex. 256 bit) to go by:

1- Need a 64 bit operating system (OS)

2- At least 4 gig of OS RAM

3- Your processor (CPU) needs to at least Intel i3 or 3rd generation (LGA 1155 socket) or higher Intel dual core Celeron( tight budget)...Intel & AMD quadcores, six cores, etc work better (Intel is faster)

4- LL/SL does not currently offer a 64 bit viewer, but two 3 rd parties do...Singularity & Phonenix Firestorm do and work far better than "official" SL viewer does!

5- *DECENT* name brand power supply like Antec Neo Eco series or Thermaltake of at least 600 watts with a single +12v rail.

FYI I run an AMD-Radeon 256 bit card on a Linux OS...smooth as glass...in past I ran AMD-Radeon on Windows (coughs) ....still was smooth as glass (card was lol)  If you want Nvidia then go for it, but I always want most bang for my buck!  I'd be cautious of "benchmarks" since they only use one or two total pc combination of components. If you want a more accurate rating check out newegg.com & amazon.com (coughs) reviews...see what is good and bad posted by people who are using them. Some of those "benchmark" sites tend to lean toward company(s) who give them the most free stuff too! Have fun~~~

 

 

 

Well we got an expert here , right?

 

The 'myths' are not myths, but they happen. The issues with AMD giving crap due to their drivers,  with rigged mesh specifically,  are known and true. Not will all cards, not with all machines, but they happen. Every day. Not with Nvidia.

 

You cannot just bluntly state that Firestorm or any other viewer is better than LL's. It is personal AND it depends on your machine and card and all that, for instance I run a 64 bits machine with Nvidia card perfect on the LL Viewer and cannot use Firestorm for longer than 10 minutes before I crash or even blue screen.

Singularity I can do, but I loathe the V1 interface (pie menu, eek, blegh but: personal !)

 

I would advise the OP to do some research, google and read and make his own decision when he has read it all.

Link to comment
Share on other sites

I currently run a new machine with an AMD APU. Right away I could not see rigged mesh at all. I found a workaround, but in some situations where a lot of rigged mesh is on display, i.e. shopping anywhere, I still get the spiked smear mesh glitch from some of the worn rigged mesh in my viewer window.

I never had any serious issues with Nvidia.

Link to comment
Share on other sites

I'll second the nVidia suggestion, though I've not used Windows in well over a decade - take what I say here with a huge grain of salt please.

It's not that AMD doesn't show all the SL effects, it's just that historically, AMD has had problems with their implementation of OpenGL.

The AMD OpenGL drivers exhibit fairly frequent glitches (if you do a search on the forums you'll see what I and others talk about) and consensus is largely that the nVidia drivers are just more stable, at least for SL. They have their own problems mainly with unclean/borked driver upgrades, but OpenGL support doesn't seem to be an issue.

Also, you sometimes will see people claim SL only supports 512MB GPU RAM which is flat out false and has been refuted by LL so many times, I think they're getting sick of the claim. If you can afford a 2GB card, get it. SL _will_ use it.

On the other hand, when it comes to performance vs price, AMD wins hands down. It all depends on your budget and needs as well as your underlying hardware.

Talking about hardware: That's something to watch out for if you're upgrading old hardware. Some older mainboards only support PCIe 2.0 (like the mainboard on my main PC) and sticking a PCIe 3.0 GPU into it won't give you as much of a boost as you might expect. With my GTX 760, running custom graphics with 128m draw distance and full light/shadows I get around 30 FPS, in busy venues 20-some FPS. On the other hand I get about 10 FPS more on a secondary PC with the same CPU/GPU as my main but more modern mainboard with PCIe 3.0 support. Both run 3 large flat panels, making the load fairly comparable.

Link to comment
Share on other sites

Nvidia definitely!

To experience SL as it should be played, you really need two Titan Z cards.

In fact, I am saving up for a third so I can model the human brain.

[When I said "SL as it should be played", I meant that having those high power cards would enable you to speedily identify the type of avatar who is going to ask you for sex, even if you are the wrong gender, because they are using Intel graphics and seeing clouds against a Fifty Shades of Grey background, and are culturally ignorant so they can't tell that HotBabe2015 Resident is likely to be a bloke playing a Barbie. The other benefit of such a set up is that you can cook bacon sandwiches on the PC case while playing, and crashing, and playing, and crashing, and . . . ]

[Also, when I said I needed a third one, I probably don't for 99% of SL residents; an EGA card would do to display the average of 42 brain cells that it takes to post - and reply to - a question in General Discussions which should obviously be in a Techical forum.]

Link to comment
Share on other sites

That card doesn't have a standard VGA port, how am i supposed to attach my monitor? It's also 10 fricking inches long and uses a PCI 2.0 interface, requires a minimum of a 550w powersupply and is loud as hell! I had one in my wife's PC.. returned it and got a Nvidia 650ti for $15 more.

Link to comment
Share on other sites

AMD is an excellent choice, you  pay less money for them and in return you get more problems from buggy drivers for the money saved in comparison to Nvidia gpus.
Mesh not showing, toen pixels, weird color issues , all these things have happened and still are happening in SL  thanks to AMD's really poorish drivers under all operating systems.

J.

Link to comment
Share on other sites

I bow down to your "obviously" superior knowledge on all things SL related and IT in general. Hence forth you shall be known as the SL IT God. LL should hire you post haste and give you a management position.

Btw HERE is the link for the card i bought my wife.. My bad, it was the 750, not the 650ti.

Link to comment
Share on other sites


Jenni Darkwatch wrote:   Also, you sometimes will see people claim SL only supports 512MB GPU RAM which is flat out false and has been refuted by LL so many times, I think they're getting sick of the claim. If you can afford a 2GB card, get it. SL _will_ use it.


  I was honestly unaware of this. In the graphics preferences the max VRAM setting slider tops out at 512MB, and most people will base their buying decisions on this setting. Does the viewer ignore that setting now? I'm interested in reading LL's refutation of this belief if anyone has a link.

Link to comment
Share on other sites

For Second Life, I would always choose Nvidia over AMD if you are using Windows.

Nvidia drivers tend to have a LOT less problems on SL then AMD drivers.

I don't class myself as an expert on graphics cards & drivers by any means but as someone who deals with a lot of broken users on a daily basis (i'm a member of the Firestorm support team) I can honestly tell you that AMD drivers have been, are still and probably always will be a major pain in the backside for Second Life.

Here is a quick run down of the known problems:

AMD

 

  • The Latest Catalyst 14.12 drivers will not render any rigged mesh unless hardware skinning is disabled.

                JIRA issue: BUG-7653

                More details in these excellent blog posts from Inara pey

https://modemworld.wordpress.com/2014/12/09/latest-amd-catalyst14-12-drivers-continue-sl-rigged-mesh-woes/

https://modemworld.wordpress.com/2014/12/13/amd-catalyst-drivers-additional-windows-workaround/

 

  • Catalyst 14.9 drivers and later have broken projected shadows

                JIRA issue: BUG-7627

 

  • Catalyst 14.9 drivers and later have broken selection outlines when Advanced Lighting Model is enabled

                JIRA issue: BUG-7947

 

  • The viewer will crash when editing mesh on some older AMD cards

                 JIRA issue: VWR-28607

                 This is an old bug but it is still not fixed. Newer AMD cards will still suffer from this crash if using older drivers.

                 For older AMD cards there is no fix, they will crash even when on the latest driver.

 

  • Some rigged and fitted mesh will stretch out to 0,0,0 on the region on drivers earlier then the 14.4 Catalyst

                  JIRA issue: BUG-5156

 

  • Because of these bugs, if using an AMD card, you will need to use the 14.4 Catalyst driver - nothing earlier and nothing later. This is always the way with AMD drivers for SL - maybe 1 in every 4 or 5 driver releases will be a known "good" driver for Second Life which is safe to update to.

 

NVIDIA

 

  • As far as I am aware, there are currently no known rendering bugs which are Nvidia specific.
  • Viewer crashes in the Nvidia driver are lower then AMD and Intel driver crashes.
  • Some users of Nvidia cards have problems with the "OpenGL terror" crash - "The NVIDIA OpenGL driver lost connection with the display driver due to exceeding the Windows Time-Out limit and is unable to continue."

                 Details: https://community.secondlife.com/t5/Second-Life-Viewer/SL-and-the-NVIDIA-OpenGL-error-Terror/td-p/1813597

 

This seems to mostly affect overclocked cards and can be fixed by not overclocking the card. It can also be caused on older Nvidia cards by having hardware acceleration enabled for flash and having  a  web browser open displaying flash content while logged into SL. Disabling hardware acceleration for flash will fix a lot of cases of this driver timeout.

 

Intel

 

  • As far as I'm aware, there are currently no known rendering bugs which are Intel specific.
  • Earlier intel HD drivers had a severe memory leak, which would cause the viewer to crash from out of memory pretty swiftly after logging in, especially after teleporting to a texture rich region. The last few Intel driver versions do not have this problem. Currently using the latest available driver for Intel cards is recommended.
  • Viewer crashes in Intel drivers are a real problem - really, its a disgustingly high crash rate compared to Nvidia & AMD.

For Firestorm viewer:

  • 25% of all viewer crashes are Intel driver crashes.
  • 11% of all viewer crashes are AMD driver crashes.
  • 9% of all viewer crashes are Nvidia driver crashes.

 

Link to comment
Share on other sites


Jenni Darkwatch wrote:

 

Also, you sometimes will see people claim SL only supports 512MB GPU RAM which is flat out false and has been refuted by LL so many times, I think they're getting sick of the claim. If you can afford a 2GB card, get it. SL _will_ use it.


Where did LL refute that claim? Link?

LL made an attempt recently to raise the limit past 512MB but that change had to be backed out due to severe side effects.

This change never made it into a released version of the viewer.

The change was made here: https://bitbucket.org/lindenlab/viewer-tiger/commits/8425f76bbb1de290c9c4956a2e0579d4a05a0112

One of the side effects this change caused was BUG-6207 - Increased texture thrashing problem on viewer-tiger

The change was backed out and the current release of the LL viewer still only allows a max of 512MB for texture memory, no matter how much memory your card has.

Bao Linden explained the reasons for the 512MB cap on SH-2547

This cap is still in place.

 

Bao Linden added a comment - 14/Oct/11 8:16 PM

As Kouta Ruska pointed out, 512MB is the cap. But we do not think it is a bug. Here are the reasons why we set this cap:

 

1, this cap is a soft cap. It tells viewer it is good to make the total amount of textures to follow this cap. But if this cap needs to be overflown to make all necessary textures sharp enough, and viewer has resource to overflow the cap, viewer will do it. Of course viewer will immediately remove unneeded textures to follow the cap if the situation changes.

 

2, vram is used to hold all stuff for GPU. Textures are just part of it. VRAM needs space for VBO, render targets, and other necessary stuff.

 

3, When a texture is created, it has a copy in the vram and a copy in the main memory. Normally when vram can not hold all textures for rendering, drivers should do texture swapping between the main memory and vram. This is why we reserve 1.5 times of texture memory cap in the main memory. So say if the cap is 512MB, 768MB is reserved in the main memory. SL is built with 32-bit system. So the max address space it has is 2GB, which includes program space and heap space. Considering the fragmentation issue, the actual max memory (or heap size) for SL is no more than 1.6~1.7 GB. Among that, 768MB is a large enough portion to be reserved for textures only.

 

4, 512 MB cap is also sufficient for performance. Texture itself is hardly the reason causing FPS drop unless GPU is busy in doing a lot of texture swapping while rendering. This is not the case when we have 512MB texture memory. If you open the texture console by "ctrl+shift+3", and watch the total amount of textures to be bound at an instant, it is very rare that that number is more than 512MB.

Instead of increasing the 512MB cap, we unfortunately might have to do the opposite changes: to shrink this cap, for certain cards. For instance, some ATI cards are not good at texture swapping. So textures eat up all VRAM quickly in some regions which causes FPS to crawl, and eventually crashes SL.

 

Hopefully this cap can be raised some day without causing bad side effects. It would help greatly with the much hated Texture Thrashing problem.

Link to comment
Share on other sites

After some searching and reading, I found this workaround for the latest Catalyst OpenGL bug. It involves copying some older dlls into your viewer folder, which seems to have the effect of rolling back your driver without actually having to do that. It works for me. I Still see some rigged mesh periodically stretched to region 0,0,0 when a lot of avatars wearing it are in viewing range. But I can see it now.

Link to comment
Share on other sites

I would like to think that the new Amd cards are worth their money, But On overlcock.net one of the main complaints about a Amd cards is their heat generation and power consumption.  

I can say that for the value Amd cards rock.. Sl how ever is a pain in the ass. Most Sl users are not avid pc enthusist and need things to just work.

 

I would not suggest a video card that requires a user to go through so many hoops to make sure it works. Thats just me.

Link to comment
Share on other sites


Drake1 Nightfire wrote:

the
SL IT
God. 

 

I have nothing constructive to add to this thread, I simply wanted to say this nickname, has me cracking up, probably because I saw no space the first time I read it :D

I also think it's quite fitting, but I suppose that's neither here nor there.

Link to comment
Share on other sites


Whirly Fizzle wrote:


Jenni Darkwatch wrote:

 

Also, you sometimes will see people claim SL only supports 512MB GPU RAM which is flat out false and has been refuted by LL so many times, I think they're getting sick of the claim. If you can afford a 2GB card, get it. SL _will_ use it.


Where did LL refute that claim? Link?

[good stuff elided]

 
added a comment -
14/Oct/11 8:16 PM

As Kouta Ruska pointed out, 512MB is the cap. But we do not think it is a bug. Here are the reasons why we set this cap:

 

1, this cap is a soft cap. It tells viewer it is good to make the total amount of textures to follow this cap. But if this cap needs to be overflown to make all necessary textures sharp enough, and viewer has resource to overflow the cap, viewer will do it. Of course viewer will immediately remove unneeded textures to follow the cap if the situation changes.

 

2, vram is used to hold all stuff for GPU. Textures are just part of it. VRAM needs space for VBO, render targets, and other necessary stuff. [...]

 

Hey, Whirly, I think the resolution is right there in Bao's point #2, isn't it? The 512MB texture cap is still in place, but a graphics card can make productive use of much more that 512MB for things other than SL textures -- including other uses within an SL viewer. 

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 3325 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...