Jump to content

'Radeon HD7970' or 'Nvidia GTX 680'?

Ulukai Voom
You are about to reply to a thread that has been inactive for 2873 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

hello second life users


i want to play second life in ultra high graphic details with shadows and all other things and will buy a graphic card this week. please reply fast ^_^ before x-mas

does "Radeon HD7970" (faster than GTX 680) work that way with second life? or must i buy a "Nvidia GTX 680"?

has somebody that graphic card? or some benchmark HD7970 vs GTX680 @ second life?

whats with that open gl is not supportet by radeon? it is supportet O_o -> http://www.toppreise.ch/prod_254832.html

or are the drivers extremely bad? :-(


is that true? where is the chart from? -> http://community.secondlife.com/t5/Second-Life-Viewer/What-video-card-do-I-need-for-SL/m-p/1546603/highlight/true#M14539

in WoW not so good :-( would that look the same in second life? -> http://www.tomshardware.com/reviews/radeon-hd-7970-ghz-edition-review-benchmark,3232-12.html 


its in german but look only ont the chart ->http://ht4u.net/reviews/2011/amd_radeon_hd_7900_southern_island_test/index25.php



thank you very much!


Link to post
Share on other sites

See this thread http://community.secondlife.com/t5/Second-Life-Viewer/Which-video-card-is-best-for-SL-AMD-Radeon-or-nVidia-GeForce/td-p/1419993/highlight/false/page/10

Also based on my experiences with a 7970 and a GTX 680 i suggest you go NVIDIA. 

It's rather a driver issue on AMDs site than performance.

Performance-wise both cards felt almost the same for me, though you feel that they render different.

Link to post
Share on other sites

You asked for advice about what video card to get to run SL.  I really don't think it's possible to get any recommendations from anyone here that is not based on emotion.  I'm an nVidia user and my emotions naturally run with nVidia (but, I'm also very well aware that my bias is entirely based on my personal preference that has been developed over the years by experience with both AMD/ATI and nVidia).  It's identical to my Intel preference over AMD.......it's all emotion.

That being out of the way the closest I can "recommend" is to go with what you want.........both chipsets are more than capable of running SL at the highest settings with power to spare.  There is a point where more GPU is wasted for SL.  You can have the best GPU ever made placed in the most powerful computer ever built connected directly to the Linden Lab backbone to the Internet and still not have the graphics quality that most of the well known online graphics games available.  The reason is simple.  Those well known, popular graphics games have 100% of the graphics that make up the scenes professionally made.  Those graphics are tested for rendering speed (and, I'm sure, many other rendering demands for game play).  The graphics are downloaded to and placed on the users' machines for quick retrival when the game calls for them.  And most of the "action" that occurs on your monitor is locally housed (such as scripts).  Your computer has to deal with all that during play and everything except positional data is right there on the hard drive waiting to be called into action by the program.  SL has almost none of that..........it's all housed remotely on the LL servers (for most people that's 1000's of miles away).  The Internet and every server along the long path from your computer to the LL servers and back to your computer is going to slow that access down for "game play".  Your GPU can render everything almost instantly..........and you'll see some lag due to how long it took for everything to reach the GPU to be rendered on your screen.  If there's a tiny hiccup along that path (to the servers, and back to you) you'll have a little more lag to contend with.  That lag is not due to the GPU falling down.......but it's going to be there just the same for you to see.  The content that makes up SL is millions times more than the content that makes up the most advanced online game (way more than any home computer can ever store).  Every time your viewer calls for any content that makes up SL the asset servers have to be searched (terrabytes of data have to be searched for even the smallest of objects/items).........that takes time.  That time is seen on your end as lag.  And then there's that professionally made content vs user made content.  I'm going to guess at 95% of the content making up SL is user made (but my gut says that that is more like 99% +...........I'm being conservative).  User made content is amateur made content.  It's not efficient like professionally made content.  A texture that is twice as large as it needs to be or is 32 bit color depth when it needs to be 24 bit will take much longer for even that best of the best GPU to render........it will show up as lag on your screen.  Most of the textures in SL are much too large and contains unnecessary alpha channels.  And then the content that is not even graphics related.........such as scripts.  Your connection and your CPU have to deal with everything.  The GPU will deal with the graphics but that poorly written script has to be processed which takes time........and before the graphics can be rendered the script associated with it has to be processed (that often shows up as lag).

Add all the factors that are beyond your (or anyone's) control and wind up with a point of which anything more is being wasted on SL.  You simply do not need the best of the best GPU's for Second Life.............you need a good one but not the best available.  You need to decide on what your bias is.  What you are willing to pay for (the nVidia cards are slightly more expensive).  How the drivers are installed (nVidia drivers are a little easier to install).  What the power supply demands are for each card (nVidia typically wants more power).  That the physical demensions of the card is for placement in your computer case (that varies with the cards within each chipset vender manufacturering designs).  If you truly have no bias (probably possible, in my opinion) then maybe my post will help.......otherwise you can take it as just another person's opinion. 

One more thing and I'm done.  If you use your computer for other graphics programs (such as video games, image/video editing, etc) the check the benchmarks for those programs.........take the one that best meets your needs.  Second Life will take care of itself as long as you stick to the high end GPU (either chipset maker's).

Link to post
Share on other sites

I see more problems with AMD cards on these forums than with NVidia cards. That's about all I can say about that.

I agree with Peggy about the overkill. I have a GTX670, which is nearly as fast as the 680 and if the circumstances in SL are bad, and they often are because of the sheer amount of data to process, I don't get more than 20 fps on the highest settings.The load on the GPU is very low in those cases, maybe 25%.

If the circumstances are right, a well designed sim and not too many overloaded avatars around, I have seen fps rates of over 200 on ultra settings. This is even more reduculous. In fact I have capped my fps at 60 so the GPU doesn't heat up too much.

Anyway,  at a certain point the GPU is no longer the bottleneck and anything faster will be useless. Make sure you have a serious CPU and a solid internet connection. Those will probably do you more good than the most fancy graphic card you can find.

Link to post
Share on other sites


thanks, the thread helped. it looks like radeon cant made professional drivers. so i will buy that chinese nvida card:


@Peggy Paperdoll
thanks for you long, emotional post ^_^
from my feelings i like more the radeons, but i have no choice.
radeon or second life

@Kwakkelde Kwak
thanks, i thougt only about the power, not about the special architecture of the cyberspace and the internet connection.
but i dreamed 10 years about such a graphic card and now i can buy it ^_^ and lasts also for the future

Link to post
Share on other sites

I agree on the part that either card is overkill for SL, SL itself may be also the cause for the problems with AMD cards but who knows, back in the days i had a 6970 which produced funny looking stars into the SL-sky. 

Apart from that i was able to get into sub-10-fps with each high end card.

Overkill or not - SL is just badly optimized in general.

4 GB is quite alot, though it's one of my reasons to switch back to AMD... SL uses around 1 GB to 1.3 GB video memory.

But that is only one viewer running. I don't know your kind of setup that you want to run but i saw my 680 struggleing with more then 1 viewer where as the AMD 7970 handles 4 viewers simultaneously just fine ... :l

Link to post
Share on other sites

My nVidia 580 GTX w/3 GB VRam  handeled so far 4 viewers ( native 64 bit linux viewers )simultaneously in ultra with 40-60 fps in each viewer, so, something else on your system seems to be the problem.

Here my complete system spec:

AMD FX 8150 ocatcore processor

16 GB DDR 3 Ram

nVidia 580 GTX 3 GB VRam

OS: Debian Sid 64 bit

nVidia Driver 310.19

Open GL 4.2



Video memory usage gets capped at 512 MB in all viewers.....

Link to post
Share on other sites

Sure thing, go to Misty Mountains and say this again ...

Also did you heard of MSI Afterburner? It is capable of showing the memory usage of your graphics card.

You should toy around with it. Oh right, you're on linux. Where OpenGL is the standard (and thus maybe better supported) and you probably don't have a tool to read out the memory usage. This might help: http://www.matrix44.net/blog/?p=876

Anyways the 680 is sold and i've advised to buy Nvidia. I honestly don't get your offtopiciness.


Video memory usage is not capped in all viewers: 


Where are you referring to is Texture Memory Buffer... exactly: TEXTUREs. Guess what, a 3D world is not only textures, you also need geometry, shadowing and lightning and probably some other stuff.

That's also inside the GPU memory.

By the time of posting it's well over 2 Gig for 3 viewers. I'm quite surprised it's "only" 2 Gig as 1 viewer normally draws 1.2 Gig. But that's probably a 32 bit limitation of the Firestorm viewer. 


If you would be so kind and please send me a proof of your 4 viewers @ 60 fps via personal message. Thanks. I might go Linux if this joke turns out to be true.


Enough of the Offtopiciness. As i said: Go NVIDIA. No problems there.

Link to post
Share on other sites

Chibbchichi wrote:


By the time of posting it's well over 2 Gig for 3 viewers. I'm quite surprised it's "only" 2 Gig as 1 viewer normally draws 1.2 Gig. But that's probably a 32 bit limitation of the Firestorm viewer. 

My LL viewer doesn't use more than 700 something MB per viewer. I guess that's the 512 plus the things youmentioned. I have never seen it at 1.2G or even close.

Also don't forget all inactive viewers get their fps cut to 15 or so. No idea how this affects memory usage, but I suspect it does have AN affect.

The 2GB is a 32bit limitation for your RAM usage, is that really also the case for the video card? I can't check since all other memory intensive programs I run are 64 bit.

Link to post
Share on other sites

I don't know if I want to get into this discussion since it's pretty much pointless.  But I believe you (Kwakkelda Kwak.....geeze what an ood name :) ) are correct......or more correct anyway.  The last time I ran concurrent avatars I saw no decrease in FPS for either........they were almost identical depending on the sight line each avatar was viewing.  If I had only one avatar logged in and was getting 40 FPS, when I logged in the second avatar my FPS was still 40.....on both avatars.  It's too confusing for me to run more than one avatar at a time so I don't do it any more (I'm not try to go for any bragging rights that some people think is so important.......if it works well and there is not apparent lag then I don't care what the FPS is).  But here's my objection to all this fantastic FPS, GPU RAM usage and, even system RAM usage.  If you are runing multiple avatars on a single computer then you can have only one active window.  How the heck can anyone measure anything (FPS, video RAM, system RAM, or anything else) on an inactive window (or a window that is put on the task bar)?  As soon as you click on the window to make it active then that is the window that you  are making measurements on....that window you just left is now inactive and there is nothing being measured on that window any longer.  40 FPS on all four concurrent log ins?  Yeah, I believe that since that's exactly what I observed years ago........but you are not measuring the FPS for all four windows at the same time (you are measuring only one).

Someone's blowing smoke.  And it's typical for users of a certain operating system.  I don't know why that is but it's very common.  There are three major operating systems in use in today's computers.  Each has it's strengths, each has it's weaknesses..........the only way one might be better than the other is that a particular OS better suits one's computer needs.

Link to post
Share on other sites

Peggy Paperdoll wrote:

As soon as you click on the window to make it active then that is the window that you  are making measurements on....that window you just left is now inactive and there is nothing being measured on that window any longer.

If I have two windows side by side, both SL viewers and have the statistcics open on both (Ctrl Alt 1) I see my "normal" fps on the active window and 15 or so fps on the inactive window. I can also really see the fps going down, with my eyes I mean. It's not a lack of resources, plenty of those available for a third and when I open that third, fps on the active windows stays the same and I get the 15 fps on both inactive viewers.

I think what was ment by running 4 viewers at 40-60 fps is any active window has those fps. I have a very hard time believing even a 680 is capable of running four viewers all at that fps simultaniously. That would have to be one empty sim.

I'm running windows btw, 7 pro SP1


Link to post
Share on other sites


This topic is now archived and is closed to further replies.

  • Create New...