Jump to content
Sign in to follow this  
bessi222

Which video card is best for SL, AMD Radeon or nVidia GeForce?

Recommended Posts

Do we know if a 680 runs SL yet? Someone told me they have a 7970 and are getting a blank screen.

550 Ti $119

http://ncix.com/products/?sku=65800&vpn=N550GTX-Ti%20M2D1GD5%2FOC&manufacture=MSI%2FMicroStar&promoid=1201  

GTX 580 is under $400 canadian but in the US $500-600 range and the same company NCIX. There are 3 brands of GTX 680 on the Canadian site for $510 but out of stock atm and plus 11% for taxes

NCIX does have a US site but the product available is different.

http://ncix.com/products/?sku=56629&vpn=VCGGTX580XPB&manufacture=PNY%20Technologies%20Inc&promoid=1387

http://ncix.com/products/?sku=58688&vpn=GV-N580UD-15I&manufacture=Gigabyte&promoid=1101

http://ncix.com/products/?sku=56739&vpn=ZT-50101-10P%2FZT-50105-10P&manufacture=Zotac

Share this post


Link to post
Share on other sites

Well if SL cannot run on the new cards at all and LL has no plans to change their code to work the way the future is going then it makes sense it appears LL is letting SL attrit itself out for closure soon. LL has gone in new directions unrelated to SL. I don't think I will be investing in SL anymore and a lot of others appear to be under the same impression. LL went silent again too. So LL better stand up and state whether or not they will support anything beyond a 580. That will be the signal LL is done with SL.

 

Let me put it another way. If you were a graphics client developer and your company was not going to advance then why would you remain there while your value erodes away as though you were shuttled to a dead end job for constructive dismissal?

Share this post


Link to post
Share on other sites


Ormand Lionheart wrote:

Do we know if a 680 runs SL yet? Someone told me they have a 7970 and are getting a blank screen. I know the 680 does not have "shaders" anymore is that going to be an issue?
...

Where did you read the 680 has no shaders? From what I read it has twice the shaders. Now some of the OEM 680 specs show shader clock as N/A but plenty of others list the shader clock frequency.

The rub is GTX680 "shaders" to AMD (and normal shaders) is comparing apples to oranges. This review is interesting: http://www.theinquirer.net/inquirer/review/2162193/nvidias-gtx680-thrashed-amds-mid-range-radeon-hd-7870-gpu-compute?WT.rss_f=Home&WT.rss_a=Nvidia%27s+GTX680+gets+thrashed+by+AMD%27s+mid-range+Radeon+HD+7870+in+GPU+compute

hmmm does that mean applications will have to be rewritten just to make a 680 go fast?

The main thing is I'll just wait and see what people with new 680's say about how SL runs with them. Then again the 500 series prices should be dropping soon. If the 580 dropped to $300 then hey that would be good.

 

Also this tidbit is rather important:

*GeForce GTX 680 supports PCI Express 3.0. The Intel X79/SNB-E PCI Express 2.0 platform is only currently supported up to 5GT/s (PCIE 2.0) bus speeds even though some motherboard manufacturers have enabled higher 8GT/s speeds.

So your mobo needs to have PCI Express 3.0 for optimal performance. Otherwise it is like hooking a race horse to a fully loaded bud beer wagon. No 680 for me. Not till I decide to replace the entire system. As a matter of fact I think I will go in full system upgrade direction instead. Why piecemeal when what I have is already too old.

Share this post


Link to post
Share on other sites

You are right apparently triple the amount of shaders the GTX 580 has but slower as they run at the GPU speed. It has 1536 shaders compared to my 336 on a GTX 460. 

If I remember right o/c the shader clocks with MSI Afterburner on a 460 had no effect on fps so the slow down may be no issue. Have you seen the

? Supposedly originally produced with three 580's and a single GTX 680 can do the same.

 

Share this post


Link to post
Share on other sites

And SLI is working for u in SL and actually incrased performance. If so how sinc I have been trying to a year to get SLIimproving performance. I can get it working however the fps plummet dramatically.

Share this post


Link to post
Share on other sites

I have been using Nvidia ever since I had nothing but problems from ATI several years ago. I would say about 7 years ago lol .. I have been a solid Nvidia user  since.

I recently got a AMD FX6100 CPU which are optimized for AMD readeon vid cards...still a tough decission changing to an AMD vid card lol.. And My Motherboard runs the 990FX chipset http://www.amd.com/us/products/desktop/chipsets/9-series-integrated/Pages/amd-990fx-chipset.aspx

My Question is. ATI didnt used to be AMD . Did AMD buy ATI out or did they just put their chip in the radeon?

Either way I have always been a AMD user. . Never had an Intel chip in my life.. If AMD has made some good improvments to the radeon then it may be worth the change for me... It's still a hard sell for me lol But as my whole set up is optimized for it.l It May be the best way to go for me. I have always been one to keep the hardware matching with mannufactures.. My Last motherboard and video card were Nvidia. Matching set. I think a lot of residents run into problems mix matching.

 

700 USD is very pricey for a AMD Radeon HD™ 6990 lol . I could rebuild my whole PC for that lol

 

Share this post


Link to post
Share on other sites

Do multi-gpu cards benefit in Second Life? SLI does not appear to in my experience (apparently used to). Does using the one card in a multi-gpu card perform better than in dual gpu mode?

Share this post


Link to post
Share on other sites

I only use one GPU so I could not answer that

But with a multi core CPU

I can run anything I want without it lagging out SL.. If running SL and I need to log in more than one viewer to do some work. I experience no lag or slow downs.

Multiple CPU cores has it's advantages thats for sure And a noticable increase of FPS

My Viewer never crashes anymore lol

 

There used to be an advanced debug setting for the older viewer to use Multiple threads. I think that is on By Default now in viewer 3. But you can still find that setting in older viewers

 

Before, I would have to close SL just to run Photoshop. Now I can run 2 viewers. photoshop, blender, web server, sl group joining bot, and much more at the same time without any lag lol

 

I only use One GPU tho . I am sure 2 would help but not sure how well SL utilizes SLI/Crossfire technology

It wqould probably be benneficial if used in a sceenario like mine. Using SL at photo shop and other rendering programs ad the same time.

Share this post


Link to post
Share on other sites


Dilbert Dilweg wrote:

I only use one GPU so I could not answer that

But with a multi core CPU

I can run anything I want without it lagging out SL.. If running SL and I need to log in more than one viewer to do some work. I experience no lag or slow downs.

Multiple CPU cores has it's advantages thats for sure And a noticable increase of FPS

My Viewer never crashes anymore lol

 There used to be an advanced debug setting for the older viewer to use Multiple threads. I think that is on By Default now in viewer 3. But you can still find that setting in older viewers

 Before, I would have to close SL just to run Photoshop. Now I can run 2 viewers. photoshop, blender, web server, sl group joining bot, and much more at the same time without any lag lol

 I only use One GPU tho . I am sure 2 would help but not sure how well SL utilizes SLI/Crossfire technology

It wqould probably be benneficial if used in a sceenario like mine. Using SL at photo shop and other rendering programs ad the same time.

RunMultipleThreads is actually off by default in viewers now not on. I don't notice any difference with my 4 cores off or on in a brief test so I don't know how useful that setting is now. I may do a more extensive test some time. There were a few threads on this issue from Autumn 2010 on when LL took the option out of the Advanced Menu in v2.2.0.

Share this post


Link to post
Share on other sites

To see the effect of multiple cores I think you have to enable it and then restart the viewer. To me I beleive it's a help but that could be due to my specific cpu. Not all cpus are using hyperthreading so maybe that explains some people benefitting and others not. Enable VBO for Nvidia cards is a definite benefit maybe up to 30-40%. Window size especially when in Shadow mode. For me installing the viewer and Sl cache in a RamDisk is a benefit.

 

SLI has not been of any benefit for me. 2 GTX 460's and the fps will plummet 20-30) fromm200+ fps with one card. I have tried everything suggested on the web and it acually seems to work it's just that the benefit is VERY POOR.

 

The GTX 670 recently came out for $400. Benchmarks indicate maybe up to par with a GTX 680 (about $500-530) and about 30% better than a GTX 580 (which is still expensive). Now a GTX 580 is about 50% better performance than my GTX 460 in SL as I compared with another avatar especially in shadow mode which is where faster cards will profit. Of course this is not necassary for most people but I produce Machinima so I nned maximum performance to broaden my film ideas.

Share this post


Link to post
Share on other sites


Ormand Lionheart wrote:

To see the effect of multiple cores I think you have to enable it and then restart the viewer. To me I beleive it's a help but that could be due to my specific cpu. Not all cpus are using hyperthreading so maybe that explains some people benefitting and others not. Enable VBO for Nvidia cards is a definite benefit maybe up to 30-40%. Window size especially when in Shadow mode. For me installing the viewer and Sl cache in a RamDisk is a benefit.

 SLI has not been of any benefit for me. 2 GTX 460's and the fps will plummet 20-30) fromm200+ fps with one card. I have tried everything suggested on the web and it acually seems to work it's just that the benefit is VERY POOR.

 The GTX 670 recently came out for $400. Benchmarks indicate maybe up to par with a GTX 680 (about $500-530) and about 30% better than a GTX 580 (which is still expensive). Now a GTX 580 is about 50% better performance than my GTX 460 in SL as I compared with another avatar especially in shadow mode which is where faster cards will profit. Of course this is not necassary for most people but I produce Machinima so I nned maximum performance to broaden my film ideas.

I have a highly overclocked watercooled quad core Intel i5 750 (derived from the i7 not the normal i5) which was a very good gaming chip when released, but it does not use hyperthreading so going on what you say, that would explain why I don't see much if any difference.

 

 

Share this post


Link to post
Share on other sites

It's old news that Nvidia is better in OpenGL than ATI. In fact Nvidia's new drivers have crippled the 400 and 500 series so much in OpenGL that the much older 8000 and 200 series outperforms them consistently. The 400 and 500 series OpenGL performance level are more like the old 6000 series now. This is done on purpose so Nvidia can sell more professional line cards, which have basically the same hardware but drivers that are "optimized" for OpenGL, ie, not-crippling.

Share this post


Link to post
Share on other sites

There are OpenGL benchmark software out there that are free to download. Just google. No you cannot use the Quadro drivers for consumer grade cards. Yes there are supposedly driver hacks but I haven't found any that works. Also whatever you do, don't get the new Nvidia 600 series. They are better than AMD/ATI cards in DirectX games, finally, but are capped in OpenGL performance even more so than the 400s and 500s. Even the 400s is better for OpenGL than the 600s, and 400s have been known to have OpenGL performance issues. I might have to go with ATI from now on just for SL, even though ATI cards suck at tasks like video conversion compared to Nvidia cards.

Share this post


Link to post
Share on other sites

I did  test with my i7 930 chip. With hyperthreading enabled I get the same fps as when I idsable it. However the cpu usuage goes up to 50% from 20% so if I was running another program like FRAPS etc I'm gonna have some issues possibly as it maxes out. I have heard it said i5 is great fro games I guess the one shortfall would be multitasking although these chips are so fast these days even that's not a HUGE issue. I know I can have Firefox and 2 viewers running with no issue as far as the PC goes. GPU is a different matter as I have to reduce viewer settings considerable for 2 viewers to run without lag. I have a GTX 460 so an upgrade may be a huge benefit in that regard.

Share this post


Link to post
Share on other sites

At Linden Lab Realms I get 250 fps in HIGH with no AA or AF and in shadow mode close 100 fps at 1080P. With a GTX 460 768 mb EVGA and a i7 930. I know someone gets about 50% more in shadow mode with a GTX 580. Then I know a guy with a 580 getting maximum 20-40 fps and down to 6 fps with shadows enabled. Plays DX11 games no problem but SL sucks. We checked his gpu usage in gpu-z and it was a lot lower than mine. Could be different brands even though the same gpu are optimised differently? I have used Furmark to benchmark OpenGl however not sure what is good or bad and I heard the isses can be LL's version of OpenGl comapred to Nvidia? I was going to upgrade to at least a 570 or 580 but when the 670 came out was going to go that route. In theory double the performance of my card however I only play SL so waiting to see if anyone has seen better performance.

Share this post


Link to post
Share on other sites

Does this test result indicate a 680 is twice as fast as my 460 in OpenGL? - EVGA GeForce GTX 680 (GPU@1005MHz, mem@3004MHz, max GPU temp:74°C), TPC:101W/312W, GPU power: 115% TDP, R301.24 beta (branch r301_07-12, Win 7 64-bit) - 2699 points (44 FPS)

Share this post


Link to post
Share on other sites

The 580 should be significantly faster than your 460 in both DirectX and OpenGL performance. Since the 600s are out, Nvidia drivers will only be holding the new 600s back in OpenGL performance, so 580 and 590 will probably be the fastest cards around right now for SL.

Share this post


Link to post
Share on other sites

Does a 590 actually improve SL performance. I thought they are treated as SLI? Or is it treated different that 2 SLI 580's since a 590 uses one slot? I know SLI for 2 cards is not an option since my 460's in SL sucked in any manner of configuration. I know a 580 is about 50% at minimum better than my current performance in shadow mode. But they haven't come down in price much with the release of the 680. 670 looks like the better option. Maybe drivers will catch up?

Share this post


Link to post
Share on other sites

I found a unigine test for the 680 with OpenGL then did the test myself on my 480. Results suggest a 2 1/2x benefit however may depend on how the SL viewer handles OpenGl as I heard they are not one and the same?

Settings: 1920×1080 fullscreen, tessellation: normal, shaders: high, AA: 4X MSAA, 16X anisotropic filtering.

- EVGA GeForce GTX 680, R301.24 (Win 7 64-bit)
- 1661 points, 65.9 FPS

http://www.geeks3d.com/20120506/intel-ivy-bridge-hd-graphics-4000-gpu-opengl-4-tessellation-tested/

 

My test results: GTX 460

Settings: 1920×1080 fullscreen, tessellation: normal, shaders: high, AA: 4X MSAA, 16X anisotropic filtering.

- EVGA GeForce GTX 460, 8.17.12.9610 768M (Win 7 64-bit)
- 647 points, 25.7 FPS

 

 

 

 

 

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...