Jump to content

SLI Completely Broken with SL?


Danny Nolan
 Share

You are about to reply to a thread that has been inactive for 3626 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

So I'm aware that SLI doesn't particularly work with SL, and in most cases hurts FPS even more. But out of curiosity, what would it take to get it working again? Update the renderer? Certain SLI forced options in the nVidia control panel?

I remember, back when SL was a lot more basic (3-4 years ago), it did actually help marginally, but caused a lot of crashing, but it def bumped the FPS up a bit. What broke it?

Sure would be cool to get it working again. I'd love to have both my video cards eat away at those wonderful shadows ;)

 

Edit: Just noticed some interesting behavior

Got to playing around with SLI. I have it forced to AFR2, single display mode with VSync on, and triple buffering on, and shadows off. I'm on KirstenLee's viewer, Build7 RC2. 

if I zoom out in a very busy sim, I get really low FPS (30 to be exact), which is to be expected. However, if I go into options, disable Toggle Framebuffer Objects, I get the same FPS, and my GPU usage my first video card goes low (about 30%). Then, when I turn it back on, my FPS jumps to 60 FPS immediately, and my GPU usage on my first video card shoots up to 80%! Which is awesome. 2nd Video card, for some reason, seems to be stuck around 30% usage. 

However, when I start to cam around, my GPU usage drops, and then so does my FPS. And then sometimes it seems to "recover" and GPU usage goes back up, or it stays stuck, and is broken until I flip Toggle Framebuffer Objects on then off.

Pretty weird.

Link to comment
Share on other sites

Yeah .. I've certainly voiced my opinion a few times, on how I wish this client where more optimised and made better use of my hardware. I have a killer machine, I should be screaming right along .. but alas, I still see 30 fps pretty regularly when I'm anywhere but my skybox.

I reallyyyyyy wish LL could, instead of implement new features (that a lot of people don't even want), that they could just start working on the renderer. Hire a team of professional programmers(or make use of the ones that already do) that understand OpenGL and/or make DirectX available to SL. And have OGL/DX toggleable, like a lot of games do. It'd be so amazing.

But, I understand business is business, and what I ask for is very difficult, and expensive. I guess a guy can dream. haha

Link to comment
Share on other sites

The viewer is very much behind times. LL just caters to new users and how to make it easier for them.. Just look at your Processor usage when the viewer is running.. Got a nice i7 like I do? You will see only one core being used to its max. Some not even used at all... There goes the frame rate and sometimes the whole thing freezes for  some time.

Then you play some high end first person shooters and your frame rate never leaves 60 with all options on max.. Makes you wonder what LL is doing besides raising rates.......

 

Link to comment
Share on other sites

Yeah, not only do I have an i7 I have the 980x, so with Hyperthreading on I have 12 logical cores alltogether. Haha. I think SL uses maybe 7-10% of that. 

It is reasonable for this game to be very CPU heavy, since it's not like a normal first person shooter where all the objects are completely static, unchanging, etc. In SL, every single peice of geometry is subject to change, and (just guessing here), that it has to apply the parameters that's set to each prim very very often.

However, since it is so heavy on the CPU it really should be threaded more. And more optimised. I've noticed that when a thread in the viewer starts to get excessive my GPU usage drops, meaning the GPU is sitting around waiting for the client to finish what it's doing before it bothers to do anything. Rezzing textures is a good example of this. MAN rezzing textures kills the client. And something like that should def by going on in it's own thread, and not holding up the main thread.

But I've made this rant before. Haha. I GUESS I'M JUST VENTING.

Link to comment
Share on other sites

[guessing here by experience]  

 

The SL viewers are not compiled for more than a single processor.  At best one processor is used.   Having used duals, quads and now a 2600k, only one processor is ever used by SL.

As for graphic cards - having installed a 590 - again only see one processor used, so am thinking this is a basic code/compiled condition of SL.  So, not sure but have not seen a boost in frame rate using SLI.  If there is a trick, I would like to know it!  ok

 

 

Link to comment
Share on other sites

  • 1 month later...


Alexis Sommerfeld wrote:

Does this mean that someone could compile the source for multiple processor use?

 

I am not certain, but I don't think it works that way.

 

 

This comes up a lot. The answer is usually either A) It's on LL to update EVERYTHING and get with the times (those of us that have the hardware),  B) Because LL is all user generated content, it will always be laggy and there's nothing we can do about it (this comes from the LL apologists and/or people with 10 year old computers), or C) a combination of both A and B.

Link to comment
Share on other sites

  • 1 month later...

Actually SL view support multi threading, In viewers previous to 2.0 it can be enabled in advanced menu. When enabled a multicore cpu benefits are not epic but tangible. To fully optimize code for multicore would be a huge work.

SLI it's completely another animal. It's true that some years ago, with some combo of nvidia drivers and viewer versions you could actually benefit a little by forcing Alternate Frame Rendering with your SLI.

As rule of thumb take for granted that sli/multi gpu NOT work with SLI, offering performances worse than a single card if not instability and crashes.

LL don't give a damn about working in this issue, and probably by their point of view even with good reason. First SLI is still a niche toy. Second SL in not properly a game, and having extreme performances (fps) is not considered to relevant by the majority of residents.

So i have no faith that in the future we finally see a view supporting SLI.

Just my opinion of course.

Link to comment
Share on other sites

  • 2 years later...

:( Yes I really wish they rebuild the client I have Dual GTX 680's 4 gig's cards in sli on 3 monitors and a AMD FX 9590 with 32 gig's of system ram and 3 Sata 3 SSD's and  Fiber Optic connections 75 MB down and 35 MB up what does all that = to sl? 30 fps. My computer is falling asleep. LL just keep on comming out with more stuff to make the client lag more but nothing to actually speed the thing up. Im suprised Firestorm or some other third person party have desided to not wait for LL and tore the client apart themselfs to do so. (But Firestorm does have a 64 bit version now.) But then again im thinking more and more LL should just run the servers and leave the client's creation to 3rd party developers. SL has a preatty high learning curve to get use to it people that are on it actually making money from stuff they sale have to know alot about computers to even deal with sl so LL running on this low fps thing is okie is like promising a Dodge Viper and handing you a Pruis. http://www.geeks3d.com/20101209/tips-how-to-enable-sli-and-crossfire-support-for-opengl-applications/ I was looking at Open GL options for sli ran acrost that page from a Link from a EVGA forms so appearly Sli can be activated in Open GL as SL seems to not want to go to Direct X for lisences fee's maybe? But here's to hopefully seeing 60 fps in a crowded sim sometime on sl.

 

Just some passing thoughs to think on.

Link to comment
Share on other sites

Most of the major improvements are accomplished by Linden Lab. Not TPV Devs. So I don't see what it would do any good, if LL would give up on viewer development. Besides the fact that, if they would do so, we would have like 20 viewers where everybody would see things different, or not at all, or completely broken. because every TPV dev has it's vision on what a viewer should be.

I hope that will never happen actually!

I'm also pretty sure that some places in Second Life will drag your 680's even in SLI down to their knees. While it's true that there is always room for improvement on the viewer side, it's also true that Second Lifes user generated content is the major cause of poor performance.

Link to comment
Share on other sites

Well yea what I said was more food for thought really. But just becase what I said about LL not making viewers does not mean I ment for them to step back completely as what you said is right if one viwer see a square and another see a a circle there is a problem. But LL developing client team seems to be working with AMD single core and equivilant Intell processors with mid to mid low range graphic card's as seen from the perspective of looking at the client from a external perspective on a level of programing of old doom style game or something. But I spend the last 3 + hours forcing Sli on with Force Alternate  Frame Rendering 2 and setting antialising mode to Override and looking at fps even with seeing the lil Sli indicator light on witch somehow set my mind at ease knowing the second card was tossed in Ironicly enough starign at the same wall my fps did not really go up from were it was before I made the Second Life nvedia Profile to get Sli working So looking @ my cpu usage and gpu ussage as still low just makes Me wish I was a actuall programer and was able to tear appart and look at a server and see what exacly can be optimised to make SL use all the hardware capability to hold a 60 fps untill as you said there are things that will drop any system to 1 fps but when that happens I expect to see CPU Ussage 100% GPU Ussage at 100% then I could see how SL would be lagging Not 1 fps and CPU ussage 5% and GPU ussage @ 10%. But Yes I love SL spend alot of time on it just makes me sadto not be able to use all I have. And even in the end maybe tossing more cores at sl won't make it run faster. But without the code for it there there will always be the what if and the why dosn't it support this aspect. But yea not so much as Sli working but all the CPU Processors as well. As the GPU is just wating for the Info from the CPU to wait for what ever script is running more threads faster scripts get processed or the next core can take on the next script in the area if one script is taking to long so the system is not wating.

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 3626 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...