Jump to content
Sign in to follow this  
RoudyRaccoon Hand

Graphics optimizations, and why Linden Labs needs to make it a top priority.

Recommended Posts

Alright, lets get started shall we...

I am running a AMD FX-8, that's 8 CPU cores, my GPU is an EVGA 480GTX, I have 4GB of RAM at 1600mhz and a gaming hard drive...

Now let me ask this...why the hell am I only getting 15FPS on a gaming rig that can run ANY other game at 85FPS, yet secondlife can't seem to keep up if the gods them selves were rendering it, making sense this isn't.

Share this post


Link to post
Share on other sites

You have what appears to be a good box and I assume 64 bit since it's a quad core.  By gaming hard drive, I assume you mean a solid state drive which is also optimal.

However, what most don't fully understand is that the graphics rendering issues you normally see are in large part the fault of the content creators in SL.  Use of high rez and large files cause significant graphics lag which comes from many builds and avatar clothing.  Since each scene is rendered client side and loaded when you enter a scene it is unlike other games where you installed most of the game elements in your PC when you loaded their software.  SL software is thin and only loads what you need when you need it then caches that content.

The less you move around and the more you stay in one region, the better it will run.  Also if you are running a 64 bit system, you may want to think about more RAM. (yes, here is where the debate starts on RAM).

Another quick fix is to reset your modem and router.  This will often find you a new route and can improve performance in not just SL but most games.

I too play many other games like WoW, LOTR, WarHammer etc.  Of them all, SL is the hardest on my PC and I have about the best gaming box you can get today.  It's really something you have to adjust to in many cases including turning down your graphics settings.

Share this post


Link to post
Share on other sites

Check the anti-aliasing setting in your graphics prefs. Someone set the default for the LL Viewer to 8x, which will bog down any GPU pretty badly, and is insanely overkill. Drop that to 2X and I'm rather certain your frame rate will at least triple.

Share this post


Link to post
Share on other sites

Also, since SL has been around for years some of the internals of the rendering system are old and cause problems with newer graphics cards and drivers. What viewer are you using? I know Phoenix shuts off OpenGL VBO's by default now because they cause problems, and any viewer that supports mesh right now has OpenGL problems that are being worked on at LL as we speak.

Share this post


Link to post
Share on other sites

Are you asking me or the OP?  I can run SL at max with two windows at a time open and stil can play WoW so, I'm good.

But, I think our OP was a post and poof...lol :)

Share this post


Link to post
Share on other sites


RoudyRaccoon Hand wrote:

LL's needs to start transitioning away from OpenGL markets and more towards DirectX for maximum resource utilization, and eventually a "cloud" grid run by the millions of users.

This sentence makes absolutely no sense.

 

Anyway there are a lot of people with the same or similar hardware as you that do get good performance. I've noticed this seems to be a trend lately, people with similar hardware getting wildly different performance for no apparent reason. I haven't heard of any good theories as to why that happens, tho that might have something to do with the people getting bad performance just posting a few rants then leaving without ever giving us info.

Share this post


Link to post
Share on other sites


leliel Mirihi wrote:


RoudyRaccoon Hand wrote:

LL's needs to start transitioning away from OpenGL markets and more towards DirectX for maximum resource utilization, and eventually a "cloud" grid run by the millions of users.

This sentence makes absolutely no sense.

 

Anyway there are a lot of people with the same or similar hardware as you that do get good performance. I've noticed this seems to be a trend lately, people with similar hardware getting wildly different performance for no apparent reason. I haven't heard of any good theories as to why that happens, tho that might have something to do with the people getting bad performance just posting a few rants then leaving without ever giving us info.

Agreed on both counts.  In fact, I'd take it a bit futher: in the past week or so it's public that Microsoft is killing Silverlight, and Adobe is doing the same for Flash (first on mobile platforms).  Now, if you were advising the folks managing resources for the next generation GPUs, how much would you tell them to bet on DirectX?

And I, too, have been seeing reports of wildly different framerates for higher-end graphics cards.  There's really no reason the OP's machine isn't getting much higher SL framerates than reported -- as do the vast majority with comparable hardware -- so I can't help but wonder whether those low numbers are peaks, averages, or the very worst it ever does. I'm also suspicious that some of the unaccounted-for variance might be due to network bandwidth or latency, but obviously that wouldn't explain such a crappy peak framerate.

Share this post


Link to post
Share on other sites

I have a duel core CPU (3GH) and an nVidia GTS 8800 512. 4 years old and not top of the line but working well. With LL V3 latest beta I am getting between 9 and 15 fps with around 12 most of the time (for this test). I use Imprudence most of the time so I checked what I was getting with it and in my skybox at over 3000m I am getting between 60 and 95 fps and when going to the same store as V3 I was getting between 45 and 80 fps. I was wondering why I was getting a headache when I was using V3 but I never thought to look at the frame rate. Tested both viewers in the same places but also went to another store with Imprudence that I know is busy and still got around 50 to 70 fps. I checked the anti-aliasing setting and it was at 4x. Before the test I set it to 2x for V3 and relogged. I keep my graphics settings set to High, that is where it is set automatically when I install the viewer.

Just some info to add to the post.

Share this post


Link to post
Share on other sites

a) SL doesn't multi-core well so having 8 cores does you little good rendering SL. I get good FPS (20 FPS) with a C2D E8500 @ 3.16GHz, 8GB RAM, and GTX 560 Ti w/1GB VRAM in busy clubs with my bandwidth set at 3500 kbps on a 12MB cable connection.

b) The high FPS you're used to seeing with most games is not going to happen with SL. The reason is that most games have all the background and textures hard-coded on your HDD or SSD and they don't change. The background and textures in SL have to be rendered "on the fly" as they are sent to you by SL. In fact they can change while you're looking at them. That doesn't happen with most other games.

c) It's not about SL keeping up, it's about your rig and bandwidth and viewer keeping up. Keep in mind that SL regions run at 45 FPS (check the statistics window of your viewer).

Good luck optimizing your rig for SL.

Share this post


Link to post
Share on other sites


I have a relatively low end machine

CPU AMD Athlon 64 X2 Dual Core Processor 4600+ (2832.27 MHz)
Memory 4096 MB
OS Version Microsoft Windows 7 64-bit Service Pack 1 (Build 7601)
Graphics Card ATI Radeon HD 4670 Series, 1GB

I tested Viewer 3.2, FireStorm-Beta and Phoneix in a nearly empty sandbox region and moderately populated region

Here is what I got as far as fps is concerned ( I had turned off anisotropic filtering, anit-aliasing vbo in all three)

viewer empty region populated region
Viewer 3.2 20 fps 10-15 fps
Firestorm Beta Mesh 35 fps 20-25 fps
Phoenix 1.5.2 50 fps 30-40 fps

Viewer 3.2 has a nice new interface but as can be seen it has the lowest performance, so low infact to make it virtually unusable for me.
For now I will be sticking with Phoenix.

Share this post


Link to post
Share on other sites

You can probably still adjust your graphics settings to get better performance in SL. What I am running is by no means a "gaming rig". I have a several years old IBM Intellistation quad core system, 4 cores, Intel Xeon 5150 at 2.66 GHz, 3 GB of RAM that my win XP 32-bit OS can see, and an ancient Nvidia Quadro FX 3500 graphics card that was in the junk bin at Good Will, (PCIe x16 with 256 MB VRAM). With my draw distance set to 256 Meters and all the other graphics settings maxed, I get about 18 FPS with Viewer 3.2, and 23 FPS with the Firestorm Mesh Beta. If I go with lower graphics settings, I can get 30 to 40 FPS, without much trouble. The SL Viewer allows some incredibly interesting graphics possibilities, but a lot of the high-end settings are only useful for still photography. To get a good frame rate, you have to make compromises on things like draw distance, water reflections, and more.

You also have to bear in mind that "Graphics optimization" is critically dependent on the visible content in the scene that is being rendered. Virtually all 3D games are entirely created by professional game designers. Every building, plant and rock can be pre-optimized by the designers, caching the image data and eliminating every polygon face that isn't needed for rendering. Where a Player can go can be constrained, allowing only certain paths to be possible. (For example, why have mesh data for the back side of a barn that you can't go behind?)

In SecondLife, virtually everything in-world is designed by the residents. By the players themselves, and not by Linden Lab. There is no approval process or optimization process to allow new content to be added. If "Screamarr the Barbarian" decided to make a super sword with unique 1024 bit textures on each one of the 200 tiny gemstones that encrust the hilt of his sword, there's nothing preventing him from making that lag monster and wearing it in-world. Never mind that for most people in viewing range, those stones merge into a sandy blur. Every person that has him in their draw distance still has to download and cache every one of those 200 huge textures, and attempt to render them!

A professional game designer probably would have used one 256 x 256 texture for Scremarr's sword, and still been able to use part of that same texture for his tooled leather belt his sword sheath, and his matching dagger and its sheath. One texture that is 1/16 the size of any one of the 200+ textures in his sword is all that is really needed, in the eyes of a pro content designer. 

And furthermore, in SL, that sword, and his castle, and his moat dragon, and everything else in his sim has to be renderable from every possible angle, because there's no constrained path that the players have to follow in moving through the virtual world. Compare that to a game where you might only see the castle from your side of the moat, and might only be able to fight that dragon from a limited angle range. Defeat the dragon, and they transition you inside the castle, to an interior-only shot that again can be optimized.

Second Life is chock full of content like Scremarr's sword. And tomorrow someone else may make something that is even worse to try to render. Every hour of every day, the content in SL changes, as places get deleted and created, and new items get created and sold. All of it designed without ANY constraints on 'optimized' design. For that reason alone, SL will never perform as well as a professionally built game, with a limited set of professionally designed and approved content.

Share this post


Link to post
Share on other sites


RoudyRaccoon Hand wrote:

My big issue with their whole setup has been fixed in the FPS title ARMA 2, who's physics and object calculations are a dynamic streamed event for muliplayer, yet I can run it at about 65FPS even with it's horrible optimizations. LL's needs to start transitioning away from OpenGL markets and more towards DirectX for maximum resource utilization, and eventually a "cloud" grid run by the millions of users.

 

 

Umm, no. DirectX is a pricy Windows only technology that MS will never port to other platforms. Going to the proprietary DX and away from open-source and transparently developed OpenGL would freeze out the many Mac and Linux users of SL, some of whom are the largest and most established content creators.

 

Share this post


Link to post
Share on other sites

If DirectX is cross platform why do popular PC games like Eve Online, that rely on DX rather than OpenGL, have to run on the Mac via Transgaming's Cider virtual Windows environment? Why can't I just pop down to Gamestop and pick up Battlefield 3 or Crysis and run it on my Mac? Might the reason be that DirectX is (and always has been) a Windows ONLY technology?

Yes, DX support is built into the graphics hardware but you need software to talk to the hardware. That software (drivers) needs to have hooks into the OS (APIs) to make communication between the hardware and software possible. Microsoft has never allowed Apple or the opensource Linux community to have access to their DX APIs. Therefore, it is NOT POSSIBLE to run DX games on Macs or Linux. Period. Never has been and never will be unless Microsoft suddenly changes.

Umm, yeah... I think you need to do a little research...:smileywink:

ETA: Although I do agree with you that LL needs to rewrite the client to use multicore CPUs and get their OpenGL implementation up to date.

Share this post


Link to post
Share on other sites

You cannot compare the FPS between viewers this easily. Just for example, Firestorm has way higher LOD settings than Phoenix has. I don't know about other details, but this example is evident.

Another issue is that you have to take care that the same amount of people are in your FOV, that your state of graphics loading is the same, and so on.

Share this post


Link to post
Share on other sites

Direct x is windows only and theres nothing wrong with opengl anyway. I think you have something wrong with your settings or setup thats not getting on well with sl.

That being said, i would love a next generation opengl based graphics and physics sytem to really make sl shine, its very overdue imo

Share this post


Link to post
Share on other sites

They will not do that, LL is based on Open Source technology, and DirectX is proprietary Microsoft technology that additionally MS do not supply to non-Windows platforms such as Mac and Linux. That would drive away the few creative producers they have left and SL would just be a cartoon sex world, if it isn't already!

Oops, should have finished reading the thread first, since 2 other posters said the same thing!

The other point is that yes, identical machines will give different fps because their internet connections and pipeline are different, internet routing is different, there can be server bottlenecks, LL often has problems between it's co-locations, parts of the global backbone may be down or under stress, packets may go by longer routes, at any moment the numbers of users logged on varies, plus your service provider may be playing fast and loose on occasion with your bandwidth. The fact it works at all at decent fps when the data is sent half way around the world for some people and not others is pretty amazing.

Share this post


Link to post
Share on other sites

OP... if you only knew your question with gather all SL fanboys around and "tech gurus". Did it? lol

Problem is not OpenGL. Problem is bad, very bad coding and optimization. And that new lighting feature just literally eats FPS. From 90 to 15 even less. I bet now some fanboy already used the usual "but games like crysis and bf are 'prebaked' and SL is dyamic". My God... 3000m in sky, only my ship and stuffs, everything is downloaded, nothing moves and I have same issue. But no! It's clearly not LL's terrible code. 
Love opensource but honestly if they have to rewrite everything, close it and use paid technologies I will not mind at all.... for the sake of normal FPS

@Chelsea Malibu,When you minimize your both SL clients and open WoW they are "hypernated" with 0% cpu time.  Else I don't  see you playing WoW who uses lots of CPU but not GPU

And btw about Linux - Nobody cares. LINUX/BSD are not gaming platforms. Deal with it. I'm using it and still stand behind this statement. Mac - "a jail for fools" as Richard Stallman said.

 

Share this post


Link to post
Share on other sites


H8 wrote:

OP... if you only knew your question with gather all SL fanboys around and "tech gurus". Did it? lol

Problem is not OpenGL. Problem is bad, very bad coding and optimization. And that new lighting feature just literally eats FPS. From 90 to 15 even less. I bet now some fanboy already used the usual "but games like crysis and bf are 'prebaked' and SL is dyamic". My God... 3000m in sky, only my ship and stuffs, everything is downloaded, nothing moves and I have same issue. But no! It's clearly not LL's terrible code. 

Love opensource but honestly if they have to rewrite everything, close it and use paid technologies I will not mind at all.... for the sake of normal FPS

We need to come up with a good name for the anti fanboys that blame anything and everything on LL. The real world isn't so simple, more than one person can be at fault, problems can be caused by multiple things interacting. But no, when it comes to sl we all turn into a bunch of children and look for some one to point our finger at. I think the biggest problem with the sl community is we're all suffering from monochrome world view, everything has to be in black and white, either it's all LL's fault or all some one else's fault. That's BS and we all know it.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...