Jump to content

Beta Viewer Improvements To Graphics Subsytem, can someone explain what has improved?


Ciaran Laval
 Share

You are about to reply to a thread that has been inactive for 4165 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

I run the beta viewer quite regularly, yesterday I had a forced update, after installing it I received a message about my graphics preferences changing due to improvements in the graphics subsystem.


My preferences have changed by default from high to ultra. Now I welcome changes and improvements to graphics and texture loading speeds but I don't know why on earth my graphics have changed, there's no real information in the release notes. There is mention of http textures, but as we can't read the Jira on the linked Jira, I don't know what that entails.

I don't want to put too much strain on my graphics cards, so I'll be happier running around with ultra settings when I can find out what's changed and why my card is now considered capable of working at a higher graphics setting.

Anyone got any information about this?

Link to comment
Share on other sites

Same here, the Linden Lab beta viewer update switched my graphic setting to Ultra ("due to improvements in the graphics subsystem" the message said).

I made a test to see what fps I get in two viewers.  Standing exactly on the same spot looking to the same direction.  In both viewers exactly the same graphics settings.

The results:

• Second Life 3.4.3 (267135) [second Life Beta Viewer]: 95 - 108 fps
• Firestorm 4.2.2 (29837): 68 - 72 fps

The earlier Linden Lab beta viewer gave similar fps as Firestorm 4.2.2 gives.  Now this new Linden Lab beta gives higher fps.  So something good indeed has been done in the graphics subsystem.  Great!

What improvements have been done that I don't know.


For reference here is my system info:

CPU: Intel® Core i7-2600K CPU @ 3.40GHz (3411.15 MHz)
Memory: 8169 MB
OS Version: Microsoft Windows 7 64-bit Service Pack 1 (Build 7601)
Graphics Card Vendor: NVIDIA Corporation
Graphics Card: GeForce GTX 560 Ti/PCIe/SSE2
Windows Graphics Driver Version: 9.18.0013.0697
OpenGL Version: 4.2.0

Link to comment
Share on other sites

With Viewer 3.4.3 Linden Lab have made extensive changes to the default level of graphic settings that the viewer is set to based on your graphics card - if you are interested to see what these are the table is contained in gpu_table.txt and is easy to read.


Whilst there may be some improvements in the graphics systems in Viewer 3.4.3 some of the changes in that graphics table are a bit too extreme.  The machine I am most regularly using at the moment uses a Nvidia 8600GT gpu - a 5 year old mid-range graphics card. With the new settings for that card as default I get lighting and shadows turned on!  As good as that graphics card is there is no way that such a setting could be used for enjoyably moving around in SL no matter how good the rest of the computer's components are.

My major concern is that many people will just click the 'rendering settings change' box not knowing what it really means then find that their Second Life is running slower for them and then become dissatisfied without knowing why - they may not even notice the graphical improvements if they are upset.  I would like such a change to have been linked to a message at login to properly explain the changes and how to change them back, or to have given the option to try the new level of settings LL think is appropriate for your graphics card.


Maybe such a change could be considered before Viewer 3.4.3 hits the main release channel.

Link to comment
Share on other sites

You've highlighted my concerns in a far better way than I did in my OP. I'm not convinced my graphics card is really upto Ultra level, it's a Radeon HD 6670.

The ultra settings were set by the viewer, not me. Now a lot of people will think "Great" and as I said in my post above, it does feel better, but my concern is, that it may be asking a little bit too much of my card and I don't want to be a party pooper, but I generally err on the side of caution.

I'm also concerned about the lack of available information on these changes, but thanks for providing some extra info about the txt file.

Link to comment
Share on other sites

The bulk of the changes are here and here. It looks like LL finally adopted my suggestion of using a finer grained class system, trying to fit 12 years worth of graphics cards into 4 classes just wasn't working anymore.

Looks like they're also now doing some kind of automated performance analysis to rank GPUs, this should be a good thing in the long run but will probably have some bumps along the way.

On the down side it looks like they removed some of the fancy regex I put in to work around the horribly mangled (or just varied) strings some drivers returned.

Link to comment
Share on other sites

The new "Ultra" setting isn't the equvalent of the old "Ultra" setting, which was basically "everything turned up as far as it will go." I had been running my computer with my settings on "High" with the addition of turing "Lighting and Shadows" on but setting the "Shadows" setting to "None.". The new beta moves the slider to "Ultra" for me but the individual settings are exactly the same as I had them before.

The big change is that work has been done improving the speed of the newer rendering engine and iti's basically caught up with the speed of the older rendering engine for a lot of video cards. The newer rendering engine is technically described as "Deferred Rendering", which is a who-in-the-what-now term for most people so in preferences they call it "Lighting and Shadows" in the LL viewer. It ALLOWS shadows, but you can still use it with shadows turned off. I imagine a lot of people who are suddenly having their preferences changed to "Ultra" are actually having deferred rendering turned on with shadows kept off.

Link to comment
Share on other sites

To clarify - there are now three new levels that your computer may be set to in addition to the old "Low", "Medium", or "High" graphics.  These are:

  • Ultra - the same as "High" but with "Lighting and Shadows" turned on.  (Note: on many systems with poor graphics rendering this is a big performance hit)
  • Ultra plus "Ambient Occlusion Enabled" (providing better shadows - more details here)
  • Ultra plus "Ambient Occlusion Enabled" plus shadows set to "Sun/Moon + Projectors" - this is the maxxed out settings and if your gpu is up to it really makes SL look stunning.

Hopefully this will improve many users enjoyment of SL - but please be aware of the performance costs that may occur.

 

Link to comment
Share on other sites

The beta's automatic shift to "Ultra" did change a few of my individual settings. Previously I had enabled "Lighting and Shadows" and "Sun/Moon + Projectors" but without ambient occlusion (which combination didn't qualify for "Ultra" before); the new beta turned on ambient occlusion, and I'm pretty sure that it also turned off some water reflections, and under Hardware, turned on anti-aliasing and anisotropic filtering. 

Link to comment
Share on other sites

In fact, I'm not sure why. I doubt I've ever tried anisotropic filtering without anti-aliasing, probably because they were both recommended to be "off" for some ancient ATI video card I had several computers ago. I know when I turn them both on it's as if viewing through a cheesecloth filter. So thanks: next time I'm in-world, I'll try anisotropic filtering without anti-aliasing.

I'll probably try turning ambient occlusion back on, too. I seem to have GPU capacity to burn these days, and can't honestly remember what artifacts I used to think I saw only with AO enabled.

Link to comment
Share on other sites

The cheesecloth look is most likely from FXAA, it's the type of anti-aliasing the viewer uses when lighting and shadows is enabled. FXAA stands for fast approximation of anti-aliasing, as it's name implies its not true AA. It's actually just an edge weighted guassian blur with some other fancy stuff so it has the tendency to make everything a little blurry.

Anisotropic filtering is a special way of doing texture filtering for triangles that are at oblique angles to the camera to minimize distortion. Without AF on textures on the far end of objects tend to look like mush.

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 4165 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...