Jump to content
Sign in to follow this  
Eric Castanea

Second Life viewer 3.2.1 - anti-aliasing not working in linux

Recommended Posts


leliel Mirihi wrote:

The shining-fixes branch removes the MSAA code and replaces it with FXAA so that's not really a problem anymore.

The FXAA stuff is used in deferred rendering ("lighting and shadows") mode. It is a good addition because hardware FSAA did not like to work at all with deferred rendering, but it really does not work as well as the hardware version that is still used when deferred is off.

Share this post


Link to post
Share on other sites


Cerise wrote:


leliel Mirihi wrote:

The shining-fixes branch removes the MSAA code and replaces it with FXAA so that's not really a problem anymore.

The FXAA stuff is used in deferred rendering ("lighting and shadows") mode. It is a good addition because hardware FSAA did not like to work at all with deferred rendering, but it really does not work as well as the hardware version that is still used when deferred is off.

The MSAA code was completely removed, I just tested now and AA doesn't work unless deferred is enabled. Should file an issue in jira for that, there's no technical reason why the viewer can't use FXAA or any other shader based AA method with the forward renderer.

Share this post


Link to post
Share on other sites


Cerise wrote:

The old non-deferred antialiasing still works here in davep_shining-fixes 243593, so it will be good to check all the setup first.

I see you're right, I spoke too soon. Looks like it still has the FBO based AA, I was thinking of the sampler2DMS one.

Share this post


Link to post
Share on other sites

Don't be too quick to point fingers.  I see you're using nVidia's official drivers.  Check your nVidia control panel, your settings are likely torqued to prevent antialiasing (driver settings always override program settings).  Also, check glxinfo and make sure direct rendering successfully enabled; if your nvidia drivers did not install correctly, you could still use the kernel module, but not the GL libraries provided by nVidia, or vice versa.

Share this post


Link to post
Share on other sites

Exactly.  The biggest performance hit over time isn't from SL, but from the other stuff you have running.  Especially if you're on a linux distro that ships with compiz enabled by default, or you're using Windows.

Share this post


Link to post
Share on other sites


Peggy Paperdoll wrote:

Don't use it then.

 

If LL followed the train of thought we'd still be seeing what we saw in 2005.  Lag when concurrency reached 3000 grid wide, Frequent grid crashes, no windlight, no sculpties, no flexible prims, no voice, no media on a prim.........no lots of stuff.  All those things you enjoy now produced huge outcries from users about LL going backward, distroying the performance of the grid, making SL intolerable.

 

Fix your hardware or don't use the features your hardware can cope with.  It's easy.

I'd kudos this if I could.  QFT.

Share this post


Link to post
Share on other sites

I have been unable to get an answer on this. LL was working to fix depracated openGL calls.

 

So exactly what driver version is supposed to work with SLv3.2? Not sure I want to upgrade and disable my use of TPVs to use this beta version.

Share this post


Link to post
Share on other sites


Ann Otoole wrote:

I have been unable to get an answer on this. LL was working to fix depracated openGL calls.

 

So exactly what driver version is supposed to work with SLv3.2? Not sure I want to upgrade and disable my use of TPVs to use this beta version.

The 'bug' has been fixed in the shining-fixes branch. The problem turned out to be that nvidia's driver was lying about how many texture image units the chip had, saying it had more than it really did. But I'm shure the myth of depreciated opengl calls will stick with us for years to come.

Share this post


Link to post
Share on other sites

Cool. Disinformation is wonderful eh? Oh well. So it was not a driver issue and when that fix is finally imported to the main branch then all is well with newer nVidia drivers? Is that what you are saying? It is nearly impossible to get a straight answer.

Share this post


Link to post
Share on other sites

It was a driver issue. The viewer asked the driver how many TIUs it had and the driver said "I got 32 man" and the viewer said "Cool dude, I believe you 'cause like you know what's going on, so I'm gonna use all 32 now" and the driver said "Oh noes! I don't actually have 32. Arg crash!". So the fix was to make the viewer not believe everything the pot smoking hippy driver said.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...