Jump to content

Second Life viewer 3.2.1 - anti-aliasing not working in linux


Eric Castanea
 Share

You are about to reply to a thread that has been inactive for 4563 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts


leliel Mirihi wrote:

The shining-fixes branch removes the MSAA code and replaces it with FXAA so that's not really a problem anymore.

The FXAA stuff is used in deferred rendering ("lighting and shadows") mode. It is a good addition because hardware FSAA did not like to work at all with deferred rendering, but it really does not work as well as the hardware version that is still used when deferred is off.

Link to comment
Share on other sites


Cerise wrote:


leliel Mirihi wrote:

The shining-fixes branch removes the MSAA code and replaces it with FXAA so that's not really a problem anymore.

The FXAA stuff is used in deferred rendering ("lighting and shadows") mode. It is a good addition because hardware FSAA did not like to work at all with deferred rendering, but it really does not work as well as the hardware version that is still used when deferred is off.

The MSAA code was completely removed, I just tested now and AA doesn't work unless deferred is enabled. Should file an issue in jira for that, there's no technical reason why the viewer can't use FXAA or any other shader based AA method with the forward renderer.

Link to comment
Share on other sites

Don't be too quick to point fingers.  I see you're using nVidia's official drivers.  Check your nVidia control panel, your settings are likely torqued to prevent antialiasing (driver settings always override program settings).  Also, check glxinfo and make sure direct rendering successfully enabled; if your nvidia drivers did not install correctly, you could still use the kernel module, but not the GL libraries provided by nVidia, or vice versa.

Link to comment
Share on other sites


Peggy Paperdoll wrote:

Don't use it then.

 

If LL followed the train of thought we'd still be seeing what we saw in 2005.  Lag when concurrency reached 3000 grid wide, Frequent grid crashes, no windlight, no sculpties, no flexible prims, no voice, no media on a prim.........no lots of stuff.  All those things you enjoy now produced huge outcries from users about LL going backward, distroying the performance of the grid, making SL intolerable.

 

Fix your hardware or don't use the features your hardware can cope with.  It's easy.

I'd kudos this if I could.  QFT.

Link to comment
Share on other sites


Ann Otoole wrote:

I have been unable to get an answer on this. LL was working to fix depracated openGL calls.

 

So exactly what driver version is supposed to work with SLv3.2? Not sure I want to upgrade and disable my use of TPVs to use this beta version.

The 'bug' has been fixed in the shining-fixes branch. The problem turned out to be that nvidia's driver was lying about how many texture image units the chip had, saying it had more than it really did. But I'm shure the myth of depreciated opengl calls will stick with us for years to come.

Link to comment
Share on other sites

It was a driver issue. The viewer asked the driver how many TIUs it had and the driver said "I got 32 man" and the viewer said "Cool dude, I believe you 'cause like you know what's going on, so I'm gonna use all 32 now" and the driver said "Oh noes! I don't actually have 32. Arg crash!". So the fix was to make the viewer not believe everything the pot smoking hippy driver said.

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 4563 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...