Jump to content

Are lindens ever going to fix the transparency texture overlay issue?


You are about to reply to a thread that has been inactive for 4728 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

it can be worked around in some cases, and minimized in others.

two alpha channel textures with the same prim position in the region will generally behave much more predictably than other cases, and items that come between the view of it and you will often behave better too.

Link to comment
Share on other sites

It can only be minimized, not removed.  The problem is an OpenGL issue, not a SL issue, except to the extent that SL has been largely built and textured by amateurs who don't know how to work around alpha sorting as they design.  There's only so much that LL or the TPV developers can do. 

Link to comment
Share on other sites

The fault is in OpenGL, something the Lindens cannot do anything about.  The OpenGL libraries are what LL and other programs use to tell the graphics card what to do.  And every manufacturer of graphics card has their own OpenGL interface for their cards.

My 2c -- ATI has long had grudging support for OpenGL and has in years past screwed it up terribly.  Nvidia cards on the other hand handle OpenGL just fine.  I got rid of my Radeon and put a GTX260 (now a 460) in its place, and lots of problems I just suffered with (like the triangle mess) just flat out disappeared on the Nvidia card.  While you won't be rid of the transparency issue as long as SL uses OpenGL, Nvidia has a superior implementation of it.

 

BTW:  The usual workaround to them is to have only the outer image have transparencies, and the image of the level below be a nontransparent JPG.  Since the second picture has no transparency the ordering glitch never appears and you have a perfect "Look through the holes at the next texture" image.

Link to comment
Share on other sites

Here's a discussion on this "problem" from about a year and a half ago.

 

http://forums-archive.secondlife.com/109/28/329691/1.html

 

It's not a problem with SL or Linden Lab not being able to fix it.  It's mostly creator caused from not knowing what alpha sorting is all about.  It's a rendering problem with OpenGL (DirectX also has the problem but to a lesser degree due to amateur creators not using DirectX for their rendered creations).  Almost every texture you see in SL is made by residents........most of whom are not professionals (and many don't know squat about alpha sorting).  The thread I linked contains comments and information from, at least one, recognized expert in the field (Chosen Few).  Do a Google for "Alpha Sorting OpenGL".  You'll get a ton if information on the subject and how to work around it or avoid it entirely.

 

LL can't fix a problem that originates from sources they can't control.  We residents can help by learning what causes it and how to work with what we have for a rendering engine.

  • Like 1
Link to comment
Share on other sites

It helps to understand the issue. As otrhers have explained, OpenGL has trouble with overlayed 32bit textures. The work arounds primarily rely on a static environment, which simply does not happen in SL.

 

The other problem is LL's lack of support for user generated 1bit alpha ttextures. A 1bit alpha is only solid or transparent, no in-between translucent texture area. This means it won't work so well for things like glass or water, where you want to see through a translucent texture, but it's perfect for things like walls, and many plants.

 A benefit to 1bit alpha is that it does not suffer the same rendering problems as 32bit alpha textures, so you don't get that texture flickering from overlayed alpha images.

 SL itself already supports 1bit alpha textures, these are the alpha images used by the Linden trees and plants, as well as the alpha rendering for the avatar itself, which is why you can hide parts of the avatar with an alpha mask but not suffer the 32bit alpha rendering issues.

 

 Now, to my knowledge LL quietly snuck in the ability for residents to upload 1bit alpha by creating a 1bit PNG file and uploading it, however they made no announcement and have provided no documentation. It seems to work, tho. So if you want to use an alpha texture that has no translucent areas you would be best off trying it this way to possibly avoid alpha rendering problems.

  • Like 2
Link to comment
Share on other sites

 


Penny Patton wrote:

 Now, to my knowledge LL quietly snuck in the ability for residents to upload 1bit alpha by creating a 1bit PNG file and uploading it, however they made no announcement and have provided no documentation. It seems to work, tho. So if you want to use an alpha texture that has no translucent areas you would be best off trying it this way to possibly avoid alpha rendering problems.

Wow, thanks for this information... can't wait to try it, as soon as I figure out how to make a 1 bit png... lol.

 

Link to comment
Share on other sites


Crias Rowlands wrote:

Yeah, that would be great to give us some guidance about how to produce that kind of png.
:)

"1bit" alpha is a bit of a misnomer if I understand the encoding.... it's actually, "single value"... which is what you get when you use gif transparency.... a single color is specified as transparent, and the rest of the pallet is left allone.

so assuming this works, you would make a palleted png image (256colors), and specify one color as alpha, then save (exactly as you would with a .gif image), then upload....

I have not tried this to see if it works or how it might behave with things like shadows, or occlusion (usually this type of alpha occludes other objects)

ETA:

I made a slight error on this post, because I didn't realize that PNG supported single color transparency mode for 24bit color (AKA 16.7million colors) images.... but the idea is the same... create image, mask image, merge the mask with the image, and save with the "single color" option, and choose to use the exsiting layer transparency. someone else may want to correct those instructions to apply to PS, I use PSP, and an old version at that.

Link to comment
Share on other sites


Penny Patton wrote:

It helps to understand the issue. As others have explained, OpenGL has trouble with overlayed 32bit textures. The work arounds primarily rely on a static environment, which simply does not happen in SL.

It has nothing to do with OpenGL, it's a fundamental limitation of raster based graphics. Alpha blending is an
order dependent
operation. That means that to properly draw the frame you have to sort every alpha pixel into the correct order before composition the final image.

The current best known algorithm, A-buffers, requires DirectX 11 class hardware and can potentially use several hundred megabytes of vram to draw the image. The average resident of SL simply does not have that kind of hardware. That's why LL hasn't bothered to do anything about it yet.

Link to comment
Share on other sites

Which really means that LL will never "fix" it.  LL chose OpenGL as a means of graphics rendering so that Mac and Linux users would be able to join SL as well as Windows users.  DirectX is propriatory to Microsoft and can only be used for Windows based computers.  OpenGL is open source which is why LL chose that rendering engine........I don't see them moving away from that ever as it would immediately lock out some 20% of the users (plus tick off a bunch of people).

 

I've heard your explanation before but I can assure you that DirectX also has the alpha sorting problem (though it seems to be less than OpenGL...........but that can be due to OpenGL being used for platforms that allow amateur texture creators where DirectX is used by professional texture creators for Windows based platforms).  I saw the glitch serveral times in Blue Mars (a Windows only platform using DirectX).

Link to comment
Share on other sites

I think you misunderstood, A-buffers doesn't require DirectX 11, in fact the original paper describing it was written in 1984. What it needs is direct read/write memory access from the shaders, which is a feature only present on hardware that supports DirectX 11 / OpenGL 4.0.

It works by creating a linked list for every pixel that has an alpha face drawn on it, sorting each list into the correct order, then blending them into the final image. Its memory usage is directly proportional to the number of overlapping alpha faces (plus some overhead), which in SL can be in the thousands.

Link to comment
Share on other sites


leliel Mirihi wrote:
It has nothing to do with OpenGL, it's a fundamental limitation of raster based graphics. Alpha blending is an
order dependent
operation. That means that to properly draw the frame you have to sort every alpha pixel into the correct order before composition the final image.

The current best known algorithm, A-buffers, requires DirectX 11 class hardware and can potentially use several hundred megabytes of vram to draw the image. The average resident of SL simply does not have that kind of hardware. That's why LL hasn't bothered to do anything about it yet.


Thanks for this important clarification!  The problem isn't the API (OpenGL), it's the underlying algorithms.  Alpha sorting is a very difficult problem, and the best-looking solutions require a lot of horespower.  Currently, the practical solution is to avoid the problem.

I was unaware of the "1-bit alpha" workaround, which would be very helpful if it works in SL.

Link to comment
Share on other sites

Alpha sorting is a hard problem, but we're getting closer to a viable solution every year. I was just reading a paper from the GDC conference a few weeks back where some one presented an improvement to A-buffers called Adaptive Transparency that can run up to 40x faster while using 1/2 of the memory.

I think we're just a few years away from having the hardware and algorithms necessary to make alpha sorting a non issue. Whether LL will implement them any time soon is a different story tho.

Link to comment
Share on other sites


Lear Cale wrote:

I was unaware of the "1-bit alpha" workaround, which would be very helpful if it works in SL.

you can already see it a work..... it what both invisiprims and alpha wearables use.

there are situations where the object using it will occlude, and some where it won't, so it does have it's own problems depending on the surface it belongs to.

Link to comment
Share on other sites

  • 3 weeks later...

This is not a direct reply to Void - I don't see any way to do a general comment on the thread. I don't like this forum software much so far.

-

I have all but eliminated this issue in my scenes, and oddly have done so by using MORE transparency, not less. My specific case is a house, with primitive based plants adjacent to a window. The windows have two textures - blinds open and blinds closed. Both are 32 bit, and the closed version has some alpha in order for some of the outside scene to bleed in. With the "closed" texture applied, looking out from the inside, certain plants sorted first, and I could see almost the entire texture bleed through. With the "blinds open" texture applied (more alpha) the glitch was more difficult to repro, but still there. Changing the plant texture to one with maybe half of the opaque pixels as the previous one seems to have eliminted the glitch entirely.

I doubt very much that this issue will ever go away for good. Myself NOT included, a relative few of us in Second Life are professional 3D scene designers, and even fewer of us understand the underlying z buffer issue. My very narrow understanding is that for a scene to be entirely free of the problem, some rather tedious work needs to be done in prioritizing how and when opaque geometry is rendered vs alpha geometry, when the z buffer is written to and when it's not, all at a significant cost in frame rate, gpu and memory resources.

 

 

Link to comment
Share on other sites

I also do not blame LL for at least this one thing.

I'm pretty sure that the initial programmer never imagined me sculpting an alpha prim rolled into a helical cone and then folded up into itself at exactly the same set of points and thought "how can I prevent the leaves on the back of Josh's plant from showing through the front even if he renders it inside-out?".

 

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 4728 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...