Jump to content

Vertex Normals Issue


ZedConroy
 Share

You are about to reply to a thread that has been inactive for 1387 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

I'm have a problem wherein the vertex normals of my mesh look as if they are being squished upon import to SL. This becomes a real issue when using normal maps as the vertex normals are no longer the same as what was baked.

On the top left of the image below is the mesh imported into SL with the bad normals which are having a negative impact on the normal map, below it is a screen grab from the 3DS Max viewport, (note the bounding box)

On the top right of the image is the same mesh, now with correct normals, and a working normal map. In order to combat the issue I created two vertexes to increase the bounding box to have equal x/y/z dimensions. 

 6caa31014eb10601848b5a35528b2fad.jpg

Is this a known issue, and does anyone have another way to fix it?

The work around I came to for this test obviously increases the LI of the object as the perceived size by SL is now much larger.

Any advice appreciated.

  • Thanks 1
Link to comment
Share on other sites

I didn't know about this and it seems like something a few creator friends of mine might want to know about...

Also, that's a very pretty object.

Edit: For what it's worth, I wasn't able to reproduce this with Blender. (I used a cube with a slightly-concave side.)

Whether or not the object was scaled or not, the normals would always show identical.

Edited by Wulfie Reanimator
Link to comment
Share on other sites

9 hours ago, ZedConroy said:

Thank you Wulfie, I appreciate it.

 I couldn't find anything documenting the problem anywhere. It also didn't feel like a 'me' issue. So hopefully this info will be of interest to somebody.

The issue is in the vertex normals not being squished, they're pointing the wrong direction. When you added those two vertices, Max somehow did a reordering and reset the vertex normals to the defaults. Oddly it's not showing in the viewport shading. 

When something like this happens to me, in Maya I can see that in the viewport and  usually happens when importing an fbx file. I fix that by running "unlock normals", which I'm sure Max has an equivalent as well. 

  • Like 1
Link to comment
Share on other sites

15 hours ago, Wulfie Reanimator said:

Edit: For what it's worth, I wasn't able to reproduce this with Blender. (I used a cube with a slightly-concave side.)

Whether or not the object was scaled or not, the normals would always show identical.

I also couldn't reproduce the issue in Blender, it seems to always calculate correct averaged normals regardless of object transformations.

That being said, I exported a collada file from max and reimported it, the normals remained consistent. But if I imported the mesh into SL then saved a collada from there I ran into problems.

I've provided an example below. You can see the mesh on the left that has had no interaction with SL has correct scale transforms and the normals are fine. The mesh on the right which was saved from SL has a smaller scale transform on the Y axis, so SL must being scaling the mesh such that the vertices fill the bounds of a box before it calculates the normals.

8c32ea1e41dc4b2c0152c21e27c9c511.png

You wouldn't encounter any issues if you were looking at a sphere or a cube as the bounding dimensions are already equal.

Again though, Blender does its own thing so you won't be able to see the problem. 

  • Thanks 2
Link to comment
Share on other sites

30 minutes ago, Kyrah Abattoir said:

Did you try what I said?

I did. I've switched to using blender so that there won't be any discrepancies.

The 1st frame in the image below I have the mesh in Blender. The parameters were already how you see them, but I applied the object transform anyway just to rule anything out. I exported this to a collada file.

The 2nd frame is the mesh after I uploaded it to SL, saved a collada from SL, then imported it back into Blender. You can see the scale transform parameters have changed, and the shading on the mesh is different as the vertex normals are pointing in a different direction. This will cause problems with normal maps like I showed in my original post.

The 3rd frame is again the mesh from SL. If I apply object transforms to the mesh, it fixes any shading issues as the process re-averages the normals. At this point the mesh basically looks like the one we first exported from Blender.

The 4th frame is interesting. This is the mesh from SL, but instead of applying the transforms I have just reset the scale (Alt+S while in object mode). You can see that the mesh is stretched to fit inside the volume of a cube. If SL computes the normals at this stage, this is how you end up with the result in the 2nd frame.

1e930b9eb8415308bfbfb0dbb8b34b23.png

If a mesh were to fill more of the cubic volume then SL doesn't have to stretch it as much, so the issue with the vertex normals is less pronounced. It is just very evident in this case because the mesh is thin on one axis.

  • Like 3
  • Thanks 1
Link to comment
Share on other sites

Number 4 is entirely expected behaviour I believe. At least it makes sense to me in those terms. Number 3 probably also. Number 2 is interesting and worth investigating a little more.

Here is why I think much of this is expected behaviour.

All mesh imported into SL is quantised to signed 16-bit values, thus you get 65536 increments to a side in the bounding box, or to put it another way your coordinate space inside the BB is -32768 to + 32767 (Note for the observant: This also underlies why your pivot is in the centre of the BB, which if and when we fix it might have repercussions on effective resolution). These are the same irrespective of the relative size of the BB dimensions. thus there is an effective higher resolution along your Y axis (based on the images). The scale that you applied in 4, effectively made the BB cubic. 

I would also advise taking care with observations based on the DAE export. DAE export is rudimentary, it takes the mesh that is being rendered and converts it to Collada. It is not, therefore, operating on the asset, but on the post-processed VBOs that get drawn.

Given the bias in the BB I can see how the normals might be compressed but at the same time others are not seeing this, suggesting to me that there may be something assumed in the original asset (an unapplied scale as per previous suggestions would also have been my first guess) that we're overlooking. 

If you are willing/able to raise me a Jira on the Firestorm jira (jira.firestormviewer.org) and add the assets above to it, I can have a play. If you do not wish those to be publicly visible then you could email them to me directly, or perhaps the best solution, recreate a subset of this that we can share without violating your IP. A simpified model perhaps just a quarter slice of this should be plenty.

@polysail your 3ds max knowledge might uncover something here? 

 

  • Like 1
Link to comment
Share on other sites

Beq is 100% correct ~ all meshes are stored internally in a 'cube-like' form ~ where the mesh is scaled to fill the volume of a unit cube.  How this affects things~  and whether that's "expected behavior", that's ..... that's another question~

After reading this entire thread a couple times ~  I'm thinking this is actually a long buried bug.  I haven't tested this personally yet, but after reading everything here and seeing the examples, SL should be capable of preserving the correct vertex normal directions while doing whatever scaling operation it needs to during the quantization process ( where it converts the DAE into an internal 'second life mesh').  This looks like pretty definitive evidence that it does, in fact, NOT do that.  Scale matrices suck.  I routinely get them wrong.  My guess is that the mesh uploader is also handling this incorrectly.  Perhaps it's just usually not that noticeable~ but it gets progressively worse the more of a deviation that your object has off of a cubic shape?  ( Though that last sentence is making me ponder ~  how does it handle flat planar objects?   Are they special cases? )  I'm not really sure.

SL does take in original object vertex normals and apply them to the uploaded mesh, it doesn't simply recalculate them all from nothing.  At least I've seen it do that in cases where the vertex normals are "explicitly defined".  To define what I mean by "explicitly defined" means ~ it's the case that Optimo referred to in his post where if you're in your 3DSoftware attempting to edit your normals with the usual face/edge tools that set up hard/ soft transitions have seemingly no effect on the object until you "unlock" them.  I'm not sure of the precise mechanics of the how and why of this situation across all thee different programs discussed in this thread, but I know SL does acknowledge them as inputs.  How those inputs are handled ~ and whether they are handled correctly.  Well that's a question worth asking.

It shouldn't matter what the source application is,  3ds max solves vertex normals mostly behind the scenes, much like Maya does, unless it gets a asked to import a file with "explicit defined" vertex normals ~ if my memory serves me correctly... this is identical to how Maya handles it ( I'm not 100% sure on this )?

However this behavior noted here ~ regardless of the source application, explains a substantial amount of frustration I've had with the inconsistencies of how normal maps function inside SL.  I've not tried setting cubic bounding boxes for all my meshes to see if it fixes all the normal map issues that I've been struggling with, but that will definitely be something I will try from now on with objects that are not naturally cubic.

 

I wonder how this all calculates with rigged meshes?

  • Like 1
  • Thanks 2
Link to comment
Share on other sites

Actually, once I hit post ~ I remembered.  I'm pretty sure just exporting a mesh into a DAE file format "explicitly defines" it's vertex normals.  DAE is a simple format, and does not allow for edge smoothing.  ( Again I haven't tested, but I have a fuzzy recollection here ) that if I export a mesh with 'regular' handling of vertex normals from either 3ds max or Maya~ that just immediately re-importing it will require them to be "unlocked" again.  it's just a limitation of the DAE format.

  • Like 1
Link to comment
Share on other sites

Okay ~  I've done some preliminary testing.  Nothing 100% conclusive yet ~ but by all indications ( at least for meshes originating in Autodesk Softwares )~  for any meshes that aren't perfect cubes ~ during the quantization process ~ it seems to be re-calculating the normals for them using an inverse scale matrix for the surface normals ~ instead of a scale matrix~  meaning ~ the thinner and flatter your object is ~  the vertex normals of the object are going to be distorted ~ by not only the ratio of the difference from the mesh to a standard cube ~ but then that ratio AGAIN beyond that ~~ meaning if your initial object measures 0.25m x 1m x 1m ~   the surface normals are being calculated in a manner which ~ in order to get them to match the original shape ~ your object must be scaled to 4.0m x 1m x 1m ~  16 times the original mesh dimension in the axis that was "off".

If my testing is correct ~ and this is the mistake...... Holy @#*%*@

  • Thanks 3
Link to comment
Share on other sites

We still have testing to do butI worked through the viewer code and confirmed Liz's theory. I have fixed a test version of FS and  @polysail and I have tested that fix and it appears to work. Bad news, if it does turn out to be right then there's 9 years of bad mesh normals out there 🙂

  • Like 2
  • Thanks 1
Link to comment
Share on other sites

As Liz stated above (and this explains also why @Wulfie Reanimator's repro failed) the problem lies in the conversion process applied at upload. As per my note, all mesh undergoes a normalisation process that force aligns to an origin point central to the mesh geometry bounding box. This is done by determining the min and max extents and from those a scaling fact is derived. This is used to squish the mesh coordinates into the 0:1 domain, unfortunately it appears that during this transformation an incorrect scaling was applied to the mesh normals. The result is that any non-cubic meshes will have some degree of taint to their normals. 

Liz has created the following Jira https://jira.secondlife.com/browse/BUG-228952

I have applied my fix to Firestorm, so it should be fixed for FS in the next release, and added the appropriate patch for others.

  • Thanks 3
Link to comment
Share on other sites

On 6/19/2020 at 3:16 PM, polysail said:

Actually, once I hit post ~ I remembered.  I'm pretty sure just exporting a mesh into a DAE file format "explicitly defines" it's vertex normals.  DAE is a simple format, and does not allow for edge smoothing.  ( Again I haven't tested, but I have a fuzzy recollection here ) that if I export a mesh with 'regular' handling of vertex normals from either 3ds max or Maya~ that just immediately re-importing it will require them to be "unlocked" again.  it's just a limitation of the DAE format.

It is an FBX module's option that collada belongs to too, by default the vertex normals are assumed as explicitly defined and therefore locked upon import, but it can be turned off (generating a warning about it) 

Edit to add: this is a behavior i discovered when importing sculpted models from ZBrush, which uses an arbitrary scale unit, to Maya for retopo. Scaling the model up to retopo at real scale keeps the vertex normals near to zero length, generating issues during texture bake that resolve by unlocking normals.

So I think that vertex normal calculation upon import to SL should be checked also against the scale multiplier found in the third tab. 

Edited by OptimoMaximo
  • Thanks 1
Link to comment
Share on other sites

2 hours ago, OptimoMaximo said:

So I think that vertex normal calculation upon import to SL should be checked also against the scale multiplier found in the third tab. 

The scale multiplier in the 3rd tab is a universal omnidirectional scale value ( applies to all axis equally ~ X Y Z )  so it doesn't actually affect the normals data at all, also you can't zero it out ~ so there's no concern for 0 magnitude normals.

Edited by polysail
  • Thanks 2
Link to comment
Share on other sites

4 hours ago, polysail said:

The scale multiplier in the 3rd tab is a universal omnidirectional scale value ( applies to all axis equally ~ X Y Z )  so it doesn't actually affect the normals data at all, also you can't zero it out ~ so there's no concern for 0 magnitude normals.

The scale multiplier applies to the vertices as far as we know, but I've read the code snippet from Beq and I see that vertex normals get normalised right after the scaling for the quantization. If the scale value in the third tab doesn't trigger another vertex normal normalization, it might end up in near to zero length (zero is possible only on vertex colored point cloud data afaik) which may be problematic anyway. I don't know, just saying it's worth double checking early and not give things for granted, especially for those who export at unit scale from Maya (centimeters) and scaling up upon import. 

  • Like 1
Link to comment
Share on other sites

On 6/20/2020 at 2:23 AM, Beq Janus said:

I have applied my fix to Firestorm, so it should be fixed for FS in the next release, and added the appropriate patch for others.

On a second thought and observing how Maya regenerates the normals applying an inverse scale, I'm now wondering if the scale on the vertex normals is even needed at all. The rescaling by inverse scale is intended to be used when the new scale is meant to be a permanent transformation and so the vertex normals have to be adjusted accordingly. However, after the "squishification" (as @polysail words it) for the quantization process to take place, the object gets reverted to its original dimensions and when rezzed its proportions go back to those the item had in the 3D application, basically restoring the original conditions for the vertex normals to be correct without being touched at all. Is there a reason downstream for the vertex normals scale to happen at all?

  • Like 1
Link to comment
Share on other sites

Yeah.  I've been down that entire rabbit hole, and came out the other side.  ( I think ) ....

Remember my very first going in position on this was "Scale Matrices Suck, I routinely get them wrong."  So my confidence level in all of this has been fluctuating wildly between "pretty high ~ but not certain" all the way down to "I have no idea what I'm doing".

Maya handles vertex normals with inverse object scale.  This is NOT how 3ds max handles it ( As best I can tell ) However, the only way to get 3DS max to render vertex normals is to use the old Editable Mesh asset type, instead of Editable Poly.  So, I'm really not entirely sure how the software handles this internally.  3DS max has a lot of bizarre intricacies behind the scenes, this might just be 'one of those things' it does the '3dsmax way'.  Which can be kinda "speshul" sometimes~

That being said : 

We've found code in SL now for object storage ( squishification ) and subsequent expansion  ~ for object rendering.  Both of which take the normals and multiply them * inverse scale.  As long as these two operations use the same maths, then ( in theory ) everything regarding mesh storage and recall is actually fine.  Conversely, if both calculations used object scale ( like 3ds max appears to ) it would also be "okay", however rendering scaled objects would have to be handled in a vastly different manner~  like I assume it is handled in 3ds max.  But this is not presently the case inside SL.  SL clearly has the maths to use inverse_scale for both calculations.  However, the display of vertex normals in SL, using the debug tool ( the little blue lines we look at ) clearly uses object scale, not inverse scale.  What this means... I honestly have no idea what is going on at this point.  If I turn off all atmospheric shaders, and disregard rendered debug-normals, and just analyze this with an Ambient Dark environment and a single point light SL seems to render surface normals correctly.  But I can't be 100% certain that this is the case, because again, normal maps and shaders are clearly still borked.

 

The only two things thing I am 100% certain of is that :

1: Inside SL, the display of a normal map on a curved object scaled flat is incorrect.  WHY this is the case, is not something I understand yet.  Still digging on that one.

2: The display of vertex normals (rendering of debug type info ..aka drawing little lines out of the verticies ) , both in 3ds max and in Second Life both, is unreliable, and should not be used as a deterministic tool to decide what is going on~ even though I used it as such in my JIRA, I realize now that may have been in error.

  • Thanks 4
Link to comment
Share on other sites

10 hours ago, OptimoMaximo said:

On a second thought and observing how Maya regenerates the normals applying an inverse scale, I'm now wondering if the scale on the vertex normals is even needed at all. The rescaling by inverse scale is intended to be used when the new scale is meant to be a permanent transformation and so the vertex normals have to be adjusted accordingly. However, after the "squishification" (as @polysail words it) for the quantization process to take place, the object gets reverted to its original dimensions and when rezzed its proportions go back to those the item had in the 3D application, basically restoring the original conditions for the vertex normals to be correct without being touched at all. Is there a reason downstream for the vertex normals scale to happen at all?

Yes, don't forget that the mesh asset is effectively read only, but the inworld objects can be stretched and squashed to any proportions. As such the normals have to be stored relative to their intended proportions (the actual scale is probably irrelevent but the relative proportions are required. 

As @polysail alludes to, there is more to this bug than meets the eye, while the fix I applied fixes the scaling that is evident today. It is only papering over a deeper problem with the normals inside the viewer. I've bundled up everything I learned from this exploration and passed it to the lab, via jira and emails and we'll take it from there. I've also moved the fix applied in FS to a debug setting, this is because I hope that havinmg done all this work to demonstrate the issues we can get a proper fix, at which point mesh uploaded with my fix look different. By making a debug setting people can take an informed choice as to whether they want to fix it this way in the short term or not. If the lab decide that this won;t be fixed for any number of reasons then we can change the settings default later too. 

Edited by Beq Janus
  • Thanks 4
Link to comment
Share on other sites

@Beq Janus @polysail

15 hours ago, polysail said:

: The display of vertex normals (rendering of debug type info ..aka drawing little lines out of the verticies ) , both in 3ds max and in Second Life both, is unreliable, and should not be used as a deterministic tool to decide what is going on~ even though I used it as such in my JIRA, I realize now that may have been in error

But when exported from SL, the vertex normals correspond to what happens when the normals get incorrectly scaled, and that is geometry data. Whether or not things are handled differently between softwares when manipulating vertex normals doesn't change this fact. 

15 hours ago, polysail said:

Inside SL, the display of a normal map on a curved object scaled flat is incorrect.  WHY this is the case, is not something I understand yet.  Still digging on that one.

This is something I could observe on the dresses gifs you posted on the Jira. The fixed version with correct vertex normals still look a little weird, and reminds me of normal map x axis flipping. There are 3d apps that handles objects and their nested relative spaces as left handed, including the normals. SL, Maya, Max and Blender are all right handed based applications, but for example ZBrush world orient is lefthanded, although I'm not going to explain here why its normal maps come out corrected on export. 

Since you use 3dsmax you must have access to Arnold renderer. In the Arnold tab within the normal map handling node, by default in Maya x and y channels for the normal maps get flipped and depending on the application that output it, those need to be turned off or on. I'm sure you get that in Max too. Try to flip the R channel on your normal map and I'm quite confident you will get a similar visual result as seen in your gifs. That is where a shader can override the geometry vertex normals. You could do a test on those same dresses and on the fixed version you may try to apply a normal map that has the red channel inverted and see if it looks any better or weirder. If this proves correct, the shader code might have been mistakenly written assuming lefthanded orientations OR the vertex normals get scaled in such a manner that they still point their Z correctly, but flip the x axis the other way around, much Ike it happens with a Aim Constrain when the target object moves to the opposite axial range relative to the local object zero position in space. 

  • Like 1
Link to comment
Share on other sites

2 hours ago, OptimoMaximo said:

But when exported from SL, the vertex normals correspond to what happens when the normals get incorrectly scaled, and that is geometry data. Whether or not things are handled differently between softwares when manipulating vertex normals doesn't change this fact. 

No they are not.  This is why I explicitly stated that 3ds max is NOT a reliable tool for analyzing this.  Objects taken on a tour through SL behave 100% identically to import if they originate in an inverse-normals piece of software such as Maya.  I can take my test-shape in Maya, import the "in a Box" version of it and the non-enclosed copy of it into SL, view that the debug normals tool tells me they're totally wrong.  IGNORE THAT. Export them~ Re-import them into Maya and compare all my vertex normals, and not a single one will be deformed.  This in combination with the code exploration of SL's vertex normal code ~ turning up nothing but inverse_scale calcs leads me to believe that SL is handling normals correctly for most cases, but simply is displaying in the debug tool that it's doing it incorrectly.  Which is ... all kinds of confusing.

However  ~ We're not out of the woods yet  ~ so to speak~ ~ If I do this same experiment in 3ds max, MANY things can change this.

If I have a scale applied to my object in 3DS max at a transform ( object ) level, 3ds max will handle this with it's bizarre normals * object scale matrix calc... and as best I can tell, export those.... which will require a similar parity normals * object scale matrix inside SL to get them back into parity with the system.  ( Which is what Beq's optional patch addresses. in addition to adjusting how normal maps are rendered  ~ but it's not a true fix.)  However, if you Apply XForm in 3ds max prior to export, you will note that the moment you do this, 3ds max recalculates all the vertex normals with normals * inverse object scale, bringing it into parity with SL and Maya. 

However~  If you started off with a 14.0 , 1.0 , 1.0 sized object, that has ( 1.0, 1.0, 1.0 ) scale ( XForm Applied in 3ds max )  ~ and then take this ( 1.0,1.0,1.0 ) scale object and import into SL~  SL will compress it into an internal unit cube .SLM File which is akin to taking your mesh object in any 3D application, scaling it down to fit into a ( 1.0,1.0,1.0 ) sized cube and APPLYING THAT TRANSFORM, making the object effectively 1.0 sized cube, with ( 1.0,1.0,1.0 ) scale, then scaling it back up to object size, at the transform ( object ) level.  In the case of our 14 meter tall box  ~ regardless of what software it was sourced from is now a  ( 1.0,1.0,1.0 ) sized Object with a ( 14.0 , 1.0 , 1.0 ) scale.  Which if you import it into 3ds max, we're back to an object with unapplied XForm data which uses normals * object scale to draw its normals in 3ds max, and they LOOK WRONG until you apply XForm ~ returning the object to it's original 14.0 , 1.0 , 1.0 size ~ with a unit identity transform.  That does not mean this is how it's handled in SL.  ( Despite it being how the Render Debug Normals indicate that it is being handled in SL ... it's... there's many steps to this )

 

On top of this ~ absolutely NONE of the above addresses the original concern that normal maps (  note: Not vertex normals themselves ~ ) in SL are displayed in a manner that is completely consistent with how the debug tool ( apparently incorrectly ) draws the vertex normals.  This bug is weird.  VERY weird.

Also it has nothing to do with the handed-ness RGB channels of normal maps~  the incorrect display of an arbitrary planar normal map displays incorrectly on the side of a flat cylinder in SL.  That's not a problem with the normal map, it was baked in planar space with all the correct color channels and magnitudes, but when you stick it on a cylinder squashed flat, it makes the side of the cylinder render as if it has it's vertex normals ( nothing to actually do with how the normal map was created ) squashed to match the bounding box, just like the renderdebug normals in SL seem to indicate they do ( wrongly ) and in parity with how 3ds max handles un-applied object transforms.  This is directly contrary to all the other inverse_scale normal calcs both in mesh packing and unpacking.  If you doubt me, try doing the same test as I did ~ ignore debug normals, turn off all atmospheric shaders and just look at how objects reflect light.  They do so in a manner consistent with having their vertex normals handled correctly ( in an inverse_scale manner ).

Edited by polysail
  • Thanks 2
Link to comment
Share on other sites

  • 2 weeks later...

I wanted to pop back here and give you all an update on this little bug hunt. 

As of last weekend I have committed a change to both the viewer CPU code and the GPU shaders that addresses the underlying issue that causes the issues that @ZedConroy raised.

En-route to fixing this I also fixed the broken and misleading Render Normals debug control that you can find by enabling the Developer-> Render MetaData -> normals overlay.

Let's start with that. The debug was simply not performing the correct set of transformation to correctly rotate the normals 

It now correctly observes shape changes and the normals reported match those reported by the server via LSL.

A short clip of the new debug in action is here https://i.gyazo.com/a079875e9ee1ee929d00b15517d6a60e.mp4

Key differences:

* yellow normals and blue tangents.

* only selected objects by default (a debug setting can be used to revert to the old global view)

* in face select mode, only the the normals and tangents of highlighted faces will be drawn (useful when the tangents and normals of neighbouring faces overlap)

 

OK, so on to the underlying bug that @polysail and I pursued.

Our investigations showed that there was no issue with the import of mesh. This is provable on a TPV that has a "save as collada" function, as you can import, and re-export a sample mesh, then compare the before and after to find that they are identical. All normals and scale preserved.

To further illustrate this, the rendering of normal maps placed on prims exhibit the self-same bug. This video illustrates it a liitle, you can see that when at default proportions the cylinder looks fine. When flattened it is harder to see but the flat surface continues to behave as if it were curved. https://i.gyazo.com/044d26873ac068f479a7333bebcd9368.mp4

We finally traced the problem to two specific points. To properly explain this I'll briefly cover "the life of a mesh" (though this generalises to the life of a prim/sculpt too)

when a mesh asset is first imported from collada it is converted from floating point to integers. the mesh is normalised such that it is centralsied around the origin, thus if it is 12m high and 4m wide , it will be transformed to be ranging from -6 to +6 in Z and -2 to +2 in X. it is then scaled into a unit cube with the extents ranging from -0.5 to +0.5 in each dimension (XYZ). 

The unit cube is then quantised into integer form, -0.5 mapping to 0, through +0.5 mapping to 65535. All points are represented in this way. This is the form that the viewer send to Second Life and which Second Life creates as the base asset. Note that the dimensions of the original are retained in terms of proportional factors that are part of the mesh asset.

When the unit cube is later requested by a viewer, (note this is never directly requested, the underlying mesh is always subordinate to a parent object that dictates its in-world scale etc) it is unpacked back into floating point form but retains its unit form. When the parent object is drawn the viewer first expands the unit cube in a relative transform, into the scale of the parent object. It is at this stage that the first occurence of the bug is found. Normals (in spite of their name) are special when it comes to transformations. the move contrary to the scaling of the vertices. To correctly transform a normal you use a special derivation of the transformation matrix known as the inverse-transpose matrix. The problem is that the tangent was also being transformed using the same matrix. 

At this stage our mesh has been inflated from it's unit form into the local object scale. All of this happens in the viewer, on your CPU. It is then passed to the GPU for further work. The shaders on the GPU, take the local mesh and rotate it through a combined transformation so that it is aligned to the camera viewpoint that you are observing it from (and is ready to be drawn on the screen) this transformation (using a matrix known as the Model-View-Projection tranform) also has to adjust the normals, and again it employs the inverse, transpose, and once more the tangents were also having this applied to them. 

As a final treat, the legacy bump map code also exhibits the same problem and has been corrected as well. 

In closing I should note that rigged mesh is a whole other ball game. Firstly it does not share te same kind of behaviour, however it is also not correct. There are threads here and Jira's @Beev Fallen has recently raised that cover this. The rigged mesh issue is however (as best I can tell) planned and known "approximation" which is pretty much accepted in realtime 3d games (or at least ones of Second Life's generation) see this Jira (https://jira.secondlife.com/browse/BUG-228823) for a little more background.

And so... here we are. The following image shows the before and after, using my earlier example of a standard cube and a flattened cylinder, that, allowing for minor deviations in UV should otherwise behave the same under lights because the normals SHOULD point in the same directions.

LEFT is release Firestorm (and you should see the same on other viewers going back 9 years or so). RIGHT is the fixed Firestorm

b241f2791ade99f33029b7c41506268b.jpg

And there we have it. A real mish-mash of technical and non-technical stuff. I tried not to go off too far into techno babble land, butI hope I didn't dumb it down too much for those who were following the details. 

Here is a video clip of the side by side old and new https://gyazo.com/91d35910208bf657988f57b3fbc16f97

Any mistake, I blame on it being 3:15am 🙂

 

 

 

Edited by Beq Janus
  • Like 9
  • Thanks 3
Link to comment
Share on other sites

i actually have no right words about the job you guys, @Beq Janus , @polysail , have done here.
it's mindblowing to me, the final result. IT IS MIND-BLOW-ING.
i have no idea how this wasn't seen for ages. the SL is going to change when you release this.
thank you guys for your brilliant investigation and time spent and all that work.

  • Like 1
Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 1387 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...