Jump to content

Jenna Huntsman

Resident
  • Posts

    670
  • Joined

  • Last visited

Posts posted by Jenna Huntsman

  1. 41 minutes ago, Quistess Alpha said:

    For this sort of thing, it's more about information flow than the math. Figure out what information you have, what information you need, and then find a way to convert one to the other.

    Thinking about it backwards starting with the 'want': a spherical arc between 2 points needs a center, then once you have that, there are probably several ways to turn that into an arc, but fixing the thing I wrote years ago to include a center rather than just assuming <0,0,0> is the center:

    // 2 points, a center and a list of percents along the circular arc to return. percent=0->A, percent=1.0->B
    // assumes A,center,B is an isosceles triangle, which in general it won't be.
    list sphere_interpolate(vector A,vector B, vector center, list percents)
    { // return A when percent ==0, B when percent==1
      
      A = A-center;
      B = B-center;
      
      rotation rot = llRotBetween(A,B);
      vector axis = llRot2Axis(rot);
      vector angle = llRot2Angle(rot);
      
      list rets;
      integer i = llGetListLength(percents);
      while(~--percents)
      {   //vector ret = center + A * llAxisAngle2Rot(axis,percent*angle);
          rets+= ( center + A*llAxisAngle2Rot(axis,angle*llList2Float(percents,i)) );
      }
      
      return rets;
    }

    but if I'm understanding you right, we don't know what point is the center of our hypothetical sphere, we only have a normal to the plane, and a radius (or some other measure of the curvature. After looking at some quick sketch diagrams, the most intuitive and easy to work with piece of information is the inner angle of the curve). Would need some debugging, but my first try would probably go something like:

    // UNTESTED!
    // angle in radians, TWO_PI -> return midpoint of A&B; PI -> return center that makes a quarter arc.; 0-> math error (point at infinity).
    vector find_center(vector A, vector B, vector normal, float angle)
    {
       vector average = 0.5*(B+A); // point between B&A.
       vector delta = average-A; // could use (B-A)/2, but this makes a bit more sense since we'll use average for the last step.
       
       float halfChord = llVecMag(delta); // half the distance from A to B.
        
       float dist = halfChord*llTan(0.5*angle); // distance from average to center
      
       vector cross = llVecNorm(normal%delta); // should point from midpoint of B&A to center.
       return average + dist*cross;
       
    }

    Big picture idea is that whenever you have a normal vector, you want to almost always want to take a cross product with some vector that lies in the plane of interest, giving you 2 vectors in that plane, from which you should be able to make any other point in that plane as ( (some point in the plane) + (a weighted sum of those 2 vectors) ).

    Hmm, okay, seems like I made a mistake.

    The points I have are, 2 vectors (A and B) representing arbitrary points on the surface of a hypothetical shape, and vector of the center point of that shape, and a float representing the interpolation amount.

    If it helps, I also know the 2 rotations of those points A and B.

    Looking at the output, I wonder if the results being outside what I'm expecting is maybe because it'd be better to interpolate along an ellipsoid rather than a sphere? How could that be done?

    I (somewhat shamefully) asked an AI model about this but as far as I could tell the math it was able to output was garbage.

  2. On 9/20/2021 at 4:14 AM, Quistess Alpha said:

    So, I know this is an old post, but I stumbled upon it, and it didn't look like it got a good answer. I might be understanding the problem wrong, but here's the problem, as I'm understanding it:

    Assume there is a sphere at point <0,0,0> and two vectors A and B on that sphere. Find equally spaced(in spherical distance) points between A and B that lie on the sphere:

    vector sphere_interpolate(vector A,vector B, float percent)
    { // return A when percent ==0, B when percent==1.
      rotation rot = llRotBetween(A,B);
      vector axis = llRot2Axis(rot);
      vector angle = llRot2Angle(rot);
      
      vector ret = A * llAxisAngle2Rot(axis,percent*angle);
      // if the lengths of A and B are known to be the same, this next line is unnessessary:
      ret = llVecNorm(ret) * ( ((1-f)*llVecMag(A)) + ((f)*llVecMag(B)) ); // there are efficiency gains to be had here I'm too lazy to do.
      return ret;
    }

    and for the given example of wanting to find 2 points, let percent be 0.33 and 0.66.

    Just revisiting this problem after a little while!

    I've figured out that I also need to provide a plane normal so that the interpolation path follows along a defined plane between points A and B, and having the appropriate curvature for that path along the sphere.

    How could that be added?

    (many thanks for this code! I'm not a great mathematician and I can't seem to find the info I need elsewhere!)

  3. sRGB should be enough for content creation in SL.

    While SL is likely to be able to support HDR output in future, the content standards used in SL actually require textures to be in sRGB space anyway.

    It's also worth noting that DCI-P3 isn't really a color space that's used in computing - it's a standard designed for cinemas. Monitor manufacturers like to quote it because it's a smaller color space than the actual color space used in HDR (BT. 2020), so gives a higher % coverage.

    In computing, the 2 major color spaces are sRGB and BT. 2020 (also known as Rec. 2020 ).

    • Like 1
  4. 24 minutes ago, Eoul Derryth said:

    They do? In my case, my eBody Reborn, Maitreya, Belleza and Legacy bodies all seem to not have any different neck sizes available in my inventory, and they don't seem to care if I'm using my Catwa or Logo heads. They just seem to work (see pictures). This is why I assumed that there was a standard neck seam size that everyone used. I've even gone in just now and tested my Legacy, Freya and Reborn bodies against my Logo Mae and Catwa Uma heads. While the seam between head and body is, of course, plainly visible, there is no gap. I did, however, discover a sizeable gap between the default Ruth avatar and the bodies while swapping heads, so that answers the question of if I am the one in error. Also checking my own body against the Ruth head, it appears that while I got the scale right, the position is off. Going back and comparing the bone positions with the MayaBentoFemaleAugust2016 file I downloaded from the SL wiki, there is no such discrepancy. So, in summary, while I don't agree with your answer, in checking and verifying what you said, I can only conclude that I just have to add some to my neck and taper in order to pass through the existing necks on the heads and do my best to just hide it, so thanks!

    Inventory2.jpg

    Inventory1.jpg

    Ruth_Legacy.jpg

    Logo_Legacy.jpg

    Legacy_Catwa.jpg

    Reborn_Catwa.jpg

    Logo_eBody.jpg

    I think these images demonstrate that the Catwa head in question is using a nonstandard (or at least, not SL Neck) neck, as the Logo heads and Reborn use SL Neck by default (Not sure about Legacy, although I'd guess they are using SL Neck).

    Bonus with SL Neck is that it also resolves the seam issues caused by misaligned material maps, which is even more important with the introduction of the PBR viewer as the easy ""fix"" (not a fix at all actually, but one people keep reiterating) to turn ALM off is no longer available.

  5. Go back to basics for a second -

    Redeliver your body and head. Unpack the fresh copies, and wear them.

    Is the seam still visible?

    For example, my partner wearing the latest version of Jake and his head (LeLutka Quinn)

    Screenshotfrom2024-03-0616-45-04.png.763106376c5364e85f43588bdcc7f99a.png

    • Like 1
  6. Just now, Zalificent Corvinus said:

    Sn-Fran is not on the equator, and it's not the Spring Equinox.

    Still not the Equinox, go away and learn some science.

    Your second example is not only not on the Equinox, it's at timezone noon, which isn't exactly the same as LOCAL noon (sun at peak elevation) and its not exactly on the equator either.

     

    LOCAL noon ( sun at highest elevation, on the Equinox, on the Equator.

    You might find this a bit closer.

    https://www.suncalc.org/#/0.0001,109.3367,16/2024.03.20/11:50/1/3

     

    Apologies, I misread what you said.

    It is indeed true that if all criteria are met, that it is possible for the sun to be at a 90 degree (or extremely close to it) elevation.

    However, the criteria are extremely specific and fall outside what most people would expect.

    For example: as you highlighted, the time of midday may not equate to the exact midday point of a given point on the Earth due to the use of timezones. I'd argue, however that most people would think of midday as the time of midday, rather than the solar midday.

    I'd also argue that the majority of people wouldn't assume that the midday preset was deliberately made to represent a point on the equator on the day of the spring / autumn equinox. The San Francisco example shows what someone from North America might expect to see at midday.

    • Like 1
  7. 6 minutes ago, Zalificent Corvinus said:

    Except at noon, on the Spring or Autumn Equinox

    Not quite.

    For example, the meteorological data for San Francisco on the Spring Equinox 2023 at midday places the sun distinctly not at a 90 degree elevation.

    https://www.suncalc.org/#/37.7771,-122.4197,12/2023.03.20/12:00/1/3

    6 minutes ago, Zalificent Corvinus said:

    for people living on the Equator.

    This is closer to being true, although, again, not quite 90 degrees.

    Today's date, for example:

    https://www.suncalc.org/#/-0.0201,109.337,14/2024.03.01/12:00/1/3

     

     

    • Haha 1
  8. 4 minutes ago, Rowan Amore said:

    I don't even care what it looks like in PBR because midday is midday = sun directly overhead.

    It is actually relevant as the midday preset was updated in the PBR viewer, and this complaint is one which was addressed.

    The new default midday no longer places the sun directly overhead (because, that doesn't really happen in the real world anyway).

    • Like 3
  9. 1 hour ago, Zalificent Corvinus said:

    Which becomes meaningless if you don't have realistic values for lighting.

    This is actually coming as part of the glTF stage 2 (mesh and scene import) project.

    Lights will be able to be defined in physical units, as opposed to the arbitrary 0-1 scale that is currently used.

    • Thanks 1
  10. 28 minutes ago, Rowan Amore said:

    Midday = Noon.  When the sun is directly overhead, it casts shadows on your avatar FROM your avatar.  You'll have weird looking light and dark areas.  We recently had someone ask why their feet were not matching the rest of their body.  It was quite noticeable.  From the picture they posted, I could see it was a lighting issue as in, she probably had it on default/midday.  She changed it to Nam's.  All.was well.

    I'm sure midday is great for building.  Horrible for your 10000L plus avatar, which frankly, is what most people care about looking the best.  Not the chair you made.

     

    That issue in particular is likely down to the user's preferences, namely shadows being disabled.

    Switching EEPs is a hack around the issue, as Nam's (the EEP mentioned as the "fix") uses a high level of ambient light to eliminate areas which would otherwise be shaded - not really a PBR issue per-se. (You could reproduce this on the 6.x viewer)

  11. 1 minute ago, Zalificent Corvinus said:

    So, you make the mistake of buying "Pretentious Bloody Rubbish" enabled clothing and it looks like crap because "you are using the WRONG EEP/Probe Set".

    Then you go somewhere, your favourite club say, and the owner of the club isn't using the EEP/Probe Set recommended by the maker of your outfit, or that suggested by the makers of anyone else's outfits.

    If the clothing creator packages a specific EEP that they require you to use in order for the clothing to display correctly, that's a sign of faulty content.

    PBR enabled clothing does not and should not require the use of a specific EEP preset.

    • Like 1
    • Thanks 1
  12. 1 minute ago, Rowan Amore said:

    But venues/regions using them will effect our own avatars?  Honestly, couldn't care less how that chair looks but if my avatar looks like kaka with any EEP setting I might use, it's a hard NO.

    Yes.

    An avatar stood within a reflection probe will have it's reflections and lighting influenced by it.

    Reflection probes always work based on the EEP currently applied in the user's viewer, so any manually applied EEP will display the correct lighting within that probe for that setting, combined with any local lights (if any are present).

    • Like 2
  13. 1 hour ago, Qie Niangao said:

    But it's my understanding that PRIM_RENDER_MATERIAL returns an impenetrable asset from which there's no extracting the component material maps, and PRIM_GLTF_BASE_COLOR returns only the override parameters which will only reveal a texture if it has been overridden from whatever the Material specified internally.

    Yes, this is how this works.

    The stated reason for things working this way is because the region (where the script is executed) doesn't actually know anything about the contents of a material asset, but does know the overrides (this is because the overrides are stored locally on the region, whereas content in the material asset is on the CDN, which the region doesn't communicate with if it can be avoided).

    • Thanks 2
  14. 1 hour ago, Cooter Coorara said:

    Consider the reflection probe which I suspect is an after the fact work around.

    This is incorrect- reflection probes are pretty much standard practice in other engines.

    See:

    Unreal Engine 5.0 - https://docs.unrealengine.com/5.0/en-US/reflections-captures-in-unreal-engine/

    Godot 4 - https://docs.godotengine.org/en/stable/tutorials/3d/global_illumination/reflection_probes.html

     

    Anyway, it is indeed a bug that Planar alignment doesn't work for PBR materials.

  15. 3 minutes ago, Cooter Coorara said:

    You can change it to a box but that is irrelevant.  What it reflects is still a sphere.

    This is incorrect. The probe type influnces the projection and sampling of the reflection map, but it doesn't change the object that's showing the refleciton. So if you're viewing the reflection on a chrome ball, then the box and sphere probe types will look very similar. But on a planar (i.e. flat) chrome surface (a "mirror" of sorts) you can easily see the difference in projection types.

    5 minutes ago, Cooter Coorara said:

    So leave it as such and make the sphere large enough to cover the entire build

    .. No - probe in each room is how you're meant to be doing it, as you'll then get correct reflections and lighting on objects in each room. This becomes especially apparent when viewed in a nighttime setting where lighting has a much more obvious effect.

    Generally for interior scenes, unless you've got a massive room, you'll want to use a single box probe for the room.

    • Like 1
    • Thanks 1
  16. 2 minutes ago, Quistess Alpha said:

    couldn't you just take the camera's rotation and left-multiply by a 90 degree turn on the z-axis?

    rotation point_y_to_camera = <0, 0, 0.70711, 0.70711>*camera_rotation; 

     

    Not quite - I did try this at first, but because the object could be anywhere on screen, if it's at the edge of the screen it looks skewed due to perspective, so I implemented the above function to have the object always orient itself to look into the "lens", which resolves the skew issue.

    • Thanks 2
  17. Ended up managing to solve this by a dumber method than I thought was needed.

    I modified Dora's function to have it's UP vector multiplied by the rotation of the camera, so the UP direction is kept relative to the camera rather than the world.

    orientToCameraMaintainUp(rotation camRot, vector camPos)
    { //Rotate the object to the camera, while also keeping the object's +Z axis relative to camera. WARNING: SUFFERS FROM GIMBAL LOCK
        rotation inv = <0.00000, -0.00000, -0.70711, 0.70711>; //-90deg on Z axis.
        vector target = camPos - llGetPos();
        //camRot = Vec2RotTrue(target); //Rotate X+ towards camera. Suffers gimbal lock.
        camRot = Vec2RotTrueRelative(target, camRot); //Rotate X+ towards camera. No gimbal lock.
        camRot = inv * camRot; //Rotate Y+ towards camera.
        list changes = [PRIM_ROTATION,camRot];
        llSetLinkPrimitiveParamsFast(LINK_ROOT,changes);
    }
    
    rotation Vec2RotTrueRelative( vector V, quaternion R )
    { //Rotate to look at V while maintaining RELATIVE up axis. Credit: Dora Gustafson, Jenna Huntsman
        V = llVecNorm( V );
        vector UP = < 0.0, 0.0, 1.0 > * R; //Keep UP axis relative to input rotation R
        vector LEFT = llVecNorm(UP%V);
        UP = llVecNorm(V%LEFT); // you want to keep the direction of V
        return llAxes2Rot(V, LEFT, UP);
    } //https://wiki.secondlife.com/wiki/User:Dora_Gustafson/llAxes2Rot_right_and_wrong

     

    • Like 1
  18. Hey all - been trying to bash my head at getting this to work but I fear I'm a little rotationally challenged.

    I've got an object, which should always point (+Y axis) at the user's camera while keeping the object's +Z axis aligned with the camera's UP axis, but I'm having issues.

    The code I've got so far is this:

    orientToCameraMaintainUp(rotation camRot, vector camPos)
    { //Rotate the object to the camera, while also keeping the object's +Z axis relative to camera. WARNING: SUFFERS FROM GIMBAL LOCK
        rotation inv = <0.00000, -0.00000, -0.70711, 0.70711>; //-90deg on Z axis.
        vector target = camPos - llGetPos();
        camRot = Vec2RotTrue(target); //Rotate X+ towards camera.
        camRot = inv * camRot; //Rotate Y+ towards camera.
        list changes = [PRIM_ROTATION,camRot];
        llSetLinkPrimitiveParamsFast(LINK_ROOT,changes);
    }
    
    rotation Vec2RotTrue( vector V )
    { //Rotate to look at V while maintaining up axis. Credit: Dora Gustafson
        V = llVecNorm( V );
        vector UP = < 0.0, 0.0, 1.0 >;
        vector LEFT = llVecNorm(UP%V);
        UP = llVecNorm(V%LEFT); // you want to keep the direction of V
        return llAxes2Rot(V, LEFT, UP);
    } //https://wiki.secondlife.com/wiki/User:Dora_Gustafson/llAxes2Rot_right_and_wrong

    Which works pretty well... but it suffers from gimbal lock if the camera is pointing straight up or straight down (or close to either).

    I haven't got any good ideas about how to solve the problem, and my understanding of quaternion math isn't great. Hopefully someone with more knowledge than I can point me in the right direction - all tips appreciated! :)

  19. 4 hours ago, Qie Niangao said:

    Do you happen to have some of these saved and could share? All I know about are:

    which are generally better than nothing, but too superficial to really understand what all needs to change to make EEPs "compliant" with PBR.

    There's some additional information in the scripting wiki for llGetEnvironment(), but it's assuming "facts not in evidence", for example:

    Like… and this makes sense because…? Just try searching the Knowledge Base for "HDR".

    The alternative is dipping into Content Creation User Group transcripts that assume so much history, like:

    It may as well be in Greek. So I'm really hoping there are pages out there I'm just not finding. Somehow EEP creators are making PBR versions of their previous Environments, but how did they know what to do? Folks wanting to create (or migrate) their own Environments need a clue.

    Documentation for EEP has always been an issue. I did a small writeup of some rules-of-thumb for making presets that play well with the PBR viewer here - https://wiki.secondlife.com/wiki/User:Jenna_Huntsman#PBR

    The llGetEnvironment stuff is because the PBR viewer no longer implements gamma as it was in older viewers. Instead, it serves as an adjustment to the HDR camera in the scene (if reflection probe ambiance is above 0), or the overall scene brightness (if reflection probe ambiance is 0).

    The definition of a "PBR" EEP will likely vary depending on who you ask, but it mostly would boil down to 2 descriptions:

    1. An EEP that the author has verified and matches their artistic vision for the environment in the PBR viewer.
    2. (my definition) An EEP that has removed the classic ambient light in favour of the PBR viewer's indirect lighting system (IBL), and thus will have a reflection probe ambiance of 1 or above.
    • Thanks 3
  20. On 1/30/2024 at 1:06 PM, Qie Niangao said:

    The description of fade_color, "the current color of the light emitted from the dominant light source" sounds good, but I just don't see that color in the ambient lighting. With a black SKY_AMBIENT, that "Environment Ambient" is pretty dark and nearly monochromatic, so the best I can find for approximating Environment Ambient seems to be SKY_LIGHT total_ambient.

    The wiki article on llGetEnvironment mentions that fade_color erroneously returns an unclamped value, meaning the return isn't as expected. You can fix this by clamping manually, e.g:

    vector luminenceHack(vector in)
    { //Hack to modify input colour to max brightness. Credit: Jenna Huntsman.
        list channels = [in.x, in.y, in.z];
        float maxChan = llListStatistics(LIST_STAT_MAX,channels);
        if(maxChan == 0)
        {
            return ZERO_VECTOR; //Avoid divide by zero error.
        }
        else
        {
            return in*(1/maxChan);
        }
    } //This func also handles clamping, as it will multiply against a negative number. (thus, the result is reduced)

    It's also worth noting that fade_color is the general ambient lit color of the environment, but won't factor anything coming from reflection probes.

    With a PBR environment preset, expect total_ambient to be zero (<0,0,0>) or close to zero, regardless of time of day - you can see that in these presets - https://marketplace.secondlife.com/p/PBR-EEP-Collection-Windlights-Studio-Saberhagen/25626640

     

     

  21. 14 hours ago, Qie Niangao said:

    First, a huge thank you for looking at this and giving feedback; I was so hoping you specifically would take a look. Now I'm thinking I too will end up doing some "empirical study" to better inform my next pass at this.

    It seems, if I'm reading your other notes correctly, that my laziness got me in trouble: I was trying to avoid putting labels inside swatches that would change colors depending on the environment in which the probe was being set. As a result, I put the labels outside those swatches along a horizontal line, so they erroneously appear label the level of that line, not the swatch above or below it as intended. It was all to avoid finding a contrasting and readable label color at runtime but now I think simple black or white will work for inside-swatch labels depending on the value of the swatch color.

    The "EEP ambiance" label text doesn't appear to work anyway, and I guess "Environment Ambient" needs to squeeze in there somehow.

    There's also a nagging problem with the very horizontality of that 0-1 range sum of Environmental Ambient and indirect "Irradiance" lighting. I need to show that there's a simple proportional amount of each, but that suggests there's a y-value that's constant over that range and there's not. I stared at that for a long time and just couldn't find a practical, less confusing alternative. (This is related to why I find the single float so confusing: it doesn't monotonically adjust any perceptible quantity except over piecemeal ranges, and even there it's a non-obvious assortment of quantities being adjusted. Well, non-obvious to me, that is.)

    I did some additional testing just now, here's what I found:

    total_ambient is a combination of all things that contribute to ambient lighting directly, so that means:

    • EEP Ambient Color
    • Cloud color (likely multiplied by cloud coverage value)

    That means, that with newer PBR presets that total_ambient may read ZERO_VECTOR, if their EEP ambient color is set to black ( <0,0,0> ) and have the cloud coverage set to zero, although their actual observed ambient light value is something else - hence why I say that fade_color with an additional clamping mechanism is a better measurement (among other reasons).

     

    Anyway - Some feedback on the graph you made, just from personal opinion, and I'm not sure I have any good solutions for these (As I write, I've had a few drinks and can't think straight enough haha!)

    • The graph suffers from non-linear range issues, meaning you have the 0-1 scale, the 1 - 4 scale and the 4 - 100 scale represented by the same distance
    • Not sure what the "Current EEP minimum" slider is meant to do. Could be that my inebriation is inhibiting me from understanding, but clarification would be appreciated.
    On 1/19/2024 at 9:12 PM, Qie Niangao said:

    ² If a product includes multiple reflection probes in the linkset with potentially different settings for each, there's also need to be a way of navigating among them (at least by name).

    The way that I've handled this up until this point is to put the name of the room that the reflection probe is in to the description box, which can then be read later by a script. Generally you want all reflection probes in the same room to share the same ambiance value.

    • Thanks 3
  22. 8 minutes ago, Qie Niangao said:

    Above I threatened to create a UI for the scripted function that can adjust a reflection probe's ambiance (with PRIM_REFLECTION_PROBE attributes), so I took a shot at a design. This concoction made me realize just how much I didn't know, which means this is sure to include errors, so it's a DRAFT strawman very much in need of review:

    Screenshot2024-01-19145007.thumb.png.c3443f5d262d660a569490043e3116c4.png

     

    Some feedback:

    • "EEP" ambiance isn't a constant value as this changes with the sky setting, so there should be a swatch showing the ambient setting.
    • Any probe ambiance value above 1 is a multiplier on the irradiance contribution - values above 4 make for extra sky contribution (so, "extra sky" should be clarified)
      3 hours ago, Qie Niangao said:

      he total_ambient attribute of SKY_LIGHT (which I'm calling "extra sky" in hopes it's the color of the wiki's "only indirect lighting received from the sky")

       

    • My own testing of that function has told me that fade_color is usually a more accurate approximation of indirect light produced by the environment preset. (It does require being clamped into a valid range, however).
    • Thanks 3
×
×
  • Create New...