Jump to content

Jenna Huntsman

Resident
  • Posts

    672
  • Joined

  • Last visited

Everything posted by Jenna Huntsman

  1. If you're having these issues, you may wish to note down which viewer you're using (you don't appear to be using the Linden viewer), and ensure that you have the latest version of the viewer you're using. You may also wish to use a PBR test card in the environment which is causing issues, to see if the content itself is at fault or SL's rendering - https://marketplace.secondlife.com/p/PBR-MacBeth-Color-Checker-Chart/25493856 PBR is still actively receiving fixes, so if you can repro this on the latest LL viewer, then please file a bug report at https://feedback.secondlife.com/
  2. The wiki article has now been updated to reflect the correct settings for the current version of Substance.
  3. Double check the bit depth of the exported images. SL only allows 8 bit color in textures, but Substance will often export at 16 bit, which the viewer will crunch down (badly) to 8 bit. See: https://wiki.secondlife.com/wiki/PBR_Materials#Adobe_Substance_3D_Painter
  4. Hmm, okay, seems like I made a mistake. The points I have are, 2 vectors (A and B) representing arbitrary points on the surface of a hypothetical shape, and vector of the center point of that shape, and a float representing the interpolation amount. If it helps, I also know the 2 rotations of those points A and B. Looking at the output, I wonder if the results being outside what I'm expecting is maybe because it'd be better to interpolate along an ellipsoid rather than a sphere? How could that be done? I (somewhat shamefully) asked an AI model about this but as far as I could tell the math it was able to output was garbage.
  5. Just revisiting this problem after a little while! I've figured out that I also need to provide a plane normal so that the interpolation path follows along a defined plane between points A and B, and having the appropriate curvature for that path along the sphere. How could that be added? (many thanks for this code! I'm not a great mathematician and I can't seem to find the info I need elsewhere!)
  6. sRGB should be enough for content creation in SL. While SL is likely to be able to support HDR output in future, the content standards used in SL actually require textures to be in sRGB space anyway. It's also worth noting that DCI-P3 isn't really a color space that's used in computing - it's a standard designed for cinemas. Monitor manufacturers like to quote it because it's a smaller color space than the actual color space used in HDR (BT. 2020), so gives a higher % coverage. In computing, the 2 major color spaces are sRGB and BT. 2020 (also known as Rec. 2020 ).
  7. I think these images demonstrate that the Catwa head in question is using a nonstandard (or at least, not SL Neck) neck, as the Logo heads and Reborn use SL Neck by default (Not sure about Legacy, although I'd guess they are using SL Neck). Bonus with SL Neck is that it also resolves the seam issues caused by misaligned material maps, which is even more important with the introduction of the PBR viewer as the easy ""fix"" (not a fix at all actually, but one people keep reiterating) to turn ALM off is no longer available.
  8. You may also want to compare your body with a LeLutka head, as they originated the standard that most bodies and heads use now, called the "SL Neck". Not sure if there's a standard document or anything, but if you contact the LeLutka devs they might be able to point you in the right direction.
  9. Go back to basics for a second - Redeliver your body and head. Unpack the fresh copies, and wear them. Is the seam still visible? For example, my partner wearing the latest version of Jake and his head (LeLutka Quinn)
  10. Apologies, I misread what you said. It is indeed true that if all criteria are met, that it is possible for the sun to be at a 90 degree (or extremely close to it) elevation. However, the criteria are extremely specific and fall outside what most people would expect. For example: as you highlighted, the time of midday may not equate to the exact midday point of a given point on the Earth due to the use of timezones. I'd argue, however that most people would think of midday as the time of midday, rather than the solar midday. I'd also argue that the majority of people wouldn't assume that the midday preset was deliberately made to represent a point on the equator on the day of the spring / autumn equinox. The San Francisco example shows what someone from North America might expect to see at midday.
  11. Not quite. For example, the meteorological data for San Francisco on the Spring Equinox 2023 at midday places the sun distinctly not at a 90 degree elevation. https://www.suncalc.org/#/37.7771,-122.4197,12/2023.03.20/12:00/1/3 This is closer to being true, although, again, not quite 90 degrees. Today's date, for example: https://www.suncalc.org/#/-0.0201,109.337,14/2024.03.01/12:00/1/3
  12. It is actually relevant as the midday preset was updated in the PBR viewer, and this complaint is one which was addressed. The new default midday no longer places the sun directly overhead (because, that doesn't really happen in the real world anyway).
  13. This is actually coming as part of the glTF stage 2 (mesh and scene import) project. Lights will be able to be defined in physical units, as opposed to the arbitrary 0-1 scale that is currently used.
  14. That issue in particular is likely down to the user's preferences, namely shadows being disabled. Switching EEPs is a hack around the issue, as Nam's (the EEP mentioned as the "fix") uses a high level of ambient light to eliminate areas which would otherwise be shaded - not really a PBR issue per-se. (You could reproduce this on the 6.x viewer)
  15. If the clothing creator packages a specific EEP that they require you to use in order for the clothing to display correctly, that's a sign of faulty content. PBR enabled clothing does not and should not require the use of a specific EEP preset.
  16. Yes. An avatar stood within a reflection probe will have it's reflections and lighting influenced by it. Reflection probes always work based on the EEP currently applied in the user's viewer, so any manually applied EEP will display the correct lighting within that probe for that setting, combined with any local lights (if any are present).
  17. This intentionally doesn't work, so that's a non-issue. https://wiki.secondlife.com/wiki/PBR_Materials#Unsupported_use-cases
  18. Yes, this is how this works. The stated reason for things working this way is because the region (where the script is executed) doesn't actually know anything about the contents of a material asset, but does know the overrides (this is because the overrides are stored locally on the region, whereas content in the material asset is on the CDN, which the region doesn't communicate with if it can be avoided).
  19. This is incorrect- reflection probes are pretty much standard practice in other engines. See: Unreal Engine 5.0 - https://docs.unrealengine.com/5.0/en-US/reflections-captures-in-unreal-engine/ Godot 4 - https://docs.godotengine.org/en/stable/tutorials/3d/global_illumination/reflection_probes.html Anyway, it is indeed a bug that Planar alignment doesn't work for PBR materials.
  20. This is incorrect. The probe type influnces the projection and sampling of the reflection map, but it doesn't change the object that's showing the refleciton. So if you're viewing the reflection on a chrome ball, then the box and sphere probe types will look very similar. But on a planar (i.e. flat) chrome surface (a "mirror" of sorts) you can easily see the difference in projection types. .. No - probe in each room is how you're meant to be doing it, as you'll then get correct reflections and lighting on objects in each room. This becomes especially apparent when viewed in a nighttime setting where lighting has a much more obvious effect. Generally for interior scenes, unless you've got a massive room, you'll want to use a single box probe for the room.
  21. Not quite - I did try this at first, but because the object could be anywhere on screen, if it's at the edge of the screen it looks skewed due to perspective, so I implemented the above function to have the object always orient itself to look into the "lens", which resolves the skew issue.
  22. Ended up managing to solve this by a dumber method than I thought was needed. I modified Dora's function to have it's UP vector multiplied by the rotation of the camera, so the UP direction is kept relative to the camera rather than the world. orientToCameraMaintainUp(rotation camRot, vector camPos) { //Rotate the object to the camera, while also keeping the object's +Z axis relative to camera. WARNING: SUFFERS FROM GIMBAL LOCK rotation inv = <0.00000, -0.00000, -0.70711, 0.70711>; //-90deg on Z axis. vector target = camPos - llGetPos(); //camRot = Vec2RotTrue(target); //Rotate X+ towards camera. Suffers gimbal lock. camRot = Vec2RotTrueRelative(target, camRot); //Rotate X+ towards camera. No gimbal lock. camRot = inv * camRot; //Rotate Y+ towards camera. list changes = [PRIM_ROTATION,camRot]; llSetLinkPrimitiveParamsFast(LINK_ROOT,changes); } rotation Vec2RotTrueRelative( vector V, quaternion R ) { //Rotate to look at V while maintaining RELATIVE up axis. Credit: Dora Gustafson, Jenna Huntsman V = llVecNorm( V ); vector UP = < 0.0, 0.0, 1.0 > * R; //Keep UP axis relative to input rotation R vector LEFT = llVecNorm(UP%V); UP = llVecNorm(V%LEFT); // you want to keep the direction of V return llAxes2Rot(V, LEFT, UP); } //https://wiki.secondlife.com/wiki/User:Dora_Gustafson/llAxes2Rot_right_and_wrong
  23. Hey all - been trying to bash my head at getting this to work but I fear I'm a little rotationally challenged. I've got an object, which should always point (+Y axis) at the user's camera while keeping the object's +Z axis aligned with the camera's UP axis, but I'm having issues. The code I've got so far is this: orientToCameraMaintainUp(rotation camRot, vector camPos) { //Rotate the object to the camera, while also keeping the object's +Z axis relative to camera. WARNING: SUFFERS FROM GIMBAL LOCK rotation inv = <0.00000, -0.00000, -0.70711, 0.70711>; //-90deg on Z axis. vector target = camPos - llGetPos(); camRot = Vec2RotTrue(target); //Rotate X+ towards camera. camRot = inv * camRot; //Rotate Y+ towards camera. list changes = [PRIM_ROTATION,camRot]; llSetLinkPrimitiveParamsFast(LINK_ROOT,changes); } rotation Vec2RotTrue( vector V ) { //Rotate to look at V while maintaining up axis. Credit: Dora Gustafson V = llVecNorm( V ); vector UP = < 0.0, 0.0, 1.0 >; vector LEFT = llVecNorm(UP%V); UP = llVecNorm(V%LEFT); // you want to keep the direction of V return llAxes2Rot(V, LEFT, UP); } //https://wiki.secondlife.com/wiki/User:Dora_Gustafson/llAxes2Rot_right_and_wrong Which works pretty well... but it suffers from gimbal lock if the camera is pointing straight up or straight down (or close to either). I haven't got any good ideas about how to solve the problem, and my understanding of quaternion math isn't great. Hopefully someone with more knowledge than I can point me in the right direction - all tips appreciated!
  24. Documentation for EEP has always been an issue. I did a small writeup of some rules-of-thumb for making presets that play well with the PBR viewer here - https://wiki.secondlife.com/wiki/User:Jenna_Huntsman#PBR The llGetEnvironment stuff is because the PBR viewer no longer implements gamma as it was in older viewers. Instead, it serves as an adjustment to the HDR camera in the scene (if reflection probe ambiance is above 0), or the overall scene brightness (if reflection probe ambiance is 0). The definition of a "PBR" EEP will likely vary depending on who you ask, but it mostly would boil down to 2 descriptions: An EEP that the author has verified and matches their artistic vision for the environment in the PBR viewer. (my definition) An EEP that has removed the classic ambient light in favour of the PBR viewer's indirect lighting system (IBL), and thus will have a reflection probe ambiance of 1 or above.
  25. The wiki article on llGetEnvironment mentions that fade_color erroneously returns an unclamped value, meaning the return isn't as expected. You can fix this by clamping manually, e.g: vector luminenceHack(vector in) { //Hack to modify input colour to max brightness. Credit: Jenna Huntsman. list channels = [in.x, in.y, in.z]; float maxChan = llListStatistics(LIST_STAT_MAX,channels); if(maxChan == 0) { return ZERO_VECTOR; //Avoid divide by zero error. } else { return in*(1/maxChan); } } //This func also handles clamping, as it will multiply against a negative number. (thus, the result is reduced) It's also worth noting that fade_color is the general ambient lit color of the environment, but won't factor anything coming from reflection probes. With a PBR environment preset, expect total_ambient to be zero (<0,0,0>) or close to zero, regardless of time of day - you can see that in these presets - https://marketplace.secondlife.com/p/PBR-EEP-Collection-Windlights-Studio-Saberhagen/25626640
×
×
  • Create New...