Jump to content

Jenna Huntsman

Resident
  • Posts

    672
  • Joined

  • Last visited

Everything posted by Jenna Huntsman

  1. That indicates an issue with the remote server - The selected port (12043) is firewalled - https://stackoverflow.com/questions/29745705/php-fsockopen-error-connection-refused111-try-to-connect-fingerprint-device
  2. I doubt that SL would move in that direction, simply because it would likely mean breaking a lot of legacy content, which LL just does not want to do, even with the existing engine. The most likely thing to happen is modernization of the existing engine, for example the upcoming PBR implementation, and somewhere further down the line swapping over to another graphics API (most likely Vulkan). if LL went down the road of swapping over to a new engine, I'd have thought they'd move towards using Godot as it's a very well supported open-source game engine, as opposed to Unity and Unreal which are both closed-source.
  3. Wiki has been updated to reflect this information, is much more concise about the truncation of body lengths between pipelines.
  4. Building off what KT Kingsley said, if you find that llFrand doesn't produce a random enough number, you may find the below RNG useful. integer iRand(integer max) { //llFrand isn't great, so replace it with this ++max; string seed = llMD5String((string)llGenerateKey(),0); return (integer)("0x" + llGetSubString(seed,0,6))%max; }
  5. I highly doubt there will be any kind of HUD for the NUX body. As Patch said, they aren't trying to make a full competitor for the existing class of mesh bodies. Things like nail lengths I doubt will be included. (That said, there would be nothing stopping creators from making a nail addon!) That said, feet height - It's *possible* that the feet are bento-rigged, meaning that the feet can change height via animations (similar to the Cinnamon & Chai body).
  6. Ah, I see - I believe the issue there might be because the llHTTPRequest body length isn't actually expecting a response (i.e. the script sends data, then the server will 'reply', but not as part of the same transaction), the HTTP_BODY_MAXLENGTH doesn't apply. If using something like an HTTP GET, it would apply as the response is expected as part of the transaction. Pinging @Monty Linden for confirmation of this hypothesis.
  7. Oh, sorry, I read and interpreted wrong in that case. I do agree that the CameraFieldOfView value is roughly equivalent to AoV. I actually already had that as part of my "Calculating Lens Presets" section on the Wiki. I did reach out to one of the members of the Graphics team - in their words - So in the end it somewhat doesn't matter about the semantics, as the values aren't used in calculating a physically-accurate lens.
  8. I actually disagree with the view that the FoV and View Angle are mixed up. I did some digging in the backend to see how the DoF shader works in SL, and came up with the following process: Convert given FoV value to the lens' Radius Of Curvature (Unknown (to me) formula, see below) (Unknown, final result) - POSSIBLY - calculate dioptre from index (Value calculated in step 2) div (/) radius of curvature I believe steps 2 and 3 are used in calculating the power of an "equivalent" defocusing (concave) lens which is used by the shader; although not 100% sure. (My brain isn't working well enough right now to dig much deeper, may revisit another time). As a side note, I found out why the level of bokeh changes as the CameraAngle value is changed - When calculating the power of the *final* "defocusing" lens, the code actually retrieves the *actual* camera's FoV value, and this is mixed into the formula presumably as some sort of failsafe. Disconnecting the 2 would actually only involve swapping out the name of 1 variable.
  9. You may want to check result of that getStringBytes function @KT Kingsley mentioned above, to double check that length value. Assuming the given length value is correct, it would seem that the body is being truncated elsewhere.
  10. Hmm, there doesn't seem to be anything wrong with that - although using method "POST" seems like an odd choice given that a response is expected. As part of your http_response event, try reading back the character length of the body with llStringLength to double check that the string isn't being truncated elsewhere in the script or by an LSL quirk. You may also find reading the metadata in the http_response event useful, as you can find where (if any) the body was truncated.
  11. You might want to post a snippet of the llHTTPRequest code. There's a lot of gotchas with LSL HTTP, so you may have tripped over one.
  12. The body is still in development, so nothing is set in stone, but I'm lead to believe that the body makes use of pure alpha-masking with no alpha cuts - as noted by Beq (Firestorm Dev), cuts cause a drastic rendering performance hit. I'd really hope so. Given the abject failure of the last round of Linden mesh avatars, LL need to lead by example with this NUX body.
  13. That happens when whoever made the environment preset for that land has set the moon's colour to near black - the moon will still emit light, but the texture will be very dark in the skybox.
  14. No dev kit is available right now as the body is still in development, no ETA has been given on when the body will be launched.
  15. I don't think so. The Lab themselves said that the body isn't designed to be marketed towards "best-in-class" body users, but instead to give newer users a perfectly fine body. If anything, it's going to benefit creators, as the development kit for these new bodies (I believe) will be freely-accessible, which marks a massive step away from the current walled-garden of mesh body development; something which I feel has been holding back the market for some time.
  16. No, Belleza messed up their default by forcing the alpha mode to "None". While Blending is the default alpha mode, with BoM the alpha mode is a set-and-forget item as bakes textures are always supplied with an alpha channel, regardless of if it is used or not, thus preserving the alpha mode set on the mesh. So long as the texture is assigned to a bakes channel, then the creator sets the alpha mode to Masking, and sets a mask cutoff value (Usually 128 is a good default), then the alpha mode shouldn't change unless the texture is reassigned away from the bakes channels (Which, is a gotcha when appliers are in the mix, as appliers *will* force the alpha mode to default to blending)
  17. I don't have exact (or accurate) numbers for polycount, but from what I was able to gather, polycount is in the sub-100,000 range. No idea if it's on display anywhere or not.
  18. As an owner of hardware that features an Intel iGPU, I can assure you that isn't the case. Also - I did some searching and I can't seem to find any information which points to that being a "known" issue on any generation of Intel graphics (The closest I found was a patch for Intel iGPUs reporting the wrong supported OpenGL version to Windows 10 programs) The patch mentioned above covers the following Intel iGPUs: 3rd generation Graphics Media Accelerators except GMA 910 and 915 series; [1] 4th generation Graphics Media Accelerators; PowerVR edition Graphics Media Accelerators; 1st and 2nd generations Intel HD Graphics. If you have one of the above, then it's a driver issue, not a Second Life one.
  19. Unless you make some *really* bad purchasing decisions, that's not going to happen. Hardware manufacturers get a negative press for reducing (or producing) products which become e-waste. AMD and Nvidia have quite a good track record of supporting their old hardware (AMD only recently stopped driver support for cards which were approaching 10 years old!). Intel, well, it's pretty much never been recommended to use outside of office use, and even there it's not brilliant. For reference, to get an Intel iGPU that supports OpenGL 4.6, your CPU would have to have been made in 2016 (HD Graphics 500) or younger (Assuming you're on Windows, Linux users get 4.6 with the 2015 HD Graphics ) Nvidia users get OpenGL 4.6 on the GT420 (2010), AMD users get it on the Southern Islands GPUs (2012) - not to say you'd have a good experience on a GPU that old, but it goes to show how behind Intel historically have been. (Also - you might want to post some info about your setup, as you say you have a 64-bit CPU, so chances are someone might be able to help you troubleshoot if they have a little info!)
  20. I wouldn't be surprised if 32-bit was being killed off, after all, 32-bit now only accounts for 0.5% of the market; and is supposedly a bit of a nightmare to maintain the viewer for. (See here for stats: https://www.pcbenchmarks.net/os-marketshare.html ) EDIT: Also, FWIW if your CPU is old enough to be 32-bit, and is using Intel integrated graphics, well, it's amazing it manages to render anything at all. The Intel drivers have historically been a joke, and although they have improved in recent years they still aren't great by any measure. Also, Intel probably EOL'd it years ago, so no driver updates for you!
  21. I don't have any photos, but the I did inspect the model and the feet are fully modelled with individual toes. Hands, well, to be expected as the hands were in a default "rest" pose, as Jester didn't seem to be using an AO at all.
  22. The demo that was on display was an All In One (head + body in a single mesh), but it was acknowledged that it is "preview-grade", so I'd say it's too early to know which way they're going to go. (Although I do agree, the head and body should be separate - I'd also hope it uses the SLneck!) EDIT: Please see the comment from Patch here, the body is separate from the head mesh -
  23. Yes - I don't think they mentioned if it's using a different UV map, but I'm assuming it's SLUV
  24. So, as of today, Linden Lab have unveiled their new (replacement?) to the old System avatars. The "NUX" (New User eXperience) avatar is a full-mesh, Bento-rigged avatar which will be fully slider-adjustable, and will have a male and female variant. LL also intends for a library of clothing to be launched with the avatar, which will be available for both variants. LL have also announced that the dev kit will be available to creators who wish to make content for this new avatar. Some photos from the launch (Male variant, modelled by @Jester Mole) (Taken in Alchemy (project) Viewer, 100mm lens) Livestream of the reveal (Unveiling at 13:45)
  25. Mii - now in the Metaverse! (change my mind).
×
×
  • Create New...