Jump to content

Frionil Fang

Resident
  • Posts

    385
  • Joined

Everything posted by Frionil Fang

  1. You can't send a direct group invitation, but the group links can be used in a dialog box just as well as chat; it won't present you a join button directly, but it's easier to notice than a chat link (and less spammy, in case of those Very Unthoughtful objects that use channel 0 to advertise their group joins). For example: default { touch_start(integer _) { llDialog(llDetectedKey(0), "Click here to join the group:\nsecondlife:///app/group/19657888-576f-83e9-2580-7c3da7c0e4ca/inspect", ["Dismiss"], -99999); } } The "dismiss" button does nothing since there is no associated listener on channel -99999, it's only there because a dialog has to have at least 1 button.
  2. A low quality keyboard might not register more than a couple keystrokes at once, and on top of that limitation, some keys may share signals so they lock each other out even if the max number wasn't reached. A very common problem in the 90s, trying to play multiplayer games on the same keyboard... Use a key combination test tool to verify, for example https://www.mechanical-keyboard.org/key-rollover-test/ seems like it might do the job. The only solutions are to change to a non-conflicting key combo or get a better keyboard.
  3. Well... no, not based on what I see in the source. The encoding happens in the viewer and it does a basic bilinear i.e. blurry scaling, implemented in LL code. The JPEG2000 library isn't involved (indra/llimage/llimage.cpp (223)). JPEG2000 compression certainly cause extra artifacts if there's sharp elements in the texture (text, sharpened downscale, etc.) but "blur the image by bilinear scaling" probably is not the end-all solution to that. I'd still recommend doing the scaling yourself, then you're at least *more* aware of the actual texture you're sending to SL even if the J2K compression will mess it up further.
  4. If you *really* want to go hard at getting the best downscaling possible, I'd suggest ImageMagick but it's a command-line tool and not a graphics suite, so it's not as approachable. On the plus side, it can rescale a directory full of images in one command. It doesn't have Fant scaling which is quite nice for downscaling (I absolutely second using paint.net, it's an excellent tool), but Fant seems to be a Microsoft secret sauce algorithm, and ImageMagick has some 20 different algorithms available. The reason there are so many is because scaling algorithms are very much a "whatever looks best to you, in this specific situation" thing. Some modes apply more sharpening, some less, some handle different patterns better, etc. If you scale your image to a resolution SL supports and use lossless compression, you're in control of the results unlike when letting the uploader decide how to mangle resize your texture.
  5. I know Avastar does them, I'm fairly sure I didn't have that in 2012. Not that there's anything wrong with .bvh from QAvimator or inworld export-capable posing tools if you settle for priority 4 or lower, but it's nice to have a free tool that can actually export raw .anim files, paying for Avastar wouldn't really make sense for my level of use. Whatever I used back in 2012 (most likely Avimator, the precedessor to QAvimator), it made a .bvh and successfully created a priority 6 asset in SL. At some point that just stopped working due to some kind of a server-side check, if I had to guess. .bvh is a non-SL-specific generic text-based format and has to be converted to the internal binary .anim format, after all, so if the servers handling that are more strict now, no amount of priority hacking at the viewer side is going to bypass that. Anyway, not really related to the subject of the thread any longer, I tested my old copy of QAvimator with Win7 compatibility mode and it works a treat, so that's good to know it's not out of the picture yet.
  6. I've never been a prolific animation uploader, but I found a priority 6 arm/hand pose from March 2012 and it is not overridden by priority 4 animations. I have no recollection what tool I would've used to make it, but it has to have been a .bvh file, the only .anim capable tool I've used is Black Dragon. For contrast, I found animations that list themselves as "priority 6" in the Firestorm animation details panel that were uploaded in 2016, 2018 and 2020 are overridden by priority 4, and not by priority 3. Curiously I also have one prio 6 from 2018 that's tagged as "fixed" and isn't overridden by prio 4, whereas its "broken" counterpart is overridden, despite obviously being the same animation file... no idea what happened there. Immediate edit: I think I found a *very* janky .bvh-to-.anim converter around that time, it was too broken to get much use out of but I may have used that.
  7. Since forever and even right now, at least on Firestorm, but they no longer seem to work correctly. Pretty sure I have some properly working priority 6 animations from ancient times, and back then you could just use a debug setting in the viewer to set the max priority to whatever you pleased. Now it seems to mess with the bone priority at server side; the animation is reported as "priority 6" but keeps getting overridden by priority 4 ones, so that sounds certainly sounds like the bone priority getting clamped to 4 so whatever plays latest wins. Rephrase edit: "since forever" as in I'm 99% certain they worked before, and you can still upload a prio 6 .bvh, and it is identified in the viewer as prio 6, but doesn't work like a prio 6 anymore.
  8. I've certainly made some hand poses with BD and uploaded the results. The files did need some hex editing to cull out unused bones (it saves all bones instead of just the active ones, even if it implies it shouldn't be doing that) and fix the bone priorities (the whole point for me was to get properly working priority 6 animations for hand overrides, which no longer works properly if you upload a .bvh instead of an .anim with bone-level priorities) but still a better deal for me than paying a subscription for a tool that I need 2 times a year.
  9. Black Dragon viewer lets you craft poses from scratch inworld and then save them as animations. Supposedly also does animation but didn't try that part. The upside is that you get to see it on your avatar and don't have to touch Blender or pay for Avastar. The downside is that it takes a moment to figure out how it works and isn't quite finished yet iirc.
  10. Out of curiosity: the break-even point (very lazy test, looking for the last float in a list of floats) is around list length 20. Actually finding the entry without scanning the whole list makes it considerably faster, of course. LinksetDataRead is not quite constant time (it finds entry with the key that was written first, (string)0.0, faster than it finds the nonexistence of a key) but close enough for practical purposes.
  11. Like Atomic infinity said. The API documentation gives you the formulas to correct the x/y coordinates per zoom level.
  12. Blender PBR shader agrees with SL PBR: white text on black, white-to-black gradient below, mapped to roughness channel on a blue cube. The outlining happens on aliased text as well because SL texture sampling is always linear (i.e. blends pixels) so intermediate values always exist. In real life you could make the transition between the roughness levels arbitrarily small, but when you're on a texture you're limited to the texture's resolution and higher resolutions make the transition band more narrow. Being able to specify nearest neighbor sampling per texture in SL would be neat, but don't hold your breath I guess. Now, let's dissect how roughness looks at the exact same conditions: As you can see the intermediate values *are* supposed to look brighter than the extremes when the angle of lighting doesn't change (I'm sure some physicist can explain precisely why better than I, but my guess: absolutely high roughness absorbs more of the diffuse light, hence appears darker) but there certainly doesn't seem to be a bug present unless the PBR material standard itself is broken which is quite unlikely. Edit: one last comparison, plain old diffuse-normal-specular glossiness (so the texture is inverted and mapped to normal map alpha instead) in SL does the same:
  13. I couldn't link you to a location, but as the above posters said... being able to mix&match material channels can be a very nice way to spice things up without having access to "professional techniques". Got a boring old prim thing with just a basic diffuse texture? Slap a subtle scratchy normal map and a perlin noisy specular on it and it suddenly looks a lot more real. Being able to scale said normal/specular maps on demand lets you adjust and match the piecemeal materials. I'd imagine with PBR it's not quite the same since the channels are no longer independent in their own separate textures in every case, but I sure would not like forcing things towards a "professional" workflow, when going outside the intended spec can also have nice results.
  14. Massive disagree. You can spruce up surfaces even without using custom-baked-and-matching material textures, and changing the scale independently makes that much more flexible. SL shouldn't become a "bake it in Blender/Substance Painter" system and retain its sandboxiness instead.
  15. I'm gonna have to let someone smarter take over with this investigation, sorry. Two last ditch ideas to check: if your bodysuit is two-sided, have you looked on the inside if the points are getting scattered there? Also have you tried setting the min. distance to 0? This is equivalent to the purely random mode of scattering and obviously skips the uniform-ish distance checks, but at least it lets you see if something is up with scale/distance considerations.
  16. Your distance minimum on Poisson scatter is set to 10.2 meters, which might not be what you want? Above: max density 5, min distance 0.2 m (which is the size of the stud object). Below: min distance 2 m. Barring that I should probably leave it for someone more experienced to figure out.
  17. In the above example the "stud" object is just a small hemisphere oriented along the Z axis, with its origin centered at the bottom: The origin points are what are scattered and the geometry follows, so if I offset the geometry upwards, each stud on the sphere they're scattered on would start to hover. Might also want to make sure you've applied scale/rotation on the stud.
  18. Something like this? I used to use a particle system to scatter stuff randomly (after getting a pleasing enough distribution, convert the particles to actual mesh), but I guess with geometry nodes these days that's not necessary... haven't gotten to play much with them yet.
  19. Come think of it, my post above may also be why the rotation would jump: reset clears the R variable and since on_rez doesn't fire again, it's not set to what you expect.
  20. If you reset the script on owner change, that probably goes through after the on_rez event so the timer event is removed.
  21. For attachments llGetRot() returns the avatar's mouselook rotation while it's active, the the avatar's global rotation otherwise. Could be something going on there? Otherwise since we can't see all the relevant parts of the script: what is "R" here? "if (cmd=="Launch") {llSetRot(<0.0,0.0,R.z,R.s>); llMessageLinked(Launcher,0,msg,"");}" Are you absolutely sure it has been properly set to the object's current rotation, and isn't using something outdated?
  22. The only way I can reproduce something even somewhat similar: Fire a bullet that is *not* volume detecting, but instead enables that in on_rez(). If it's moving slowly enough, I can then fire another bullet in a way that collides with the first: one of the bullets will report a null_key in a collision_end event, but haven't seen a single one in collision_start. It's a bit hard to determine without spending excessive effort, but I think the bullet that is not volume detect during that brief window after spawning reports a collision with a volume detect bullet, with all zero parameters (even the name and collision position are null). If the bullets are set to volume detect before rezzing (the property sticks with the object, after all, no need to call it during firing), they can't collide with each other in any manner. As for the side note above, an attachment gun can't have its bullets collide with itself though: attachments don't have bounding boxes,. Any collision would be with the avatar itself, so the offset away from firing location is valid, anyway.
  23. Duplicate your mesh, flip the normals, then scale it up in some appropriate way. I find shrink/fatten works pretty okay, but pinch points might need manual work. Don't forget to assign a separate texture face to the outline object (or butcher their UVs so they just read from a black pixel on the main texture). Someone savvier might have a more convenient, pinch-free solution.
  24. https://wiki.secondlife.com/wiki/Template:LSL_Constants/PrimitiveParams
  25. Not quite true, a notecard needs to have the copy permission to be readable (and since you have to have either +copy or +trans, you can't make it not transferable while being unreadable). This does not prevent you from putting it inside an object and having a script read it for you, though.
×
×
  • Create New...