Jump to content

Jenna Huntsman

Resident
  • Posts

    672
  • Joined

  • Last visited

Everything posted by Jenna Huntsman

  1. I'd recommend against doing bulk data storage within the mesh itself, as (presumably) most of the time it won't be needed and is just sat there eating server resources. Regardless, with your current method I'd probably use JSON for this - below is a simple example of how you can use JSON to store the appropriate data for a given button. As to actually applying it, as Prof said above you'll want to use llSetLinkPrimitiveParamsFast in combo with PRIM_LINK_TARGET. I like JSON over strided lists as you don't need to maintain a constant offset, and you don't need to use multiple to store different forms of data (e.g. the materials themselves, the actions of the buttons, etc.). Experiment with how you store the data, again, below is a very simple example but shows the concept. string JSON = "{\"buttons\":{\"button1\":{\"link\":3,\"face\":[1,2],\"mat\":\"wood\"},\"button2\":{\"link\":3,\"face\":[3,4],\"mat\":\"glass\"},\"button3\":{\"link\":3,\"face\":[1,2],\"mat\":\"glass\"}},\"mats\":{\"wood\":[\"diffKey\",\"normKey\",\"specKey\"],\"glass\":[\"gDiffKey\",\"gNormKey\",\"gSpecKey\"]}}"; default { state_entry() { llSay(0, "JsonExample!"); } touch_start(integer total_number) { llSay(0, "Getting data..."); string action = llJsonGetValue(JSON,["buttons","button1"]); //returns the JSON for container button1 llSay(0,"button1 = " + action); string material = llJsonGetValue(JSON,["mats",llJsonGetValue(action,["mat"]),0]); //returns the diffuse key for the material "wood" llSay(0,"wood diffuse = " + material); } }
  2. The above script is incomplete - you've only given us the state_entry event, which only runs when the script first starts - the touch event is the one that really matters.
  3. The parts that @Rowan Amore said above will do just fine, aside from the amount of RAM (as others have said, you want dual-channel (i.e. 2 sticks of RAM)) You 100% can get a laptop which will run SL just fine, but expect it to cost more for less performance. Also, very little in the way of an upgrade path.
  4. Building a PC is super easy. Heck, LTT, a respected tech content creator, made a start-to-finish video on how to build a PC for people with no prior knowledge - see here -
  5. Again, not much to work off of, but it may actually be easier (and more dynamic) to instead use a single script and implement a multiplier to govern how responsive a vehicle is, and use link messages to manipulate the multiplier. For example, using a linden vehicle, if I wanted to manipulate the turning speed, I could do this: float HealthMult = 1.0; //Float value representing vehicle health. 1.0 is 100% health, 0 is 0% vector angular_motor = <0,0,1>; angular_motor *= HealthMult; llSetVehicleVectorParam(VEHICLE_ANGULAR_MOTOR_DIRECTION,angular_motor);
  6. That's because you've (presumably) disabled the script which handles control of the vehicle, but then didn't disable the vehicle's physics behaviour. Depending on the type of vehicle, you actually *may* want to keep physics enabled but just disable the engine script (for example, a small boat wherein crashing into it would still cause it to move), but without more details about what you're trying to do it's hard to advise the best approach.
  7. This. 100%. I agree with the sentiment of updating the presets, as the old presets are Windlight-era stuff which doesn't play well with the new EEP system, and as a result look dull and don't have that much in the way of apparent dynamic range. All I'd want is LL to update the same presets to make use of the new EEP system properly, so our scenes are correctly exposed again. But these presets like CalWL and AnaLutica are just pure bad. You're better off disabling ALM and running some ancient, barely maintained rendering code for the performance boost than running those presets. Baked lighting is rarely needed - try using the specular materials map instead. I'll have to post some of my EEP presets publically soon, so people can play around with them (As someone with a lot of lighting experience, they are really nice!)
  8. You can, but unfortunately that's not the answer to the question I was asking - that would just re-export a .bvh, when I want the resulting file in the SL-specific .anim format which allows for some more advanced configuration. The Avastar plugin for Blender can do this, but unfortunately I'm not in the position to invest in a tool like that right now for an idea which is just a concept right now.
  9. Pinging @Monty Linden Use the JIRA to file a bug report in the meantime. https://jira.secondlife.com/
  10. Out of interest, the UUIDs in that URL - is one of them the object that the script is in or the handle for the HTTP request? If so, that would point to an internal issue with SL.
  11. Actually that's the correct behaviour - the gamma setting determines how luminance values are decoded, which means higher values will result in a final image with more 'dynamic range', but much easier to overexpose. Take a look at this article, about halfway down there's images which show you what gamma correction does - notice how with the higher gamma value, the white paint of the car is overexposed. https://www.scantips.com/lights/gamma2.html
  12. It absolutely can be - you need to provide the specs of your PC (as SL sees them, through the Help > About (viewername) ). It's entirely possible that it IS a software issue, but limited to your hardware setup. It's impossible to diagnose without actually knowing how to reproduce. Again, the fact you're unable to update the GPU driver raises questions about the hardware setup you have. Let's rule that out - but you need to tell us what you're working with. Aside from it can be. Sure, you may have lived in the same building for x many years, with the same ISP for however long, but that doesn't mean that the actual route your traffic takes is to a given destination is the same every time. In fact, it's normal for this to change on a daily basis, depending on outages of networks along the route, load balancing, etc. If a provider between you and the server has a bad piece of software which assigns an extremely poor route or a nonfunctional one, then it's possible you wouldn't even be able to log in at all. That's not LL's problem, nor is it yours, it's an issue with the ISP. There's lots of times where sites like YouTube get knocked offline due to this kind of problem. (As an aside, you should also post some information about your connection from the Statistics Bar, accessed through Advanced > Performance Tools > Statistics Bar - that, among other things, gives statistics on your connection between yourself and the region (and by extension, the CDN)). Aside from some unloaded textures in the last image, the sharpness looks fine. An unloaded texture appears blurry (due to mipmapping), but the actual edges of objects look perfectly sharp with the correct contrast. Textures not loading can be caused by many things, including server issues, network issues, and issues with your PC itself. So, if there is an issue I'm not seeing, then providing an A/B comparison where you can highlight the exact differences makes it a lot easier to figure out the problem. What data? You need to supply the data (as mentioned above) in order to create a reproducible issue, at which point it can be fixed.
  13. Gonna throw in my 2 cents on this one: As others have said, this isn't a release build, it's a beta. So expect problems, if you know what is causing it / have a hunch, file a JIRA. It should never take more than one clear cache operation to repair / rebuild the cache. If anything, that would indicate a network or server-side issue (Server side is unlikely as it would affect most of us, as can be seen when AWS throws a fit and we can't log in or regions go down unexpectedly). Bear in mind that network doesn't just mean from your home to AWS, it also means the connection between your computer and the access point (If you're on WiFi, you'll likely see more issues, especially with low-grade access points, as SL hammers the connection with lots of requests of various types). Assuming no other issues, ALM doesn't do a whole lot to blur an image, but it's more perceptual - when ALM is disabled, everything is hit with a fairly intense ambient light, meaning that you can pick out every little detail where objects join, details on textures, etc. When ALM is enabled, a proper lighting pipeline is used and so surfaces that are poorly lit will "blend" together to your eye. A great example of this is with Ambient Occlusion - look at a tree. With Ambient occlusion disabled, you can see exactly where the joint is between the ground and the trunk. With it enabled, the joint has a shadow cast on it which causes it to blend into the background more. It hasn't been 'blurred', it's just been lit differently. (p.s. as a side note, enabling DoF but adjusting the settings to create a very subtle blur effect helps a lot when looking at assets further away from camera, as it avoids harsh super harsh edges around items. Here's an example: https://imgur.com/a/Jkqdg5q ) You should probably post the specs, if nothing else, to eliminate the computer being the cause. As you go on later to mention that your GPU driver is out of date and cannot be updated, that raises red flags. Just as a note: if AVG is the one doing the updating, then that is likely what causes the crashing issue. Always update your graphics driver manually, using software downloaded from the manufacturer's website. (Also, sidenote, this includes Windows Update, which for months (up until fairly recently) was distributing a broken AMD graphics driver, however the AMD site had issued a patch mere days later) The screenshots you have on display here aren't in the same place or with the same lighting, so it's impossible to do an A/B comparison on the images to determine if the image is unexpectedly blurry. Beyond some partially loaded textures in the last image, all looks normal to me.
  14. When you say 'Origin', do you mean the scene origin or the origin of the skeleton? Because rotating the scene origin won't work (in fact, that'll more than likely screw up the animation), but rotating the origin of the skeleton (the pelvis) does work. Also, to transition from animation A to animation B without returning the avatar to a 'base' state, you need to make sure the position and rotation values of the pelvis match the end of animation A, and the start of animation B.
  15. Unfortunately then you're SOL. Unless Legacy provide a compatible tattoo layer, OR provide you with a BoM copy of your chosen skin, you'll pretty much always have seaming issues. The problem isn't with the skin, or the mesh, but the fact that you're using 2 textures which aren't matched at their edges, which is what causes the seam. Neck fades aren't all that reliable due to the issue I mentioned above, and while it's possible to manipulate the viewer settings to hide the seam issue, others will still see it. (No, telling them to change their settings to view your avatar correctly isn't a solution!). As others have mentioned, it may be worth putting some time in to find a full BoM skin that you like (There's so many options out there, I guarantee spending a couple hours touring skin shops and you'll find one you fall in love with) as it's much easier to mix and match head and body skins, all while never encountering seaming issues.
  16. Going to join the pedantry train here (sorry in advance!), but accuracy is important when it comes to lighting - 3200 kelvin, 0 Δuv - assuming the source is a tungsten-halogen filament lamp. (Translated to RGB in Rec. 709 space (What SL uses), that's <255,190,121>). Incandescent lamps are slightly warmer, at ~2800 kelvin, 0 Δuv; Ideally 2856k (Also known as CIE Illuminant A - in SL terms, that's <255,178,99>) Sunlight can be really tricky to simulate as it's colour temperature isn't fixed - it can range from 2400k ('Horizon' sunlight) to ~6500k (Overcast afternoon sunlight); though in the film industry direct sunlight is often assumed to be 5600k for simplicity. (For accuracy's sake, i'm using 0.002 Δuv; as daylight does not lie directly on the Planckian locus - in SL terms, this translates to <255,240,228>). Anyway, nonetheless a very interesting writeup
  17. I'd avoid doing this if possible. The reason why is that neck fades use alpha blending, the source of all the dreaded alpha sorting bugs. Surfaces that make use of alpha blending are also not lit in the same way as other surfaces (as doing so would be too intensive). If Legacy have provided a tattoo layer as a neck blend, use this - make sure to double check that you're using the correct blend layer for your chosen head UV (Evo X and SLUV are not compatible, so wearing an SLUV neck blender and an Evo X skin won't work!) and disable the neck fade. Let us know how that works for you.
  18. I've seen it done before in other games that the env map is fully dynamic, but rendered at a much lower resolution using a lower LOD - as a hypothetical, would it be possible to do something similar, and (bonus points) implement some sort of upscaling algorithm (FSR, DSR etc) to upscale the image into a higher resolution? Maybe it would be better to have some sort of hybrid approach could be better for static objects wherein they use the baked env maps and only physical objects use the fully dynamic map. (It's also possible i'm being stupid and that approach wouldn't take enough load off to be worth doing, but figured I'd put it in type anyway).
  19. That's because fullbright has no controls other than on or off. I believe (someone may correct me on this!) that fullbright is just a flag which tells the viewer to skip lighting that surface and render it with the texture's absolute colours. You may find that manipulating the PRIM_GLOW parameter might get what you're looking for, though. (Try with some low values!)
  20. Hey all, I'm tinkering around with the SL animation system, and I've got some animations which I want to try out mixed priorities on - I've heard / seen that SL's .anim format can facilitate this, but I seem to be drawing a blank as to how to convert my .bvh files to .anim files. (Or at least, any solutions that don't point to Avastar - I don't want to spend money on an idea which could go nowhere!) I've found a couple different tools, but none will allow me to import a bvh and save out a .anim Extrude Ragu's AnimHacker - I'd like to play around with this, but it can only handle .anim files, no support for .bvh AnimMaker - Oooold tool for creating and manipulating .anim files. Still seems to work with Bento, which is cool. Uses different Coordinate space than AnimHacker. Again, no .bvh support though. Anim2BVH - Conversion goes the wrong way. No bento support. I've seen it mentioned on the forum that the viewer creates a .anim from a .bvh before upload, but it doesn't seem like you can export the .anim :( Anyone got any recommendations (aside, of course, Avastar) of any other tools which may help in this endeavour?
  21. Oops, you're 100% right! I'll update the above with the correct values. (Pretty new to animations, so still learning the ropes!)
  22. Spent a couple of hours messing around, figured I'd share my findings. Valid as of date of writing, using Blender 3.1.2 So, I have an SL-compatible bvh animation (e.g. a bvh export from QAvimator), which when imported straight to SL will work just fine, but I want to apply some smoothing using Blender. Unfortunately, Blender is very good at messing animations up, and without Avastar, it's a pain to sort out and use. Turns out, if you get the exact right settings, Blender can import / export an already SL-compatible animation without screwing it up. Note: This assumes that the the base unit of measurement in Blender is set to Meters. Ignore anything after a // ; those are just comments. TO IMPORT: When importing the bvh, use the following settings: Scale: 0.0257 //inch to meter - Thanks to @Quarrel Kukulcan for finding that, and sharing on another thread! Rotation: Euler (Native) //Match import - For reference, my animations use XZY format rotations, however YMMV. Forward: Y Forward Up: Z Up This will import the armature laying perpendicular to the camera, on it's back. TO EXPORT: Scale: 39.37 //Scale back to inches Rotation: Euler (Native) //Matches import. Root Translation Only (OPTIONAL!) This will allow you to import a working animation, and export a working one too! All without Avastar.
  23. That would make sense - my bad!
  24. While it doesn't seem to be documented behaviour, you can use llGetPrimitiveParams to return some info by calling the following: llGetPrimitiveParams([VEHICLE_ANGULAR_FRICTION_TIMESCALE,0]); Unfortunately I don't know exactly what the values returned mean (I'm not familiar with the function you're working with), but hopefully you'll be able to make some sense of it.
×
×
  • Create New...