Jump to content

Phate Shepherd

Resident
  • Posts

    214
  • Joined

  • Last visited

Posts posted by Phate Shepherd

  1. Is the new proxy pool implemented as a round-robbin? Last night I experienced a production region that was reporting a 502 "ERROR: The requested URL could not be retrieved" proxy error with outbound HTTP that went away after the region was restarted. Curious if a round-robbin proxy pool would have at least mitigated the issue. If the Proxy's are no longer tied to the region, would we have to open a ticket when/if one of the proxies in the proxy pool started spewing 500 errors? Will there be any meta-data to report which proxy in the pool was at fault?

  2. 18 minutes ago, animats said:

    One crossing, not near a corner, took seven seconds. One near a corner took more than 30 and failed. The avatar got Left Behind. This is the same behavior seen on the main grid. Interesting that it can be reproduced in an empty sandbox on AWS.

    I did several circuits around your region corner prim. Never got left behind, but in one instance, lost vehicle control. I suspect that your region crossing protection code likely reinstates controls when crossing into a new region to combat that.

  3. 1 hour ago, animats said:

    Didn't want to put this in Oz's pinned topic, but that's what it's relevant to.

    Outbound HTTP from Cloud Sandbox 4 working fine. Data from the server that logs region crossings for my test bikes. (Only ones that say REVIEW or DEMO do this; if you buy one, it doesn't have the logging.)

    Quick testing shows it is working for my stuff as well. And I did do a naughty in one item and use part of the returned URL from llRequestURL

    Ohh, and inbound HTTP is working as well... not sure that was already working or not.

    I also found that it appears to be active on 1-4. They were returning URLs of the form: http://simhost-07ecfc0bf885aed5b.aditi.secondlife.io and both inbound and outbound was working.

     

    HOLY CLUCK! Maybe it is partly because the region is empty, but I did an animation load to my in-world animation tool. What normally takes maybe 2 seconds per frame to load, loaded 6 frames in under a second! I could cycle through frames at nearly real-time as well.

  4. 19 hours ago, Rolig Loon said:

    I suppose so, but then you'd have to be careful to link all the components consecutively.  You can do the job almost as easily by giving all components of each drawer the same name and then building a separate list of link numbers for each drawer in state_entry (list lDrawer1, list lDrawer2, etc.).  Then, when you click on any part of the linkset, check to see if you touched a component of one of the drawers.  If so, apply the set of movement commands to every link in the list for that drawer.  It is mildly fiddly to set up, but you can still manage the entire thing with a single script in the root prim of the linkset.  [EDIT:  The "linkset" I'm talking about is the linkset for the entire dresser, including all of the drawers, in the sense that Rachel is discussing in her post, below this one.]

    Ohh, how we could save a ton of work with 3 additional LSL commands:

    llGetDescPrimitiveParams(), llSetDescPrimitiveParamsFast() and llDetectedDesc(). Also the flag "NEXT_DESC_TARGET"

    Same format as LLSLPPF, except instead of a link number, you would supply a link description string to match:

    llSetDescPrimitiveParamsFast("Drawer1",  [PRIM_POS_LOCAL, ...... ] );

    You could target 1 or more links in a set just by using the description you assigned to the link. Link order wouldn't matter at all. No viewer changes needed.

    • Like 2
  5. 5 hours ago, Lucia Nightfire said:

    Then the question is, what minimum % hovertext transparency would the community accept being able to be read remotely?

    Beside using transparent hovertext to store data that you may not want the viewer to have access to, there may be many cases where you don't want other scripts to have access to visible hovertext, such as the case OP mentioned with farming items. If the hovertext was to indicate "health" or some other stat that required attention, would you want it to be possible to automate attention to this stat?

  6. 59 minutes ago, GEARspirit said:

    Actually not, what i want is "read" the status  showed as hovertext, in a (for instance) D&S roleplay system field farming

    You might ask the creator if his system has an API to get information from their objects. However, if the information wanted was to automate a task that they feel should be done manually, it is unlikely they would expose that information.

  7. 1 hour ago, Kyrah Abattoir said:

    Technically as the land owner it could unsit rather than returning the object.

    Hmm... I didn't know about  the landowner exception to unsit.

  8. Late reply, but encountered a rather ingenious force unseat in the security system of Bellisaria homes. If it can't eject you from the land, it returns the object you are sitting on.... THEN kicks your butt out.

     

    Not necessarily an option for OP, but it could be a useful solution in some cases.

    • Like 1
  9. 14 minutes ago, Wulfie Reanimator said:

    Just wait until you hear about TCP communication! (If you don't get it, let's just call it UDP.)

    I date from the serial days.  ACK! NAK!

    Now I sound like the martians from "Mars Attacks!"

  10. 13 minutes ago, Wulfie Reanimator said:

    What if the first letter isn't a letter?

    For example: "12 more things..." or "[brand] product"

    I was about to jump in and post an elaborate solution when it occurred to me.... the uppercase of a non-alphabetic character is the same character... so it still works fine.

    On a side note, it may act a little goofy with a single character as the src input.

    • Like 1
  11. 1 hour ago, Rolig Loon said:

    LOL.   Newcomers to the forums take a while to learn that we NEVER delete the text of an OP.  That just confuses everyone who comes along later.  :)

     

     

    Reading just the word "Hello" has started to give me PTSD shakes from all the support IM's that say "Hello" or "Hi" without asking a question. I hate to be rude and not reply, but I hate it even more to have to do the pointless handshake before valid data is transferred.

  12. 1 hour ago, Leda Sands said:

    Yes, we're just discovering how each user will experience media according to their own settings. So it probably can't be done.

    One common trick is to use an instructional graphic on the face of the prim you use for media. If they have media turned off, the graphic will show telling them how to turn it on. If they have it on, they never see the graphic, only the media.

  13. On a more serious note, I can think of a way to "appear" to be closer or further away from a sensor than they are.. Animations can be offset by 5 meters plus or minus from the actual avatar hit box.

    Highly unlikely that this helps your issue, unless you just wanted your avatar to appear to be in a different place than they are. The sensor will still know where your "hit box" is.

    It is a cheat for mouselook shooting games though. Using an AO with all offset animations obscures where you really are.

    • Like 1
  14. Just in case anyone searches and ends up here.... 

     

    AnyPose can only change facial expressions on legacy heads. Because of the differences in how bento heads are rigged, a smile on a bento head made by one creator could look significantly different than a smile on a head made by another creator, so I don't believe there can be a universal set of facial expressions that work across all heads.

  15. 1 hour ago, Kyrah Abattoir said:

    I'm not going to change your mind, after 16 years in SL I know that people will just keep doing what they've always done, until the heat death of the universe. But it doesn't make it right.

    Points well taken. I completely agree that it all falls on the creator. Given proper use of tiling, that would be a reduction in vram usage for items like homes. I had my mind focused on objects that rarely use tiling, such as objects with all faces on a single UV.

    I have mentioned it before: LL blew the best time to address this when land impact was introduced. Texture usage should have been rolled into the LI calculation. To deal with the dynamic nature of object faces (Scripts, etc.) each face would be assigned a max texture size that would be set in the texture tab of the build window. It would not be changeable with scripting, so a script couldn't increase an item's land impact by changing from, say a 512*512 to a 1024*1024. We are already accustomed to LI changing as we make adjustments with the build tools, like scaling an object. Where the downsampling would take place is up for debate (server side or client side.)

     

  16. On 8/12/2020 at 3:58 PM, Kyrah Abattoir said:

    Of course they are supplementary maps, however you get a more "visual appeal" in ALM at a fraction of the vram budget, because you let yhe GPU do its job instead of pre-baking everything into a texture that has to be much larger (and cannot be reused) as a result.

    And even with baked materials, you typically don't need as high of a resolution.

    I would argue against saying it is a fraction of the vram budget. I highly doubt creators are using a lower resolution diffuse when they add in spec and/or normal. I would bet a significant number are using the same rez for diff, spec and norm, doubling or tripling vram usage when ALM is on.

    I am also confused at the statement that pre-baking AO and light map/simulated spec into a texture has anything to do with its resolution. Pre-baking AO and light map doesn't change the size of the texture, nor does it necessitate a higher resolution texture. Texture resolution should be based on the typical surface area that will be presented.

    I am not saying using ALM  is bad, or looks bad, I'm saying for a lot of lower end machines, it is takes a tremendous performance hit in both frame rate and vram usage.

  17. On 8/10/2020 at 9:23 AM, ChinRey said:

    We can do with far more heresy among SL content creators. It's far too much doing it by the book and not nearly enough findign the best solution for specific situations.

    I'm not sure what to say about your specific question though. The fundamental idea with shininess, prebaked or dynamic, is to make the parts of the surface with high reflectivity lighter in color than the ones with low. That's the basics anyway, you probably want a bit of colour adjustments and maybe a little bit of blurring too.

    So, open the texture in whatever graphics program you use, add two new layers, one for the UV map and one for the shininess map you are going to make. On the shininess layer add brighter colors/shades for the high reflection parts and darker for the low reflection part, using the UV map as reference. I'm not sure which blend more the shininess layer should use. If your program supports Overlay mode (paint.net and Photoshop does, GIMP doesn't) try that first. Other options are Multiply, Screeen, Glow and Reflect. Once you are happy with the result, remove the UV map layer, merge the other two and save.

    This is probably way to basic to be useful to you but it's hard to be more specific without knowing more.

    That is sort of what I already knew. What complicates this further, is trying to turn around and teach this to someone else that doesn't feel comfortable hand drawing on UV maps. I might experiment further with the plastic wrap photoshop filter to see if it can help automate some of the tasks.

    • Sad 1
  18. On 8/10/2020 at 7:29 AM, Kyrah Abattoir said:

    Why?

    It is just a general feeling I've gotten... maybe its just me thinking the creator community frowns on those that are doing retextures of full perm. Mind you, I'm starting to do more of my own meshes, but don't have the time or inclination to do everything myself.... and full perm mesh creators rarely do spec maps.

    I'm normally on a laptop, and rarely run with ALM on unless taking photos... so I can appreciate lower spec users wanting something that looks "good" without having everything turned on.

    • Haha 1
×
×
  • Create New...