Jump to content

Wulfie Reanimator

Resident
  • Posts

    5,737
  • Joined

Everything posted by Wulfie Reanimator

  1. Do you want a smooth color transitions on a surface? Don't do repeating timers or complicated color math. Create a texture with the desired color gradient, and smoothly slide it across the surface. You can have as many colors as you want and it's easy to control, and it works great with small textures! You could even create a greyscale texture so it becomes tintable while giving you control of the color's brightness.
  2. Sure, if you first compare the value of llDetectedKey(0) with a specific avatar's key, in the touch_start event, you can block anybody else from getting the item. http://wiki.secondlife.com/wiki/LlDetectedKey
  3. But I love nits! And at least we don't have spammy gestures here on the forum. Signatures get close, but at least they're disabled by default.
  4. Opening a profile and clicking notices causes a UI sound to play. It could be that something on your computer is adjusting/balancing/prioritizing sounds, and thinking that SL is more important than your music stream (even if you're listening to it through SL).
  5. The placement of the mesh while not attached to an avatar is not accurate, because they are rigged and so their visual placement is based on the avatar's "skeleton," which is affected by your shape. You need to adjust the piercings after you have linked and worn the ears. (Your piercings won't move with the ears either, if they are animated, because your piercings aren't rigged to the ears. They might follow one of the ears, if the attachment point itself is animated.)
  6. There's two ways to do it. 1. You combine the two linksets, then change the script so that it moves a list of specific prims. This method has some problems like if you link something new to the linkset, the order of the links will change and the moving part will break, unless you name all the moving links something specific and your script searches for which parts should be moving. 2. You keep the two linksets separate, but both linksets have a script that communicate with each other so that the moving part can align itself correctly with the static part, even if the static part is moved later. This is the worse option because it requires two scripts and two listens. The reason why it didn't work how you wanted is because the script doesn't know which links belonged to which linksets before they were combined.
  7. I see you're a man of good taste in hand-writing. People look at me weird when I give them my 0.25 mm. If you exchange llDetectedTouchPos and coordinate-checking (which is hard to maintain) with llDetectedLink, this is the correct way to do a multi-button HUD with one script. This way, you can have one texture for the HUD (or per HUD page if you don't want to figure out offsets) and only need to move the buttons around to get the exact placement/size you need.
  8. The easiest way to use rotations is to... not. More specifically, life gets easier when you don't touch quaternions (the "rotation" type, with 4 values) directly. Instead, you should define a rotation in normal XYZ degrees, like so: vector relative_rot = <0, 0, 45> * DEG_TO_RAD; // "DEG_TO_RAD" is a conversion from "degrees to radians." // Rotations use radians, but don't think about it. // Just do the conversion and forget about it. So, now you have relative_rot in a format that can be given to llEuler2Rot, which converts the XYZ rotation into a proper rotation. vector relative_rot = <0, 0, 45> * DEG_TO_RAD; rotation relative_r = llEuler2Rot(relative_rot); Now you have relative_r which you can use to correctly apply a 45-degree rotation around the Z axis. Similarly, if you want to adjust some object's existing rotation, you can do the conversion in the opposite direction: rotation object_r = llGetRot(); vector object_rot = llRot2Euler(object_r) * RAD_TO_DEG; object_rot.z += 45; // Add 45 degrees to the Z rotation. llSetRot(llEuler2Rot(object_rot * DEG_TO_RAD)); This doesn't directly answer your question, but hopefully this can guide you towards easier rotations. If not, I can get back to you later tomorrow.
  9. I don't think that's correct, but it's close. How do you calculate start? For example, if start is llGetPos (avatar pos at <10,10,10>), you're calculating: raycast_start = <10,10,10> + <0.5,0.0,0.5>; raycast_end = <10,10,10> + <60.0,0.0,0.5>*llGetCameraRot(); Your raycast starts 0.5 meters above and to the East of the avatar, regardless of where they're facing. The ray might hit the user itself. Add the camera rotation to raycast_start as well and then it's correct. But if start is llGetpos*llGetCameraRot, you'd be calculating: raycast_start = <10,10,10>*llGetCameraRot() + <0.5,0.0,0.5>; raycast_end = <10,10,10>*llGetCameraRot() + <60.0,0.0,0.5>*llGetCameraRot(); // When avatar rotation: <0,0,90> degrees raycast_start = <-10,10,10> + <0.5,0.0,0.5>; raycast_end = <-10,10,10> + <60.0,0.0,0.5>*llGetCameraRot(); Which would be just completely bonkers. Debugging raycasts is definitely hard since there's no way to automatically visualize them. What I do is rez prims at the starting point, facing towards the calculated direction.
  10. You would get multiple results, but raycast doesn't cause damage on its own. Like @Fenix Eldritch said, you would have to process the results in a way that makes sense for you. If you used 3 rays in a triangle shape, for example, you could have 3 separate variables for each ray's result, and check those results in some set order to choose which ray is the "most important" for a hit. If RayA has a result, damage that avatar and ignore the rest. If RayA didn't hit, check RayB, etc. Also, regardless of how many rays you shoot (As few as possible! Rays can fail to cast completely if the sim is busy.) you should figure out what the expected maximum range for your weapon is, and then figure out the "width" of your raycasted shot, and then remember that you're shooting at a target about 0.2 - 0.4 meters wide. Your maximum spread should not be much greater than the width of your target (<0.5) at maximum range. Parallel rays are the simplest solution so you won't need to calculate an angle and you'll have the advantage of the maximum width of the shot regardless of distance. Ah, I had only tested it with an array of "visual lasers" that used raycast to determine their length. They seemed to form a spherical shape, close enough I suppose but interesting to see. This one even I didn't know about!
  11. Here's a small wrench to throw in: If the message that's received is NOT a valid vector, vReceived_vector will get ZERO_VECTOR as its value. So, if your script is always listening, or might also receive non-vector messages, you may want to verify that the message that's received is actually supposed to be a vector. The absolute simplest thing to start with is to see if the message begins and ends with < >, to at least have an idea if it could be a valid vector. The useful snippets section on the wiki has a convenient function to do everything.
  12. Two fun facts: An avatar that is standing up is shaped like an oblong sphere for raycast. This sphere is smaller than the avatar's hitbox (from render metadata). An avatar that is "sitting on ground" is shaped like a pyramid with a flat tip. Shoot multiple rays, either parallel to each other or starting from the same point but diverging by some degrees.
  13. Impressive word salad, but here: https://get.catznip.com/downloads
  14. While it is technically possible to fix, it's not very practical/efficient. It takes a lot of calculations to figure out which of many surfaces is in front of the others, and what order the rest of them are in. Most games make an attempt but to get it right in every situation is veeerry slow, so most games try to avoid using blended alpha surfaces. Second Life has no quality control or standard for assets, so these kinds of technical concerns aren't thought about or worked around by almost anybody. Creators just accept it as a reality and leave it at that.
  15. I think I'm starting to understand where this weird separatism is stemming from.
  16. Yes, but this only makes sense for TCP because TCP is very pedantic about making sure you received the previous packet before sending in the next one. That article specifically says: SL uses UDP for most things, including texture streaming. UDP is one-way, ideal for streaming content (video, files, etc...), because it does not care whether the previous packets were received before the next ones. The viewer handles and re-requests missing data and stitches them in the correct order. http://wiki.secondlife.com/wiki/UDP http://wiki.secondlife.com/wiki/Transfer_Manager http://wiki.secondlife.com/wiki/Image_Pipeline http://wiki.secondlife.com/wiki/Texture_Console
  17. Even if I grant you that grey textures are somehow bad for FPS on their own (I don't agree), latency does not affect how fast something is downloaded in the overall scale. You can have 10000+ ping and still download things at gigabytes per second. (Theoretically speaking. It's an extreme example to bring the point across.) Latency is a measure of time it takes for one packet (or a round-trip) to travel from A to B (or A->B->A). When you're downloading contiguous data like a texture file, latency won't affect the download speed after the first packet has arrived, as the rest will follow just as quickly regardless of distance, because they were already on the way right behind the first packet. (Similarly, the viewer won't wait for one texture to finish downloading before the next -- the viewer is receiving multiple file downloads at once. No delay between each texture.) Latency is only important for things like communicating inputs. If it takes 1 second after pressing W before your avatar starts moving... that's not pleasant, but you wouldn't notice the ping just from looking at how fast textures are loading. The act of streaming itself doesn't cause FPS issues. It's the CPU/HDD time spent on decoding a finished download. Let's say that hypothetically you start downloading all the textures on a sim at the same time (ignoring viewer restrictions), but due to a disconnect or you throttling your internet speed to like 1Kb/s, you're not going to experience any lag from textures. Why? Because your computer can and only render what's already on your computer, and rendering no textures is super easy. I guess this is a bad example because you think that the streaming itself causes problems and disconnect/1kbps means there's no streaming. But just go and open some debug consoles in your viewer and watch where the slowdowns happen. It's not proportional to current network activity. No textures ever finish downloading = No lag from decoding them into usable data = No texture lag. And then you get disconnected from SL because 1Kb/s is not enough to sustain you.
  18. I mostly agree with you but here you're equally misinformed. The quality or bandwidth or latency of your connection has no impact on FPS, because there's absolutely no networking involved in rendering on your computer. Stadia is a streaming service, where the rendering happens somewhere else and the resulting frames are sent to you over the internet, which is why FPS might be lowered by Stadia to reduce the size of the stream when they detect you can't handle it. There might be some residual side effects from a slow connection because of other systems, like texture loading, but even that shouldn't be a thing because if there's little texture data coming in, there's also nothing to decode, which means no lag from loading.
  19. I can't tell if this is satire, but since we're on the SL forums, I don't feel so confident...
  20. These are just "offsets" relative to the size of the texture itself, think of it like a percentage. A texture, by default, is at 0.0 (or 0%) offset for both X(left/right) and Y(down/up) directions. If you were to set the values to 0.5 (50%) and 0.0, you would see that the texture is shifted halfway to the side, so that the "edge" of the texture is in the center. 1.0 means 100% to the left/down. -1.0 means 100% to the right/up. You can experiment with this by selecting an object you own while in Edit Mode and going into the Texture tab. There will be settings called "Horizontal offset" and "Vertical offset."
  21. Here you go: https://vibhub.io/ There's no hope for SL in VR though, unless you like headaches and vomit.
  22. I was thinking even more broadly, but I think the method you're showing there is more practical.
×
×
  • Create New...