Jump to content

Wulfie Reanimator

Resident
  • Posts

    5,744
  • Joined

Everything posted by Wulfie Reanimator

  1. Unfortunately no, the sim(s) I might use this on don't have Experiences. It's quite possible. llPushObject-based avatar grabbers can and have been done. I can even show you the one I have. Even if you try to move while you're being held in the air, you won't be able to move much. But this is what mine does. I'm not touching any inputs and I get thrown about to the point where I escape without trying.
  2. This has a couple problems -- I want the target to be able to escape within the first few seconds of being grabbed, and they would most likely be shot by enemy combatants while they are trapped mid-air, so the cage shouldn't block the avatar's movement or physical objects flying at it. The second part is relatively easy to build around, but the first part is the hard one. I can do the initial pulling part just fine now, but the problems happen when I'm supposed to be holding the avatar in place in the air. I don't know what kind of counter-forces I can use to cancel out the "bounciness" that gets more extreme over time.
  3. What kind of edge-case, out of curiosity? The one you vaguely mentioned is 1) not synchronous, 2) prone to lost messages, and 3) at least partially doable with SLPPF unless the changes are script-related. (I assume so.) The kinds of situations I can think of are gun-related and we have workarounds for the workarounds!
  4. So I'm working on a little something. It's game inspired and for use in combat sims or my own amusement. Essentially, this: (7 second video) The goal is to wait for an avatar to walk within a certain range, then use llPushObject to pull them in and up into the air. Problem is that llPushObject seems very weak, especially the further the avatar is from the point I want them to "gravitate" towards. My current code looks something like this: key target; vector gravity_center; float force = 3.2; float grab_range = 3; float escape_range = 6; float exponent = 0.0002; // snipped sensor(integer n) { target = llDetectedKey(0); gravity_center = llGetPos() + <0,0,2>; llOwnerSay((string)["target ", target]); llSetColor(<1,1,0>, -1); while (TRUE) { vector avatar = llList2Vector(llGetObjectDetails(target, [OBJECT_POS]), 0); vector direction = gravity_center - avatar; // Vector TOWARD the center. if (llVecMag(direction) > escape_range) jump exit; // Exit condition! float mass = llGetObjectMass(target); llPushObject(target, llVecNorm(direction) * force * mass, ZERO_VECTOR, FALSE); llSleep(0.044); if (exponent < 2) // Slow curve to force-growth, chance to escape. { exponent *= 2; llOwnerSay((string)["increment ", exponent]); } if (force < 1000) // 1000 is still very weak. { force += exponent; llOwnerSay((string)["force ", force]); } } @exit; llSetColor(<1,0,0>, ALL_SIDES); } Yeah; repeating sensors, infinite loops, AND jumps! What a time to be alive. One event is faster than one every fraction of a second, we don't have break to terminate a loop, and the object does not need to respond to anything during the loop. Adding the avatar's mass to the equation didn't seem to do anything. So my question is, how do I account for the distance so avatars would be more strongly affected? Are there other pushing functions I'm forgetting? Edit: Update! The wiki page for llPushObject says "The push impact is diminished with distance by a factor of distance cubed." So... llPow(llVecMag(distance), 3) does a good job at counteracting the distance falloff. It's a great start, but when the avatar is near the target, it tends to "overshoot" and bounce and can't hold the avatar in place while in the air, so I'm still looking for help/input.
  5. I put my main script in the root and the other 19 into a microscopic mesh triangle where people won't find it.
  6. Never mind that you can use the same payment info on multiple accounts even if you don't buy anything...
  7. The only unironic use of the base body I've seen are from naked guys looking to hump anything that moves in 2007-era sex sims.
  8. The walls are invisible from the outside because SL only supports one-sided mesh faces. You would need to create a second "layer" of wall and flip the normals to face outwards. You'll also need to create a custom physics shape for the skybox if you haven't already, and set the physics type to prim instead of convex hull. And as a personal side-note; I would prefer to have the transparent walls. I'm annoyed in my small skybox whenever my camera goes outside of the skybox and my screen is covered by a wall.
  9. I gotta stand with ChinRey on this one. Even if the UV layouts did work and stayed as they were in MD, you wouldn't want to keep them in that layout. In Blender, you can select any vert/edge/face and press Ctrl-L to "select linked" and move each UV island to separate them out. Similarly, you could select each individual material to select the all of the topology/UV islands. Random guess though, what about the 10% scale you've set? If you leave it at 100%, does that fix anything?
  10. Well, the vector part is relatively easy. (Just for reference: http://wiki.secondlife.com/wiki/LlCastRay) You'll want to cast a ray from your position, two meters forward. vector ray1 = <2,0,0> * llGetRot(); And you would call llCastRay like... list ray = llCastRay(llGetPos(), llGetPos() + ray1, []); Creating other rays that deviate from the first one is really simple. You just apply another rotation to it. This would offset the ray by 10 degrees horizontally: vector ray2 = ray1 * llEuler2Rot(<0,0,10> * DEG_TO_RAD); Visually speaking, the rays would look something like this: (And you can use the editing tools to figure out what rotations you need/want.) Programmatically speaking, you could implement the code something like this: default { state_entry() { vector pos = llGetPos(); rotation rot = llGetRot(); vector ray1 = <2,0,0> * rot; vector ray2 = ray1 * llEuler2Rot(<0, 0, 0.174533>); vector ray3 = ray1 * llEuler2Rot(<0, 0, -0.174533>); llOwnerSay(llList2CSV([ray1, ray2, ray3])); list ray = llCastRay(pos, pos + ray1, []); // Check for direct hit... ray = llCastRay(pos, pos + ray2, []); // Check for glancing hit... ray = llCastRay(pos, pos + ray3, []); // Check for glancing hit... } }
  11. Is this a math question? Do you want to know how to "spread out" the rays in some shape? Or do you need help figuring out how to decide which ray to use?
  12. Since we now have the ability to change our legacy names, it's worth pointing out that you should design with this in mind. Will your product break if someone changes their legacy name? Will they miss out on something, or have to restart their progress?
  13. Just to clarify for context, do you have VR? I wasn't saying that everybody gets affected by low framerates the same. I'm saying that everybody, when the FPS gets low enough for them, will suffer. For example if you show them a still frame and tell them to move their head. I'm pretty resistant to it, I can put up with 15+ FPS although it's not comfortable, but if it hits that or below, I crumble to the floor. Also, if you're saying that you're getting car sick while playing on a flat monitor, this is literally the first time I've ever heard of that happening to anyone and I've been playing games my whole life. I don't doubt you, but I can't imagine that being normal. When I was talking about VR content, I wasn't saying for SL. I'm very aware of the basic concepts of how VR works on the programming side, I've been making games as a hobby for a few years and I'm a full-time student in a "coding school." (Not game related.) But it's absolutely true that a lot of companies will look at VR and say "we can't afford to develop a game for such a small platform. Too high risk." The market is very small, and most of the content that exists is bad. (I've tired subscription-based platforms like Viveport. 95% actual shovelware, 3% dead games, 2% actually playable but mainly singleplayer.) You can read any number of articles on why serious companies aren't working on VR games/content. The ones we get are indies, asset flippers (buy ready-made stuff and resell it as a game, poor quality), and unfinished "early access" type stuff. We need more exceptions like Beat Saber, Boneworks, Half Life: Alyx, and to a lesser extent Skyrim/Fallout VR conversions but none of these are frequent. To me this sounds like cutting bread with one of these: It'd work, but it seems a bit overkill... I've never heard of any game using AI for their camera, lol. I can't think of any situation where you couldn't implement a "don't go through stuff" camera with simple raycasting and a sphere-shaped collider. Personally I have no idea how the camera in Firestorm is so bad at staying inside buildings, unless it's one of those "the physics shape is messed up so it's not the computer's fault" things again.
  14. As someone who gets car sick very easily, I can stay in VR for 8 hours straight and feel perfectly fine, besides the physical exhaustion. The problems you talk about happen mainly (or at least universally?) when the framerate goes down (or bad tracking, which is more of a problem with the physical setup). The people who get sick from just being in VR seem to be a minority. The main reason why VR remains niche is the cost of entry. VR is expensive and doesn't have that much quality content for it. Developers/publishers aren't making content for VR because it doesn't have a large customer base. One of those things needs to change before things can really improve. It really doesn't. All it requires are a couple line checks from the avatar to where the camera wants to be. If there is something in between, the camera gets moved closer. Like ChinRey pointed out, it even already exists. Making it better essentially only needs more parallel "lines" to check (denser = detect smaller objects that might be in the way).
  15. I imported the dae and it's perfectly fine. Yeah, I know what you mean and that's still "normal" as far as the glitch goes. The calculations are messed up and the artifacting is affected by camera position, usually. (Sometimes they stretch toward a single point, depends on the exact error that's happening.)
  16. Is there any chance you could send me the blend file? I'm super confused/curious about what's causing that. Also they're not shadows, they're literally just mesh planes stretching into infinity. It's a relatively common phenomenon in 3D rendering, for example if you log into SL with a viewer that doesn't have rigged mesh implemented.
  17. If the demo is broken, always assume the real one will be too. If you can't make a demo properly, why would anyone think you can do it right when people pay for it? Disclaimer: This isn't to say that all products will be broken. But if someone is working on a demo and make a mistake, they're likely working on the real product in tandem and replicating the same mistake.
  18. It's no different from "regular select." You just always have that tool selected until you select something else. Hold left click on the Select tool icon. What do you mean? Like splitting the view so you can have 3D view + UV editor open? It works exactly the same as in 2.79. Drag a corner or right-click between the views: Correct me if I'm wrong but 2.79 has the same issue, the reference manual is largely unchanged in style.
  19. What I wonder is how useful would it really be in a lot of cases. I seriously doubt you could resize mesh by hand accurately enough to make it seamless. There's nothing to snap to, and even the slightest inaccuracy gets worse in altitude. What about verts around joints? You scale it down and everything starts clipping, you scale it up and you get extreme jaggies. Do you really get good results when you just scale up the rigged mesh in [insert 3D software]? Genuine question.
  20. But once you get used to the changes, things that used to take hours take minutes. 2.8 was a massive rework, so re-learning the UI is expected, but it's much better overall now.
  21. I've been watching a ton of crafts videos lately and I had Blender open while watching this: It's 602 tris total, not totally cleaned up and definitely fits in 1 LI but I just wanted to show / bump.
  22. Alternatively, you could have just one object that's being worn by each person holding the sermon, and those objects send a message to any avatars that are beyond regular chat range, using llRegionSayTo on channel 0. This way there's no shouting but everyone can hear the important parts at any range within the sim. The two-object method can have slight overlap, so people sitting in range of both objects would get two messages unless that's being handled somehow.
  23. Here you go, just for you: https://marketplace.secondlife.com/p/Mesh-Display-Panels-05-LI/19358603
×
×
  • Create New...