Jump to content

llSensor (Making a mele combat system in Second Life, Problem 1)


Estelle Pienaar
 Share

You are about to reply to a thread that has been inactive for 345 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

I haven't been scripting in a long while, but got interested in making a mele combat system, in which a magic wand would "shoot" lightning bolts (as particles) against little monster NPCs (non player character). I am currently in the concept phase and would like to ask a few questions / rant about the unpractical implementation of LSL... 

So, in an ideal world, I could start a sensor from the prim attached to the wand that goes downward in a PI / 4 arc, scanning if the NPC is in range when it is fired. But sensor events are always along the x axis, or when attached to an avatar llSensor will use the direction the avatar is facing as their forward vector. In general it appears as a very unpractical solution to me, to always do the sensor along the x-axis of an object. When attached as here, a forward arch will miss NPC monsters that are too far below and near the agent.

I will have the same problem when doing the NPC monsters. Also for them it makes no sense to scan an 1/4 Pi arc, if they can only do it around the static x axis. They will always have to do a full 1 Pi arc scan.

Clumsy workarounds I see:

(1) For the agent and their wand, I could make the player attach another attachment at their feet with a sensor. This sensor would detect the NPC objects correctly, but the player needs to attach another object and if they forget it, the game won't work.

(2) Have a full Pi arch sensor scanning everything above and behind the avatar and then calculate the area that actually counts after that. So I would first calculate, if the detected NPC object faced by the avatar:

Then I would calculate if the NPC object is close enough to the avatar: 

And finally I would calculate if the NPC object is really under the wand by calculating the z-position via llGetAgentSize and some math.

That's a lot of clunky operations , just because the sensor can't be directed freely. Not very efficient. Even if I would apply the clunky solution (1) with an extra attached prim for the sensor, I would need to do solution (2) to determine the agent position for attacks of the NPC.

There are better solutions, even in SL: For example particles are always shot by the face that was upwards when the prim was rezzed. And it will traveling in the direction that this face is facing. Why isn't it possible to implement llSensor in this way? It would make much more sense and make the script much more efficiant!

Or am I missing a better solution?

wand.png

Edited by Estelle Pienaar
Link to comment
Share on other sites

19 minutes ago, Estelle Pienaar said:

For example particles are always shot by the face that was upwards when the prim was rezzed. And it will traveling in the direction that this face is facing. Why isn't it possible to implement llSensor in this way?

There's llCastRay, which is quasi-similar to llSensor, but only detects things in a straight line and returns results directly in a list rather than a separate event, but setting up good raycast lines also involves some math and good judgement.

personally I think doing a sensor with a full circle and then doing the math to narrow it down to what you actually want isn't so bad, but I'm pretty good at math.

  • Like 1
Link to comment
Share on other sites

42 minutes ago, Estelle Pienaar said:

When attached as here,

Something you might be missing is that the SL simulator doesn't "know" where attachments are exactly on your body* because it doesn't "see" what animation/pose your avatar is doing. Doing things from the avatar's perspective saves a lot of headaches in misunderstanding where the thing "is" vs where it looks like it is.

*Unless the attachments are attached to the root/center attach point, which ignores animations.

  • Like 1
Link to comment
Share on other sites

Thank you very much for the hint that the SL simulator might not know the attachment position. I will do some tests and see if a simple sensor will do! 

I guess that all scripters are ok to pretty good with math. But geometry is just not my love affair and if I can avoid it... 😂

  • Like 1
Link to comment
Share on other sites

5 hours ago, Estelle Pienaar said:

(1) For the agent and their wand, I could make the player attach another attachment at their feet with a sensor. This sensor would detect the NPC objects correctly, but the player needs to attach another object and if they forget it, the game won't work.

The sensor will originate from the center of the avatar, regardless of which attachment point the llSensor call is made from. Having stuff attached to your feet won't lower the origin.

For other objects, like your NPCs, you can account for the sensor by building the object in such a way that the root/link points in the correct direction.

Also, instead of doing 45 or 180 degrees, why not 90? That way you'd scan only the correct (front) side of the object.

Link to comment
Share on other sites

1 minute ago, Wulfie Reanimator said:

For other objects, like your NPCs, you can account for the sensor by building the object in such a way that the root/link points in the correct direction.

Also, instead of doing 45 or 180 degrees, why not 90? That way you'd scan only the correct (front) side of the object.

Thanks a lot for the first part of the response. Indeed you are right that the sensor does not originate from the attached prim, but from the center of the avatar. The lsl Wiki could be more specific at this point. But now I know that a simple sensor called from an attached will be sufficient for the player agent. I also tested and the PI/4 arc is not sufficiently wide. It needs to be PI/3. So for that part my headaches are solved!

I don't however understand the second part of your response. This would only work if the NPC does not move and rotate at all, wouldn't it? But in the concept of the game the agent and the NPC object will move and rotate constantly towards each other. So also the sensor that originates from the NPC would have to rotate with the NPC. But an LSL sensor originating from an object would constantly only point towards the x-axis, ignoring any rotation of the object. Or do I get something wrong? 

Link to comment
Share on other sites

59 minutes ago, Estelle Pienaar said:

But an LSL sensor originating from an object would constantly only point towards the x-axis, ignoring any rotation of the object. Or do I get something wrong? 

It faces towards the x-axis of the object the script is in (assuming it's not an attachment) so if you turn your object the 'sensor cone' turns with it. LSL has some rather primitive implementation at times, but it's not ~that primitive.

  • Like 1
Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 345 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...