Jump to content

Looking to hire a scripter.


Tinkarbell
 Share

You are about to reply to a thread that has been inactive for 1104 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

Hi,

I want to hire someone to make a product for second life .

Mesh eyes and HUD. The eyes follow (look at) the head of the closest avatar. The HUD is used to turn this feature on/off. also to set the maximum turn angle to prevent the eyes from turning around backwards. Also to set the range in meters. And finally an option to rotate the head (with limits) towards the nearest avatar.

-so if I am wearing the eyes and the HUD, I turn the HUD on, set range to 10 meters and set head and eye angles to 30 degrees, then my avatar should look directly at the head of the avatar that walks up to me.

If there is also a way to have my avatar look at my own camera that would be included.

how much would it cost to program this?

Link to comment
Share on other sites

For something you rez on the ground this would be pretty easy, but for something you wear this is almost impossible, in short, because the direction you are facing gets reported very inaccurately, and the script would need to know what direction you are facing to know where to tilt your head. it might be almost doable though if your avatar was a magic floating head.

if you turn off all of your AOs though, you will notice that your head already tracks what you are looking at, and sometimes a nearby avatar. Someone should yell at linden lab to expose more of their already existing animation system to creators, and improve on it as much as they are able.

  • Like 1
Link to comment
Share on other sites

1 hour ago, Quistessa said:

[...] that your head [and eyeballs!] already tracks what you are looking at, and sometimes a nearby avatar. Someone should yell at linden lab to expose more of their already existing animation system to creators, and improve on it as much as they are able.

This is an old and very underrated feature that makes avatars more alive, but has been sidelined with the massive adoption of custom mesh heads. Eyes in mesh heads don't track their surroundings anymore like system's mesh heads and eyes would, which is what makes the new mesh heads look dead doll-like. The best the new heads can do is emulate eye movements through random motion, without context.

The avatar's look_at coordinates can be obtained with PERMISSION_TRACK_CAMERA and llGetCameraPos and llGetCameraRot. With these data, you should be able to rotate the mesh eyes. (Of course, with limits, unless you want the eyeballs to rotate inward like with the girl in The Exorcist.)  But with that, you're merely imitating what the original eyes can already do without scripts. Even custom mesh eyes attached to the eyeball attachment points will rotate along and follow nearby avatars (especially when they are typing or talking), as long as the user has look_at enabled in their viewer. The culprit is that such rotating eyeball behavior makes the 'gaze' on most modern custom mesh heads look creepy.

Edited by Arduenn Schwartzman
Link to comment
Share on other sites

4 minutes ago, Arduenn Schwartzman said:

This is an old and very underrated feature that makes avatars more alive, but has been sidelined with the massive adoption of custom mesh heads. Eyes in mesh heads don't track their surroundings anymore like system's mesh heads and eyes would, which is what makes the new mesh heads look dead doll-like. The best they can do is emulate eye movements through random motion, without context.

Most mesh heads I've used come with mesh eyes that are rigged to the eyeball bones, which lets them work like the "real eyes" or unrigged attachments.

  • Like 1
Link to comment
Share on other sites

14 minutes ago, Wulfie Reanimator said:

Most mesh heads I've used come with mesh eyes that are rigged to the eyeball bones, which lets them work like the "real eyes" or unrigged attachments.

This appears to have never worked well. With the first Catwa heads, the eyes had a weird offset that would make them look really creepy when rotated sideways. I think, with newer brands that has improved a bit. The rotation range that makes eyes stay out of the creep zone is much smaller that with systems' heads. Most people on Flicker and in-world seem to have totally turned off the eye rotation feature. It's very rare that people go through the trouble of manually adjusting their gaze. Note how most people in SL only stare straight ahead, like mannequins. Just browse through a bunch of portraits posted on Flickr. People's eyeballs turned sideways is very rare. The eye-following-people thing in the mesh heads just appears not to be very popular either.

[Update] Just to illustrate: in only one picture in a 100 in this random collection of portraits on Flickr, the avatar has their eyes (barely) rotated. In 99 out of 100, the gaze is straight into the infinite. Compare that to real life models, who mostly look into the camera, but their eyeballs are very often under an angle because their heads are tilted, which is a way more natural thing.

Edited by Arduenn Schwartzman
Link to comment
Share on other sites

7 minutes ago, Arduenn Schwartzman said:

This appears to have never worked well. With the first Catwa heads, the eyes had a weird offset that would make them look really creepy when rotated sideways. I think, with newer brands that has improved a bit. The rotation range that makes eyes stay out of the creep zone is much smaller that with systems' heads. Most people on Flicker and in-world seem to have totally turned off the eye rotation feature. It's very rare that people go to the trouble of manually adjusting their gaze. Note how most people in SL only stare straight ahead, like mannequins. Just browse through a bunch of portraits posted on Flickr. People's eyeballs turned sideways is very rare. The eye-following-people thing in the mesh heads just appears not to be very popular either.

My tolerance for "uncanniness" is probably higher than yours because I've never felt that an avatar looks creepy because of their eyes, even when they're rotating more than humanly possible. I don't mind static eyes at all, either. But I will say that some eye textures have too much of that reddish vein detail around the eyes, which will look just really ugly when the eyes turn.

Edited by Wulfie Reanimator
Link to comment
Share on other sites

6 hours ago, Arduenn Schwartzman said:

The avatar's look_at coordinates can be obtained with PERMISSION_TRACK_CAMERA and llGetCameraPos and llGetCameraRot. With these data, you should be able to rotate the mesh eyes.

You should be, yes, but as I tried to say, in the LSL you can't just tell the object where to look at. because it's an attachment you need to need to have the facing of the avatar (root prim) and give a rotation relative to that, like say 5 degrees to the left (actually everything's in quaternions, but that's less easy to explain in a forum post). you'd also need the position of your head because (for example) if you were crouched down you would want to look up at things above your eye level.

knowing the rotation and position of the head is impossible for 2 reasons. Firstly the rotation of your whole avatar is inaccurate and slow to update. if you face north then turn to the east, scripts will think you're facing east north east or so, and only  about a quarter of a second later. Secondly if you play any animation/pose that moves your head and or body (so pretty much any interesting animation) your head will be in a slightly different position for everyone who looks at you, because animations are a client-side effect.

tl;dr trying to win an argument nobody here cares about.

Edited by Quistessa
  • Like 1
Link to comment
Share on other sites

7 hours ago, Quistessa said:

For something you rez on the ground this would be pretty easy, but for something you wear this is almost impossible, in short, because the direction you are facing gets reported very inaccurately, and the script would need to know what direction you are facing to know where to tilt your head. it might be almost doable though if your avatar was a magic floating head.

if you turn off all of your AOs though, you will notice that your head already tracks what you are looking at, and sometimes a nearby avatar. Someone should yell at linden lab to expose more of their already existing animation system to creators, and improve on it as much as they are able.

Here is an eye that follows the closest avatar:

https://marketplace.secondlife.com/p/Super-Mario-64-Follow-Eye/15243757

I'm not sure why it would need to know the position of your head if it is attached to it . the only thing changing is the rotation.

Link to comment
Share on other sites

If you can't afford 5l for an eye I'd wonder how much you would intend to pay your scripter. . . If you hit me up next time I'm in-world I could demo a look at avatar script for anyone interested. I made one for someone a week or so go. it also plays random sounds. I think there's also such a script on the wiki. . .

// Same as above, but for use inside a child prim or the root of an attachment.
// Make the child or attachment look at nearest Avatar.
 
default
{
    state_entry()
    {
        llSensorRepeat("", "", AGENT, 20.0, PI, 0.2);
    }
 
    sensor(integer total_number)
    {
        vector p = llGetPos();
        llLookAt(p + (llDetectedPos(0) + <0.0, 0.0, 1.0> - p) / llGetRootRotation(), 3.0, 1.0);
    }
}

http://wiki.secondlife.com/wiki/LlLookAt

Link to comment
Share on other sites

13 minutes ago, Quistessa said:

If you can't afford 5l for an eye I'd wonder how much you would intend to pay your scripter. . . If you hit me up next time I'm in-world I could demo a look at avatar script for anyone interested. I made one for someone a week or so go. it also plays random sounds. I think there's also such a script on the wiki. . .


// Same as above, but for use inside a child prim or the root of an attachment.
// Make the child or attachment look at nearest Avatar.
 
default
{
    state_entry()
    {
        llSensorRepeat("", "", AGENT, 20.0, PI, 0.2);
    }
 
    sensor(integer total_number)
    {
        vector p = llGetPos();
        llLookAt(p + (llDetectedPos(0) + <0.0, 0.0, 1.0> - p) / llGetRootRotation(), 3.0, 1.0);
    }
}

http://wiki.secondlife.com/wiki/LlLookAt

The problem here isn't coming up with 5L. Trust me.

The problem is that eye doesn't have all the features including the HUD etc, that I have outlined in the OP.

I know that it isn't impossible . I'm still looking for a scripter who wants to be hired. anyone?

Link to comment
Share on other sites

I can easily add the extra features, but if the base product is kind of crappy any extra features won't make it better and in order to make it worth my while I'd have to charge exhorbitant prices for a one off custom piece because it's not something I could market to anyone who wants it to look realistic/ not janky. it's not /impossible/ but it will look bad.

Link to comment
Share on other sites

3 minutes ago, Quistessa said:

I can easily add the extra features, but if the base product is kind of crappy any extra features won't make it better and in order to make it worth my while I'd have to charge exhorbitant prices for a one off custom piece because it's not something I could market to anyone who wants it to look realistic/ not janky. it's not /impossible/ but it will look bad.

Why will it look bad?

Link to comment
Share on other sites

6 hours ago, Arduenn Schwartzman said:

One problem is the dark baked shadowing onto the eyeballs that becomes visible when rotated even within physical constraints.

there aren't any dark baked shadowing on my mesh eyes. I think you may have bought the wrong one.

Link to comment
Share on other sites

1 minute ago, Quistessa said:

because it won't look in exactly the right direction. . .

it does though. I just tested it with this look at script:


default
{
    on_rez(integer param)
    {
        llResetScript();
      
    }

    state_entry()
    {
       
       llSensorRepeat("", "", AGENT, 15.0, PI, 10);//scan for nearby avatars
       
    }

    sensor(integer total_number)
    { integer distance = (integer)llVecDist(llGetPos(), llDetectedPos(0)); 
      llLookAt(llDetectedPos(0) + <0.0, 0.0, 0.5>, 0.2, 0.3);//look at detected avatar   
         if (distance <= 10)
                     {             
                        llSensorRepeat("", "", AGENT, 15, PI, 0.1);
                      }
                      else if( distance > 10){ llSensorRepeat("", "", AGENT, 15, PI, 10);
                      }       
            }
}
 

Link to comment
Share on other sites

integer gChannelRange=17;
integer gChannelAngle=18;
float gWakeTime=10.0;
float gSensorInterval = 0.1;

float gRange=10.0;
float gAngle=PI;

default
{
    on_rez(integer param)
    {
        llResetScript();
    }
    state_entry()
    {   llSensorRepeat("", "", AGENT, 15.0, PI, gWakeTime);//scan for nearby avatars
        llListen(gChannelRange,"",llGetOwner(),"");
        llListen(gChannelAngle,"",llGetOwner(),"");
    }
    listen(integer Channel,string Name,key ID,string Text)
    {
      if(Channel==gChannelRange)
      {	gRange=(float)Text;
        llSensorRepeat("", "", AGENT, gRange, gAngle, gSensorInterval);
      }else if(Channel==gChannelAngle)
      {	gAngle=(float)Text*DEG_TO_RAD;
        llSensorRepeat("", "", AGENT, gRange, gAngle, gSensorInterval);
      }
    }
    sensor(integer total_number)
    {	
      integer n;
      while(llDetectedKey(n)==llGetOwner()) // could just be 'if' rather than 'while'. . .
      {   ++n;
      }
      if(n>=total_number)
      {   
          llSensorRepeat("", "", AGENT, gRange, gAngle, gWakeTime);
          llRotLookAt(ZERO_ROTATION,0.2,0.3); // look straight ahead.
          return;
      }
      llLookAt(llDetectedPos(n) + <0.0, 0.0, 0.5>, 0.2, 0.3);//look at detected avatar   
      //vector distance = llGetPos()-llDetectedPos(0); 
      /*if( distance*distance > gRange*gRange)
      {
          llSensorRepeat("", "", AGENT, gRange, gAngle, gWakeTime);
      }  */     
    }
}

That's with chat commands on channels 17 and 18. "/17 0" should turn off the movement.

yw.

Edited by Quistessa
more slight adjustments
Link to comment
Share on other sites

10 minutes ago, Quistessa said:

gChannelRange=17;
gChannelAngle=18;
float gRange=10.0;
float gAngle=PI;
default
{
    on_rez(integer param)
    {
        llResetScript();
    }
    state_entry()
    {   llSensorRepeat("", "", AGENT, 15.0, PI, 10);//scan for nearby avatars
        llListen(gChannelRange,"",llGetOwner(),"");
        llListen(gChannelAngle,"",llGetOwner(),"");
    }
    listen(integer Channel,string Name,key ID,string Text)
    {
      if(Channel==gChannelRange)
      {	gRange=(float)Text;
        llSensorRepeat("", "", AGENT, gRange, gAngle, 0.1);
      }else if(Channel==gChannelAngle)
      {	gAngle=(float)Text*DEG_TO_RAD;
        llSensorRepeat("", "", AGENT, gRange, gAngle, 0.1);
      }
    }
    sensor(integer total_number)
    { integer distance = (integer)llVecDist(llGetPos(), llDetectedPos(0)); 
      llLookAt(llDetectedPos(0) + <0.0, 0.0, 0.5>, 0.2, 0.3);//look at detected avatar   
         if (distance <= 10)
         {             
             llSensorRepeat("", "", AGENT, gRange, gAngle, 0.1);
         }
         else if( distance > 10)
         {
             llSensorRepeat("", "", AGENT, gRange, gAngle, 10);
         }       
    }
}

That's with chat commands on channels 17 and 18. "/17 0" should turn off the movement.

yw.

syntax error line 0, column 13.

I don't think it knows what gChannelRange is

Link to comment
Share on other sites

4 minutes ago, Quistessa said:

oh woops, should be: (edited original post)


integer gChannelRange=17;

thanks so much for doing this. you are awesome. the only problem with the code is that when it it attached, it thinks that myself is the closest detected avatar. if there was only some way to get it to ignore self, or focus on the second closest avatar then we would have a winner.

Link to comment
Share on other sites

Edited my first script post to (hopefully) ignore self. Also used the more efficient distance check and moved some parameters to global variables.

Distance check was unnecessary because that's already covered by the range of the sensor. commented it out after making it more efficient.

Link to comment
Share on other sites

39 minutes ago, Quistessa said:

Edited my first script post to (hopefully) ignore self. Also used the more efficient distance check and moved some parameters to global variables.

Distance check was unnecessary because that's already covered by the range of the sensor. commented it out after making it more efficient.

we are so close... I'm making a video for you...

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 1104 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...