Jump to content
  • 0
Sign in to follow this  
Syko Kamachi

Interactions between voice and scripts

Question

14 answers to this question

Recommended Posts

  • 0

I did a quick search for references to mouth movement in Second Life and found a page with information about turning on Lip Sync. The web page can be found at http://wiki.secondlife.com/wiki/LipSync. Hope that is of some help to you.

 

Message was edited by: Andrew HellershanksAfter posting the information above, I did some further digging in to the LSL API. I also remembered I had seen keyboards that pop out in front of an avatar when they are typing.To find out if you are typing, set a 1 second timer. In timer event handler use a call to llGetAgentInfo with your avatar key and check if the AGENT_TYPING bit is set. If the bit is set, call the routines to animate the mouth. If the mouth is part of your normal avatar, you could trigger an animation. If your avatar is wearing a prim head with a mouth, you can trigger some routine to animate the prim mouth piece of the head using Puppeteer, or some similar prim animation script. system.

Share this post


Link to post
Share on other sites
  • 0

If you're trying to get a mouth on an avatar to move, you need to go into advanced and chose to have the avatar lips move when you speak. I dont think there's a special script to do that. It's already in SL as an advanced option.

Share this post


Link to post
Share on other sites
  • 0

It's for a prim head, like is used on furry avatars - this is why I need a script function or event. It's to go in an attachment. And like I said, there is a way to do it because I've seen voice activated stuff before.

Share this post


Link to post
Share on other sites
  • 0

Open and read the link Andrew posted. It contains detailed information and so far, you can only use some basic mouth gestures - not more and not less.

Share this post


Link to post
Share on other sites
  • 0

Naahhhh, my 26-odd years professional programming experience just isn't enough to overcome only 2 years LSL scripting.  As I'd hate to be called a wiseguy - or any sort of guy for that matter - I will say nothing and look forward to sicko's Syko's explanation of how it's done.

'course, I might be tempted to slide over to the 'scripting tips' forum or maybe ask in one of the scripters' groups or mailing lists while I'm waiting.

Or I could toddle off and write some app outside SL that interfaced with my mic and sent info to SL.

Then again, I might not.  After all, no-one's paying me for something I can't do.

Share this post


Link to post
Share on other sites
  • 0

Whooooa. Chill out. No need to break out the unnecessary sarcasm, I was asking an honest question. I've seen voice activated AO's being used by people who don't know one end of a script from the other before, so I know there is an entirely in-world way of detecting voice. All I wanted to know was how to do it. Also, for someone who's allegedly been professionally programming for longer than I've been alive, poking fun at my username was a pretty childish response.

Share this post


Link to post
Share on other sites
  • 0

There are gestures that can be triggered by voices, using a special chat trigger, you can use them as a relay to a chat output your scripts can interpret. Look up voice activated gestures, it's along the lines of voicelevel1.0 or similar. They're avaliable in the Library under Gestures.

Hope I helped.

Share this post


Link to post
Share on other sites
  • 0

EDIT: Following my own advice and reading the docs - there is one exception to the "can't react to voice". The voice system does give info that voice is being used, so a simple on/off reaction is possible. If that is all that is wanted, yup, you CAN do it.

Original message:

There is no way a script can "react" to voice. I don't know what you've seen, but it was NOT a script reacting to voice.

You can either believe the wiki, the developers here, and me, or waste time trying to find something that doesn't exist.

http://wiki.secondlife.com/wiki/Voice/Technical

Also follow the links from there - you will see that the voice is carried separate from the other SL data and so CANNOT be "read".

Have fun reading.

  • Like 1

Share this post


Link to post
Share on other sites
  • 0

YES. Thank you, that's exactly what I was looking for. I looked up into gestures and found the triggers:

/voicelevel1/voicelevel2/voicelevel3

 

Funny how they're so useful to scripts, but there isn't a whisper of them in the scripting sections.

Share this post


Link to post
Share on other sites
  • 0

Nvm, got my answer. Voice activated gestures. That's all I was looking for. The fact that gestures is the easiest way to get SL to recognise that voice is being used is probably why it wasn't mentioned in any of the scripting help.

Share this post


Link to post
Share on other sites
  • 0

And seriously, 'cause I was pulling your chain before, are you ok from there?

(customise voice-gestures to chat some command on a channel.  Have your script listen on that channel for the command. Activate whatever it is in the listen event in the script)

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Answer this question...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...