Jump to content

Second Life Puppetry


You are about to reply to a thread that has been inactive for 218 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

5 hours ago, Bree Giffen said:

Any news from the meeting? I was looking for it on youtube but didn't find anything. I would have gone but it was scheduled in the middle of the day.

The only real news is that LL will not budge on using OpenXR by default.

This is expected behavior, though, as LL has many times tried to reinvent the wheel with major features instead of using industry standards.

So if we want to use OpenXR, we now have to deal with LL's own plugin as a bottleneck as I'm sure there is going to eventually be TPV Policy enforcement for legal purposes.

My guess why LL went with their own format is so they can try to license it to other platforms in the future, which I do not see happening since OpenXR is already established and is backed by dozens of major tech companies.

https://en.wikipedia.org/wiki/OpenXR

  • Like 2
  • Sad 1
Link to comment
Share on other sites

Nobody made a video of the meeting?

At least three people were rigged for puppeteering. One Linden had a webcam face tracker, and you could see her facial expressions, but not body movements. One guy had a body tracker using a Kinect, but not facial expression tracking, so you could see him move his arms. One woman had tracking and a very realistic avatar, but mostly stood quietly.

Much technical discussion. A few social points:

  • Being rigged for voice and body tracking makes you much more alive in SL. Anyone with a full rig makes everyone around them look dead.
  • This raises the bar for performers of all kinds, from leaders of meetings to escorts. If you want to be looked at, you're going to have to have at least face and upper body tracking. And, of course, voice.

It's a whole new world, and it's going to be fun.

(Please don't botch the job, LL.)

 

  • Like 3
  • Haha 1
Link to comment
Share on other sites

7 minutes ago, animats said:

A few social points:

  • Being rigged for voice and body tracking makes you much more alive in SL. Anyone with a full rig makes everyone around them look dead.
  • This raises the bar for performers of all kinds, from leaders of meetings to escorts. If you want to be looked at, you're going to have to have at least face and upper body tracking. And, of course, voice.

It's a whole new world, and it's going to be fun.

(Please don't botch the job, LL.)

Oooh, a world bifurcated into live chatting augmentationists and dead typing immersionists.

This isn't the end I was expecting.

Edited by Madelaine McMasters
  • Like 4
  • Sad 2
Link to comment
Share on other sites

8 minutes ago, animats said:

I'm not worried. Second Life is the size of Los Angeles but the number of people in it is 100x less. There's more than enough room for people doing different things.

I'm not terribly worried either. Voice bifurcated the crowd long ago. I think face animation, if it uses the web camera built into computers, tablets and phones, might be nice... if there's a quick way to switch it in and out.

Edited by Madelaine McMasters
  • Like 1
Link to comment
Share on other sites

Slightly sidetracking here. Since thinking about this for a few days i had a bit of a light bulb go off in my head (an old Incandescent one, so it felt warm and fuzzy). I use a heavily modified version of  davedub's bvhacker that does what i need it to do tweak my animations to perfection (i'm impossible to live with). I'm a multiple monitor user so i preview animations (using Firestorm, obviously) and tweak them using davedub's bvhacker. Now, that light bulb told me, what if i dig deep into this puppeteer thing and add new code to davadub's bvhacker so it'll stream it's data as i edit and play animations from within that to the viewer so it will preview on my avatar(s) in real time... How about that for a real use that will in my opinion will greatly benefit creators?

I think i might just go ahead and spend some time trying to make that work (since LL accepted my now 2 year old jira https://jira.secondlife.com/browse/BUG-229586 but doesn't do a thing with it.)

 

Yes, i'm criticizing you yet again LL, don't give me so many opportunities to do so. In the linked jira i've asked to make the animation preview window not open smack dead in the middle of the screen slam on top of the avatar. You do nothing with it yet you (waste) spend time on this puppeteer thing hardly anyone will use. I think my idea could be an actual use for this, don't you agree?

Edited by CaithLynnSayes
  • Like 1
Link to comment
Share on other sites

10 hours ago, animats said:

I'm not worried. Second Life is the size of Los Angeles but the number of people in it is 100x less. There's more than enough room for people doing different things.

I am worried: it just divides the already small user base into more sub-communities... Voice is an impairment for non English *speakers* like me and a huge turn off for role-players who cannot care less about the gender, age or any physical characteristic of the RL person behind the avatar, but on the contrary want their imagination to stay in control.

This said, I am not too worried either since I already predicted (see point 2 of this post) that the ”avatar expressiveness” stuff (which is only part of what can be done with the puppetry feature) will end up as a one of those gadget features of SL very few people are using (anyone here using voice morphing ?... Or path-finding ?)...

However, I can see more opportunities and applications than just the real time capture via a webcam for the puppetry feature, as it is coded... There are also potential and interesting ”side benefits”, such as reusing the IK code with/for current SL animations... So, the work done is not totally useless, even if the initial goal (avatar expressiveness based on RL user facial expressions) will become more and more evanescent as the initial hype fades away.

Edited by Henri Beauchamp
  • Like 2
Link to comment
Share on other sites

Today I saw a news article featuring VR/Metaverse projects from S. Korea, India, Thailand, China, Taiwan and Vietnam. All either directly selling some merchandise via virtual influencers or via play-to-earn "games". There's awareness there's money out there (an industry worth an estimated $1 trillion so they say) but nothing so far has really crossed into the mainstream. It all seems to be money thrown at potential and hype, and I seriously doubt any of the investors have the first clue about the technology. That said if LL can cash in on some of this sudden surge of interest, good for them. At least SL is a product that actually exists.

Link to comment
Share on other sites

4 hours ago, Henri Beauchamp said:

I am worried: it just divides the already small user base into more sub-communities... Voice is an impairment for non English *speakers* like me and a huge turn off for role-players who cannot care less about the gender, age or any physical characteristic of the RL person behind the avatar, but on the contrary want their imagination to stay in control.

This said, I am not too worried either since I already predicted (see point 2 of this post) that the ”avatar expressiveness” stuff (which is only part of what can be done with the puppetry feature) will end up as a one of those gadget features of SL very few people are using (anyone here using voice morphing ?... Or path-finding ?)...

However, I can see more opportunities and applications than just the real time capture via a webcam for the puppetry feature, as it is coded... There are also potential and interesting ”side benefits”, such as reusing the IK code with/for current SL animations... So, the work done is not totally useless, even if the initial goal (avatar expressiveness based on RL user facial expressions) will become more and more evanescent as the initial hype fades away.

It would be nice if IK could fix self collision problems with current animations and wonderful if it could fix collisions between avatars.

For keyboardists like me, upper body mo-cap is a non-starter. I can see some limited use for facial expression capture. I might use it sparingly, but it would have to be something I could toggle on and off with a keystroke command. I might also want an "interposer" of some kind that, rather than animating my SL face directly from my RL facial expressions, passed through some library based filtering of expressions intended to mimic my real face. People won't want facial ticks or responses to RL distractions to animate their gorgeous and graceful SL faces. Imagine how off-putting eye tracking would be if you emote "/me gazes longingly into your eyes" while they're actually darting all about the SL viewer UI.

Edited by Madelaine McMasters
  • Like 1
Link to comment
Share on other sites

16 hours ago, animats said:

Nobody made a video of the meeting?

At least three people were rigged for puppeteering. One Linden had a webcam face tracker, and you could see her facial expressions, but not body movements. One guy had a body tracker using a Kinect, but not facial expression tracking, so you could see him move his arms. One woman had tracking and a very realistic avatar, but mostly stood quietly.

Much technical discussion. A few social points:

  • Being rigged for voice and body tracking makes you much more alive in SL. Anyone with a full rig makes everyone around them look dead.
  • This raises the bar for performers of all kinds, from leaders of meetings to escorts. If you want to be looked at, you're going to have to have at least face and upper body tracking. And, of course, voice.

It's a whole new world, and it's going to be fun.

(Please don't botch the job, LL.)

 

So we'll get to see all kinds of arm movements and things?

I just can't get my mind passed how much things we won't want to see.. like all the scratching or other things that's gonna go on..

gorila-ass.gif

  • Haha 3
Link to comment
Share on other sites

10 hours ago, CaithLynnSayes said:

what if i dig deep into this puppeteer thing and add new code to davadub's bvhacker so it'll stream it's data as i edit and play animations from within that to the viewer so it will preview on my avatar(s) in real time

I was actually thinking about something similar, i.e. using Blender or Maya to pose the SL armature and have that information mirrored in real-time on your avatar in SL via the puppeteer plugin.  That could potentially be useful for SL photographers wanting to fine tune poses when taking pictures.

  • Like 4
Link to comment
Share on other sites

2 hours ago, Ceka Cianci said:

So we'll get to see all kinds of arm movements and things?

I just can't get my mind passed how much things we won't want to see.. like all the scratching or other things that's gonna go on..

When this gets going, we're going to need  a second window in the viewer that shows your own avatar from the front, like Zoom calls where you have a small window to see yourself.

  • Like 1
  • Haha 2
Link to comment
Share on other sites

48 minutes ago, Fluffy Sharkfin said:

I was actually thinking about something similar, i.e. using Blender or Maya to pose the SL armature and have that information mirrored in real-time on your avatar in SL via the puppeteer plugin.  That could potentially be useful for SL photographers wanting to fine tune poses when taking pictures.

I can see this kind of use for puppetry for sure, but this seems maybe rather more complicated than is necessary? Learning Blender or Maya is not a trivial thing. How much easier would it be have an in-viewer poser, as BD does.

  • Like 3
Link to comment
Share on other sites

12 minutes ago, Scylla Rhiadra said:

I can see this kind of use for puppetry for sure, but this seems maybe rather more complicated than is necessary? Learning Blender or Maya is not a trivial thing. How much easier would it be have an in-viewer poser, as BD does.

I agree it's an undeniably cumbersome solution and having someone write a simple lightweight app that allows you to pose a figure and have that data relayed to the puppeteer plugin would definitely be preferable.

I was just thinking more in terms of what would be the simplest solution to implement in order to test how well the theory works in practice and since the puppeteer plugin uses Python which is supported in Maya & Blender that seemed like a logical first step.

I must admit I've yet to try BD so haven't had a chance to try the in-viewer posing tool but I assume that would be the easiest solution of all (if LL were inclined to implement it).

  • Like 2
Link to comment
Share on other sites

8 minutes ago, Fluffy Sharkfin said:

I must admit I've yet to try BD so haven't had a chance to try the in-viewer posing tool but I assume that would be the easiest solution of all (if LL were inclined to implement it).

I think that you're absolutely right that there may be all sorts of interesting and worthwhile spinoffs of this, if it is implemented well, and in a way that makes it extensible.

As for the BD poser -- it is EXCELLENT. In fact, it's almost too powerful: it provides so much control over your avatar that it's not hard to "break" your body with it. I strongly recommend playing with it.

  • Like 2
Link to comment
Share on other sites

5 minutes ago, Scylla Rhiadra said:

I think that you're absolutely right that there may be all sorts of interesting and worthwhile spinoffs of this, if it is implemented well, and in a way that makes it extensible.

As for the BD poser -- it is EXCELLENT. In fact, it's almost too powerful: it provides so much control over your avatar that it's not hard to "break" your body with it. I strongly recommend playing with it.

It will be interesting to see what ingenious ideas people come up with (one of my favourite things about LL giving us shiny new features is seeing all the innovative uses SL residents manage to find for them).  Perhaps one of these whizzkid AI coders will take some time off from putting artists out of a job and write an AI that can analyze a music stream and automatically generate dance animations that sync up to whatever song is playing? 😅

I keep meaning to try BD but lately my primary concern when selecting a viewer has been performance, since my PC is sorely in need of an upgrade.  As it is I can't really spend time in-world when doing anything in 3D apps although that's unsurprising given that the software I use generates millions of polygons, I suspect I need some pretty expensive hardware to be able to run SL smoothly at the same time.

  • Like 2
Link to comment
Share on other sites

4 hours ago, Fluffy Sharkfin said:

It will be interesting to see what ingenious ideas people come up with (one of my favourite things about LL giving us shiny new features is seeing all the innovative uses SL residents manage to find for them).  Perhaps one of these whizzkid AI coders will take some time off from putting artists out of a job and write an AI that can analyze a music stream and automatically generate dance animations that sync up to whatever song is playing? 😅

I keep meaning to try BD but lately my primary concern when selecting a viewer has been performance, since my PC is sorely in need of an upgrade.  As it is I can't really spend time in-world when doing anything in 3D apps although that's unsurprising given that the software I use generates millions of polygons, I suspect I need some pretty expensive hardware to be able to run SL smoothly at the same time.

You'll love it, it has sliders for everything, even for the finger joints.

  • Like 2
Link to comment
Share on other sites

19 hours ago, Ceka Cianci said:

You'll love it, it has sliders for everything, even for the finger joints.

That's got to be a lot of sliders given that there are 130 bones in the SL avatar skeleton, most with multiple rotational axis.

Even assuming the poser only supports the standard bipedal avatar rather than also supporting things like hind legs, wings, etc. and doesn't provide posing sliders for the facial bones that's still a lot of sliders!

I hardly ever take photos myself but still I'm curious to see it in action.

Link to comment
Share on other sites

I haven't read the whole thread, so pardon if some of my fellow quippers already have showed up with the (on-point) snark that based on the title of the OP, I'm really surprised a coterie of suspiciously similar avatars haven't weighed in here, to white-knight the poor, downtrodden puppets...

 

 

 

 

 

maybe that's just how my brain works... carry on

ETA: I think "coterie" and "similar avatars" is redundant... so shoot me.

Edited by Seicher Rae
  • Like 1
Link to comment
Share on other sites

  • 1 month later...

At the last puppetry meeting, nobody seemed to quite know what to use it for. Here's a place to get  some ideas.

Upcoming remote VR metaverse event: Metaverse Creator Summit, November 16th. This is for creators of metaverse avatars and experiences. All the usual suspects - Matthew Ball, the venture capitalist with metaverse book, Phia, from the Virtual Reality Show, somebody from Unity. Nobody from Linden Lab, though.

There's a $39 charge but a coupon code, "PHIA", for free registration.

(LL needs to be more visible in the game and metaverse communities. Even though Meta and the entire crypto sector have tanked, there's good stuff going on. In the last two months, VRchat concurrent user counts passed SL for the first time. I'm encouraged that LL is attempting to make puppetry and maybe VR work, but they're late getting into that space and are now playing catch-up.)

 

  • Haha 1
Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 218 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...