Jump to content

Animesh seat scripting


VirtualKitten
 Share

You are about to reply to a thread that has been inactive for 968 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

On 6/23/2021 at 6:09 AM, Quistessa said:

A naive question, but avatars can wear animesh attachments, no? why not wear the dragon instead of sitting on it?

Simple reason: the whole thing will have to get synced animations anyways, the only change is what would trigger the animations. Animesh driving avatar animations, or avatar states determining what animations the animesh should play?

Also, one of the key points is that the fire emitter doesn't follow the animesh animation, so no fire breathing from the mouth, just from where the fire emitter was originally attached

  • Like 1
  • Confused 1
Link to comment
Share on other sites

  • 2 weeks later...

I still dont know why linden is nonwriting code anymore to add to the amimesh problem I had hoped they would integrate links to its model but it doesn't seem to want to do this for some reason. I ad hoped they would fix this no proper answer on my ticked please dont just post links night some of us are not joined in with Jira so cant see if this was an answer hugs.

@OptimoMaximo,  yes we are all aware of the particle emitter and the problems with placing it requiring a link. why can they not write an offset to emitter to place it were you want in the animesh. additionally why cant the base seat of theanimaton move with the animesh. Yes I can move this every time by scrip but it will place a ridiculous overhead on the browser

I am hopeful they will one day DX

Edited by VirtualKitten
Link to comment
Share on other sites

2 hours ago, VirtualKitten said:

why can they not write an offset to emitter to place it were you want in the animesh

Because there's no code structure to let the server know where the joints are at any given moment, and adding such thing to the current implementation with the viewer sending that data would cause the huge overhead you foresaw coming by adding that feature yourself via script,just a different overhead.

2 hours ago, VirtualKitten said:

additionally why cant the base seat of theanimaton move with the animesh.

Same reason,plus there is a deformer in place if you talk about a saddle, which means skin data passed over to a non skinned object in real-time... Do you realize how unfeasible all this is?

Link to comment
Share on other sites

@Optimo Maximo this seems ridiculous we have a perfectly good armature that  can take link in blender and add individual bone vector groups too that move with the armature why is this not mirrored in Second life ?

If its so unfeasible how is it that it works in blender. 

You say there is no structure to let the server know where the joints are however blender is quite adept at this.  It seems untrue then that these structures do not exist in some capacity to allow functional movement of a single linked . I understand that your skin  data vector  objects are moving real-time. But as these vectors of skin groups are moving why can a linked skin vector group not move in the same way as a non linked group.  You would seem to already have that input and and force to act upon the vectors in this single link group upon them as you are doing this calculation on the single link vectors! You must also record then a group of vectors and a rotation and position  deform per group .How is it un feasible that these vector skin data cannot be associated with the same vectors in another link? Surely in your Second life framework build:

i) You need to ascertain what vectors are part of this object  (You must have something to do this)?

ii) Use a deform object class to deform these vectors ( Which you must have) ?

Quite frankly my script moving seat and keeping sync with its movement is working now by script placement providing an overhead in itself, however  a fire ball is simply even more overhead being caused by the system not providing a workable solution to act on this link with skin data movement . The answer for the fire emitter  would seem very simple to vary the particle emitter position specifying a vector position to its new center . Why can this not be written into particle coding as a new  rules  to the http://wiki.secondlife.com/wiki/LlParticleSystem to add a emitter position b way of PSYS_SRC_EMITER_POS and  PSYS_SRC_EMITER_ROT on the object the script is in. This seems a very simple ask and ridiculous that its not already been implemented when changes have been already made to add new rules  set ?

 

Hugs Denise 

Link to comment
Share on other sites

14 minutes ago, VirtualKitten said:

If its so unfeasible how is it that it works in blender. 

You keep making comparisons between two different softwares, with different architectures and data structures, not to mention different purposes (Blender not real-time, SL real-time and split between viewer and server). Blender is not made for SL, SL can't have its features mirrored as there is no connection whatsoever between them.

On the other hand, I'm not saying it can't be done in an absolute manner. It can be done, but the current implementation makes it unfeasible. First, all avatar animations are viewerside, as already said, and the server has no clue of what a joint is to begin with. What you ask basically is, for every frame:

1. Take all joints positions and rotations, translate them to world coordinates

2.  Make a list of joints with all the objects attached via skinning (skin queries are computational heavy also in Blender or Maya) and calculate their bounding boxes to get a world coordinate position of the geometry (not the object itself, remember geometry and transform node are NOT the same thing)

3. Send those lists over to the server (network performance dependent)

4. Get the server to retrieve all objects mentioned in those list, calculate the bounding box center, and move those objects to their locarions

5. Rinse and repeat for every frame, whatever frame rate you have: 30 fps, do the above 30 times per second

6. Do this for every single skinned mesh in the region.

Now for animesh objects, where do you take the data from point 1? It's not your avatar, so what is the standard to decide which viewer has to be taken for such task? Who's eligible to take over, and by what rule a user has to accept to be the one designated to use their viewer (and their pc resources) to provide this service to the servers so that others can see the animesh attachments move? 

Thus is the current state of animesh thing. There have been requests to include attachments points to animesh like the users avatars, but so far the requests kept hanging there, most likely because of the reasons listed above.

Updating or upgrading the current architecture is not as simple as you make it sound. Things break when new object classes get added, and might simply not work at all.

Link to comment
Share on other sites

The underlying problem is that Second Life doesn't have a real hierarchy. In just about everything else in 3D editing and animation, child objects can have child objects of their own. Second Life doesn't have that. So sitting is a special purpose hack, attachments are a special purpose hack, and skeletons are a special purpose hack. Just getting the wheels on a vehicle to both steer and rotate requires workarounds.

Philip Rosedale has said this was his biggest mistake when designing Second Life.

(Interestingly, the server to viewer message protocol supports a full hierarchy, although the code in the LL and Firestorm viewer do not.)

Link to comment
Share on other sites

16 hours ago, VirtualKitten said:

You say there is no structure to let the server know where the joints are however blender is quite adept at this.

Key point: Second Life does not have a master idea of "what pose your bones are really in" due to the fact that animations are not synchronized between different residents' screens. Also, as far as the server is concerned, animations are phantom and don't actually move anything.

If I start dancing Tango37, all the server knows is that my avatar is executing Tango37. It doesn't have an authoritative, "true" idea of what frame of that animation I'm on or where any of my bones have moved to. Instead, every single resident who can see me (including me myself) generates the animation's effects individually within their own viewer. Every resident's viewer does this separately, just for them, and none of them are the "true" version. When someone new teleports in, the server tells their viewer I'm dancing Tango37 and that's all. It has no other information to give. So, that newcomer's viewer has no choice but to display my avatar starting that animation at frame 1 on their screen, while everyone else sees me as farther along.

Complicating that even more is the fact that avatars can be impostered (rendered at a lower LOD and/or with sporadic animations) or jellydolled if their rendering complexity is too high. That also ruins the idea of forcibly synchronized animations on everyone's screen. Plus some viewers support making all animations run faster or slower, or overriding them with custom poses.

Second Life was originally coded this way because it was a much more feasible way of handling data loads on the slow computers and internet of 2003. The more content residents created over the years, the more disruptive it has become to rewrite everything so the server does have a master idea of my "true" animated state and constantly feeds it to everyone in the region to keep things synchronized. And that's the change that'd have to be done to give you the feature you want. It's a major restructuring and it would cause immense side effects, so while it's possible...

Link to comment
Share on other sites

Let me start by saying when it comes to rigging, my brain goes even more sideways than normal. 

My attempts to alter the default sl rigged body always turns into something bizarre when i switch on animesh.  So I've yet to tackle all you need to know about rigging. 

That said...

I recently got an animesh figure and was told you can add rigged clothing to it like you do to your avatar.  I haven't tried this yet but is this possible?

So would it be possible to put a rigged saddle on an animesh cow, for example, and the cow has an animation script & animation  in it and your avatar wears an object that animates it and both the cow and saddle attach to the avatar and a script syncs all the animations (like couples dance scripts) so you could ride the animesh cow/saddle combo?  And if you wanted the cow to spit sparkles, could you not also use a rigged attachment with a particle sparkle script that is rigged to the cows mouth rigging?

 

Link to comment
Share on other sites

2 hours ago, Pixels Sideways said:

I recently got an animesh figure and was told you can add rigged clothing to it like you do to your avatar.  I haven't tried this yet but is this possible?

If you link /rigged/ objects together as animesh they should all magically follow the same skeleton. the Li goes through the roof though unless they have some good optimization.

2 hours ago, Pixels Sideways said:

could you not also use a rigged attachment with a particle sparkle script that is rigged to the cows mouth rigging?

My guess is that particles and rigged attachments in general do not play nice with each other.

Link to comment
Share on other sites

  • 2 weeks later...

no the only way I have found to add mesh to animesh to make it as one in blender . Don't know why the powers of be dont add a particle emitter-location and rotation in coords this would be so easy @OptimoMaximo if you cant do it post the code and let is fix it for you ?

Link to comment
Share on other sites

  • 1 month later...

If you're dubious about sitting on an animesh object, you might check out the Folkdance venue. Animesh figures dance a sequence of figures, not necessarily the same but synced with each other. An avatar can sit (right-click sit) on one animesh figure to replace it and keep dancing the same sequence. Standing restores the animesh figure, in place and dancing.

Not the best animations, but the scripting shows possibility.

The figures are dancing much of the time, but a community gathers there every Thursday 4:30-6:30 SLT.

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 968 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...