Jump to content

Hi is there any news on these updates


VirtualKitten
 Share

You are about to reply to a thread that has been inactive for 800 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

1) (prefered )Is there an update to  to have a linked object with same  armature sharing bones which follow the armature animation  so that saddles and other objects like fire smoke particles can be placed and remain on armature with the armatures in them to set their position where they are placed during an animation or keyframe move?

 

2) Is there an update to the particle system to set an output that can be tied to an armature part that will move with the animation  like breath while moving a face?

Edited by VirtualKitten
Link to comment
Share on other sites

  • 3 weeks later...

Why is that @arton Rotaru we really need this as the animations look silly with people sitting on animals when they bounce around a bit. An item linked with bones in would if animated by the bone move root position of avatar with the vector translation map .

Like wise its really near impossible to have fire from a dragons mouth as when the animation moves the head there is no movement of particle that can be afforded on fly . Why can not a same ball be made in blender that shares bones with mouth and is moved by mouth when animation moves bones or fake bones as in LSL as its a translation vector map this would keep flames in mouth.

The current models look ungainly and unrealistic in LSL unlike what blender can achieve.  No one can give an alternative amount of coding that will put these right. It Linden added a new light model which was fantastic why did these basics not get covered first surly that would benefit all the community? Can you please set up a poll?

Hugs Denise 

Link to comment
Share on other sites

3 hours ago, VirtualKitten said:

Why can not a same ball be made in blender that shares bones with mouth and is moved by mouth when animation moves bones or fake bones as in LSL as its a translation vector map this would keep flames in mouth.

It's not because the translation math is hard. (It's easy.) It's because LL would have to rework the entire way SL communicates animations between the central server and every single person logged in. It can be done, but it would be a lot of work -- possibly too much work for the amount of benefit it would provide.

The way SL was designed, the server doesn't know what frame any animation is on. It only knows which animations are playing. For example, the server would know that an animesh dragon is playing animation "dragon_hop_2" but it doesn't track the frames. Instead, every single resident who can see that dragon has their own private idea, in their client, of what animation frame it's on, but the clients don't coordinate with each other or with the server, and none of them is "the right one". There is no single, correct, authoritative answer to "what frame of dragon_hop_2 is the dragon playing right this instant?", which means the server can't tell how it should move avatars attached to the dragon's animation bones.

LL can't add your feature without redesigning this. There are multiple ways to do it, but they're all hard and could all screw up lots of other things in unforeseen ways.

  • Like 2
  • Thanks 1
Link to comment
Share on other sites

If the assumption of  a "needed feature" starts from "because you can do it in Blender", then why not getting particles to collide with objects, or have meshes get displacement mapping and multiple uvSets,, or allow custom skeletons and animations with infinite priorities, object animation and so on

  • Thanks 1
Link to comment
Share on other sites

  • 2 weeks later...

I don’t think the issue is related to the way SL plays animations on the client vs the server, after all particle emitting glow sticks which move with an avatars hands during an animation are a thing and that sounds very much like what is being asked.

Not sure if I understand the context here but it sounds like the animals (I.e. the dragon) must be animesh, because if it was a mesh avatar you could just attach the particle emitter to the mouth. 

So maybe the real request is to allow attachments on animesh objects?
 

Link to comment
Share on other sites

On 3/15/2022 at 2:51 PM, FridayAfternoon said:

I don’t think the issue is related to the way SL plays animations on the client vs the server, after all particle emitting glow sticks which move with an avatars hands during an animation are a thing and that sounds very much like what is being asked.

Not sure if I understand the context here but it sounds like the animals (I.e. the dragon) must be animesh, because if it was a mesh avatar you could just attach the particle emitter to the mouth. 

So maybe the real request is to allow attachments on animesh objects?
 

 

10 hours ago, VirtualKitten said:

FridayAfternoon yes you are spot on correct it is  related to animesh the math may be hard as a translation matrix but it must already exist in Secondlife to move other items I cant see how this is is impossible if its already being done?

Both contexts, from Friday and from Virtual, are things that are being played in the viewer, while the server is completely unaware of these things. That is where the problem sits. The feature you're talking about would require such visual, viewer side updates to be sent over to the server and streamed over to every viewer in range.

So to put this in context, when griefers were used to throw particles bombs to lag everyone down, it was sufficient to kill the max particle count in the viewer and all went back to running smoothly, with the server totally unaware and, thus, unaffected by such lag bombs, so that the griefers could be kicked and the bomb returned. This just to explain the lack of any communication between viewer and server when it comes to visual effects such particles and animations, that were designed to run within the viewer. Imagine now to have such a potentially heavy calculation feature being streamed and updated continuously between viewers and server, for as many agents in range.

  • Like 1
  • Thanks 1
  • Sad 1
Link to comment
Share on other sites

Still, allowing attachments to animesh objects would provide the desired functionality. You just add a particle emitter to the mouth of the animesh dragon and it moves (in everyone’s viewer) along with the animation.

they added benefit is you could dress up your animesh character with clothes and so forth (assuming of course the mesh is the same). That would make it way easier to create NPC’s for RP sims and so forth.

 

 

Link to comment
Share on other sites

On 3/24/2022 at 7:08 PM, FridayAfternoon said:

Still, allowing attachments to animesh objects would provide the desired functionality. You just add a particle emitter to the mouth of the animesh dragon and it moves (in everyone’s viewer) along with the animation.

they added benefit is you could dress up your animesh character with clothes and so forth (assuming of course the mesh is the same). That would make it way easier to create NPC’s for RP sims and so forth.

 

 

Attachments are just joints, so you can add rigid clothing components to animesh objects in form of rigged items, can be animated but still, the traditional concept of attachment points that translate objects is not supported because, again, it is managed viewer side. It's the only exception that gets streamed over to other viewers and, again, the server is not aware of that stuff. It's a sort of builtin rudimentary skinning to joint, but it's not the type of feature you're thinking it is. Animeshes do not have an associated agent to arrange the necessary structure required, at current time of this writing.

  • Thanks 1
Link to comment
Share on other sites

Thank you everyone who is contributing to this debate .

I can understand Optimo's points and maybe lag and haters could use this wrongly to inflict more lag. But  the RP benefits would surely outweigh this problem. Could a creative  solution not be made.

Maya the avatar when sitting on an Animesh doesn't even sit correctly  currently fidgeting if its on a moved part of mesh.  It needs feed back from the primary animation which is running on Animesh to change seating position which cannot currently be achieved as the data of the location of the animation parts is not available to the Animesh . A link like a object in the dragons mouth moves like the avatar and will not remain in the mouth when animated  in much the same way as a avatars sit.  I tried to bone my linked objects to make them follow the objects boning structures but it did not have any affect on objects position in viewer or server.  

In regard to the  agent solution you admitted to Optimo would this solve this if all Animesh  was an 'agent'  or you could have 'agent' checkbox  on the features page  in viewer and had the required structures to hold the attachments in place the same as agent ? Could this check box or physical character traits then tie the Animesh character to the creator or current owner and if was used in grief hate could be identified or identifiable?

Edited by VirtualKitten
Link to comment
Share on other sites

On 3/28/2022 at 8:43 AM, VirtualKitten said:

In regard to the  agent solution you admitted to Optimo would this solve this if all Animesh  was an 'agent'  or you could have 'agent' checkbox  on the features page  in viewer and had the required structures to hold the attachments in place the same as agent ?

An agent is basically the structure with the server side code that makes a viewer, so that you get an inventory, a shape editor window, an avatar, its collision box, the name tag on top of the avatar, the voice dot and related spacial positioning for streaming voice, etc... Including the attachment points and the collision volume bones with their connections from the shape sliders. Which isn't currently possible for many reasons, same reasons for the lack of support for shape sliders and attachment points. These are defined in a separate file that gets applied at log in, while the base joints can be found in another. The agent assembles the two definitions at runtime. The second file defines the connections with the sliders, their ranges, what attributes of a joint is affected and the hierarchy placement of such "add ons" joints. All contained in your viewer, the thing that creates the agent when you log in.

So, the answer to your question would be yes, if each animesh gets a scripted viewer to control it. But that's a bot , and this would defeat the whole purpose of animesh. There needs to be another type of system, but apparently that isn't in the current LL  interests list, apparently

  • Sad 1
Link to comment
Share on other sites

I think this is all becoming more clear Secondlife why doesn't see its self as roleplay?.

It may not have this as a current concern  but it wrote new scripted bots to deal with automatic messaging group invite and shop attendance for commerce.  it wrote new light model for everyone . So when it brought in Animesh why was this heralded the greatest thing in Secondlife and then it was cut down short what actually happened . You alluded to griefers and the possibility of misuse, Surly the pupose of Animesh which was released Wednesday, November 14th,2017 Alexa Linden Lab announced the official release of Animesh, with the promotion of the Animesh viewer as the de facto release viewer.:

"Welcome to Animesh! Animesh is a new Second Life feature to allow independent objects to use rigged mesh and animations, just as you can today with mesh avatars. This means that you can now have wild animals, pets, vehicles, scenery features and other objects that play animations" https://wiki.secondlife.com/wiki/Animesh_User_Guide#:~:text=Welcome to Animesh!,other objects that play animations.

Optimo,  How is this statement and opportunities been  met if the connection of the animation timings cannot be linked to Avatar attachments movements without agent characteristics giving any realism to them . Surely this cant be right ? 

Furthermore it can be read.

Animesh has been in development for about a year, and like Bento, has been a collaborative effort between Linden Lab and Second Life content creators.  Essentially, it allows the avatar skeleton to be applied to any suitable rigged mesh object, and then used to animate the object, much as we see today with mesh avatars. This opens up a whole range of opportunities for content creators and animators to provide things like independently moveable pets / creatures, and animated scenery features. Press release https://modemworld.me/2018/11/14/animesh-officially-released-for-second-life/

Hugs D

Link to comment
Share on other sites

On 3/31/2022 at 7:29 AM, VirtualKitten said:

I think this is all becoming more clear Secondlife why doesn't see its self as roleplay?.

It may not have this as a current concern  but it wrote new scripted bots to deal with automatic messaging group invite and shop attendance for commerce.  it wrote new light model for everyone . So when it brought in Animesh why was this heralded the greatest thing in Secondlife and then it was cut down short what actually happened . You alluded to griefers and the possibility of misuse, Surly the pupose of Animesh which was released Wednesday, November 14th,2017 Alexa Linden Lab announced the official release of Animesh, with the promotion of the Animesh viewer as the de facto release viewer.:

"Welcome to Animesh! Animesh is a new Second Life feature to allow independent objects to use rigged mesh and animations, just as you can today with mesh avatars. This means that you can now have wild animals, pets, vehicles, scenery features and other objects that play animations" https://wiki.secondlife.com/wiki/Animesh_User_Guide#:~:text=Welcome to Animesh!,other objects that play animations.

Optimo,  How is this statement and opportunities been  met if the connection of the animation timings cannot be linked to Avatar attachments movements without agent characteristics giving any realism to them . Surely this cant be right ? 

Furthermore it can be read.

Animesh has been in development for about a year, and like Bento, has been a collaborative effort between Linden Lab and Second Life content creators.  Essentially, it allows the avatar skeleton to be applied to any suitable rigged mesh object, and then used to animate the object, much as we see today with mesh avatars. This opens up a whole range of opportunities for content creators and animators to provide things like independently moveable pets / creatures, and animated scenery features. Press release https://modemworld.me/2018/11/14/animesh-officially-released-for-second-life/

Hugs D

I won't get into details with elaborate answers as to why this or that, but I will make you notice a couple of facts:

Can you sit on an avatar attachment, even though its intended use is to be a vehicle?

Can two avatars walk hand in hand with realistic arms interactions, such as connected IKs, without making animations specific to the involved avatars?

Systems have limitations. Your, or anyone's, niche need of any "realism" stand within those limitations, which CAN be changed, but usually at the cost of breaking things. Standalone game engines, as much as Blender or any other 3d app, run standalone and backward compatibility is never granted. Meaning that if you upgrade your Unreal Engine, Unity, Blender or Maya, from a version to another, there is no guarantee that you won't be forced to re-do some if not all the work on features that were involved, directly or indirectly, in the implementation of the new "shines". Is SL the type of platform that can afford such eventuality?

Link to comment
Share on other sites

15 hours ago, OptimoMaximo said:

I won't get into details with elaborate answers as to why this or that, but I will make you notice a couple of facts:

Can you sit on an avatar attachment, even though its intended use is to be a vehicle?

Can two avatars walk hand in hand with realistic arms interactions, such as connected IKs, without making animations specific to the involved avatars?

I can be just a brief no they cant to all of what you said but their are solutions to some .

A couple  walker will have exactly the same problem and you end up with a multiseat  on a moveable prim with paired animation so technically yes you can but not with IKs as Secondlife doesn't use them. It  however uses math and multiplication and addition of matrices you already explained to create translation on mesh points which we call an animation. why cant these same matrices work on attachment you said they needed to be an agent? 

Yes I can sit on a vehicle especially if its a jeep to get bounced around but as soon as that vehicle becomes a transformer then we have problems as your seat has become part of the transformer that moving You  can move your avatar to d=sit on shoulder but if this transformer walks like a human then your avatar will not be sitting on the shoulder very long . Its the same with a horse or other animal the body of the animal is always moving . If the animal flies then it even moves even more extreme as its using more powerful muscles . This in secondlife translates to  the avatar  staying in current position of the original unanimated  animal but the animation has moved the animal , car, plane so you are no longer connected and looks like its sitting independently in space.

You seem to  talk about cost of implementation in monetary units and in server time  but blender does all this free too

I ask you to note from what you have kindly informed me : ,

1) You can sit on a non and  animated object or Animesh and animate the avatar realistically as long as long the part of the animated object you are sitting on is not moved by animation of the object you are attached to or are sitting on  .

2) You cannot add bones or IK  in blender to an item  that is an attachment and this attachment when imported into secondlife and attached to a Animesh is influenced by matrix interpretation of those bones that move in the Animesh. Simply put there is no connection other than the link point   not to the translation matrices or the  point of attachment. Therefore the object sits at this point animating at this link position  not influenced by objects animation or movement matrices.

3) The Second Life physics model importer  cannot build physics for multiple items  in blender they all have to be imported with their own physics separately to keep low li

4) There is no future development planned to improve this attachment issues or improve importer ?

Hugs D 

Edited by VirtualKitten
Link to comment
Share on other sites

7 hours ago, VirtualKitten said:

You seem to  talk about cost of implementation in monetary units and in server time  but blender does all this free too

And you keep pushing comparisons with a type of software that was DESIGNED to do exactly all this. SL has no built in design for these features right now.

It's not a matter of how complicated to calculate those points via matrix or vector map, the problem is the underlying structure, which currently needs an agent to be able to stream these changes over to another agent, which along with the lack of server-side code base to support the lack of data coming from an animation, makes what you are requesting impossible to achieve. Remember that the server only knows that you're playing an animation, sends the same file name to others, but its contents are unknown to it, the animation is being read and played back on the avatar in the viewer. Ever noticed how two different viewers may see the same avatar playing the same animation, but the two might not be synchronized in the 2 different viewer? I really don't know how else I could try to explain this to you by now.

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 800 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...