Jump to content

freston

Resident
  • Posts

    23
  • Joined

  • Last visited

Everything posted by freston

  1. Hi, Avastar's character already has IK set up. Normally it's hidden out of the way but it's easy to enable: in the 3D view press N to bring up the properties pane, with one of the limb bones selected in pose mode you should see a section on IK controls in the Rig Properties part. There, you can show/hide the IK controllers and targets, switch on the influence, and have the IK match the current pose. All the bones also have targetless IK set up which means you can grab any of them and drag the limb around. You can control the length of the chain in the Targetless IK section. Of course you can always modify this IK setup or add your own on top. See http://blog.machinimatrix.org/avastar/ui-armature/ and http://blog.machinimatrix.org/avastar/avastar-animation/
  2. There's a bunch of advantages to using SL's 'anim' format - you have precise control of where you put the keyframes, per bone priorities, priorities from 0-6, animation of the location of any bone as well as the rotation, animation of the attachment points and volume deform bones, parameters such as ease in/out, priority, looping are stored in the file. There's even an IK (inverse kinematics) system built in. The official viewer now supports uploading these anim files out of the box, and third party viewers have allowed this for a while. The Avastar plugin for blender supports exporting animations to it (this is why Avastar was created originally), but it's a comercial plugin. I'd imagine more and more SL plugins for the popular animation programs will support the format in time. I'm not sure if this anim format is the same as the maya animation file though.
  3. Hi Da5id, do share how you broke it . No seriously .. it would be good to know what broke so we can improve it. Send me an Im if it's too product specific too be interesting in the forum. Avastar uses the LL defined skeleton, including attachment points and volume deform bones, modified by whatever SL shape you use. There's a few other bones to support IK and so forth but the modelling and animation is done againts the SL bones. Yes you can change the rig, or move bones around and animate them afterwards but it does get tricky see here for an example: http://www.sluniverse.com/php/vb/content-creation/73726-extreme-skinning.html. For that rig we wrote our own exporters for collada, bvh and anim.
  4. I think Chosen Few's workflow is really nice, just a couple of thoughts: Don't get too hung up on the armature being a skeleton. As far as the 'game' is concerned the bones are really just a handle to manipulate the mesh. The analogy with a human skeleton only gets you so far then you have to start compromising like making a wrist rotation a combination of mWrist and mElbow rotations. This means you need to be thinking about how you are going to animate your character as you are weighting it. In Blender the weights can sum to more than 1, they are renormalised virtually when you move a limb. So with Chosen's workflow, as a first cut just select every vertex that you think should be influenced by the hand bones and assign them weight 1 to mWrist, then select every vertex that should be influenced by the forearm and give them weight 1 to mElbow and so on.. The weights at the transition region will blend into each other with the on-the-fly normalisation. I find doing this is much more accurate than weightpainting as you can easy respect any symmetries in the mesh. You're creating your own mesh so you can control the vertex density. Just add more structure to the joint areas to give yourself more freedom to transition the weights. This shouldn't increase the overall poly count too badly. With setting a natural default pose you have to jump through a few hoops. For animation the rest pose has to be the SL T pose, but often for mesh you want something else. In Avastar I've created a button for this that recalculates the mesh back up to the T pose. By hand you can do something like: pose the avatar in a natural pose and set it as a rest pose. Do your weighting. Pose the model back up to T pose by hand and set it as the rest pose (getting this accurately can be tricky). Alternatively you can hit "Use modifier while in edit mode" in the armature modifier and also "Apply modifier to editing cage" to see the mesh posed while in edit mode and adjust things as Chosen has suggested.
  5. The SL avatar doesn't have the option to preserve volume (unless it's a hidden feature somewhere in the code). So although you can switch this on in Blender you won't get the same in-world. All you can play with simple skin weights, but you can have up to 4 on each vertex. It might help to blur the transition more across the joint. Another tactic if you are creating your own mesh, is to rig the character in a natural pose you are likely to be most often, that way that pose will have the least distortion. Just experiment with different weightings and look at the result when you pose the limb. If you texture the model in Blender it will give you a better sense of how it will look. On the other hand if you were to do that particular move with your arm I doubt your skin would look any prettier
  6. Hi Medhue, why do you say 30fps is a limit? Internally the time of the keyframe is stored as an unsigned short, so even for a 30s long animation that's a maximum frame rate of 65536/30=2185fps! Of course that's a ridiculous number, but the point is that there's no inherent limit to 30fps. I'd expect the graphics card to max out at 60fps or so, but there might be another throttle somewhere. Is this a limitation in the BVH import routine? Jack - there's a bunch of different rates or limits in play: Animation length - this is hardcoded to 30s. To get longer you have to string together separate animations with a script, or loop the animation. Something like a walk is looped so it can cycle indefinitely. The looped section can be in the middle of the animation though, so you can have a lead in and lead out section that takes you to some neutral pose. That way you can string different animations that transition well from one to the other (though lag stomps on this). Animation 'rate' - this doesn't exist. Animations are stored by time and it's perfectly legitimate to have an animation with a frame at 0s, one at 0.1s, and one at 29s. The viewer looks to see where it is in time and interpolates the joint rotations as necessary. Here, what's important is the smallest time separation two frames can have (30fps = 0.03s separation). Of course if your animation is not changing at that time scale it won't make any difference. If you are uploading via BVH, that does have a frame rate, and the uploader analyses the frames and remove any redundant motion leaving a nonlinear timeline like the example. Viewer reported FPS - a ton of things affect this including network lag, graphics card and wether SL is just feeling grumpy. You can get high values, certainly higher than 60fps but I have no idea how that affects the smallest resolution you can see in an animation.
  7. Well, if you turn on "Show Collision Skeleton" on a tiny, you might get a clue of what you will fit into (actually I haven't tried to see if this is the physics hull of the avatar)
  8. Out of curiosity, where does the limit of 30fps come from? I've not come across this in the code. Internally, animations don't have a fps, they have specific times at which keyframes are recorded. The viewer then samples the motion and interpolates to get what the pose should be at a particular instance. This means that two viewers refreshing at different rates can stay in sync with an animation. But what is the maximum effective rate? I always understood that it heavily depended on your graphics card. Is there a hard cut off somewhere in the code? Or is it that there just is not much point driving the graphics at higher than about 60fps?
  9. In Firestorm turn on: Develop -> Avatar -> Animation Info. The option may be in a different place for another viewer. This will show you all the animations running and their base priority. Each bone can have a different priority though so the base priority may not be that accurate. The only sure fire way I know is to create a static pose at all the different priorities (1-6) and see at what point it gets overriden. For two animations of the same priority the last to run controls the motion.
  10. Do you know if changes to avatar_skeleton are actually honoured in any viewer in SL too? The only testing I've done is with adding new attachment points in avatar_lad.xml and using Firestorm. As you pointed out, this has been hacked in the past to provide multiple attachment points and I guess Firestorm inherited that part of the code. I found you had to choose the right id's for it to work (looked up the id's for those duplicate attachment points) but I didn't test it particularly deeply. I can certaily imagine having a region that recomends downloading and installing a custom avatar_skeleton.xml file in order to have cool new avatars
  11. I know that this can be done with attachment points in avatar_lad.xml, though there might be some issue with available id's. I wouldn't be supprised if you could modify the skeleton in avatar_skeleton too, but every viewer would have to have the modified xml files to see it so it's not very useful except for machinima.
  12. I don't think the idea I had in a rush of blood to the brain will pan out I remembered that the anim format has a contraint system too. I thought maybe the system could also fix the location of a bone which would then work like you wanted to 'reparent' a bone. From what I can see though this is an IK system so the end of a bone chain can stick to another bone or the ground and the rotations up the chain will be dynamically calculated. Still it might be fun to play with.
  13. Hi Seven, As you've found the attachment points have a bunch of limitations. You can animate them (rotations) like you can for all bones. You can also animate their locations, and Avastar will support this in the next release which we are testing at the moment, the current release doesn't really support bone location animation. You can't change their positions in the armatures like the usual bones unfortunately. From what I've seen either nothing happens or your entire armature goes crazy when the attachment point has both weight in the mesh and is in the armature section of the Collada file. I'd call this an SL bug except I don't think LL ever intended the attachment bones to be used in this way. It would be awesome if it worked though - it would give you a bunch more bones that you could use for things like facial animation. At the moment you have to make do with where they are by default. The eye attachment points are parented to the normal eyebones so when they move, they will move the attachment points too.
  14. Hi Tessa, Hi there, I am wondering if anyone knows of a way to animate mFootLeft, mFootRight, mToeLeft, mToeRight, and mSkull? I've done some searching and found a few references to it saying it's possible http://community.secondlife.com/t5/Animation-Forum/BVH-animation-for-all-26-Avatar-bones/m-p/1390891#M1004 but I've yet to find anywhere that goes into detail on how to do it. Yes it's possible to animate all those bones. I've also heard that it can be done via BVH but I've not confirmed it and in any case it's much easier to just use the "anim" format. For instance, my plan is to move those bones and change their parents to the spine instead of the feet/head, and use them for a set of wings. That means disconnecting them from their usual parent bones, else the wings would move every time the feet or head did. One of the main things I want to find out is, when the wings (and modified armature for the wings) are uploaded, will the disconnected bones still move in relation to the feet/head? or will they move as though they were attached to the spine? (as I'm hoping they will), or perhaps all this is handled by the animation? You can't reparent the bones unfortunately. At least not without patching everyones viewers. What you want to do might be possible though. You would have to create an animation that both rotated and moved the bones so it looked like they where attached to the spine. To animate bone movement you will definitely need to use the anim format. I'm interested in testing out the idea (see http://www.sluniverse.com/php/vb/content-creation/73726-extreme-skinning.html and I'm in the process of modifying Avastar to try and make this work. The idea is to create a rigging skeleton that is parented like you want and that drags along the SL bones with it, then the animation is just exported from the SL bones and has the right translations and rotations for the effect. That way it would be easy to animate. You wouldn't be able to use any other "standard" animation though as those would move the feet and legs and the foot and toe bones will follow them.... so you would have to redo all the AO animations etc and commpensate for .... damn, just thought of a way around it Sorry to go all mysterious on you but I need to try it out first ... if it works it will be very very cool
  15. I'm pretty sure that limitation con be overcome with a simple patch to the upload code in the viewer. I'm bouncing off getting a mesh enabled viewer to compile under linux though so haven't been able to test it.
  16. Might be possible to fix this though it would require some trickery: http://blenderartists.org/forum/showthread.php?246027-Vertex-Normal-editing would only be applicable at export time ... Would there be much need for this kind of fix? See Gaia's suggestions.
  17. Hi Amphei, The reason we havent joinned the meshes is that it would break all the shape code (may explain your boobs collapsing ). If you want to do this I'd freeze the meshes first which removes all the shape morphs, then join them together. This will lock in the shape, which you still have to wear inworld. I think you would also have to redo the UV maps so wouldn't be able to use normal skin textures, though you might get away with creating a new UV map that has all the usual ones stacked one above the other and do the same for the skin texture. Another idea: with a mesh set to smooth, it's the vertex normals not the face normals that are used, so maybe matching the vertex normals for the overlapping vertices along the boundaries might be all you need to get smooth shading ... I'll play with this too and see if it works.
  18. Don't apply the armature modifier! You need to keep it in the modifier list for Blender to insert the skin weights in the collada file.
  19. The Collada file can have the bone locations and this info is used when "joint positions" is selected in the importer. Unfortunately you need a weighted mesh for "skin weights" option to become available, once you tick that then "joint positions" becomes available if the data is in the collada file. However, if you just want to create a deformer just create a mesh with a tiny single triangle weighted to mPelvis say and sitting at the origin of that bone so it will never be seen.
  20. Hi, has anyone successfully modified the location of an attachment joint through mesh import? I can correctly weight any of the normal bones like mPelvis/mHead.. and I can reposition them too. I can also correctly weight mesh with any of the single-name attachment joints like Mouth/Nose/Spine etc. I thought I used to be able to move the attachment points around by having the positions in the collada file, but I can't seem to reproduce this now. Also as soon as I wear a mesh that has an attachment joint both as a weight and location the entire armature goes haywire and it looks like all the joints collapse in on themselves.
  21. Sorry, I think I missunderstood. You see jitter in the arms when both your walking animation and the arm animation are both running? If just the arm animation runs do you still see jitter? You have to track down what is animatting the jitter, e.g. Is it in time with any other bone movement? When you left the arms unposed did that include the collar bones? If you leave all the arm bones in the default T pose then those bones should be removed from the animation when you upload it. Any other animation should then control the arms. Where it could get messy is if your walk twists the spine as this will swing the arms
  22. Hi Mahaila, If you left the arms with no positioning at all then those bones are removed on bvh import. The jitter you're seeing may be background body noise
  23. I agree with Medhue, it sounds like a problem with the initial frame. If a bones rotation changes significantly from the first frame somewhere in the animation then the importer will keep it. Otherwise the importer ignores that bone entirely. In a multi frame animation the first frame is a reference and its easier to change that . A change of about a degree in x or y axis should do it, or you could do something dramatic like Medhue
×
×
  • Create New...