Jump to content

Quarrel Kukulcan

Resident
  • Posts

    530
  • Joined

  • Last visited

Everything posted by Quarrel Kukulcan

  1. What is the size of your mesh in Blender's internal units, and are you using meters? Did you check that scales are applied to both the armature object and the mesh object(s)? The fingers deforming on only one hand suggests an error somewhere along the way with mirroring or symmetry. You probably have some right finger weighting mixed into the left fingers. The left foot looks slightly affected as well. The arms moving to that weird position in the preview is exactly what you'd expect to see as a result of the rescaling in pic #2 and shouldn't happen once you fix that. (The preview always plays a humanoid idle animation on the rig, and the main part of that is rotating the shoulders down from the T-pose.)
  2. The planar option ignores your UV map and tries to figure out how to apply the texture all by itself, as if it were projecting the texture onto the object's surfaces from all 6 axis directions at once. It's unpredictable and mysterious (and also pretty bad at times, including all the prims with curved surfaces). If the UV maps all have the same name, they'll be merged. If they have different names, Blender will keep them separate. And then you'll have something that won't upload right. Make sure they have the same names before you merge. Otherwise you'll have to re-separate, rename and re-merge. (Or re-unwrap everything. It's fussy work either way.)
  3. Check that your object only has one UV map. It's in the Object Data Properties: SL only allows one UV map, but Blender lets you have multiples. If you have more than one UV map and you upload, SL will keep one (I'm not sure which) and throw out the rest. That is one possible reason why your UV edits aren't uploading. If that is the case, figure out which of your UV maps is the right one and remove the rest with the "-" button to the right of the list. But your problem is a UV mapping issue, and the ideal way to fix it in Blender is to rotate the UV map vertices, not rotate the faces in the mesh. Don't worry about vertex normals. That's something completely different and can't cause this.
  4. That's not quite true. You always have to make an llRequestPermissions() call. What you can usually get away with, for seats/beds and attached objects, is assuming permission is granted so fast that you can start the animation at any place in the script instead of needing a run_time_permissions() event handler.
  5. AFAIK animations can't add onto each other except when one is easing out and another is easing in, but that's still going to be a weighted average rather than both at full strength, and it won't persist after the first one ends. Also, remember that animations are illusions, and each resident watching gets their own personal rendering of that animation on their screens with no synchronization. No real movement happens to the avatar/animesh as far as the server is concerned. That's the other reason snapback happens. The system isn't built to remember "animation #1 moved the fish 2 meters forward".
  6. That's the problem. You're moving the armature. Translations/rotations to the armature object don't go into the animation. You need to modify the mPelvis bone within the armature (or CoG in Avastar -- NOT Origin). Here's what I had to do: Don't put a Follow Path object constraint on the armature. Put a Follow Path bone constraint on CoG. When you bake the animation (Pose -> Animation -> Bake Action...), turn on Only Selected Bones and Visual Keying. (You may or may not want to Clear Constraints, but if you don't, disable them so they don't re-apply post-bake.) Bake the data to the Pose. ADDENDUM: Something you need to watch for. Animesh animations can't move mPelvis more than 5 meters from its starting spot. Make sure your swimming circle isn't too big. You will probably also get a one-time, permanent vertical displacement the first time you play an animation with a hip repositioning on an Animesh object. The system is really quirky like that.
  7. How are you moving the entire skeleton? Are you keyframing the armature itself in Object Mode or the mPelvis bone in Pose Mode? Because only the second method will include the motion in the animation. Also, if you're using Blender's own BVH export and you're using the default scale of "1" = 1 meter, you'll need to set a scale of 39.4 in the export options. SL is hardcoded to interpret all distances in BVH files as inches.
  8. If you don't need alpha panels or shape customization, it might be worth trying to get it on the Bento skeleton yourself. I predict the face will be the trickiest part -- both the "porting the model from VRChat" half and the "making it do what you want while in SL" half. But, yeah. They're very different systems.
  9. You do not ask the easy questions... Max 8 materials per object and 21,844 triangles per material. If you go over the triangle limit, SL creates another material for the overflow if there's room. But you can have up to 256 objects in a "linkset", which is SL's name for a group of 2 or more mesh/prim objects that's attached and removed as one. That's how creators make mesh avatars with far more than 8 materials so they have lots of tiny panels that can be turned invisible so the body doesn't clip through poorly-fitting clothing. If you're making a dedicated custom AV that won't be wearing existing clothes, you won't need to do this. Max 4 weights per vertex. Also, max 110 total weight groups, which means you can't use every single face bone and finger bone and also tail/wings all in a one-piece mesh. That'll be too many bones. One UV map per object. The armature is tricky. No custom ones. You have to use the predefined one or a subset. You can reposition bones -- but this can cause conflict with SL's body & face shape slider system. There is an option during upload to make your mesh ignore sliders on bones you've moved, so you can make this tradeoff safely if you want to. Here's the ugly part. There are two separate sets of "extended" bones: Bento (detailed face, fingers, tail, wings and hindbody) and Fitted Mesh (makes more body shape sliders work). LL has Bento rigs at https://wiki.secondlife.com/wiki/Project_Bento_Resources_and_Information and Fitted Mesh rigs at https://wiki.secondlife.com/wiki/Mesh/Rigging_Fitted_Mesh but there is no official source for a rig containing both. And a lot of people like using both and end up using a dedicated SL-oriented plugin that gives them everything. Furthermore, Fitted Mesh bones use a custom bind pose. Meshes rigged to them don't export correctly unless your software/plugins support bind poses. If you're using plain Blender, you need to use an armature with custom Fitted Mesh bone properties (like the one above) and also check the extra "Keep Bind Info" option during export. Finally, there's Levels of Detail and the collision shape. All mesh objects in SL have 3 additional, progressively simpler versions of the mesh (which get displayed instead of the original at increasing distances to reduce rendering time) and 1 physics collision framework. SL will automatically generate any of these that you don't create yourself, and it's bad at it. The physics shell is easy: use a cube or single triangle for anything meant to be attached to an avatar. Even though the mesh upload must contain a physics shell, it's ignored for attached objects, so make it extremely simple. The LODs take time and experience to do well, and a lot of creators just futz with the vertex count of SL's own autogenerated ones until things aren't too ugly. It's a complex and highly charged topic in creator circles. One last thing: file a support ticket to activate your account on the Beta Grid so you can upload and test without spending real Lindens. See https://wiki.secondlife.com/wiki/Preview_Grid for how. There are no custom shape keys or flexi-bones. All mesh facial expressions and head/body shape adjustments are done through armature manipulation. (If the face is rigged you can make expression animations for it and trigger them with hotkeys or slash commands in chat.)
  10. There isn't much call for pure-texture clothing for the legacy(*) avatar, but the process of texturing to an existing UV map is fundamental to mesh clothing as well, so you need to learn it anyway. Plus the Bakes On Mesh feature made system avatar textures relevant again, as the whole point of that feature was to let mesh bodies "absorb" the assembled cumulative appearance of a large number of skin, clothing, makeup and tattoo textures from the hidden system body underneath and display them in a more flexible and more efficient fashion than how mesh bodies had been doing that. So LL's legacy UVs are still relevant. Absolutely, 100% use SL's "local texture" feature to test things out so you don't have to pay upload fees until things are perfect. You can set a mesh, prim or legacy clothing item to use an image directly from your hard drive to preview how it would look if you uploaded it. Overwriting the file will update the appearance in SL automatically, too. (The view is private to you and it's temporary, so you can't get a final appearance for free.) * Not to be confused with the Legacy brand mesh body -- from the same creator as the Classic mesh body.
  11. The vector data in this normal map looks bad. Just poking around at a few pixels and turning them into X/Y/Z data, I don't get vectors of length 1.00 like I should. There are also pixels with a blue value below 128. I don't know that that can ever happen with tangent normals -- that represents a point where the surface is so sloped that it's flipped itself over. Non-normalized normal maps can absolutely produce incorrect shading. Also, see that prominent green diamond? That's the kind of slope you'd get if you generated your map from the point of view of a single eye looking down at the entire object. That is NOT the slope you'd generate if you create the map from the point of view of an eye looking directly at each individual face from that face's own "outward" direction. In other words, this isn't a tangent space normal map. (Or, possibly, it's a tangent space map to a lower-poly version of the mountain but it's being applied to the higher-poly one.)
  12. It looks like you didn't apply the subdivision modifier. You need to either check that option during export (or better yet, use the "sl+open sim static" preset, which includes that) or apply the modifier permanently to your mesh. Keep in mind that's going to quadruple your face count for each subdivision level. EDIT: Sniped by seconds!
  13. Set your current frame line to the first frame, select all keyframes (and I do mean all of them -- make sure you click off "Only Show Selected" and click on "Show Hidden"), then scale. You can scale keyframes just like vertices: press 's' and then move the mouse or type a number and press Enter. Keyframes always scale towards/away from the current frame marker. You're lucky your original animation is 100 fps. That number makes the math easy. If you want to go to 24 fps, scale by .24 . The general formula is (new fps) / (old fps).
  14. That is far too many polygons to use in anything doing realtime 3D. That's why your LI is so high. I don't believe Blender lets you truly combine multiple mesh vertices into a single UV vertex. It has several ways of aligning multiple UV vertices at the exact same spot and even moving them all together, but they never actually merge.
  15. "Translation" means "change bone locations". You need to leave translations on to make the tongue stick out. You can't get that kind of motion with rotations. Animations containing translations often conflict with face & body shape sliders, though. In general, for Second Life, you want to make your animations purely by rotating bones. The two cases where translations are OK are if you're shifting the entire avatar by moving mPelvis (because that's safe) or sticking the tongue out (because you have to use translations for that). So you need to find out why your animations are trying to reposition the other facial bones. If you have keyframes for X/Y/Z locations on other face bones, you'll have to take them out. If that's not the source of the problem, you may need to delete all the face bones you're not using.
  16. Multiple issues might be causing this: An incorrect reference frame in your animation. An incorrect bone movement distance scale in your animation. Another animation playing at a higher priority. Problem #3 depends on what animations are built into the head and its HUD and how they're scripted. The simplest way to solve it is to increase your animation's priority. As for the other two, if you're exporting an animation using Avastar: Make sure you check "With Bone translation". If you're exporting BVH, also make sure you check "Add Reference Frame". If you're exporting using Blender's own BVH exporter, it's more work. If your animation has any bones shifting position (not just rotating), you need to set the scale factor in the export panel to 39.4 (assuming you have Blender working in meters). SL requires that BVH files specify bone movement in inches. When Blender tries to say "stick the tongue out 0.05m" it can't write "meters" into the BVH file, only the number 0.05. Then SL interprets that as "stick the tongue out 0.05 in." and moves it by so little that you won't notice. 39.4 is the meters-to-inches conversion factor and fixes that. If you have any bones that don't start shifting or rotating at the very beginning, you have to manually add a reference frame. Second Life doesn't play the first frame of a BVH animation. Instead, SL checks which bones change position/rotation between frames 1 and 2 and only animates those. Bones that don't move until somewhere in the middle of an animation need to have their true starting keyframe on the second frame and another keyframe on the first frame where something about them is different. It doesn't matter how they're different as long as it's more than a tiny fraction. (The reference frame won't get played. SL plays a copy of the second frame in its place so things don't look glitchy and the total duration doesn't change.)
  17. Are you using Avastar or just Blender?
  18. What, exactly, is the problem? The skirt clipping into the leg? That means the skirt is weighted less to the thigh bone than the body mesh is. That could be because you smoothed your weights. What happens if you transfer but don't smooth? A more important issue: where did you get the armature? Mesh bodies and clothing pretty much always use the Fitted Mesh bones -- the all-caps ones like L_UPPER_LEG. If you're rigging to those, they don't export correctly to Second Life unless your bones have special custom properties or you use something like Avastar.
  19. You basically never want to write your own core random number function unless the supplied one is truly terrible. (Like, it's always a multiple of 1/32nd or something.) You're better off rethinking how you're using randomness and changing your design, as that's usually where the real problem lies. "Play a truly random animation each time" will naturally produce streaks and repeats that you probably won't like the looks of even if your RNG function is mathematically perfect.
  20. Possibly you need to be in Object Mode?
  21. It's not unusual for transferring weights to transfer all named groups to the new object, even ones that don't cover or are all 0.0 in that region of the mesh. So the weights could still be correct. You should go to Weight Paint mode on your ring, select all verts, and use Clean under the Weights menu, using the "Deform Bones" or "All Groups" option. Then see if there is still weighting on the ring you don't want. If there's still a problem, you're probably doing something wrong when transferring weights, like accidentally selecting the wrong things or in the wrong order. It's hard to say for sure. If you have to fix it by hand, fortunately, it's easier to weight something 100% to one bone and 0% to everything else than it is to do typical rigging. You don't even have to use Weight Paint mode. Edit Mode is quicker. Select the object. Go to its Vertex Properties tab (3-point triangle, probably isn't green on your Blender version) and find Vertex Groups. Open the dropdown menu ("v") and delete all groups. Add a new group. Rename it "mHandPinky1Right". Switch to Edit Mode. Select all verts and select the mHandPink1Right group. Set the weight in the Vertex Weights panel UI to 1.0 and click the "Assign" button.
  22. It also modulates Glow level. This is true regardless of alpha mode -- even None.
  23. Yep. That's not an option if I want to produce objects worn at the same time as Avastar-exported rigged mesh from the same dev kit.
×
×
  • Create New...