Jump to content

Psistorm Ikura

  • Posts

  • Joined

  • Last visited

Everything posted by Psistorm Ikura

  1. Hmm, it shouldn't be related to an AO, I use pre-bento ones currently, and my animations only drive bento bones, so I think there should be no potential conflicts
  2. I've recently built a mesh avatar using Bento animations for hands, head, and tail. All in all I'm very satisfied, though I've come to notice that sometimes the animations just kind of stopped without any discernable reason. To start the animations, I have two triggers for on_rez and attach, the former querying llGetAttached() before starting an animation trigger permission request. Once the permissions have been granted, I scan the active animations for any leftover animations that each script can play, and if any are found, stop them so no stale animations can play through, say, a script reset or anything. After this cleanup, my animations are set to play. On animation switches - for facial expressions for example - I first stop the old animations that were playing, then start the new ones afterward. Other than that, there is no other stop logic, and yet, I sometimes realize that my animations have stopped seemingly on their own. Are there any other conditions - teleport, region change, etc - that can stop looping animations? I should also mention that I make use of single-frame looped animations as well, though they seem just as effected as my multi-frame looped animations.
  3. Hiya! I actually determined the cause of the joint offsets, and I'm sorry to say it was my fault all along! It turns out I majorly brainfarted and didn't assign any weights to lower/upper teeth, thus breaking the chain of weighted bones upon export. This explains why any weird offsets went away once I made sure the offset between jaw/teeth was the same as in the original skeleton. I just forgot to do some sanity checks it appears. As for the FBX files, I finally figured out why they come out looking different than the DAE files as well as what avastar provides. The FBX files aren't built based off the default shape, but have various sliders changed, most notably arm length, shoulders, neck thickness and head size, along with a few other minor tweaks. This explains the discrepancies I've been seeing between the FBX files and other skeletons.
  4. I'd like to bring up another issue I came across, something which does need addressing in an official form. The .fbx files provided on the official forums have certifiable flaws in them, most notably the issues seem to be stemming from mFaceTeethUpper and mFaceTeethLower, which appear to have some anomalous rotation applied to them, or they do not correctly respond to having offsets applied to them. My tests have been inconclusive other than the fact that they do not behave as expected, and as such create a lot of problems when trying to, say, construct an anthropomorphic head. The worse thing is that these errors do not present in the import preview, thus necessitating an upload each time to verify the correctness of the rig. Edit: I found a further bit of confusing information. The project bento skeleton guide found here - http://wiki.secondlife.com/wiki/Project_Bento_Skeleton_Guide - claims that the tongue and lip bones are parented to the jaw, when in actuality they are parented not to mFaceJaw, but instead to mFaceTeethLower. This bit of information is confusing, but leads me to believe that creating a joint offset for BOTH mFaceJaw as well as mFaceTeethUpper/Lower is what may be creating the strange jaw behavior I am currently seeing. I would very much appreciate some official insight on what is supposed to be going on here. Further edit: By some more experimenting - and having SL display the skeleton for me - it does indeed appear to be the case that SL does not take kindly to trying to offset mFaceTeethUpper and mFaceTeethLower. I will attempt to see whether leaving these at their original location has any sort of positive effect. Final edit & solution: After a solid day worth of research and experimenting, I've come to the following solution: Do NOT, not EVER offset mFaceTeethLower OR mFaceTeethUpper right now. It is perfectly fine to have these bones receive offsets via moving their parent bones, such as mFaceJaw. But do NOT move these bones on their own, it will inevitably lead to very strange issues. As soon as I re-aligned mFaceJaw and mFaceTeethLower to their original positions, then moved ONLY mFaceJaw to its intended location - seeing as mFaceTeethLower is unrigged in my mesh - the problems disappeared. Likewise the upper lip deformation disappeared by moving mFaceTeethUpper back to its original position and its child bones to their intended offset positions. This leads me to believe that there is some bug calculating offsets from mFaceTeethUpper/Lower, which may be connected to the statement in the wiki, falsely stating that the lower lip and tongue bones are parented to mFaceJaw, which they very obviously aren't.
  5. What you might be looking for isn't the mSkull bone, but instead the HEAD collision volume, which what most mesh hair seems to respond to, so as to respond to head size. In that case, I'd suggest raising that volume bone up and rigging some vertices to it so as to make the changes show up when using joint offsets.
  6. The issue has been solved as of a few minutes ago. It appears the blame does lie with the official .fbx files, which appear to have wrong joint rotation data as far as I can gather. I have manually transferred my 3dsmax rig to a blender avastar rig, and it now imports as intended. Thus a plea for LL: Please fix your official source files, many people depend on these being right.
  7. I've been working on a custom mesh avatar using bento facial bones, among others. The head mesh has been rigged using 3dsmax and one of the source .fbx files from the wiki page. It features custom bone offsets, and when uploading, looks perfectly fine in the upload preview window when I enable skin weights and joint offsets. However once worn inworld, the lip bones as well as the tongue base/tip end up in the wrong positions, giving the whole setup a rather smashed-in look. I haven't edited any bone rotations, simply moved the dummy objects into the new, desired positions. If anyone can offer any help on this subject, it'd be much appreciated, since I'm quite at a loss as to what may be wrong. The mesh on the right is my import, the one on the left shows it being worn. As you can see, the eyelid bones, forehead etc, all look correct, but mouth corner, upper and lower lip bones as well as tongue bones are as if their root bone has been rotated in some strange way.
  8. Thanks, I did manage eventually after much reading and cussing Now the last thing to figure out is why the wireframe of my avatar mesh - the one imported from collada - appears very much destroyed, aka full of holes. Only wireframe though, shaded view appears fine. I'll probably just attempt a re-import since now I have a decent idea of how to get things done. Edit: And i managed it. A re-import got rid of the wireframe strangeness. I probably hit some magic buttons whilst experimenting
  9. I have a somewhat related question, since I'm trying to apply some final tweaks to my model. Since most of my workflow happens in max, but I'd like to use avastar's slider features for a bit, is there a way to get a rigged mesh from 3dsmax bound to the avastar skeleton? I can export per-vertex weights as .env in max, possibly more, so can I then transfer said mesh over, then bind it to the skeleton/armature and load the weights back in somehow? Also does this process work back and forth? Edit: After much fussing with how to export, I at least managed to get a collada transfer. So I now have an avastar rig and my own rig in the same file and lined up properly. I wanted to look up retargeting in the blender manual, but while the entry exists, it is empty. Is there anywhere I can read up how to retarget my mesh body to the avastar skeleton, whilst retaining the bone influences? Nother edit: After even more fussing, I finally understood how to keep the mesh & vertex groups, remove the previous armature, and use the bind menu to apply the avastar armature. It appears my mesh has now successfully been bound. My issue should be resolved!
  10. Ooh wonderful. Would retargeting from your female to your male skeleton then be more straightforward? If so, I can probably just wait and get to work on other stages, such as getting some hand poses set up in blender and making the alpha faces for the existing meshes. If I should just follow a different workflow altogether, let me know and I'll be sure to give it a try - and once again thanks, I can't begin to say how much your work is both helping out already and going to help in the future.
  11. Hiya, once again me with a rigging issue. I've been using Polysai's excellent rig to get my female/androgynous bodies rigged, which has, after realizing I have a bone affect limit of 4, yielded some very good results and I have them ready for further processing. However the male body is giving me some trouble. For this one, I am using the latest bento male skeleton file off the wiki page (called something akin to BentoMaleSkeletonWithPosOffsets), and initially remapped my androgynous weights to a male body on polysail's rig. I then took the male body mesh and retargeted it to the male skeleton, which had the joins appear in the right positions as far as I could see. But upon importing into SL, the male mesh bulks up when worn on the male shape, and I'm somewhat clueless where those extra deforms come from. I've successfully done male rigs before, and iirc transfering weights from a female mesh to a male rig and retargeting was a valid process. But if I committed an obvious mistake there, I'd much appreciate being pointed toward it, since I've been at it for a few long hours now and can't find the cause.
  12. I've put some more research into the hand issue I was seeing. Apparently, with your skeleton, I have to rotate R_HAND and L_HAND forward by 10 degrees, then SL will display my hands as they are in 3dsmax. Also a quick question, since I about finished my female rig and am about to begin the male one. Is your skeleton universal in some way so I can rig both easily, or should I rig on a different male skeleton for now, and retarget later on if/when you release a male skeleton speficially? Because iirc they do use different bone lengths in some cases
  13. Definitely looking forward to the changes I have a question regarding the hands, which I didn't see anything mentioned about, apologies if it was answered before. When I did my hand rig, I used LEFT_HAND and RIGHT_HAND as base for the palm, and the finger mBones for the fingers themselves. I am not setting any joint offsets. The rig behaves as expected within 3ds max, though when I import into SL, while the hand scales correctly, the fingers seem to be offset backward from the hand as far as I can see. I can provide screenshots of needed. Also, I am SO glad being able to straight import my .dae files finally rather than having to paste missing bone information. Feels good, man.
  14. Thanks very much for the info. I'm not looking to go /too/ complex yet, probably just preparing finger animations and, once I got my avatar head done, a facial rig for an anthro head. So knowing that I can do simple rotation based keying etc does help a lot. And any scale based anims I'll just have to make at the original size, but I can work around that. Also, I very much appreciate the work you're doing. As someone who does have years of experience making SL content, though virtually none in animation, and little in rigging, this setup and the prospect of having something akin to avastar for max is amazing news for me, along with the possibility of anim export in the future.
  15. Thanks for the pointers. On previous avatars I worked at 10.000% scale on all bones, which was ignored by SL, it just made things a bit awkward. Scaling the root token and resizing for export is a good idea, I wasn't sure if it was a workable solution, but I'm glad to know it is On a final note, once animations can be exported, should I be able to do the same workflow? Aka scale the rig up, animate, then scale it down for export?
  16. I've looked at some of the rig examples, most recently the max 2012 bone rig, which while very nice, has the same problem as all the other rigs I tried to get into max. Its /tiny/. Could someone please help me out by letting me know how I can achieve a bigger working model, so I don't constantly run into the near clip plane hiding parts of the model, while still achieving a proper export? Or is the only way to get a correct export to work on the tiny tiny scale and subsequently export that way? Since whenever I try to upscale the rig, I just end up with a stickman body mesh. Once again, thanks very much in advance.
  17. Once again I ran into some issues I can't quite sort out. Using 3dsmax, I imported the latest skeleton files as per the wiki instructions. I set display units in max to meters, and system units to 1 unit = 1cm. This imported the rigged body mesh at the correct size in max. I then created an avatar mesh and brought that into 3dsmax. I verified that all its transforms were nulled, so its scale was at 100%, no rotation or offsets. I bound the mesh to the skeleton. I then exported the rigged mesh as collada, attempting to use both centimeters and meters as unit conversion. The mesh itself imports fine into SL, and looks the correct size when rezed inworld. However once worn, the mesh becomes an assortment of noodles for limbs, as in, the limbs, body, etc are incredibly stick thin. The problem is, I have no idea why. I thought I followed the guides to a tee, and all the units and transforms look good. Both the bone dummies as well as the mesh itself is at 100% scale, the mesh has no offsets I can see, and the units setup is as described. I know I fixed this problem in the past, I believe by cranking up the bone scale to 10.000%, but I wonder if there isn't another way that actually allows me to have a file for once where scaling isnt entirely broken just to get a legit result? Once again any help is greatly appreciated EDIT: Apparently the problems begin much earlier than that. Trying to import the bento skeleton, I set the system/display units according to the wiki, but no matter what I do, the model ends up absolutely tiny. And it seems that no combination of system/display units changes anything about that. So please, what is the scaling I need to use to /actually/ have a properly scaled skeleton available to me?
  18. Thanks so very much first of all! That was pretty much all the info I needed, I appreciate it a lot. So many good news in all that, it's very reassuring to me that LL is doing these new features right. As for the third question, I mostly build anthro avatars, so I was wondering if I'll be able to have a jaw-joint bone I can use in order to animate the muzzle's jaw. Lip bones will come in handy for expressions to be sure, as will pos/rot animations for bones. Also, good luck on the .anim exporter. As avid max user (and sometimes loather), this will be a huge deal for my workflow, so you get all the kudos from me for tackling this
  19. I do have a couple of questions pertaining to bento, i would have searched the topic for details, but it appears the forum won't let me. Thus, thinking 110+ pages might take a bit long to parse, I figure I'd ask directly. - About the new bones, especially regarding the hand/finger setup as well as facial rigging. Are they going to respond to the hand/head collision bones, or will they ignore head/hand slider settings? Basically what do I have to keep in mind if I want to do a fitted mesh rig to allow my customers to customize my avatar shapes? - Is the importer going to change any? As in, will it finally be able to assume/fill in missing mBones, or will I still have to - as a 3dsmax user - edit the .dae file per hand and put in any non-defined base skeleton mBones so the importer realizes there is rigging data? - What changes for the import process? Do I just need to have the regular 50-ish mBones of the SL skeleton defined, even if they have 0 influence, or will I also have to define all the new bento bones for the importer to identify my rig? - Will we be able to animate bone positions and/or scaling, or just rotation? I am new to the idea of facial rigs, but making facial expressions using JUST bone rotation strikes me as an extremely complicated task. - Will we have a proper animated talkjaw/lower lip setup, or do we still have to abuse mSkull or other un-used bones? Also two somewhat unrelated questions: - Will 3dsmax ever be able to import animations to SL via other formats, or will I have to resort to blender to get my .anim or .bvh data inworld? - There were talks of more than 8 faces on a mesh, but no followup. Has this project died, or is it still alive and kicking, or potentially even implemented already? Thank you very much for any helpful replies. I do look forward to the possibilites opened up by this project, and I very much want to see it succeed so that us creators can make even more impressive creations
  20. Having just now updated to max 2017 and preparing to launch a major avatar project, I'm curious about this. Is there ANY way for me to create animations in max and have them import to SL? I don't much mind using converters or third party apps, I just would very much like to be able to use max to create the actual animation, rather than blender, since converting a custom model and rig over to blender has been giving me some issues in the past, and with a more complex rig/animation goal, that might quickly turn into a small nightmare.
  21. This is more of a materials based question, but I hope I can resolve the problem I'm seeing. What I have is a product with various textures, which may exist either in a non-alpha state (using alpha mode none in the diffuse settings) or in a half cut-off state, using an alpha mask. To save space in my scripts, I use the same texture in both cases, aka the texture has an alpha channel, but I use alpha mode none when I do not need the alpha channel to have an effect. This works great in all cases, except for glow. For some reason, even with alpha mode none enabled, the glow effect is affected by the alpha channel, aka the portions that would be rendered invisible by the alpha channel do not receive glow. Since my customers sometimes want a glow effect on the product, this is causing me quite some problems, since my system was initially designed around this one texture solution, as were the scripted appliers, since this particular oddity was only discovered after the fact. My question is this: Is this intended behavior, or is this an SL bug that we might see a fix for down the road? Since if it is intentional, i will have to redesign parts of my work, potentially. To sum it up: When using a texture with an alpha channel, but selecting alpha mode none, glow still is limited by the alpha channel, even though - in my opinion - it shouldn't be. Bug or intended behavior?
  22. Essentially mild obfuscation. I have a script that puts out data to the user, and I simply want to obfuscate the message as base64 encrypted string. I'm not worried about people trying to break it as much as just giving the user a simple block of text to copy/paste. Another use is object to object communication, to obfuscate texture UUIDs. I've used llXorBase64StringsCorrect() with the latter in the past, for a texture application system, but some customers ran into errors that I had trouble reproducing. I've started to wonder if those errors are because bits from the arrays are being discarded and whether padding would prevent that.
  23. Hiya, first of all thanks for the reply. Though I already did read the implementation section, and while i admit that I may be missing the forest for the trees here, I couldn't find a conclusive answer to the simple question: Can this function lose the last few bits of a message, as in, should I pad every message I send to guard against this possibility? Or will input always match output despite certain bits being discarded?
  24. I've made use in the past of llXorBase64StringsCorrect() in order to apply a mild encryption to some messages. Security concerns aside, when reading up on this function again today - as well as on the new llXorBase64() - I saw a caveat mentioning this: During the conversion to a byte array the last (bitcount % 8) are discarded from both str1 and str2. This has me wondering. The wiki doesn't clarify much beyond this point, but to me this reads that data can be lost for certain string lengths. This might explain hard to trace bugs that some of my customers have been seeing. The question now is: Assuming str1 is my data and str2 is my "password", should I always pad str1 to avoid losing data from it upon decryption? Again, the wiki hints at discarded data, but is very unclear what this actually means for practical use, at least to my eyes. So a bit of a better explanation would be much welcome so that I can improve my implementation if necessary :)
  25. So I've almost finished an avatar project, but I want to include a large size option for my customers, since bears usually tend to be big and strong. To achieve that, I figured the easiest way is to create both upsized body meshes (easy and working fine) and to make a deformer to scale the skeleton, to avoid certain issues involving alpha scripting and meshes with joint offsets stored in them (glorious SL, something I had to figure out the hard way during this project). This second part is hard though. My first attempt involved scaling up everything (body mesh and skeleton) to 150% size in 3dsmax. I then made a small box, and rigged every bone to it with a small influence. This box was then exported into SL, with the joint position flag set in the importer. But it does nothing to influence the skeleton at all. The second attempt was to then use Bone Tools in max to reset the scaling on all my bones - rather, dummy objects acting as bones - which had me believe it should now actually understand that the bones are 150% the normal size. Instead, now when I import, the deformer blows my avatar up to roughly 100 times its size rather than just 1.5 times. I should mention that for some reason, the skeleton I work with had to be scaled extensively, to about 10.000% to fit in the metric system (using meters for display and centimeters as unit setup as advised by the wiki page). However SL was fine to that, yet scaling it to essentially 15.000% did nothing to influence the deformer. Yet resetting the scale somehow just makes it explode into a 100 times as large skeleton, despite all the readings on the bone positions reading sensible values. At this point I'm stumped. What I want to achieve seems very straightforward and simple, yet at the same time, I have no idea what else I could possibly try. I have a nice skeleton with reset scaling, everything is in its proper place and yet - SL utterly hates dealing with it and decides it needs to be way too big. So my question is this: How DO I scale up a skeleton in 3dsmax the proper way, given the information I provided?
  • Create New...