Jump to content

Medhue Simoni

Advisor
  • Posts

    4,748
  • Joined

  • Last visited

Everything posted by Medhue Simoni

  1. Qie Niangao wrote: I just fear this is turning out to be another case where the Linden developers are listening to the wrong resident creators, leading to the kind of disappointment that we got with Mesh which initially had no attention paid to worn Mesh. I take much issue with this statement. We, the creators involved have NO SAY. I was in both the mesh development group, and the Bento group. The said issues had nothing at all to do with us. In the case of Bento, this very issue was brought up, and script commands to do exactly what you suggest were asked for. We were told that it was outside the scope of Bento. Please do not blame any of us for any of Bento's downsides. If it were up to us, this would have been done. That said, it's somewhat insulting to imply we ask for the wrong things, as we have gotten LL to take Bento way farther than any of us could have guessed it would. This is a credit to LL for listening to us to the extent that they did. LL wasn't even going to give us bone translation. We begged, and made our case and LL listened and saw the possibilities. And, there were many other ground breaking features added because of us. You know, the meetings have been open to anyone, and if you had concerns then you should have came. There is another meeting on Thursday, at 1pm SL time. If you want to blame anyone for the downfalls of mesh, then blame all the people that did not participate. Not those of us that did!
  2. Well, everything I've made so far has facial animation playing all the time. If head creators are smart, they would do the same. See, a face doesn't just sit still. Your face is moving all the time. Much like tails in SL, a face will be constantly playing an animation, if not a number of them. This means, no matter what you are doing, the face will not be lifeless, or really should not be. For instance, my wolf is constantly switching from panting, to blinking, to licking, and whatnot. What I predict, is that the user will controll the face, not the furniture. As I said before tho, there are individuals working on a system to have facial animations trigger with furniture, by using a naming standard. I haven't really looked to much into tho.
  3. Qie Niangao wrote: Are the old facial animation states now being exposed to scripts somehow, for use in triggering new, Bento-compatible animations? Or is there some other way to link the huge inventory of scripted facial expressions to new Bento faces? I'm thinking of the huge installed base of scripted multi-pose furniture and AOs. Second Life simply won't survive long enough to get very far on updating all that stuff with Bento-specific facial animations--just never gonna happen--so unless there's some alternate path, Bento heads will mostly remain expressionless masks like their dumb-mesh predecessors. But maybe that's always been part of the project. I haven't been following Bento because it didn't seem to have any new scripting capabilities, so I don't know whether maybe this is a solved problem. Not that I know of. Some people are working on solutions for this, but nothing is really official. I don't agree with your accessment either. SL has survive this long, and there is nothing on the horizon that is seriously going to dismantle it. Bento is a rework of the complete skeleton, allowing creators to literally make an unlimited array of nicely animated avatars. Why would anyone want to limit these avatars to old, outdated systems just for compatibility? The old system has a limited range of expressions. Why stick to that?
  4. I'm perplexed by the images. What would cause 1 eye to be out of place? Then 1 ear stretched up? Really, I'm stumped.
  5. Henri Beauchamp wrote: Whirly Fizzle wrote: Hmm there isn't a JIRA issue for that, unless it's a case of https://jira.secondlife.com/browse/BUG-37634 That bug is supposed to be fixed though. That bug seems to affect other bones than just the face bones, which is not the case here... Probably a different bug, entirely. Oh, and resetting the skeletton doesn't fix the issue either (it's an animation issue, and no, reloading the animations for the affected avatar is not either a work around for it). How is it animated? What bones are they using? Did they use supported ways to do this? No offense to the creator, but if they just used some unsupported way to animate with, I'm not so sure LL should be fixing it. This is the risk a creator takes by using unsupported ways. This is exactly why I never ventured into the unsupported ways. Totally sucks for people who bought it, but the creator will likely want to redo the avatar rig anyways. The avatar will end up being many times better.
  6. Psistorm Ikura wrote: I have a somewhat related question, since I'm trying to apply some final tweaks to my model. Since most of my workflow happens in max, but I'd like to use avastar's slider features for a bit, is there a way to get a rigged mesh from 3dsmax bound to the avastar skeleton? I can export per-vertex weights as .env in max, possibly more, so can I then transfer said mesh over, then bind it to the skeleton/armature and load the weights back in somehow? Also does this process work back and forth? Edit: After much fussing with how to export, I at least managed to get a collada transfer. So I now have an avastar rig and my own rig in the same file and lined up properly. I wanted to look up retargeting in the blender manual, but while the entry exists, it is empty. Is there anywhere I can read up how to retarget my mesh body to the avastar skeleton, whilst retaining the bone influences? Nother edit: After even more fussing, I finally understood how to keep the mesh & vertex groups, remove the previous armature, and use the bind menu to apply the avastar armature. It appears my mesh has now successfully been bound. My issue should be resolved! In theory, meaning I didn't test this, you should be able to bring any rig/collada and convert the Avastar rig to it. There are a couple of ways to do this. 1 would be to use the convert tool in Avastar by selecting your rig and then an Avastar rig you just created, and then clicking Convert. The 2nd way would be to Repair the rig using Avastar. Again, I've not tried any of this, so it's all just speculation. When I get done with all the avatars I want to do initially, I will be converting a number of rigs over. So, I will be testing all this soon, with completely custom rigs.
  7. So, uhm, I wrote a Blog about Bento. My SEO keeps telling me to write more blogs. Apparently they bring traffic. So, I did 1. Warning! Half the blog post is promoting my big wolves, but it is my site, you know. lol http://www.medhueanimations.com/blogs/news/second-life-bento-and-large-wolves @Cathy Foil - Can't you give me links to Mayastar, I want to add them to the blog? @Polysail - Same! Can you get me links to your's for 3Ds Max?
  8. Here's a work in process. Almost done. It just needs to be cleaned up a bit. Humans might want to be weiry of requests to animate coming from large wolves. https://i.gyazo.com/115ce7637b33096428760dcd3da8741b.gif
  9. Playing with shape sliders today on the wolf. Here's some images showing the range of extremes the wolf can acheive. The small shape simply adjusts all the sliders in a way that makes those parts smaller. The large shape was done in the opposite way. 
  10.   So, I took the large wolf out to the London City bday bash. Here's him riding a motorcycle. lol 
  11. Vistanimations wrote: Awesome, i will try in avastar! Matrice, of the Machinimatrix team, said today that the retargeting system with the new bones still needs a little work, and it might be best to wait until the next version.
  12. Try putting the AO on the ground and then adding the animations to it. You still have to edit the notecard tho. And don't forget to take the AO back into your inventory when you are done adding the animations. You can edit the notecard while wearing the AO.
  13. Vistanimations wrote: you have any news? Do you know any date? thanks In Avastar, all the bones have been added to the retargeting system in the last release. I don't know about Cathy's Mayastar yet.
  14. Not to beat a dead horse, but I try to look at it from the newbs point of view. Yeah, in SL, or in creative programs in general, there are many ways to do things. This is why the default way should be as newb friendly as possible. The seasoned veteran can easily figure out how to get things the way they want it.
  15. I do completely understand now the concerns here, but at the same time I don't quite agree with the solutions. Even with your concerns, I still think my suggestion is better. It treats everything the same, and easy to understand. A simple solution, to yours and Whirly concerns, would be to create an Outfit with only those gestures that you most often and most like to use. Then, you'd simply add it to your current outfit, if appropriate. (Edited) Scratch that, I forgot that we have to have certain parts of the avatar in the Outfits. That said, you could just put those gestures that you like in a folder, and then just select the whole folder and add it to any outfit you want. Teager wrote: What you may want to consider instead, if you're concerned about people using your wolf talk gestures on another avatar that they're not meant to fit, is setting up empy gestures for /voicelevel1, /voicelevel2, and /voicelevel3 which do not play an animation themselves, but instead chat on a private channel "1", "2", or "3" to your AO to indicate that it should play a speech animation in your AO. That way, if the gesture is played with any other avatar worn, it will do nothing (or, worst case, will play a talk gesture appropriate to that avatar). If you need help setting up a script like this one, you could talk to Tapple Gao, who has one premade and open source. While I'm not totally against using scripts instead of gestures that do the same thing, as I had my coder do this for chat gesture triggers, I"m not sure it would be best for voice. I guess the question would be, which is better, meaning which triggers faster, as a real mouth moves incredibily fast and it is difficult to get it all working fast enough to keep up. I'd love to chat with Tapple about creating some Bento specific scripts tho.
  16. Just commenting on the vast difference in our inventories. Of course, I am a special use case, but I probably have hundreds of gestures. Speech gestures require 16 to cover the full range of voice. Now multiply that by how many avatars I've made, and then add in normal gestures I've made or acquired.
  17. Whirly Fizzle wrote: If this change is being considered, please can it be made optional? Maybe a tick box to include currently active gestures when you save an outfit. I'm not bothered if it's enabled by default, as long as I can turn it off. For me personally automatically saving currently active gestures in an outfit is more of a pain then a help, but I understand Medhue's use case. My main concern is that people really don't ever know what gestures they are wearing, or often forget about them, because gestures are handled differently. I could easily see someone changing avatars and all of a sudden their face explodes. Currently, the only way to see all the gestures that are activated is to search for Activated. They do not show in the Worn option.
  18. After many tries, I've managed to scale my wolf to be Twilight size, and still have everything work, without having to redo it all.
  19. It's also worth noting, I have a number of video tutorials for animation in Blender, and I will likely make more specifc to Bento.
  20. Teager wrote: But if saving your outfit saves all currently activated gestures to your outfit, then it will save ALL of your gestures, whether they're specific to that outfit or not. I'm not sure I see the problem with all this. If you are wearing gestures not made for your custom avatar, then saving your outfit exposes these gestures and the user can simply deactivate the gesture and resave the outfit. Teager wrote: If you remove an outfit and wear a folder instead of another outfit, you'll have no active gestures. If you have gestures in that folder, and you rightclick/replace current avatar, then you'll have active gestures. Teager wrote What might be more useful is just making it more apparent that gestures can be added to an outfit, by adding gestures to the "Add More" button at the bottom of the outfit editor. I'm presently on Firestorm, not LL viewer, but I see... Sounds good to me also, on top of saving gestures when you save your outfit.
  21. Hi Tamara, For the most part, for human avatars, Bento doesn't change the basic bones, nor will affect old animations. Bento does add new face bones, to animate the face, and it adds fingers to the basic human skeleton. That said, if you simply want to just create human movements on human avatars, then Qavimator is still usable. Even if you wear a Bento human avatar, the normal body movements will still work, and you would simply add finger animations and facial animations. They don't all need to be in 1 animation. Now, if you want to make wings, or wing animation, or tails, and tail animation, and so on, then you need to use Blender, Maya, or 3ds Max. If you want to make a fully custom, non human avatar, again, you need a more advanced program. I hope this helps.
  22. Vir Linden wrote: Hi Medhue, For good or ill, we have a whole lot of different ways to add things to outfits currently. What specific process would you like to be able to use for adding gestures? After finishing up 2 avatars today, I realized that putting my gestures in the avatar's folder, and then having the customer rightclick/replace current avatar on the folder, does Activate the gestures, and Deactivates when you change to another avatar in the same way. Of course, the customers still have to add the gestures to the Outfits, but at least there is a workable way to get gestures activating and deactivating as they should. All that said, IMHO, the gestures should be added to an Outfit when you save the Outfit. Forgive me if I'm wrong here, but currently, if you save your Outfit, the system takes everything, except gestures, and adds it all to your Outfit. Yes, there are numerous ways to add things to an Outfit, but the easiest and most used is simply saving your Outfit.
  23. Vir Linden wrote: This isn't really a Bento topic, but it came up at yesterday's Bento user group meeting. Question was whether you could include gestures in outfits. The answer is that you can. What seems to work currently is: Drag a gesture into an outfit. This will create a link to the gesture in that outfit. Wear the outfit. The linked gesture will become activated Wear some other outfit. The linked gesture will become inactivated. Dragging into the Current Outfit Folder does not appear to work. Although I love that we, as individual creators, can add gestures to our outfits, but this does nothing to help the consumer. The process of adding gestures is in no way intuitive, or even remotely similar to creating an Outfit. Now, maybe if we, creators, can give out Outfits with gestures in them, then this would be all good, but we can't do that. IMHO, gestures need to be added to Outfits the same way that ALL other things are added to the Outfits.
  24. Psistorm Ikura wrote: I've looked at some of the rig examples, most recently the max 2012 bone rig, which while very nice, has the same problem as all the other rigs I tried to get into max. Its /tiny/. Could someone please help me out by letting me know how I can achieve a bigger working model, so I don't constantly run into the near clip plane hiding parts of the model, while still achieving a proper export? Or is the only way to get a correct export to work on the tiny tiny scale and subsequently export that way? Since whenever I try to upscale the rig, I just end up with a stickman body mesh. Once again, thanks very much in advance. Polysail, I think is making a 3DS Max compatible version.
×
×
  • Create New...