Jump to content

Fluffy Sharkfin

Resident
  • Posts

    1,107
  • Joined

  • Last visited

  • Days Won

    1

Posts posted by Fluffy Sharkfin

  1. 1 hour ago, Klytyna said:

    imagine the lag on a sim set to limit it's self to 50 avatars, if it had 1200 animesh npc bots scampering about...

    Other than being comprised of polygons rigged to an animated skeleton, avatars and animesh objects have no similarity whatsoever.   Trying to compare an animesh object to an avatar connected to a client which is constantly sending data to (and receiving data from) the sim is like comparing a car being driven by a human with the automatic barrier at a car park or a railway crossing.

    • Like 1
  2. 4 minutes ago, Qie Niangao said:

    So now I'm thinking, given enough animations, the jellyfish too might have moved "randomly enough", and even if that didn't completely offset the extra rendering, it could be a much easier bit of content to create. I mean, instead of all the offline math I did to figure out movement constraints and the scripting to make it happen, with an Animesh approach it could be just some animations to create and trivially script together. It seems that could be packaged into a pretty accessible toolkit for creators.

    I think it would help if there were more scripted control over the playback of animations.  Something similar to the level of control we have with texture animation, such as the ability to choose custom start and end frames, set playback rate (preferably with interpolation to guarantee smooth motion rather than simply changing the number of frames displayed per second), options like loop, reverse and ping-pong and possibly even some control over the ease in and ease out settings for each animation which would make seamless blending of animations possible. 

    That would allow for the creation of more "random" movement without the need to upload a large number of animations,  and also make it a lot easier to create more realistic moving animesh objects since you wouldn't need separate animations to accommodate different movement speeds, you could control the speed that wheels/limbs are being animated based on the speed at which the object is travelling making it a lot easier to avoid situations where objects appear to slide across the ground, etc.

    • Like 1
  3. 3 hours ago, Qie Niangao said:

    A big LI would much constrain the applications, that's for sure. Similarly with Pathfinding, the minimum 15 LI imposed on characters seemed to discourage adoption for exactly those simple applications where that limited technology might be useful. But a Pathfinding character came with some hefty overhead, so maybe that LI was kinda necessary.

    And similarly a fairly big LI may be appropriate for the whole Animesh mechanism, which seems very heavyweight, based on animation of a full avatar skeleton. That's all fine for the examples we've seen (basically decorative NPCs), but massive overkill for many applications that would benefit from a simpler way to deform some simple mesh geometry.

    I'm thinking of a little script I wrote for a friend who wanted mesh jellyfish to swim randomly in a tank. Both the jellyfish and the motion were quite simple, the trick being to convincingly coordinate the simple animation with the semi-random path and speed of locomotion. The point is, they were like 2 LI per jellyfish, and a single script could run a bunch of them. I'd have loved a way to flex a simpler mesh rather than texture-animate through faces of a more complex one, but it would be crazy irresponsible to foist on viewers a full animated avatar skeleton rig for each little jellyfish. Fun little applications like this are simply never going to be appropriate for Animesh.

    And that's fine, but there's an important grey area: On balance, will pets and breedables - now nearly "jellyfish simple" - get laggier as they adopt Animesh? And so is a pretty big LI necessary to constrain that?

    I'm guessing that the potential for lag from animesh is linked more closely to the number of polygons being transformed during animation than it is the number of bones being animated? 

    If that's not the case then couldn't they examine each mesh object and take into account the number of bones that a mesh is rigged to (since it's now possible to upload mesh that includes only a partial rig), and use that number as a modifier when calculating the land impact for each mesh rather than basing the calculation on triangle count alone, so that simpler mesh objects rigged to only 3 or 4 bones are less heavily penalized than those that are rigged to the full avatar skeleton?

  4. On 10/5/2017 at 6:54 PM, Violaine Villota said:

    with silk velvet the light is reflected more from the fibers that are on the side or edge of whatever it's wrapping around rather than from the front.

    Unfortunately that's something that SL materials simply can't reproduce, for that you'd need a custom shader (which SL doesn't have).

    As has been pointed out you can fake the effect, but in order to do so you'd need to either find/create a custom shader in your 3D rendering app of choice and bake the lighting info into the diffuse texture, or alternatively you can hand paint the lighting on, but since you'd then have static highlights and shadows on your texture the effect is going to work a lot better on say a piece of furniture in an indoor environment with a fixed light source (i.e. where the object isn't moving and you can position it so that the lighting on the textures matches the direction of light from any visible light sources) than it will on a piece of clothing that will be moving around and viewed in a variety of differently lit environments.

    • Like 1
  5. 7 minutes ago, carley Greymoon said:

    oh wow, Thank you so much Fluffy:) it's working now as separate scripts. one for each section. 3 sections each with their own buttons. which is just fine with me, it serves my purpose.

    but i don't understand how all 3 could work with one script. the user can input 3 UUIDs into one text box? i know you must be right, i just don't get how the script could know which UUID belongs to what section. lets say the torso texture UUID is 111-1111-1111-111 and the lower is 222-2222-222. how could the script know texture 111-1111-111 belongs to list torso = [42, 43, 44, 45, 46, 47, 48, 49, 53, 54, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134];?

    in any case i thank you profusely:) and everyone else as well.

    in the original script that I posted earlier the textures are meant to be sent in a certain order (head, torso, lower), the script simply assumes that's the order in which the three UUIDs have been sent and applies them to the corresponding bodyparts.

    As for the new script, since you're sending them one at a time you have to specify which bodypart it applies the texture to and include that information in the message (i.e. "torso,111-1111-1111-111" or "lower,222-2222-222").  As I suggested earlier you could include an llDialog command to open a menu with three buttons (head, torso, lower) then when the user presses one the script automatically opens the text input window to allow them to input the texture UUID.  That way they don't have to type in the name of the bodypart they can just select it from the menu, but having separate buttons for each section works just as well (although you can also do that using a single script rather than three separate ones if you put the script in the root object and use llDetectedLinkNumber() in the touch event). ;)

     

    Anyway, I'm glad it's working for you now! :)

  6. Okay since you're sending texture UUIDs one at a time rather than all three at once, try this...

    Sender Script (based on the script you currently have)

    default
    {
        state_entry()
        {
            llListen(-54321,"","","");
        }
    
        touch_start(integer total_number)
        {
            llTextBox(llDetectedKey(0)," \n Enter a bodypart and texture UUID",-54321);
        }
       
        listen(integer channel, string name, key id, string msg)
        {
            llRegionSay(-12345,msg); 
        }
    }

    Receiver Script (based on the script I posted earlier)

    list head = [1,2,3];
    list torso = [4,5,6];
    list lower = [7,8,9];
    
    setTexture(list link_numbers, key texture)
    {
        integer i;
        for (i=0; i < llGetListLength(link_numbers); i++)
            llSetLinkPrimitiveParamsFast(llList2Integer(link_numbers,i), [PRIM_TEXTURE, ALL_SIDES, texture, <1,1,0>, <0,0,0>, 0.0]);
    }
    
    default
    {
        state_entry()
        {
            llListen (-12345,"","","");
        }
    
        listen(integer channel, string name, key id, string msg)
        {
            if (llGetOwnerKey(id) == llGetOwner()){
                list data = llCSV2List(msg);
                string bodypart = llList2String(data,0);
                key texture = llList2Key(data,1);
                if (bodypart == "head")
                    setTexture(head,texture);
                else if (bodypart == "torso")
                    setTexture(torso,texture);
                else if (bodypart == "lower")
                    setTexture(lower,texture);
            }
        }
    }

     

    In order for it to work you need to specify both which bodypart you wish to apply the texture to and the texture UUID as a CSV.

    For example:

    Quote

    head,89556747-24cb-43ed-920b-47caed15465f

    will texture the head with the default plywood texture.

     

    It's not the most elegant solution to selecting which bodypart to texture, personally I'd go with a HUD or at least an llDialog menu rather than having to type the bodypart name into the text box, but at least you can test it and see it working. :)

     

  7. 1 hour ago, Nova Convair said:

    llList2Key only works if the list has stored a key it does not convert a string to a key

    From the wiki 

    Quote

    If the type of the element at index in src is not a key it is typecast to a key. If it cannot be typecast null string is returned.

     

    carley, the reason the script isn't working as you have it now is it was written to work with three texture UUIDs, if you're passing a single UUID to it then you simply need to change

    setTexture(lower,(key)llList2String(textures,2));

    to

    setTexture(lower,(key)llList2String(textures,0));

     

    But, that would defeat the purpose of separating the message into a list of UUIDs since there's only one.

    The idea is to handle changing all the textures of the linkset using a single script rather than using multiple scripts for each texture. :)

  8. 1 minute ago, Violaine Villota said:

    So are you saying that I would need to make a single mesh object that is rigged to the SL Avatar armature in Avastar that doesn't need to be attached to each wing bone, but rather gets rigged / skinned / binded to a single point via let's say, the spine or wing root bone? And then after importing the rigged mesh object and wearing it, I then attach each prim wing to the corresponding wing bone while in the Show Bones mode?
    If this works, you are a hero. A hero I tell you. 

    Pretty much, except I'm fairly sure you'll need at least one vertex of the rigged mesh weighted to each of the mWing1 bones, since if nothing is weighted to those bones I'm not sure if the joint offsets for them would even be recorded in the .dae file (but I haven't tried it myself so I may be wrong).  Also you won't be able to attach the prim wings to a specific bone, they'll attach to the wing attachment points (I just sent you a couple of really simple examples in-world that you can wear to see exactly what I mean about the positioning and orientation of the prim wings in relation to the wing bones).

  9. Just now, Violaine Villota said:

    Hmm, well I've done what you suggested, using mesh attachments that are rigged to the wing bones that I planned to make invisible, and I tried to attach the flexi prims to that.

    Sorry, I think you have the wrong idea, you don't attach the non-rigged wings to the rigged mesh, you wear the rigged mesh object which fixes the mWing1 bone positions separately, then you need to attach each non-rigged wing to the corresponding wing attachment point, then rotate and reposition it to align with the bones of the wing so that the pivot point for the wing matches the position of the origin for mWing1 (in order to do this go to the Develop menu at the top of the screen and choose Avatar > Show Bones so you can see the wing bones of your avatar).

    11 hours ago, Fluffy Sharkfin said:

    align your wing objects along the wing bones so that they lay along a straight line between the attachment point and the origin for the mWing1 bones.

    wing_bones2.JPG.8394d46cbce71afebebf839e3b9608d6.JPG

    Once you have the non-rigged wings positioned so they line up with the wing bones of your avatar, then as long as the wing animation you're using is only rotating mWing1 the non-rigged wings should stay aligned.

  10. 4 minutes ago, Violaine Villota said:

    Ah! Okay I see what you mean now, yes that should work if I don't care about the wings not flexing at all, but I do like the idea of them having a little bit of the appearance of flex, which was the whole reason I was trying so hard to get prim attachments to work because  then making the wings flexi would take care of that and I could be lazy about rigging anything ;P

    Yes, if you were to use flexi-prims in your non-rigged attachments it would probably give the same appearance of "flexing" (depending on the orientation and settings you use for the flexi-prims) without the need for rigging.  Honestly I think that "results may vary" will be kind of an understatement, I can imagine how if you get them just right they could look pretty cool, but I foresee a lot of tweaking and fiddling to get them to that point (it's been a really long time since I played with flexi-prims much but I do remember them being kind of glitchy and annoying to work with).

     

    9 minutes ago, Violaine Villota said:

    Not rigging them would take care of the resizing issue however, at least I think it would? This would mean I would still have to bind the mesh to the armature in Avastar for the to attach to the right place anyway, correct? And then wouldn't that break the ability to resize inworld? 

    If you're going to use non-rigged attachments then you won't need to rig any part of the wings, at most you'll need a separate rigged mesh attachment (which can basically be made 100% transparent or hidden in some other way) to set the joint offsets.  There are a couple of Bento tails that use the same system to adjust the length of the tail without having to include multiple sizes, you simply attach the appropriate "resizer" attachment for the tail length you want and that offsets the bone positions to change the length of the rigged mesh tail you're wearing.  That way you can supply something which repositions the mWing1 origin to the right place while not having to include any rigged mesh in your otherwise non-rigged wings (it also means you can potentially provide different versions of the rigged mesh attachment to account for different avatar sizes).

     

    16 minutes ago, Violaine Villota said:

    I'm not sure what program you are using to show the animation, is that in Blender or in SL?

    It's Maya, but the same would apply in Blender/Avastar.

     

    17 minutes ago, Violaine Villota said:

     Also I thought it would be great if I could make a sort of 'wing adapter' so that any avatar could attach any wings they wanted to it and have the animations as long as they had edit and copy abilities on the wing panels. Alas, that's just not gonna happen. So it's back to rigged mesh or in-world animated flexi prims.

    Not entirely sure what you mean by a "wing adapter" but it sounds similar to the concept I described above with having a separate rigged mesh attachment to set the offsets of mWing1?

     

    19 minutes ago, Violaine Villota said:

    It's sounding like people like the Bento animations more though...

    It does seem to be the "big thing" at the moment, and even if it doesn't turn out the way you expected chances are you'll learn some handy relevant skills along the way, best of luck! :)

  11. 11 minutes ago, ChinRey said:

    No, I was actually very concious about the difference and very careful which word to use here.

    I do of course realize that the question what is convincing is a bit subjective but even so, take a look at the butterflies here: http://maps.secondlife.com/secondlife/Buttermere/27/240/35. Those are Arduenn Schwartzmann's and I think everybody will notice how much more convincing, realistic and credible they look than other SL butterflies. And that's even in a hifgh lag area where you can't really see them from their best side. The main reason for that is that they flap their wings faster than the ususal "butterfly path" butterlies. (It's a shame Arduenn doesn't seem to sell his butterflies anymore btw. They're one of the true gems of Second Life and still among the very best SL animals of any kind ever.)

    Thanks, I'll definitely go and check them out next time I'm in-world! :)

    But if we're talking realism then surely we have to consider the issue of scale. The larger the wing area the more air resistance and therefore the more energy is required to flap the wings, so if a humanoid had a pair of wings that were in scale with their body then the amount of food they would have to consume in order to flap their wings at the same speed as an insect while flying would be ridiculous, they'd probably have to spend 95% of their waking life eating just so they can fly once a day.  I'll agree that faster wings look more realistic on small butterflies and insects and realistically sized birds, but if you want "realism" for fairy wing speed then there are other factors that should to be taken into account besides what type of insect the wing looks like it came from. :P

    As you rightly pointed out the question of what is convincing is subjective, especially when dealing with things that don't technically exist in reality.  You can choose to go the full scientific route and use only physics and real world examples to formulate a hypothesis of how a thing will work or you can let your imagination run wild and just go with what you think looks best (personally I've always admire those that are capable of combining both).

    • Like 1
  12. 1 hour ago, Violaine Villota said:

    I see no reason why the multiple bones couldn't be used to add a little flexing motion to the wings and have seen other's rigged mesh fairy wings that do so.

    Since you've abandoned the idea of adding flexi-prims to rigged mesh this is a little redundant, but just to clarify what i was trying to say, rigged mesh will have no problem using multiple wing bones, but were you to try and add non-rigged attachments hoping to make them seem as if they align with the rigged wings and join to the avatars torso at the point where wings would normally pivot then changing the rotation of any bone past mWing1 will change this...

    wing1.gif.3d7492cb97d5e093836b5697ae1c1a45.gif

    into this...

    wing2.gif.ec42db1095e34bc207ef5cd16ed869d5.gif

    ... (or something equally odd-looking).  As you can see from the second image, the alignment of the "fake pivot" on the non-rigged attachment is offset when you rotate mWing2 and the rotation on mWing1 then makes the part that would normally appear as if it were attached to the avatar move around.

    As I said, it's not all that relevant since you're no longer pursuing the idea of using non-rigged attachments in conjunction with rigged mesh, but since my previous attempt to explain the problem was perhaps a little vague I figured I'd provide an example.

    • Like 1
  13. 1 hour ago, ChinRey said:

    I'm afraid I have to disagree with you there. The way I see it, the lack of randomness is one of the major flaws in the SL avatar animation system.

    I see your point, and as far as the SL avatar animation system being flawed I'd have to agree, and am looking forward to the new improvements they've been talking about.  It would be nice if they added more control over playback of animations, similar to the options we have for texture animations with loop/reverse/ping-pong and better control over where animations start and stop, maybe even some scripted control of the ease in/ease out parameters so we can "blend" animations together, but hey we can all dream I guess.

    As for your points about speed vs framerate, etc, I agree to an extent but it sounds like the word you're searching for is "realistic" not "convincing", there's plenty of examples where realism doesn't work in cinematography, games, etc.  (the classic example being how they have to stop filming snow scenes when it snows in RL because real snow doesn't look real on screen).  I guess my point is that realism isn't always what people strive for in Second Life, sometimes it's preferable to go with whatever is the most aesthetically pleasing.  In the example of fairy wings for avatars you could argue that as humanoids are a lot larger than insects their wing speed would be reduced,  (after all, who can say what speed a fairies wings would beat at "realistically"?), but now we're bordering on the realms of artistic license and personal preference, which is a lot more subjective I guess.

    • Like 3
  14. 3 minutes ago, Lillith Hapmouche said:

    PS: I wouldn't call the problem in the thread with the GTX 970 remotely similar, especially since desktop and laptop are quite different plattforms.

    Yeah, given the benchmark comparisons the GTX 950M isn't even in the same ballpark.  Which may explain the slow frame rates in certain areas, especially if those areas are full of content and/or avatars.  As Lillith pointed out there's a world of difference between your average game level and the contents of the average SL sim, and if you don't have the jellydoll feature enabled then a few "well-dressed" avatars can bring your viewer to a grinding halt, even on a more capable system.

  15. 33 minutes ago, ChinRey said:

    There is no definite answer to that. It depends on so many factors.

    As for the main question, I don't really see any advantages and several disadvantages of using Bento for stiff insect wings and other attachments that only needs to be rotated around a fixed attachment point. That's even regardless of the resizing issue.

    For flexible wings I really want Bento these days. If they can move fast enough that is. Bento or no Bento, bird and bat wings that only can flap once or twice a second just look silly.

    I guess the main benefit would be simulating smooth movement without running into problems in laggy areas, using bento animations would probably be better for producing flapping motions where the wing speed increases and decreases gradually like this...

    butterfly.gif.d69b59f8e1e1fa33292a6d12135ad3b6.gif

    as well as for those random flex and flutter type motions which can add a little variety and life to wing animations.

    • Like 2
  16. 9 hours ago, Violaine Villota said:

    You would think so, but whenever I've tried the prims attach far away from the body. Trying to edit them is a nightmare as every time I try to rotate it loses it's position completely. Unless there's a special way to do it, but I've searched and tested and it just won't work for me. At least not with regular prims, do you mean with rigged mesh?

    That's because the attachment point for the wing is located near the end of the wing (on the mWing4 bones) whereas the pivot point you need for the wing is closer to mWing1 (on the back/shoulder blades). 

    wing_bones.JPG.55b7bcf18a4b7d8bae960479ae7ae457.JPG

     

    As long as your wing animation only rotates the mWing1 bones (which should be okay for insect type wings since they don't have multiple bones so a simple flapping motion using the mWing1 bone would be all that's required) then you should be able to align your wing objects along the wing bones so that they lay along a straight line between the attachment point and the origin for the mWing1 bones.

    wing_bones2.JPG.8394d46cbce71afebebf839e3b9608d6.JPG

     

    The problem is the origin of the mWing1 bones are still "floating" outside the avatar mesh rather than aligned with the surface of the back...

    wing_bones3.JPG.88223e94f2a5815fa87efd30e10edf55.JPG

    ...so you'll probably need to use a rigged mesh attachment with joint offsets to fix that, otherwise your wings will pivot in the wrong place when the flapping animation is playing.

    You'll definitely need wing animations that only rotate the mWing1 bone though, any movement on the other mWing bones is going to make your attachments fly around in all sorts of crazy directions!

    • Like 1
  17. 46 minutes ago, carley Greymoon said:

    wow thanks for all the help and suggestions. i'm going to try both ways just so i can know more than one way of doing it. i had no idea you could automatically find link numbers by color.

    thanks to everyone for taking the time to help:)

    Well, it's not automatic but you can do it with a simple script like so...

    default
    {
        state_entry()
        {
            integer i;
            vector color;
            list head = [];
            list torso = [];
            list lower = [];
            for (i=1; i<=llGetNumberOfPrims(); i++)
            {
                color = llList2Vector(llGetLinkPrimitiveParams(i,[PRIM_COLOR,0]),0);
                if (color == <1,0,0>)
                    head = head + [i];
                else if (color == <0,1,0>)
                    torso = torso + [i];
                else if (color == <0,0,1>)
                    lower = lower + [i];
            }
            llOwnerSay("\nlist head = ["+llList2CSV(head)+"];\nlist torso = ["+llList2CSV(torso)+"];\nlist lower = ["+llList2CSV(lower)+"];");
        }
    }

    ... just tint the different sections red, green and blue and then drop the above script in and it will output the lists of link numbers for each section complete with formatting so you can just copy and paste, replacing the first three lines of the other script.

  18. 12 minutes ago, Klytyna said:

    So, for say a mesh body that's 1 'prim' with 3 faces each with a different texture, your single script would have 3 uses of llSetLinkTexture, each applying a different image to a different 'face' of the 'prim'.

    Unfortunately that's not going to work if they want to apply different textures to different parts of the linkset since we can't use lists of integers to specify link numbers in llSetLink commands (which is a real shame).

    Personally I'd suggest using a for loop with llSetLinkPrimitiveParamsFast() and a list of link numbers for each part of the object (head, torso and lower), it may take marginally longer to set the texture for all the children of a linkset but it's still preferable to having 90+ running scripts with listens.  You can pass all three textures in a single message as a comma separated value and apply each texture to different parts of the linkset.

    list head = []; // link numbers for head
    list torso = []; // link numbers for torso
    list lower = []; // link numbers for lower
    
    setTexture(list link_numbers, key texture)
    {
      integer i;
      for (i=0; i < llGetListLength(link_numbers); i++)
        llSetLinkPrimitiveParamsFast(llList2Integer(link_numbers,i), [PRIM_TEXTURE, ALL_SIDES, texture, <1,1,0>, <0,0,0>, 0.0]);
    }
    
    default
    {
      state_entry()
      {
        llListen (-xxxxxx,"","","");
      }
    
      listen(integer channel, string name, key id, string msg)
      {
        if (llGetOwnerKey(id) == llGetOwner()){
          list textures = llCSV2List(msg);
          setTexture(head,llList2Key(textures,0));
          setTexture(torso,llList2Key(textures,1));
          setTexture(lower,llList2Key(textures,2));
        }
      }
    }

     

    Of course you'd need to fill in all the link numbers for each list at the start of the script, but a simple way to do that would be to rez the object, tint each of the sections a separate color then drop in a script which will check the color of each object in the linkset and compile the link numbers into 3 lists based on those colors and spit those lists out into chat.  Then you can copy and paste those into the start of the script above.

  19. 4 hours ago, GunshotOwner said:

    But when i try to upload it to second life, it cant find data. All settings are zero. 

    Unfortunately I think you may need to provide a little more information than this, perhaps a screenshot of the problem in the upload window, or at least a specific error message?

    As far as joining meshes together and rigging them as a single object that's fine and shouldn't be causing you any problems.

    I don't use blender myself so wouldn't be able to help much on any blender/avastar related issues, but there are plenty of people here that do so I'm sure one of them will be able to assist you. :)

×
×
  • Create New...