Jump to content

Fluffy Sharkfin

Resident
  • Content Count

    279
  • Joined

  • Last visited

Everything posted by Fluffy Sharkfin

  1. According to the list of premium member benefits found here you receive the L$ 1,000 bonus 45 days after going premium.
  2. If you're looking for tutorials I'd recommend googling for terms like "low poly game hair tutorial" and "creating textures for hair cards" etc. The number of results in the videos section alone seems to range between 500,000 and 1.5 million depending on the search terms used, at least the first few pages of which are usually youtube video tutorials. It does seem that rather than using hair "cards" (flat 2D planes) a lot of creators in SL use tubes to give the illusion of volume from a wider range of angles, but the process of creating the textures will be pretty much identical regardless of which geometry you choose.
  3. Alternatively, if you really want to start by making clothing for something other than the default avatar, you can download the dev kits for the Avatar 2.0 and Kemono mesh bodies here, no application required.
  4. @Yasojith If you're trying to achieve a smooth transition (i.e. pulsing effect) and are only using a blank texture on the object rather than a specific image then I would definitely recommend exploring the llSetTextureAnim approach rather than attempting to do it all via script. Texture Animation is a client-side effect so is far more resource friendly and also won't suffer from any slowdown on laggy sims where lots of other scripts are eating up cpu time. Additionally since the intensity of the glow is determined by the value of the pixels in the alpha channel of a texture you may find it simpler to set up more complex transitions and effects in a bitmap editor using gradients. For example: default { state_entry() { llSetLinkPrimitiveParamsFast(LINK_THIS,[ PRIM_ALPHA_MODE, ALL_SIDES, PRIM_ALPHA_MODE_EMISSIVE, 0, PRIM_GLOW, ALL_SIDES, 0.2, PRIM_TEXTURE, ALL_SIDES, "5c2f0866-56b7-8ea3-00cf-cca31a07a220", <1,1,0>, <0,0,0>, 0.0]); llSetTextureAnim(ANIM_ON|LOOP, ALL_SIDES, 128, 1, 0, 128, 32); } } uses a 128 x 32 plain white texture with an alpha channel like this to produce a steady transition from 0% to 100% and back to 0% (the actual amount of glow is determined by the settings for the prim, i.e. PRIM_GLOW, ALL_SIDES, 0.2, however emissive mask mode affects the Full Bright value of the prim so it will appear slightly brighter than using only glow). Changing the PRIM_TEXTURE settings to PRIM_TEXTURE, ALL_SIDES, "79de213f-bc72-748f-d6e1-285e58aff1a4", <1,1,0>, <0,0,0>, 0.0]); would use a texture with an alpha channel like this which results in a faster transition and shorter pulse with a longer pause in between. Similarly changing the PRIM_TEXTURE settings to PRIM_TEXTURE, ALL_SIDES, "2341f9d9-b7a3-e18f-c294-d2c24106de7f", <1,1,0>, <0,0,0>, 0.0]); uses a texture with an alpha channel like this which would give a 2 second pause followed by an instant transition to max glow then a 2 second fade back to zero. And finally changing the PRIM_TEXTURE settings to PRIM_TEXTURE, ALL_SIDES, "1be86627-0563-ad17-0ed5-5006176d231b", <1,1,0>, <0,0,0>, 0.0]); uses a texture with this alpha channel to produce 3 short half-second pulses at 50% glow (50% grey), followed by a 1 second pulse at 100% glow (white) with a slow fade in and faster fade out. As you can see from the above examples you can easily control the timing and intensity of the pulses simply by adjusting the gradient that you use in the alpha channel of the texture, and since the script only runs once and then stops it uses no additional cpu time (in fact since texture animation is treated as a property of the prim you can even remove the script once its been run and the texture will continue animating regardless).
  5. Or how about applying a small blank texture with this for an alpha channel ... then setting the Alpha mode to Emissive mask and the Glow to a value of 0.20 and then all you'd need in the script would be... default { state_entry() { llSetTextureAnim(ANIM_ON|LOOP, ALL_SIDES, 2, 1, 0, 2, 0.2); } } And yes, always always always use brackets!
  6. Yes, Innula and Taff are right about using brackets, I tend to skip them for single commands in if statements because I'm lazy and reckless!
  7. You're welcome! And, since I can't seem to sleep right now, here's an even shorter version without the need for an if statement or any variables aside from the integer flag, you just need to recast the flag from an integer to a float and divide by 5 to give you a value of either 0.2 or 0.0 depending on if the flag is TRUE or FALSE (1 or 0) and then use that as the glow value in the llSetLinkPrimitiveParamsFast command. integer glow = FALSE; default { state_entry() { llSetTimerEvent(5.0); } timer() { glow = !glow; llSetLinkPrimitiveParamsFast(LINK_THIS, [PRIM_GLOW, ALL_SIDES, (float)glow/5]); } }
  8. Actually that part of the script is fine, if you put the llSetLinkPrimitiveParamsFast command inside the bracket then it will only ever execute when glowVal = 0.2 so the glow won't turn on and off. The real problem is that llSetTimerEvent doesn't return a value so integer tFace = llSetTimerEvent(5.0); won't work. tFace needs to be defined either as a global variable at the start of the script, i.e. before default, or as a local variable in the event or function in which it's to be used, in this case the timer event (you can specify a particular face number or use the constant ALL_SIDES to have it change all the faces at once). The llSetTimerEvent command should be on a different line. Also you could get rid of the llGetLinkPrimitiveParams command and simply use an integer with a boolean value as a flag instead. So basically your script would look something like this. integer tFace = ALL_SIDES; integer glow = FALSE; default { state_entry() { llSetTimerEvent(5.0); } timer() { float glowVal; if(!glow) glowVal = 0.2; else glowVal = 0.0; llSetLinkPrimitiveParamsFast(LINK_THIS, [PRIM_GLOW, tFace, glowVal]); glow = !glow; } }
  9. Mesh/Exporting a mesh from Maya
  10. Other than being comprised of polygons rigged to an animated skeleton, avatars and animesh objects have no similarity whatsoever. Trying to compare an animesh object to an avatar connected to a client which is constantly sending data to (and receiving data from) the sim is like comparing a car being driven by a human with the automatic barrier at a car park or a railway crossing.
  11. I think it would help if there were more scripted control over the playback of animations. Something similar to the level of control we have with texture animation, such as the ability to choose custom start and end frames, set playback rate (preferably with interpolation to guarantee smooth motion rather than simply changing the number of frames displayed per second), options like loop, reverse and ping-pong and possibly even some control over the ease in and ease out settings for each animation which would make seamless blending of animations possible. That would allow for the creation of more "random" movement without the need to upload a large number of animations, and also make it a lot easier to create more realistic moving animesh objects since you wouldn't need separate animations to accommodate different movement speeds, you could control the speed that wheels/limbs are being animated based on the speed at which the object is travelling making it a lot easier to avoid situations where objects appear to slide across the ground, etc.
  12. I'm guessing that the potential for lag from animesh is linked more closely to the number of polygons being transformed during animation than it is the number of bones being animated? If that's not the case then couldn't they examine each mesh object and take into account the number of bones that a mesh is rigged to (since it's now possible to upload mesh that includes only a partial rig), and use that number as a modifier when calculating the land impact for each mesh rather than basing the calculation on triangle count alone, so that simpler mesh objects rigged to only 3 or 4 bones are less heavily penalized than those that are rigged to the full avatar skeleton?
  13. If you mean can you upload mesh rigged to custom skeletons then the answer is no. You have to use the SL avatar rig to create your animesh objects.
  14. Unfortunately that's something that SL materials simply can't reproduce, for that you'd need a custom shader (which SL doesn't have). As has been pointed out you can fake the effect, but in order to do so you'd need to either find/create a custom shader in your 3D rendering app of choice and bake the lighting info into the diffuse texture, or alternatively you can hand paint the lighting on, but since you'd then have static highlights and shadows on your texture the effect is going to work a lot better on say a piece of furniture in an indoor environment with a fixed light source (i.e. where the object isn't moving and you can position it so that the lighting on the textures matches the direction of light from any visible light sources) than it will on a piece of clothing that will be moving around and viewed in a variety of differently lit environments.
  15. in the original script that I posted earlier the textures are meant to be sent in a certain order (head, torso, lower), the script simply assumes that's the order in which the three UUIDs have been sent and applies them to the corresponding bodyparts. As for the new script, since you're sending them one at a time you have to specify which bodypart it applies the texture to and include that information in the message (i.e. "torso,111-1111-1111-111" or "lower,222-2222-222"). As I suggested earlier you could include an llDialog command to open a menu with three buttons (head, torso, lower) then when the user presses one the script automatically opens the text input window to allow them to input the texture UUID. That way they don't have to type in the name of the bodypart they can just select it from the menu, but having separate buttons for each section works just as well (although you can also do that using a single script rather than three separate ones if you put the script in the root object and use llDetectedLinkNumber() in the touch event). Anyway, I'm glad it's working for you now!
  16. Okay since you're sending texture UUIDs one at a time rather than all three at once, try this... Sender Script (based on the script you currently have) default { state_entry() { llListen(-54321,"","",""); } touch_start(integer total_number) { llTextBox(llDetectedKey(0)," \n Enter a bodypart and texture UUID",-54321); } listen(integer channel, string name, key id, string msg) { llRegionSay(-12345,msg); } } Receiver Script (based on the script I posted earlier) list head = [1,2,3]; list torso = [4,5,6]; list lower = [7,8,9]; setTexture(list link_numbers, key texture) { integer i; for (i=0; i < llGetListLength(link_numbers); i++) llSetLinkPrimitiveParamsFast(llList2Integer(link_numbers,i), [PRIM_TEXTURE, ALL_SIDES, texture, <1,1,0>, <0,0,0>, 0.0]); } default { state_entry() { llListen (-12345,"","",""); } listen(integer channel, string name, key id, string msg) { if (llGetOwnerKey(id) == llGetOwner()){ list data = llCSV2List(msg); string bodypart = llList2String(data,0); key texture = llList2Key(data,1); if (bodypart == "head") setTexture(head,texture); else if (bodypart == "torso") setTexture(torso,texture); else if (bodypart == "lower") setTexture(lower,texture); } } } In order for it to work you need to specify both which bodypart you wish to apply the texture to and the texture UUID as a CSV. For example: will texture the head with the default plywood texture. It's not the most elegant solution to selecting which bodypart to texture, personally I'd go with a HUD or at least an llDialog menu rather than having to type the bodypart name into the text box, but at least you can test it and see it working.
  17. From the wiki carley, the reason the script isn't working as you have it now is it was written to work with three texture UUIDs, if you're passing a single UUID to it then you simply need to change setTexture(lower,(key)llList2String(textures,2)); to setTexture(lower,(key)llList2String(textures,0)); But, that would defeat the purpose of separating the message into a list of UUIDs since there's only one. The idea is to handle changing all the textures of the linkset using a single script rather than using multiple scripts for each texture.
  18. Pretty much, except I'm fairly sure you'll need at least one vertex of the rigged mesh weighted to each of the mWing1 bones, since if nothing is weighted to those bones I'm not sure if the joint offsets for them would even be recorded in the .dae file (but I haven't tried it myself so I may be wrong). Also you won't be able to attach the prim wings to a specific bone, they'll attach to the wing attachment points (I just sent you a couple of really simple examples in-world that you can wear to see exactly what I mean about the positioning and orientation of the prim wings in relation to the wing bones).
  19. Sorry, I think you have the wrong idea, you don't attach the non-rigged wings to the rigged mesh, you wear the rigged mesh object which fixes the mWing1 bone positions separately, then you need to attach each non-rigged wing to the corresponding wing attachment point, then rotate and reposition it to align with the bones of the wing so that the pivot point for the wing matches the position of the origin for mWing1 (in order to do this go to the Develop menu at the top of the screen and choose Avatar > Show Bones so you can see the wing bones of your avatar). Once you have the non-rigged wings positioned so they line up with the wing bones of your avatar, then as long as the wing animation you're using is only rotating mWing1 the non-rigged wings should stay aligned.
  20. Yes, if you were to use flexi-prims in your non-rigged attachments it would probably give the same appearance of "flexing" (depending on the orientation and settings you use for the flexi-prims) without the need for rigging. Honestly I think that "results may vary" will be kind of an understatement, I can imagine how if you get them just right they could look pretty cool, but I foresee a lot of tweaking and fiddling to get them to that point (it's been a really long time since I played with flexi-prims much but I do remember them being kind of glitchy and annoying to work with). If you're going to use non-rigged attachments then you won't need to rig any part of the wings, at most you'll need a separate rigged mesh attachment (which can basically be made 100% transparent or hidden in some other way) to set the joint offsets. There are a couple of Bento tails that use the same system to adjust the length of the tail without having to include multiple sizes, you simply attach the appropriate "resizer" attachment for the tail length you want and that offsets the bone positions to change the length of the rigged mesh tail you're wearing. That way you can supply something which repositions the mWing1 origin to the right place while not having to include any rigged mesh in your otherwise non-rigged wings (it also means you can potentially provide different versions of the rigged mesh attachment to account for different avatar sizes). It's Maya, but the same would apply in Blender/Avastar. Not entirely sure what you mean by a "wing adapter" but it sounds similar to the concept I described above with having a separate rigged mesh attachment to set the offsets of mWing1? It does seem to be the "big thing" at the moment, and even if it doesn't turn out the way you expected chances are you'll learn some handy relevant skills along the way, best of luck!
  21. Thanks, I'll definitely go and check them out next time I'm in-world! But if we're talking realism then surely we have to consider the issue of scale. The larger the wing area the more air resistance and therefore the more energy is required to flap the wings, so if a humanoid had a pair of wings that were in scale with their body then the amount of food they would have to consume in order to flap their wings at the same speed as an insect while flying would be ridiculous, they'd probably have to spend 95% of their waking life eating just so they can fly once a day. I'll agree that faster wings look more realistic on small butterflies and insects and realistically sized birds, but if you want "realism" for fairy wing speed then there are other factors that should to be taken into account besides what type of insect the wing looks like it came from. As you rightly pointed out the question of what is convincing is subjective, especially when dealing with things that don't technically exist in reality. You can choose to go the full scientific route and use only physics and real world examples to formulate a hypothesis of how a thing will work or you can let your imagination run wild and just go with what you think looks best (personally I've always admire those that are capable of combining both).
  22. Since you've abandoned the idea of adding flexi-prims to rigged mesh this is a little redundant, but just to clarify what i was trying to say, rigged mesh will have no problem using multiple wing bones, but were you to try and add non-rigged attachments hoping to make them seem as if they align with the rigged wings and join to the avatars torso at the point where wings would normally pivot then changing the rotation of any bone past mWing1 will change this... into this... ... (or something equally odd-looking). As you can see from the second image, the alignment of the "fake pivot" on the non-rigged attachment is offset when you rotate mWing2 and the rotation on mWing1 then makes the part that would normally appear as if it were attached to the avatar move around. As I said, it's not all that relevant since you're no longer pursuing the idea of using non-rigged attachments in conjunction with rigged mesh, but since my previous attempt to explain the problem was perhaps a little vague I figured I'd provide an example.
  23. I see your point, and as far as the SL avatar animation system being flawed I'd have to agree, and am looking forward to the new improvements they've been talking about. It would be nice if they added more control over playback of animations, similar to the options we have for texture animations with loop/reverse/ping-pong and better control over where animations start and stop, maybe even some scripted control of the ease in/ease out parameters so we can "blend" animations together, but hey we can all dream I guess. As for your points about speed vs framerate, etc, I agree to an extent but it sounds like the word you're searching for is "realistic" not "convincing", there's plenty of examples where realism doesn't work in cinematography, games, etc. (the classic example being how they have to stop filming snow scenes when it snows in RL because real snow doesn't look real on screen). I guess my point is that realism isn't always what people strive for in Second Life, sometimes it's preferable to go with whatever is the most aesthetically pleasing. In the example of fairy wings for avatars you could argue that as humanoids are a lot larger than insects their wing speed would be reduced, (after all, who can say what speed a fairies wings would beat at "realistically"?), but now we're bordering on the realms of artistic license and personal preference, which is a lot more subjective I guess.
  24. Yeah, given the benchmark comparisons the GTX 950M isn't even in the same ballpark. Which may explain the slow frame rates in certain areas, especially if those areas are full of content and/or avatars. As Lillith pointed out there's a world of difference between your average game level and the contents of the average SL sim, and if you don't have the jellydoll feature enabled then a few "well-dressed" avatars can bring your viewer to a grinding halt, even on a more capable system.
×
×
  • Create New...