Jump to content

JackRipper666

Resident
  • Posts

    162
  • Joined

  • Last visited

Everything posted by JackRipper666

  1. Thanks for help guys! Well what I am trying to do is build a mesh in Maya. So far I have a head/neck area done. But as I was working on it I started looking in SL how most of the mesh or models seem to be broken down I thought in order to animate it you need to separate parts of it? I know like it needs to be rigged no matter what to animate it. I was just not sure if it can be rigged as a whole single mesh. Cause it would lack the ability to move say the head around on the neck. I seen an avatar where it was created in an outside program I could tell. But he had it so that the head was placed on top of the neck in a ball joint type. I'd rather keep my mesh all in one mesh without adding more geometry in parts. Reason I said prim I guess is I thought that's how you stated an object /geometry in SL. So that would probably save me time though if i can just make a whole mesh inside of Maya? I guess I wouldn't have to break it up into parts to move it correctly then? Or would it just not be as movable again say for example head might not be able to movable on the neck if it's all one mesh? That's basically what I am getting at. Just worried I would not be able to use it unless some how I provided points of movement where it's cut on area's that could be moveable. If that makes sense lol.
  2. Hello I was just curious about correct way to build a full set of body parts to complete a mesh I am working on. Do avatars in SL have to be broken down for example say the leg. A quad/hamstring prim, then knee down to calf area prim and say ankle to foot. Like three prims for a leg? Just curious how to make it more possible to animate and make it workable fo SL. Any help I'd really appreciate! =)
  3. Ohh ok thank you! I'll have to change that now lol!
  4. Oh ok I'll have to try this out soon. I did get one of the examples to work on wiki finally as well for the coords on x an y I think it was of where I touch the hud. I'll try to fix that part of the code without messing up what I already put together lol. Hm.. one problem I am having though is only my gesture can be herd but not the laugh that is inside the HUD. I don't know why that is because I did set it in the code as you can see to 0 for public chat. So I'm puzzled as to why it's not playing in virtual world and only to me when I click it. Again though if I just use the gesture it works. I might just use that though to keep my HUD area a little less clutered. But I'd hate to waste so much time if I can't resolve what should be a simple issue I guess.
  5. Nice I am going to take a look at it. I actually finally wrote a script well. I feel like I'm piecing together what I understand. I haven't fully memorized how to write this all up. Which I guess to me is really being a scripter lol! But it's a start This is my example of what I was trying ot do. My only issues so far is I have to figure out how to learn the llDetectedUV. So far I found on wiki the code that tells me where I am clicking. But I don't understand how to use the reported information in local chat. Maybe I can find out how it works on that college of scripting in world. Also I need to maybe add one more animation. I did add a second one but I need to figure out how to make it trigger after the first two. I like how my first two animations go off at the same time because in world one is a laugh emote for the facial expression I wanted and second one that goes off same time is the belly laugh where he is very animated. So if I can just figure out how to add a second animation that plays after the first two that would work well. Otherwise it'll play three animations and the second laugh animation just kinda interrupts the first lol. Oh well I'm getting closer. But hopefully this code will help anyone on here looking for same info. Only thing you'll need to do since I have it where it just plays what's in the prim content tab. So put in your audio file of a laugh there an it should just play what's ever inside the prim you created. I just get confused since wiki isn't really good at explaining something where I can understand it in human language LOL! The examples are much more helpful. This way I can understand what's going on better. Code:: string gAnimName1 = "express_laugh_emote"; string gAnimName = "express_laugh"; // what animation to play? default { state_entry() { llSay(0, "LOL"); } touch_start(integer total_number) { llRequestPermissions(llGetOwner(), PERMISSION_TRIGGER_ANIMATION); // ask the owner for permission to trigger animations llStartAnimation(gAnimName1); llStartAnimation(gAnimName); // automatically trigger animation. llPlaySound( llGetInventoryName(INVENTORY_SOUND,0),1); } on_rez(integer param) { llResetScript(); // reset the script as soon as it starts. } attach(key id) { integer perm = llGetPermissions(); if (id != NULL_KEY) // make sure we're actually attached. { if (! (perm & PERMISSION_TRIGGER_ANIMATION)) // remember to use bitwise operators! { llRequestPermissions(llGetOwner(), PERMISSION_TRIGGER_ANIMATION); // request permissions from the owner. } } else { if (perm & PERMISSION_TRIGGER_ANIMATION) { } } } }
  6. Thanks guys, I really appreciate this help. Yea that's exactly what I was looking for Carbon! Had no idea that llDetectedTouchUV even existed. But I am very new I guess to scripting even though I feel I understand it some what. I'm not in amateur in all area's of the computer. At least not with hardware and art. lol programming though I got a long way to go. I'm just gonna keep playing around with it maybe I can find away to write it up correctly as a script. Thanks again for point me in the right direction! Helps a lot! :smileyhappy:
  7. Hi I am trying to make just a simple script where if I place a gesture that has an animation and sound in it. Then put it in a object like a cupe. Then Put a texture on it with a play button. How would I write a script to define the play button is the trigger to make the gesture play? And if I have two do it separate where it can't be a gesture only a animation and sound file. I uploaded the sound file but the .bvh file of the stock belly laugh animation I can't find that. Where would that be lol? Sorry for all the questions but I can't find answers to them on Second Life wiki the examples aren't making sense to me. =/ Any help would be much appreciated, thank you!
  8. Hi I'm new to building full avatars or meshes, but I been spending lots of time trying to figure out the process of how you go from a high poly count model, that is I guess textured with polypaint using zbrush but working also alongside photoshop with Goz. What I am trying to get at is let's say I have a model that is 3 million polys, then I polypaint it and also use photoshop. After that what would be the process of starting to get it in Second Life? I know I need to either do retopology with zbrush, or use the decimation z plugin. Or also I have 3D coat could use that to. But I'm kinda lost is polypainting, is that like ptex from mudbox? Where some how you can save out a file and use that file to bake in textures ontop of a low poly count model? An you have to have a UV map already built I'd assume for the low poly count model. I was thinking I could use something like unfold 3d to do that. But also what confuses me is this normal map. Is that needed? Because I read it's just about adding more details to your model using fake lighting, such as bumps an dents. Reason I ask is I'm trying to simplify the process first before I go into doing rigging, animations. So if anyone can just tell me the process of going from mesh to fully modeled and textured avatar. Then I can worry about doing animations and rigging later. I know there is lots of ways this can be done but just a clear method as what's required no matter what programs you use. Example : 1. Build a mesh in Maya or Zbrush 2 retopologize it or decimate in zbrush or 3Dcoat 3. build a UV map with unfold3d or zbrush uv master plugin. 4. then polypaint in zbrush/photoshop or use ptex with mudbox. 5. then save out some how a ptex file or not sure how with zbrush save out a polypaint file. 6. Use maya to bake in those textures onto the low poly count model with a uv map ready. 7. The result would be something close to the high end polygon model you started out with in zbrush. Then etc etc lol. Sorry if this is annoying but I'm really trying to get an understand over all this like overwhelming information and the vast amount of programs that add to the confusion along with trying to be creative and build my first mesh. I have a gorilla head started so far in Maya but nothing finished. Also trying out zbrush thinking I'll just try one of the models in there first and modify it to my liking just so I can get a grip on how it's all done first. Plus with zbrush it's as close to drawing in 2D I have found. Makes it easier. 
×
×
  • Create New...