Jump to content

Rahkis Andel

Resident
  • Posts

    344
  • Joined

  • Last visited

Everything posted by Rahkis Andel

  1. Codewarrior Congrejo wrote: @ Rahkis, glad i could answer that along the way lol. And your answer - answers my question if it's finally fixed as well - as i assume it's not : ) After some testing, I found out that this feature is working now, if indeed it was ever broken. More details here.
  2. Okay, through a lot of testing I've discovered something that yourself and others may find quite interesting; there are actually two ways that textures get assigned in Blender and both can methods can be used to export a fully textured object to Second Life in a single collada file. In the blender collada exporter, there are 2 somewhat vague options that read "Include UV Textures" and "Include Material Textures". By default, when you choose one of the second life presets, UV textures is the only one that ever gets checked. I suspect that if you unchecked that one and checked Material textures instead, that would fix your problem and you'd see the textures in your upload! This is your easy fix. That is interesting news because by all accounts that feature was broken! It seems clear to me that the real answer is that people just didn't understand how it worked (which is completely understandable). But what about that other way to assign textures? Well, to do this you don't even need to assign a material at all. All you have to do is have the UV/Image editor open: - Assuming you've already UV unwrapped your object, If you then select all the faces you want to assign to a certain texture, you will see those corresponding UVs pop up in the UV window. - If you set those to a texture in the UV window (By pressing the + next to the image name and completing the image creation pop-up) with a set of faces selected, that texture will automatically be set to those faces' UVs. If you don't have textured solid set or you don't know how to set it, you can also set the viewer shading to "textured" by hitting (alt + z). Realize, however that if you don't have a lot of lighting in your scene, your object may appear dark or even black. It doesn't matter whether you can actually see the results in the viewport or not as far as the collada exporter is concerned. - If you assign all your UVs to textures this way instead of assigning materials to the UVs, you'll be able to use the exporter using the default settings (Include Texture from UV) because then you will truly be exporting the textures as they were assigned to the UVs. This can be quite confusing in blender because it is possible to have materials assigned to your UVs and also have a texture separately assigned to UVs. Also, if you texture paint in the viewport, blender -only- recognizes the textures assigned to the UVs. Having material textures and UV textures conflict when attempting certain bake modes like normal or AO baking can cause feedback loops that will stop you dead in your tracks until you unassign the texture from the UVs (You do this by pressing the "x" next to the texture's name). There is no visual way to know if an image is applied to the UVs unless you just start selecting faces with the UV/image editor open, so just generally be careful. Let me know if this makes sense to you or not. Edit: I forgot to mention, you have to have "Include Textures" in the Upload Options tab selected for the textures to be uploaded with your collada file when you are actually uploading the mesh in SL.
  3. I'm not sure if you have made a mistake in how you have worded your question or not, but you have said that you created the clothing and sized them. If you successfully did this, what is it that you are asking for?
  4. You deserve all the Kudos in the world. It is an aspiration of mine to be as helpful as you are and to have the breadth of knowledge that you do in Blender. Edit: There is something I've been wondering. Asides from the ability to unlink bodyparts in second life, what is the benefit of uploading your avatar body in multiple parts? It's quite possible that that is a good enough reason for anyone, but I was just curious if there is some other reason I don't know about.
  5. You probably already know what an LOD is, but it stands for Level of detail. SL has 4 LODs. You can imagine them by picturing your item with 4 concentric rings representing the distances a viewer is standing from your object. As you cross the border defined by the ring you will see the LOD it represents; you see the highest LOD object when you are standing close to the object and the lowest LOD when you are standing far away from it. If you are outside of all of the rings, SL will derender the object, I believe. Some graphic settings can affect how and when a new LOD kicks in too, making this somewhat complicated of a subject, so I won't bother going into details that I don't even understand all that well. The idea is to dial back the detail of your object as the viewer gets further from it to save on GPU processing power. From a distance, you wouldn't notice that the buttons weren't there, for example. From close up you want full detail. At a medium distance, you could probably remove much of the details without it being noticable. From the furthest distance away, some objects don't need much more than a 2d image bilboard. All you have to technically do to create a new LOD for your jacket is to take out parts that shouldn't be visible and use the decimate modifier to remove some unneded edges. This respects weights and UVs seams almost like magic, so if used carefully it can give you -okay- results. The best results come from manually removing edge loops, but in your case, the jacket is already triangulated, so that would be a time consuming process. After you have made an LOD you just export it as it's own .dae. So you could consider your jacket as it is to be the highest LOD and you could create different LODs by hand and upload them separately. Do -not- let SL determine the LOD by itself. It's decimation algorithm tends to just ahnialate all surface details as quickly as possible.
  6. No one is having their arm twisted to help you. Even someone demanding help wouldn't be a pest because it's easy enough to ignore them. Someone like you who is asking nicely for help is of course, even less so. But yes, when you parent a mesh to an armature using any of the deformation settings (where vertex groups are added automatically), what you are doing is the automatic equivalent of these manual steps: - Parent your mesh to the SL armature object - Give your mesh an armature modifier - Set the deformation armature object to your SL armature in the modifier settings - In the mesh, create vertex groups with corresponding names for every single bone in the skeleton Depending on the option you choose, automatic weights may be assigned to each vertex as well. Anyway, when we tell you to apply the armature, we mean to click the apply button of the armature modifier. In retrospect, that's why your mesh kept snapping back to it's original position; you were simply removing the armature altogether. Keep in mind that if you make a change that you do not like and cannot undo it, Blender automatically keeps old saves, so it is sometimes possible to go back. Go to your save directory and you may see files with extensions .blend1 and .blend2. If you remove the 1 and 2 you will be able to open them and they will be previous saves. Obviously, if you save very frequently, you may not get to a useful save, but it could potentially save you trouble. Assuming you are computer savvy, you probably also realize that you need to be able to see file extensions to do this as well.
  7. Like I said in my latest PM to you, weight painting is not a well understood process by most and there's a reason for that. I don't think I can really make out what the problem is from what you've written, but you probably now have conflicting weights going on. Let me know if this is accurate: First you created your own armature to move the jacket into the T-pose. That went well. Keep in mind if you are going to be doing a vertex weight copy from the default avatar, you should fit your jacket to the avatar as well as physically possible before moving on to the next step. At this point, you should have applied the armature modifier, thereby freezing the jacket in place. You should have also made sure that -all- vertex groups were deleted from the jacket to make sure it was a clean slate to skin to the SL armature. Then with your "clean slate", you can finally do the vertex weight copy from the SL avatar. Lastly, you parent your jacket to the armature as an object (as opposed to with empty weights or any of the other armature deform options) You will no doubt need to clean up some of the vertex weights, too. The buttons and other attached objects will probably not be weighted fully to the right bones and the SL avatar has crappy weights to begin with so it's not going to be perfect.
  8. Nice job on the house! Did you by any chance let Second Life determine your different LODs? If that is the case, the answer is simple: Don't let Second Life determine your different LODs. You can confirm the truth of this statement by setting every level of detail to your .dae file as is. It should look fine. You will probably want to supply a lower res version yourself though for optimization purposes. Not that your house looks all that high-res. Edit: For future reference, screenshots of your Second Life upload settings window are invaluable to troubleshooting mesh upload issues. No matter what your problem is, chances are strong that we'll need to see that to help you definitively. Edit: Edit: Hey, I just noticed you said that you're familiar with Blender but not so much with Second Life. That makes two of us. Welcome aboard!
  9. Edit: I don't think what I wrote made a lick of sense so let me try that again. I suggest bringing your ZBrush sculpt into Blender, retopologizing it, UV unwrapping it and texturing it there. That way you'll have more control over your topology and different LODs. And yeah, also do what Gaia said below. Perfect timing.
  10. Sure, I'll keep an eye on my messages.
  11. I've suspected for a long time now that there should be a 'best practices' for mesh creation. I guess if I were to ever want to tackle a knowledge base project, it would be that. I'd say the same thing as Kwakkelde, though. Don't worry so much. If you make a mistake, that's just one more thing to learn from. No one is going to assault you over it. The fact that you have such an open mind suggests that any big mistakes wont last long in your case.
  12. If the jacket is pretty simple, you might want to just forcably move the arms into a T-Pose. That might be less trouble for you. If you do this, make use of proportional editing. I really wish I had time to do a tutorial right now, but I really don't. If you are super stuck, I'd gladly re-pose it for you if you passed me the .blend file.
  13. Wow, this sounds exactly like the problem I had when I was starting out in rigging a few weeks ago. If I understand correctly, you tried changing the avatar's pose to fit the arms with the jacket correct? Unfortunately, that won't work. The only way to change the orientation of the bones and have that translate into second life is to actually change the bone positions in edit mode and doing that will break any arm animation that depend on those bones. Your only choice is to move the jacket mesh itself into the T-Pose. That will give you the best results. If that's what you did, I'm not sure what your problem is and I'll need to see some screenshots.
  14. I've been having a ton of trouble with shading between different parts with shared vertex locations. If that result is only possible with Avastar right now, I guess I'm sold.
  15. That's interesting. I've never heard of varying normal direction on purpose. Learn something new everyday.
  16. Well, it certainly looks like it has bizarre topology, but I doubt that has anything to do with it. It looks like it's probably a variety of issues. First of all, assuming this is a blender screenshot, select all verts and press (ctrl + n). That will recalculate all normals. Then open the UV window and take a look at the UV layout. If you could screenshot that, that would probably show the problem. Edit: After reading it again, I'm thinking it's just flipped normals. If you have no material applied, the UVs shouldn't matter.
  17. I certainly don't have a final answer, but I'll also point out that even with a full body alpha, your avatar's shadow is still there, being calculated in real time. That is rather telling evidence.
  18. That'd be nice for human avatars. Not so useful for non-human ones. A better idea would be to have an "avatar" slot that could be replaced entirely -- no hiding necessary. I don't know how that could be accomplished the way the system is now, though.
  19. Coby Foden wrote: Code and Rahkis, thank you for the comprehensive explanations. :matte-motes-smile: I'm eager reader of different ideas, opinions and modelling techniques. I try to suck the good things and excellent ideas like a sponge. So I might eventually learn something new - even though sometimes the sponge leaks and does not keep all the info delivered. Epilogue... I wonder when we will get new beautiful avatar? B eautiful, huh? This should please everybody! (and could you please stop whining?) Erm... if you say so... ...... You'll be happy to know I'll be finished soon with your new beautiful avatar.
  20. WhiteRabbit0 wrote: The deformer itself allows for the creation of Avatar 2.0, 3.0.. 5.0 whatever. Because the deformer operates based off of the approximate shape of the default avatar, you can create a new avatar with it and simply include clothing layers with appliers, very similar to the lola tango setup. Ah, I see. I honestly have no idea how it works, but that makes sense. I suppose after that, they just need to figure out how to get us a modifiable skeleton and we'll be golden. What's 5 more years?
  21. Coby Foden wrote: say Moo wrote: I'm impatiently awaiting the support of normal maps. (a wonderful world is near) There is already materials project viewer available to play with. Here's the link (in case you didn't know): http://wiki.secondlife.com/wiki/Linden_Lab_Official:Alternate_Viewers Scroll down to "Project Materials Viewer" Main grid (Agni) has the support for materials in all regions. It appears that in beta grid (Aditi) the support is not yet on in all regions. I played with that a bit and the results look really great. No way. That would be like childhood on a Christmas morning and you can unwrap your presents but never touch any of the toys. I think I'll just wait for it to actually be released. I don't fancy torturing myself.
  22. Codewarrior Congrejo wrote: Second way: (apparently was broken a long time) is to choose to upload the textures along with your model in the model-upload window of SL (last tab > there you can choose to 'include textures') Well, that answers my problem at least.
  23. Every time? I'd love to have a computer that could shoot fireballs into space repeatedly. Better than fireworks.
  24. Haha, well, I won't repeat myself trying to retort that top part -- Let's just say that I'd be more than happy to see a new avatar be released. But I imagine LL will still be getting complaints no matter how good it is. As for the whiny bit -- What I hear is that the avatar mesh as it is is deeply bound to the system. The details are not for the likes of us, I suppose. As a programmer, I can understand what the problems could potentially be but I'm just guessing. It sounds like a whole mess of poor documentation and lack of foresight. But hey, SL is an extraordinarily complicated project to jump into for a company so I'm pretty impressed that it's gone this long and not totally broken down. Edit: Oh, yes Codewarrior reminded me that also there is the problem with all of the pre-existing content that a new avatar replacement would break. it's pretty unfortunate.
  25. Don't feel bad. I am having the same problem, except replace 3dsMax with Blender.
×
×
  • Create New...