Jump to content

OptimoMaximo

Resident
  • Posts

    1,809
  • Joined

  • Last visited

  • Days Won

    3

Everything posted by OptimoMaximo

  1. Node muting is a shading debug technique as much as node soloing. It is used to observe the behavior of a material when parameters are disabled ie: say is the specular roughness the source of this noise, you make a comparison between results to determine what needs more samples. Which finally leads to the use of LightPath, instead of the general samples number.
  2. First thing i thought was the recolor feature for the nodes that was introduced in the 2.7x series. But i was mistaken. When it is red like that, only on the top label, it's because the node is Muted. You can toggle it from the menu shown in the pic or by hitting M when the node is selected.
  3. As i was mentioning in the quote, you'll need to output an animation toreposition the joints as a deformer from the default avatar. For this, you'll need a full Avastar character in your scene, i guess the avatar workbench won't do for this task. From your picture, i'd say you were importing a default avatar as a rigged mesh, i'll assume to take advantage of the joint positions delivered with such a mesh. However, this would only work actually wearing the avatar mesh as an additional item, it won't somehow "override" your current avatar (except the joint position). An animation on the other hand lets you deform any avatar. I would recommend watching Medhue's youtube video tutorials on animation to get started.
  4. The plug in you need is Avastar. It can be purchased inworld with linden dollars or using paypal through their website. Inworld location http://maps.secondlife.com/secondlife/Jass/128/128/2 Website https://blog.machinimatrix.org/ They have plenty of tutorials on all the basics you need to get started.
  5. Alright, so there are basically two ways to accomplish this> 1) create a new body shape using the Appearance Editor, outdated interface but still relevant info available here http://wiki.secondlife.com/wiki/Appearance 2) create a joint position animation to reposition each single joint and get shapes impossible to achieve with point #1 for the point #2, you have to use a 3D software that supports SL animation. Blender is a 3D software that comes free, but the required plug in is to be purchased (around 25 USD) Otherwise, Maya (commercial software) with my plug in or 3DSMax with Polysail's Marionette plug in.
  6. That is non-uniform scaling, so watch out for that. Once averaged and packed, UV islands shouldn't be touched (too much) in regard to their scale so that each surface gets its proportional number of pixels to keep the whole map resolution/proportion-consistent.
  7. you should expand more about what you are thinking to achieve. There is no such an asset as Avatar Deformer. Would you like to make a shape? Or move the avatar joints so to achieve a different scale and/or proportion? "Deformers" isn't a self explanatory term in SL.
  8. Here's something else to remember for the next project then: Usually, that is an issue due to non-uniform scaling, occurring when the object scale was manipulated, not frozen and then unwrap UV was called. The unwrapping script takes the scale into account and uses the original 1,1,1 scale to size the UV islands. You should always make sure that scale is properly applied before unwrapping (CTRL+A -> Apply Scale). I guess that is what happened also on the bricks section. My advice is to create a material, tile your base texture in there, check the stretches, address them where possible and bake the texture look resulted from material textures tiling. Then all the other maps you may need (like the AO)
  9. Well you may want to duplicate the pieces and do an actual bevel on those corners to extract a normal map instead, that other method is just quick and dirty shading management, since i assume you weren't going for map extraction in the first place. However, if i recall correctly, in Blender you have to add your edgeloops making sure that the "Correct UV" checkbox is ticked in the operator panel (bottom of the Tools panel "T") in order to get a new UV correct placement, otherwise your texture would stretch.
  10. You're welcome. You shouldn't feel limited by the model you consider as the final SL model, though. If you still need a gentler look on the shading, you can perform your bake on a copy of your model, where you add another subdivision near the edge you would like to smooth shade. Then, don't do a smooth shade on the faces, do it on the edges. The way FBX (inherited from Collada, its parent) encodes things is by vertex normals. Doing the smoothing by selecting the edges ensures they are being averaged using a common tangent (along the multiple edge's selection components' direction) delivering the bevel effect on the shading. Basically, you're baking on a higher polygon model made from and made to work on your game asset, it doesn't really need to be sculpted to the billion polygons level
  11. That is the correct way, Rolig. Ambient Occlusion calculations rely on surface AND vertex normals to determine how exposed a surface is and therefore the chance to be lit by the environment's diffuse light. Smooth shading is meant to fake roundness, so over there it's trying to average the shading as if those pieces were cylinders
  12. Basically, yes. There is another thread (here ^) where the OP was having a problem with rigging using either the avatar workbench or Avastar, in there i give a full explanation as to how the avatar should be oriented to work without Avastar's intervention. Once your scene has the avatar comply to the listed requirements, it should work also to export animations, although you can't not expect extreme precision: the scale plays a big role, but it would work.
  13. i don't use Blender anymore since a long time, so i can't help with working materials. However, you didn't mention you were using Avastar add-on. If you're trying to animate with an Avastar rig without it being installed, it won't work for a few reasons: 1) Avastar orientation is wrong for SL, it gets adjusted by Avastar scripts upon export; 2)Avastar animation rig is a utility skeleton, meant to drive the deforming skeleton (the true SL avatar skeleton). Those bones' names are different from SL's because Blender needs a specific naming convention to manage left-right side recognition. Plus... 3) This skeleton uses a structure that allows better animation than the deforming default skeleton (like the inverse hips, the COG and the Origin) and therefore the hierarchy is recognized as invalid; 4) you're exporting using the default BVH exporter on a meter scale character, while a bvh file for SL describes a character in inches scale. In Blender, nothing really works except Avastar, when it comes to animation.
  14. If you're using the Avatar Workbench for Blender, from the Machinimatrix website, make sure of two things: 1) your skeleton should face +X: rotate it in object mode and CTRL+A -> Apply rotation 2) the root bone (mPelvis) should be named "hip" (without quotes of course) then you can export with the native Blender BVH exporter.
  15. Because a cube only has 8 vertices, it needs more subdivisions along its length in order to have the bones take control over their part of mesh
  16. Texturing is a skill VERY close to storytelling, which is my only advice instead of resources. There are plenty of tutorials to achieve certain effects you may need for your project, however what you're really missing is the method to work from scratch. And the method is being an artisan under all standpoints. I expand better now. Say your helmet is a medieval or even more ancient piece of armor. You are the story-teller who followed it from the forge to the current state. So, layering is key. You will begin laying out simple and clean base textures for the involved materials (say, steel and bronze), then start putting some hammer prints here and there (the blacksmith modeled it) and add some darkness (tempered steel tends to darken before it is polished again, but its carbon rate increases in the process. Adding darkness as a layer lets you make changes more easily at a later time for tweaking). Then the pieces are combined together (add rivets with Ambient Occlusion from both actual geometry and texture details) to give a initial blacksmith product to put on the shelf, where it sat for, say 4 months, during which it got dust from the street accumulating in the crevices (reason for having texture AO along with geometry AO,m it helps with the placement if used as mask). It comes from the medieval streets, so it's brownish rather than the dark grey our modern polluted cities provide. Then it is sold to an apprentice warrior, that eventually got killed (add some scratch details, bumps here and there, even heavy, along with another layer of AO for those details) and his equipment was looted by his killer, who threw it in the chest along with other loot (add stains of undefined stuff being later unsuccessfully cleaned off) before it ends up in the next owner's hands. This is a very summarized and simplified example of how, for me, a quality texture should be thought to begin with, and it can be applied to any type of context, where as if the helmet was sci-fi looking, you would just change the involved materials, and what happened to that helmet, to something suitable to a sci-fi environment.
  17. It is not clear what you have exactly keyframed for location in your Blender scene. So i will go with a general overview Location keyframes get exported based on the COG bone in Avastar (i'm assuming you're using that) keyframed relative position to the Origin bone. This is the one on the floor with arrows. That is considered to be the Avatar location, like if your avatar's contained in a box which feet sit exactly where that controller is. This box moves the avatar around while it performs the animations (walk and run in place are an example). Moving the Oring bone around doesn't affect the animation because the animation is being exported relative to that, location, rotation and scale included. Hence you can't see the movements you designed in Blender: the Origin bone moves in the "world"/scene, but the avatar doesn't, it rather moves within the Origin bone's local space. It keeps doing its thing relative to this bone which, as far as the other avatar's bones are concerned, is the reference object for location, rotation and scale, not the center or orientation of your Blender scene. To properly animate it, you need to let the Origin bone where you want the avatar's sit target, then animate the turn around. I know it's not as easy as using a parent to draw the path, i'm sorry
  18. You're making an assumption on your own. If you thought that the whole surface area should be covered by the same amount of square centimeters/1024 textures, you're missing the point by not including the re-scale factor introduced by the use of a monitor. My advice is to build in modules, where smaller surface areas (like the window frame) get their own, smaller texture, so that the main surface, the wall, can get more area on a bigger texture alone (and the window becomes a prefab you can reuse somewhere else in the same design). Indeed it doesn't. There are sizes other than 1024, you can make some parts' UVs to catch at least the same number of pixels they had in a combined 1024 map in one or two 512 as separate materials, for example, gaining resolution at reasonable price. Not only you can achieve a smaller object using multiple smaller textures, for higher pixel density and saving on texture memory(**), but you can have series of assets that keep a design consistency, instead of a varied array of looks. Plus you get a prefab template to reuse across different items in the same project, same texture, upload once and duplicate inworld: save in upload fees and on texture load. ** saving on texture memory is meant in comparison to just splitting into more, same sized textures Which is a choice, that allows you to step onto different routes. My comment was meant to point out that any surrounding item should then account for this and use the same or similar pixel density on their textures. You can use colored meshes and make beautiful cartoon stuff with very small textures and a bit of shading, what's important is that all blends together. Too much of resolution difference between objects isn't good to see.
  19. You should be aware that a 1024**2 image equals 36.12 centimeters square paper sheet, with a density of 28.346 pixels/cm. The baked lighting can be very good, but it's gonna be very blurry.
  20. To evaluate a UV map packing, it would be useful to also know what the actual 3D size of the object is (and along with that, the surface area). If there are too many pieces, a texture might not be sufficient, it all depends on how many inworld square meters this texture has to cover.
  21. a mesh tattoo was mentioned, meaning that the mesh IS the drawing and the drawing is mesh, not a alpha image on a mesh. Tattoos get easily intricate and a "intricateness" like that as mesh itself would be rendering killer.
  22. You're right, i was thinking of the mesh devkit, my bad Now THAT is something you shouldn't have come up with. If this trend ever does emerge, we all are entitled to blame you
  23. You should know that each brand has its own devkit, why the heck should one brand include a devkit for another brand in their own? Your asking for links leads me to think that the Maitreya kit you've got is not original and so should be the devkits you're looking for. In such case, i guess this isn't the right place to ask, just sayin'
  24. Animations are script-triggered, it's not something that a script alone can handle. So you will have to get the animation(s) you need beforehand, then script the described behavior. Looking in your library, there is an object called "Torch!" that contains an example script that, partially, does what you need: it fires an animation when the item's attached and stops it when detached, which is a good starting point to add a touch event for another animation.
  25. There aren't Avastar meshes to copy weights from. Scripts are very picky, and your JOMO meshes aren't the meshes Avastar is referring too, so scrap that. Do a simple binding and then copy the weights/data transfer yourself, it's your best bet. Also, since you modeled rivets as separate parts, be careful with them and, if necessary, separate them from the rest of the harness for an easier weights copy/manual weighting. //derail start// This is something i always said to the Avastar devs: too much automation and things become quickly incomprehensible or, at the very least, very much more complicated than they actually are/need to be. Beginners may be led to believe that a few clicks in a given order and without any previous background in the field would be enough with all that automation, but as you can see there are some pitfalls one can easily fall for. Do this for a few years of improvements, and you end up with an unreasonably buttons packed panel and a myriad of issues and users assistance requests. To me, when you make a plug in you "win" if it is so simple in its layout that one short series of videos should be enough to NOT receive any assistance request. //derail end// Keep up the good work man!
×
×
  • Create New...