Jump to content

Drongle McMahon

Advisor
  • Posts

    3,539
  • Joined

  • Last visited

Everything posted by Drongle McMahon

  1. You could try - (1) Make each layer of hair assigned to a different material in Blender - they can still have the same texture. (2) Make sure any edges between hair layers are split with the edge split modifier. (3) Make sure that each material in the hair has less than 21844 triangles assigned to it. These are things that might work depending on whether the effect is cause by some known problems. No guarantee that they are or that these will work.
  2. Here's an old thread about this sort of thing. Look at the triangle/vertex counts in message 17 to get some idea what you need to aim for. Actualli, now that we can edit normals in Blender, I would now go for something with more geometry in the highest LOD to give realistic edge highlighting under advanced lighting mode. That would probably push the LI up to 2 or 3, but would be worth it.
  3. "Note that there is a limit of 65536 vertices in a single object to import to SL" Although it says so in the wiki, this is actually untrue, as you found with your imported object. This is because the test for 64k vertices, although there in the code, is never reached because the 21844 triangle limit is always reached first. It's the triangle limit that causes the strange texturing effects, by creating "secret" materials, and allows the bypassing of the 64k vertex limit. Also, both limits apply per material, not per object.
  4. The limit for the hi-poly secret extra material effect is 21844 triangles. So you have exceeded that by far. With 47106 triangles, you will have three materials instead on one, if they start out as one material. The limit applies per-material. So it is possible to exceed it for the whole model, but only if you have less than 21844 triangles assigned to each material. Try reducing it to below 21844 per material and see what happens.
  5. "Analyze" works by converting the mesh it is given into a set of convex hulls, which are handled more efficiently by the physics engine than the triangle mesh (unless the triangles are large). With any viewer, it will often struggle with mesh that have complex concavities and/or holes. One way to get around these problems is to provide a physics mesh that is already a set of convex hulls that don't overlap each other. Then the function doesn't really have to do anything. Here, for example, is a possible physics mesh for your model. It consists of seven convex hulls with 66 total vertices. It will have a physics weight of 2.92. It could be simplified further if that is too high. Any viewer should be able to use this sort of mesh without any problems. Also, you control the shape completely, as the "Analysze" doesn't have to do anything except convert it to the internal convex hull format. 
  6. The thread that Christhiana refers to does contain a workaround for Blender (in case that's what you are using) that requires editing of the dae file. However, in the same thread Gaia mentioned the forthcoming introduction of custom normal editing to Blender. This is now available* and produces a much more effective way of producing the same effect. The technique in Blender, using the Data Transfer modifier, is quite complicated, but when learned allows accurate matching of the normals at the edges of different objects (as well as other useful things). I think Avastar has a much more accessible interface to do this (and could do it even before it was introduced into Blender). So if you are using Avastar, it should be easier. If you aren't using Blender, then I believe other advanced programs have at least equally effective methods for editing the normals. *eg here and here.
  7. "SL uses 32 bit" Unfortunately, this isn't the case for uploaded mesh data. For the positions within each mesh, the upload data only uses 16 bits for the position on each axis. These positions are scaled by a factor depending on the size of the mesh in each domension, so that the whole 16 bit range is used for the extent of the mesh. This means the absolute accuracy of the positions depends on the size of the mesh. Because the scaling factors are variable (65536/extent), it isn't necessarily going to help aligning to mm. To be sure poistions won't be subject to rounding, you would have to make sure all your vertices fitted exactly with 1/65536 of the bounding box in each axis. I suppose you could do this if you set the bounding box to, say, 65.536m and then made all vertex positions exact in mm. With a different extent, the mm alignment would not avoid rounding errors. Also, if you have two mesh pieces with different sizes, and therefore different scaling factors, the rounding errors in reduction to 16 bits might well affect two coincident points differently. Perhaps the best advice here is to choose splitting boundaries where these errors will have the least obnoxious effect?
  8. Specular highlights are highly dependent on both the lighting and the camera angle, but for baking there is no camera. Without using material nodes, the old Blender render engine simply ignores specular reflections when you do a full bake. If you use material nodes, then the Extended Material node has a spec output socket which contains something like specular highlights. This can be mixed/added into the output. It then gets included in the full bake output. Similarly, with the Cycles render engine, which always uses nodes, you can also get the same highlight-like output in the combines or glossy bakes. However, the highlights from these bakes are not those you would see from any normal camera, as the baking doesn't use one. Instead they are the highlights you would get by using a different camera for each pixel, pointing at the pixel along the normal at the corresponding point on the mesh surface. This is not like any physically conceivable camera. The result depends only on the lighting. So the baked highlights are not what you see in any rendered image using a particular camera position. Nevertheless, if you want to see highlights with Advanced Lighting turned off, you can often use these bakes to get acceptable effects. As they are fixed, unchanging with either lighting or camera movements, they may conflict with dynamic lighting effects seen when using the Advanced Lighting system with appropriate specular maps and glossiness and environmental reflection parameters.
  9. Hmm. One thing that will undo the effects of the edge split modifier after it is applied is to move the mesh in edit more if you have "Automerge Editing" turned on (Mesh menu - effectively applies remove doubles). That sounds unlikely to be the problem here. You say the effect still happens if you switch the mesh to flat shading. That is very strange indeed. The only thing I can think of that does that as well as undoing the edge split is if you check "Generate Normals" in the uploader. For your model, you would also have to set the crease angle to greater than 90 degerees. I don't suppose you did either of those, did you? Another point that is also very unlikely to be your problem - if you set the shading to sharp in Blender by switching on Auto Smooth (Object data properties/Normals) and lowering the angle, that will not work in the uploaded model because the auto-smoothed normals are not exported* (although the ones you see using custom normal editing with data transfer are exported, even though they look similar). *unless the exporter has been changed recently.
  10. First, you need to be aware that the lower LOD (and physics) models will be stretched and/or squeezed so that their xyz axis bounding boxes fit the bounding box of the highest LOD grid exactly. So if you need to make them fit the whole of that box if you want to avoid distortion. Knowing that, if you make the LOD models separately, the easiest way to align their parts is to align them exactly with the high LOD model along each axis and adjust them. If you align all three axes at once, so that the models are superimposed, then that's easy, but you can't always see well enough. Then you can place one behind the other in each axis, aligning the other two. Provided you use orthographic projection, you can then align parts along those two axes. Wireframe view can be useful there. If all that isn't enough, then you can get exact measurements by checking the Edge Info: Length checkbox in the Mesh Display section of the properties panel on the right of the 3D view. Once again using orthographic projection along the major axes, you can temporarily add rectangles, stretch them to the bounds of the part you wish to measure, and read the lengths of the selected edges. As MistaMoose says though, you can avoid all these complications by actually making the low LOD models from the high LOD (or vice-versa).
  11. They use different kinds primitives in the physics engine to detect collisions. Triangle-based shapes are just that, a collection of triangles. These don'r work very efficiently, especially if (any of) the triangles are small. So their physics weights get larger the smaller the mesh is. When you click "Analyze", the uploader generates a set of convex hulls that are supposed to approximate the behaviour. The physics engine can work more efficiently with convex hulls than with triangles, and since tTheir size makes no difference, the physics weights are independent of size (mostly). There is a crossover point, at about the size of walls of houses, where the triangle-based shape can start to have a lower physics weight than the hull-based shape. The hull generator can't deal very well with complex shapes. So it is usually best to present it with a set of blocks that are already convex hull. In other words, do the work before it gets the chance to mess it up. There is a lot of discussion on optimising either kind of shape in previous threads in this forum.
  12. Is the house one mesh when you upload it, or is it a linkset?
  13. One possibility for the unreproducible LI: The LOD generator used by the uploader is non-deterministic. That means that identical input can produce different results on different occasions. The differences can sometimes affect the download weights. If the LOD meshes are used for the physics, then the physics weight can also change. You didn't tell us whether you used automatic LODs. So we can't tell whether this might be the source of the effect. Also, you haven't said whether you used triangle-based (no Analyze) or hull-based (Analyzed) physics. If it was triangle-based, then there are some strange characteristics of the uploader that can produce quite unexpected results. If it was hull-based, then if it used auto-LOD, the physics weights could differ because of the non-determinism. Was it the download weight or the physics weight that changed? Differences in UV mapping can also have quite large effects on the download weight. So if your texture changes involved changes to the UV mapping, that's another possibility. This would be reflected in different uploader vertex counts in the uploads before and after. Did you record these? There is no way to obtain the Prim-type physics weights before uploading and rezzing. If you need to experiment, you might want to do this on the beta grid (Aditi), so that you don't waste real L$ on upload fees. Can't help with the downloading to Blender, as I have no idea how that works.
  14. It might be worth noting how the deviation of length decreases with the increase in accuracy of the curve by increasing the number of segments in the full circle. 8->0.2706; 16->0.2549; 32->0.2512; 64->0.2503; 128->0.2501; 256->0.2500.
  15. Oh yes ... I guess that's when it's very tiny?
  16. Yes and no. Hmm. I think you are stretching the meaning a bit there. I'd still call it a linkset even if one member is invisible.
  17. Yes. Here's another picture. They are exactly what's expected from the angle of the trapezium. HGaven't checked the ones transtioning to the straight edges though.  ETA: The angle between the straight sides and the last segment is exactly half of the angle between two segments. The 0.2503 edge is what's expected when that is divided by 2. So again, ity's what is expected. Just another consequence of the fact that curves are approximated by straight-line segments.
  18. Yes. Sculpty download weights are size-dependent, like mesh. Although capped, they can be a small as 1.3. Then they increase as they are stretched, but stop at 2.0. That can reduce their contribution to linksets, but they can never be less than 2 LI on their own because of the physics weight, as you say.
  19. Here's a diagram. First, the shape was made by Alt-selecting the top edge, typing F to make a face, then typing i0.25 to inset the face, then deleting the central face. You get exactly the same geometry if you use the solidify tool with thickness 0.25. The lower inset show why the radial edges are not 0.25. It's because that is the radial distance between the middle of the inner and outer edges. Because the segments are trapezoid, not curved, the actual edges have to be slightly longer. At the corners, the diagonal edge is sqrt(2)*0.25=0.3536 (sqrt(2)=sin(pi/4)) because it's a 45 degree diagonal. 
  20. Under new accounting, an unlinked sculpty has its download weight capped at 2. It's physics weigh is 1.8 and is independent of size*. So it shouldn't ever have LI less than 1.8, however small it is (except where it uses the old accounting, no normal of spec map, physics type Prim) when it will be 1. If it is linked to another prim and is not the root, then its physics shape can be set to None. In that case the download weight or the server weight will set the LI. The "Prim" shape is actually a convex hull.
  21. Do the vertical posts protrude below the level of the square rim, but the physics shape doesn't? The physics shape gets stretched (or squashed) to fit the x,y and z dimensions of the visible mersh model. So that could explain why the rim of the physics gets pushed down to wherethe bottom of the posts are. You can overcome this by making the physics shape fit the dimensions of the visible mesh, for example by making the rim thicker, or by adding a small cube underneath that matches the length of protruding post. Also, what is the small yellow-highlighted rectangle in the picture of your physics mesh?
  22. If a linkset consists of only traditional prims with no normal or specular maps etc., with prim-type physics shapes, then its LI is calculated with the legacy system as 1.0 per prim. If this is then linked to mesh, or a normal map etc is added, it switches to tyhe new LI system, where each prim can be as little as 0.5 (server weight) if it is ismple enough. So if you unlinked the last object from the linkset that could induce new accounting, the LI of the remaining simple prims could as much as double. This might explain the effect. Try setting the normal (bumpiness) map to "Blank" on one of the remaining prims and see if the LI is reduced as a result. But you say it's 100% mesh - in which case that can't b the explanation.
  23. What you describe - different selectable faces reverting to texture and colour of a dominant one - sounds like a phenomenon that happens when the relevant meterials in SL have the same name. There are at least two known causes. First, when the names in the Collada file are actually the same, or they get truncated by the uploader at included spaces so that they become the same. Second. when a material/submesh has more than 21844 triangles the uploader creates an additional face that has the same name. As far as I know, the Blender excludes duplicate names and the Blender exporter replaces spaces with underscores. Those effects should exclude the first cause. Neither does it sound quite consistent with the sercond. Nevertheless, it is likely to have something to do with these effects. If you have any materials with more than 21844 triangles assigned to a single vertex, then the situation becomes more complex as the extra generated materials can take the total above eight and cause the separatrion of the mesh into multiple objects. So recommendations are (1) make sure you have no materials in any one mesh with more than 21844 triangles (that would be grossly excessive for this kind of object anyway, for performance reasons). (2) Make suire material names are clearly distinct, preferably without spaces. If these don't work, then there must be yet anither cuause for this phenomenon.
  24. At first glance, at least five years ago (eg here). Can you explain why you consider "material" is wrong? I doubt if you will change my ingrained habit, but you can have a go if you like. Inside SL the relevant entity is a list of triangles, a subset of those in a mesh, that are distinguished by a set of rendering attributes (textures, colours, etc.) normally known as material attributes. In SL and LSL this entity is called a "face", but that conflicts with the normal meaning of "face" in other contexts. (Hence the terms "SL face" or "texture face" to resolve the ambiguity.) In Collada, the same entity, that becomes the "face" in SL, is a subset of polygons that share (only) a distinguishing "material" attribute. In Blender (and, I believe, in other 3D programs) this entity is created by assigning a material to a subset of faces of the mesh. So the term "material" applies sensibly in all three contexts. In contrast, the terms "SL face" or "texture face" don't seem to have any sensible immediate meaning in the contexts of Collada or the authoring tools. "Material" is not entirely satisfactory because the same set of material attributes can be applied to subsets of faces in different meshes, so that a specific "material" doesn't obviously refer to the subset of a specific mesh, although we often use it for that purpose. Also, there is some ambiguity with the use of "material" to encompass the physical attributes settable in the SL edit dialog. I think we used to use the term "submesh", which doesn't have those drawbacks, but I guess it fell out of use because it didn't have any direct association with the rendering attributes that were the target of interest when referring to this entity.
  25. The order doesn't matter any more because the uploader now sorts the materials by name. You don't have enough triangles to run into the secret material problem (only happens at >21844 tris/material). So my guess is that you might not have triangles assigned to each of the materials at each LOD. The thing that matters in the collada file is the name of the "material" attribute inside the <polylist...> (or <polygons...> or <triangles...>) tags. There must be the same number of these tags inside the <geometry> tags of each object, and they must have the same "material" attribute values. It is not sufficient to have the materials listed in the materials section of the collada if they are not actually assigned to some triangles. If you have used more than eight materials in one object, the uploader now deals with that by splitting it into multiple objects each with 8 or less materials. I haven't seen an example, but it is possible that this splitting may not be done in a compatible way in each of the LOD files. If that's the problem, and you really have to have so many materials, then you would be advised to split the models yourself rather than leaving it to the uploader, so you can make sure the split objects have the same materials at each LOD . A better preferable solution would be to simplify the texturing so that you don't need more than eight materials.
×
×
  • Create New...