Jump to content

Drongle McMahon

Advisor
  • Posts

    3,539
  • Joined

  • Last visited

Everything posted by Drongle McMahon

  1. The dae file accessed through your second link contains a single <mesh> with no less than 122 <polylist>s. These are the sections created for each material assigned to parts of a mesh object in Blender (you have more than 130 materials defined - presumably some are not used. The large number of materials is almost certainly the source of your problems... In the good old days, a single mesh object was limited to a maximum of 8 materials. If you provided more, the geometry assigned to the ninth and thereafter was simply omitted from the uploaded data. More recently an ill-advised kludge was added which allowed the upload of dae files with objects with more than 8 materials per mesh. It didn't really drop the restriction. Instead, it splits the object into multiple objects, each of which had 8 or less materials. These objects are then uploaded as a linkset. You have no control over the splitting. So your mesh must get split into (at least) 16 separate objects. You haven't given precise details about how you made the physics shapes. This makes it difficult to infer what would happen with the split objects. If you are providing (an) explicit physics mesh(es), then you need to be aware that you need one object for each of the (split) objects in the high LOD mesh. If you don't, then the physics becomes rather unpredictable. Otherwise, I haven't studied in detail how the uploader makes the physics after it has split the mesh into multiple objects, but I would guess this is where your p[roblem arises, whichever method yopu may have used. For example, if the split object are less than 0.5m in any dimension (a single wall) then the triangle-based physics would always behave as convex hull, with no door openings, even if set to "Prim". There's a lot of possibilities here, but here are some recommendations. First, don't use any object with more than 8 materials in any mesh. If you really can't make do with 8, then don't rely on the unpredictable splitting by the uploader. Instead, split the mesh into multiple objects yourself. Then you know exactly waht's happening and you can make the physics meshes to fit. Bear in mind that using large numbers of materials means using large numbers of textures (unless they are the same, in which case they can be the same material!), which all have to be downloaded by every viewer observing your building. This is a terrible drain on everyone's resources. Secondly, don't rely on any of the internal physics model generation in the uploader. You can always do better with a custom physics mesh file. There are many threads in this forum with the details of doing this for buildings while keeping the doorways unobstructed.
  2. "Yes probably, but it's hard to see why." :matte-motes-shocked: You are expecting a rational explanation? How long have you been here? :matte-motes-smile:
  3. Interesting. I suspect this is a deliberately introduced effect, like the switch to covex hull for walls at 0.5m thick. Similarly, that one only shows up in navmesh test view, not in the physics shape display, because the server doesn't tell the viewer it's changed. It is sensitive to the pathcut length. Here are two thin cylinders (0.6m high) with x=y=62 and 95% pathcut at the top 0.84-1.00, bottom 0.85-1.00. Doesn't seem to depend on hollow%. 
  4. Have a look at this thread, which describes the effect you are seeing. It's on glasses rather than clothes, but is surely the same alpha glitch phenoenon. There are methods for avoiding, or at least minimising, the problem using materials and/or sharp edges.
  5. Do you mean something like this box prim?  The angle of the top is set by how much the paqth cut values differ from 0.375 and 0.875. The height of the mid-point is half what the Y value is set to. The only difficulty is even texturing. Here the texture parameters are set using Select Face, as follows: front+back (in view; hor scale, vert scale, rotation): 0.25, 5.0, 0.0. Sides: 0.25, 5.0, 90.0. Top: 0.25, 0.13*, 90.0 (with offsets adjusted ). These would change as you change the relative dimensions. The advantahge of using a prim instead of a mesh is that you retain complete independent control of the dimensions and the angle. *opposite sign on each half (to allow alignment of mortar). PS: To include an image, just click on the camera symbol at the top, labelled "Photos".
  6. Ah. There's something else wong here. The uploader isn't recognising the model at all. I think the material error message is probably a secondary effect.
  7. I'm not sure what you mean by "trying to get LOD1 uploaded". Do you mean that you are supplying only the highest LOD file? In that case, what have you tried with the rest? Did you try just using the default generated lower LODs, and/or did you try selecting "Use LOD Above" for all the lower LODs? The "official" (i.e. internal developers') names for the LODs are highest=LOD3, medium=LOD2, low=LOD1, lowest=LOD0. This can be deduced from the wiki. So I guess it's also possible you mean just the mesh you are going to use in the low slot. Can you search your collada file for "<polylist>", for "<triangles>" and for "<polygons>", and show us all the lines that contain these tags? That would give us the exact "material" attribute strings, which might offer a clue. (I use notepad++ for this sort of thing, but any text editor should do.) It might also be relevant to know whether you have the "slm" meachanism active - you can inactivate most simply it by resetting the upload dialog, by deleting the ".slm" files, or by using a new filename. You can also turn it off permanently with a debug parameter (whose name I forget :matte-motes-frown:, "somethingUseSLM")
  8. Adding to Chic's instructions for baking the separately applied texture into a composite single UV map, which is not so simple, you may also need the easier step of making a rectangular pattern fit onyo your tiew shapes. Here I will asuume that you want the same pattern on all of the ties. This would be easier if you had a more regular geometry, but it looks as if you have some peculiar curvy geometry in the ties. So we can't just make everything straight. It usually saves a lot of time if you do your UV mapping before duplicating features like these ties. However, that's too late now. So here is roughly how I would do it starting from where you are. 1. Select you ties (as you have shown) and make a new image in the UV editor at 512x1024. Move the ties in the UV editor all as exactly on top of each other as possible. Center them, then stretch them to fill the whole UV area. That should look like the left picture panel, except that you have a more irregular vertex geometry. 2. Now select just the topmost row of vertices, for all the ties [yellow box 1] and switch on proportional editing withg linear more [yellow box 2]. Type SX and stretch (mouse) till the edges of the ties fit the edges of the image. You will probably have to do this several times with different ranges, adjusted with the mouse wheel, to get the edges to fit all the way down. If you need very fine fitting, and they are not very equal, you might even have to select the ties separately while you do this. You can switch on constraint of UVs to image area to make sure you don't stretch anything outside the image. 3. Now you can load the texture image into the UV editor. 4. The pattern should now fit the ties, suitably squeezed to fit at the top. This is shown using Texture Solid display mode in the 3D view.  These adjustments to the UV map should be the same in cycles. It only differs in how you use the UV map to apply the texture.
  9. Drongle McMahon

    Face Issue

    "GIMP uses an Alpha Channel by default which cannot be deleted" Layer->Transparency->Remove Alpha Channel (Gimp 2.8.2, at least)
  10. I mean it's not there in the operator panel at the bottom of the tool shelf, after you do Mesh->Faces->Solidify, not the modifier, which has lots of options. I seem to remember it being there before. The tool is easier to use if you only want to solidify a subset of faces, as you don't have to make a vertex group first.
  11. In Blender, the default rendering of faces is two-sided. In SL, everything is rendered one-sided. So if you want a top and a bottom, you have to duplicate everything and invert the normals so that the faces point in the opposite direction. To see what's going on in Blender, you can turn on "Backface Culling" in the Shading section of the 3D View's Properties panel. That will make everything one-sided. There is a quick way to do the duplication by selecting the relevant faces and using Mesh->Faces->Solidify. Or you can use the Solidify modifier, which gives more control, including choosing whether to add edge faces*. With the modifier, you have to define, populate and use a vertex group if you only want part of the mesh solidified, but I think that's not the case with your roof. Anyway, you can always apply thye modifier and then remove any bits you don't want. *I think that used to be an option with the tool, but it seems to have disappeared (2.77a).
  12. 1) I think the appearance will be better using smooth shading. However, if you do, you have to make the edges sharp using the edge-split modifier*, to avoid shading artifacts in the direction of your corrugations. That will mean there is very little gain of download weight. 2) You have not used at all excessive geometry. The six segments per cycle of the corrugations is the minimum needed for a reasonable appearance when the camera is close. 3) I used half your original panel (i.e.removed the array modifier from the larger object) to do some experiments. I closed off the edge faces at the ends first, then made some lower detail versions. First, dissolving two thirds of the edge loops so that the edge is a zig-zag instead of a smooth curve. Second, dissolving all non-edge edge loops to make a flat panel. Note that with smooth shading and edge split modifiers, both of these look completely flat. The zig-zag just keeps the wavy appearance of the edge where you see its silhouette. To look good, these would have to use at least a normal map to give the corrugated shading. That would only show under advanced lighting. So some otrher faked/baked lighting effect would be needed (as well) to look good under ordinary lighting. 4) Using the panel with default LODs and default physics gave me an LI of 11. The difference between flat and smooth shaded was tiny. Replacing the medium LOD with the zig-zag and the lower two LODs with the flat version gave me an LI of 5. using the flat version for all LODs except high gave an LI of 1. I suspect the latter is what most people would do. 5) None of those LIs is anywhere near the hundreds you mention. You didn't tell us whether it was the download or the physics weight that was high. So I have to guess it was the physics weigh. Using the high LOD for the physics, if I didn't "Analyze", the resulting triangle-based physics shape is less than 0.5m thick. That means it stays as Convex Hull even if you set it to Prim. So this did not give me a high physics weight. The only way I could get a very high physics weight was to use the high LOD mesh and "Analyze". That gave me a physics weight, when set to Prim, of 69. So that's the only thing I can think of that would give you such a high physics weight. This is, of course, completely unnecessary**. The physics can be the flat slab I made for the lowest LOD, weight 0.36. Even using the default convex hull it's only about 0.5. *You could get the same effect by using Generate Normals in the upload dialog. **unless you want realistic rolling on/across the roof corrugations. In that case you will have to pay a lot for physics, even if you use the zig-zag version.
  13. Please tell us how many triangles and vertices it has, as reported by the upload dialog, how many corrugations it has, its dimensions, whether the physics weight or the download weight is higher (More Info link on edit dialog), how you made the physics shape (if you did), and whether it is using smooth shading in Blender. That information would make it easier to give you advice.
  14. Most likely you have uploaded with a triangle-based physics shape (i.e. you didn't click "Analyze" on the physics tab) and your wall is less than 0.5m thick. In that case, the server secretly converts the physics shape to Convex Hull even if you have set it to Prim. That fills in the hole. You can test to see if that's the problem by stretching the wall in the thickness direction so that it's more than 0.5m thick. If you select your high LOD mesh and click "Analyze", that should avoid the secret conversion. There are sometimes problems with "Analyze" if your mesh is at all complex. In that case, instead of using the visual mesh, you can make a special one and use its file for the physics instead. Best is a collection of non-overlapping boxes. Sometimes you can get a lower physics weight by using a triangle-based physics shape, as long as you use a version of the mesh with the narrow faces along the edges of the wall removed. In that case you need to overcome the secret conversion, either by making the wall thicker then 0.5m, or by adding some geometry, possibly underground, to make the whole mesh at least 0.5m thick.
  15. Indeed, the distortion of the custom normals is a Blender problem, not an Exporter problem. Last time I tried, choosing the triangulate option on export also distorted the custom normals. Maybe you fixed it since then. I have just avoided the triangulate option since then. Maybe you fixed it. I wouldn't have noticed. I will make sure I've got the latest Blender release and try again, then add the result to this post. Should I use something later than 2.77a? Here's the results... 1) Square plate with one-segment bevels and normals transferred with Data Transfer modifier from similar plate with two segment bevels with profile=1.0: Triangulate option distorts normals. So does triangulate modifier if it's applied after the Data Transfer modifier, but not if it's applied before. This is the same behaviour observed before. 2) Cylinder with/without either edge-split modifier or auto-smooth, with angle 30 degrees. Now the auto-smooth normals are correctly exported after import into SL. That is new. Last time I tried, it ignored the results of the auto-smoothing. So I guess the problem with not exporting the auto-smooth normals is fixed, which is excellent, but the problem with triangulation of transferred normals is still there. If there is only a data transfer modifier, that would be easy to fix because the effects with the ordering of the triangulate and data transfer modifiers suggests that it is only necessary to do the triangulation before applying the data-transfer. However, triangulating before other preceding modifiers might interfere with them. So I guess you can't just do it before applying all modifiers. Some more complicated ordering might be necessary. Maybe inserting a triangulation modifier before the data transfer modifier? For me, it's fine as it is, because I can leave the triangulate option off and either do it explicitly in Blender or rely on the viewer to do it. I suppose others may not be as happy with that. Here's a picture of the normal corruption by the triangulate option. Option on at left, off at right.  Interesting to note that the triangulation done in the viewer appears not to interfere with the custom normals. So it's not an impossible task. ETA: I guess Lithium have completely messded up the image handling. Can't see the normals properly here. Here's a close-up... 
  16. "I save the object from this program as an OBJ. file as it cant save as a DAE. I try opening this NEW file in Blender." Have to make some assumptions here - I am guessing you saved an obj file from SL, although I have no idea how. Then you opened it withj Blender, maybe using Windows Explorer right-click menu. If you do that, Blender will try to open it, but discover that it's not a Blender file. So it will fail, and instead of what you want, it opens with the default file, which is just a cube. Blender can't "Open" obj files, it has to "Import" them. What you need to do is to open Blender, then Import the obj file. That's done with the File menu, Import submenu, click Wavefront(obj) option. If my guesses are wrong ....
  17. I think you need a light source. Move the lamp I can see out in front of the house, or put a new one there, so that some light falls on the textured walls. If you experiment with the display and view rendering options, you should also find settings that will show you the materials without needing lighting.
  18. In Blender, edited normals are also exported correctly, but if you use the "triangulate" option on (or apply some modifiers after the normal-editing ones) make any changes to the geometry after editing them, they get badly distorted (unless the mesh is already only triangles). In general, custom normals are not stable in the face of subsequent changes to the mesh geometry. The "Auto-smooth" thing on its own apparently uses a different mechanism such that the altered normals are not recognised by the exporter functions, but if you copy them onto a duplicate of the mesh, then they use the edited normal mechanism and are therefore recognised.
  19. "the collada files can become a bit smaller" True, although maybe not really much benefit because the collada files are only read locally by the viewer. If you do this, please leave it as <polylist> if the triangulate option isn't selected, even if the mesh is all triangles*. That way it will be possible to triangulate explicitly in Blender before upload and still upload as <polylist>. It's a very long time since I uploaded anything with <triangles>, since the old python exporter. So I haven't got too much faith in the <triangles> function (although I guess Arton has). There is a more severe problem with the triangulate though, as it destroys custom normals. If you could fix that in the exporter at the same time, that would be great. The Blender people told me there were no plans to make custom normals survive any changes in geometry. So I don't know if that's even possible. While I seem to be compiling an exporter wishlist :matte-motes-nerdy:, there's the normals from activating Auto-smooth. Any chance of exporting those? It is possible to get them exported by duplicating the mesh and transferring the normals, but that's not exactly intuitive. (I'm assuming you haven't done this already). *sounds like what you intended - so that should be ok.
  20. "Doesn't this make people believe that higher poly counts become more tolerable?" That would be my worry, as far as SL is concerned. <triangles> also introduce some uncertainty because the threshold number of vertices would depend on smooth/flat shading and UV mapping, while the <polylist> function doesn't vary. However, it would reduce the problem for some users. I would rather see an explicit option, use <triangles> or <polylist>, instead of a hidden response to triangulation. As well as giving more control, that would make sure the results of uploading existing files doesn't change unless the option is selected. Of course the "correct" resolution would be for the code to be fixed. At the least, the <polylist> function could be based on the vertex count instead of the index count, like the <triangle> function, giving a higher limit. Better still, as far as I can see, these tests could be completely removed. That would mean they no longer pre-empted the 65536 vertex limit, elsewhere in the code, that raises an error and stops the upload. That way the originally intended limit of 6536 vertices per material would be enforced, and the whole secret material thing would disappear, along with the effects it has on objects with more than eight materials. Unfortunately, that would probably also break a lot of existing files. It's also unlikely because the jira has been there for several years without any response.
  21. Ah yes. My mistake. The extra material won't be triggered with <triangles> until you get 6532 vertices, not 21844. I edited the post above to give the correct numbers. Your cushion is a bit short of the threshold for <triangles> files. By the way, I didn't get it yet, but it doesn't matter now.
  22. ETA - this was mistaken ... see above I don't think you can have those numbers without getting the extra materials*. As long as you only apply one texture or colour to the whole thing, you won't see the effects of the extra materials. To see if they are there, you have to activate "Select Face", click on the mesh, and then change the coulor. By clicking on different places, you can eventually make each secret material a different color, as long as it's not the "master" material which will recolour all the others. The patchwork colouring will not be visible to other observers and will disappear if you relog. In tjose cases, the "master" always overrides. If you want to give me a copy of the cusion on Aditi, I would be interested in checking it out. *That is unless your dae uses <polygons>, in which case I don't know what happens.
  23. I guess so, although staying inside the 21844 triangle limit will always avoid the problems with the secret extra materials, because a triangle can't have more than three vertices.
  24. ETA: red->green edits made when Arton de4tected a mistake. This is quite true. There is an explanatory note in a comment on the BUG-1001 jira. The relevant line of code is different in the two functions that are used to load the dae file data... In the <polylist> function, it's if (indices.size()%3 == 0 && indices.size() >= 65532) In the <triangles> function it's if (indices.size()%3 == 0 && verts.size() >= 65532) In either case, if the result is true, that triggers the new secret material. The "indices" list is a list of indices into "verts" list of vertices. Every triangle must have three entries in the "indices" list, one for each of its vertices. This means that the <polylist> function always triggers the new material after 21844 triangles. However, entries in the "verts" list can be shared between multiple triangles, as long as the position, normal and UV coordinates are the same. This means the limit in the <triangles> function depends on the extent of sharing of vertices between triangles. For an extended sheet of smooth shaded triangulated quads, I think this is about two triangles per vertex. So the limit would then be reached at 43688 triangles 65532 vertices, which would be 131062 triangles!. So in general, the trigger for a dae file using <triangles> should be one to two three to six times higher than the 21844 triangle count for the <polylist> file. In any case, the 65532 limit will still be reached before the 65536 vertex limit which raises an error. So the extra secret material effect should still happen eventually. Having said that, I haven't experimented with <triangles> since abandoning Blender 4.x, which used them. So some experimental (in)validation of these statements would be worthwhile. There is a third function that loads the data from a <polygons> section instead of the other two. The structure of this code is quite different, and I don't know if or where equivalent limits are imposed. I don't know of any, but the code mentions that some fbx to dae converters use this kind of data. There's no reason obvious to me why the code should impose these limits on the number of triangles or vertices. Restriction of the number of distinct vertices to 65536 is a requirement of the upload data format, in which the three pointers for each triangle into the vertex list are limited to 16 bits, but these limits are always more stringent*. There aren't any pointers into the triangle table, as far as I know, so there is no obvious reason to limit its length, unless there is a 16 bit counter somewhere I am unaware of. *except for highly contrived structures with redundant triangles, I think.
×
×
  • Create New...