Jump to content

Drongle McMahon

Advisor
  • Posts

    3,539
  • Joined

  • Last visited

Everything posted by Drongle McMahon

  1. What can and can you not do with mesh, You can do almost anything. You can't use flexi. You can't animate by switching the mesh geometry, but you can switch visibility of parts. Errmm.... how big can you build and import in Mesh and why do you guys prefer mesh over building from scratch? Maximum x, y or z dimension of a mesh is 64m. Mesh is building from scrathch. Other prims are ready-made parameterised meshes. Mesh gives you better render efficiency (mostly), better control over LOD changes and better texturing. 1. How big of an import can I bring from Blender to SL? Once it's in SL, can I resize and adjust as needed? It's one object right? For a single mesh prim, 64m (above). You can stretch and rotate the whole mesh in each dimension, limited by 64m maximum. You cannot use any other of the distortions available for ordinary prims. You can import multiple mesh prims in one upload. They will form either a single linkset object or a set of coalsced objects. 2. Can I actually create corridors and other complex rooms in my mesh? Or will it just simply be a 3D building. Scenery more or less? Yes. You can make buildings that you can walk inside. To do so, you need to learn about making physics shapes, which is quite complicated. 3. After the mesh is build, can I add scripting to any of it? Such as a blinking light on top of the tower or will I be forced to add it as a new prim. You can put scripts in mesh prims. They can change the texture and other properties of the faces, just as with ordinary prims. Each mesh can have up to eight faces/materials. Unlike ordinary prims, the faces can be made up of discontinous patches of the measj surface. You cannot change the geometry of the mesh with scripts although you can change the visibility of faces (with invisible textures) and the size of the whole thing. 4. Can you tell the difference between Mesh and and what? 5. I understand they charge you depending on the size and complexity of the mesh. Is there a maximum limit or does it just keep going up? I think the upload fee is not explicitly capped. There is a maximum complexity of 65536 vertices per material, but you would be most unwise to try to import anything remotely approaching that limit. Imports of very complex meshes well short of the limit generally fail for unknown reasons. There is another accounting system which determines the "cost" of the meash against the parcel prim allowance. This is called the land impact. It depends on both visual complexity, physics representation and size. Learning to control the land impact is the major challenge of SL-specific mesh design. 6. I understand that you can generally building bigger and better things with Mesh with less prims, than building it from scratch? Is this true? Those are subjective judgements depending on varying personal criteria. Many would say yes, but with a few exceptions. 7. What is the meaning of life (2nd). lol 42 vertices. 8. Anything else you wish you knew as an aspiring mesh'r? Exactly how the server calculates physics weights.
  2. What are you using to import it? It looks like it is using the wrong stitching mode. What is the stitching mode in SL? (sphere/cylinder/torus/plane). What happens if you change that? Does the importer have an option to specify which to use?
  3. WEB-4587 is one of those minority of jiras that gain high exposure, for obviously understandable reasons, and then become overrun with comments that are at best not really useful to the solution, and at worst agressive. I think we have to assume that it is to avoid having to deal with this kind of jira that the changes have been introduced. I don't mean to imply that the frustration expressed there is not legitimate, only that it doesn't contribute to the solution, which is what the jira is for. However, it should be possible to find a less drastic solution to this. That jira could surely reasonably have been closed to comments some time ago. Perhaps some other way for people to exoress their frustration could be found? I remember seeing film of some Japanese companies who kept rubber models of thier executives in a basement roo together with baseball bats for employees to hit them with. A digital equivalent, maybe with Linden-bots (even a Rodvik-bot?) in the stocks, could be set up somwhere inworld. There's plenty of abandoned mainland avaiable. Perhaps some could be dedicated to this? Probably better if it's nor personal. The effigies could be identified by issue identifiers hung round their necks, and count the blows received used to measure the extent of frustration per issue. A small charge for setting up a new effigy could be adjusted to discourage potential over-use. I would have made this a feature request, but those are not available any more.
  4. I have to emphtically agree with you here. Interactive refinement of the reproduction and extensive testing by the originator, all after triage and acceptance, and all at the request of Lindens, have often been involved in jiras I have participated in. Often the refinement depended on non-originator contributions, and often those contributions were in the form of generalisation to further cases and relationsips to other jiras (which they will now not see). Is it possible the changes were designed by someone who is just not aware of the way the jira is constructively used, only of how it is occasionally abused?
  5. I wouln't worry about uploading the textures with the model. It doesn't save you anything. In fact it means you have to pay the mesh upload fee whenever you modify the texture. The only problem with applying textures inworld is when you have a low LOD texture that is hidden at the higher LODs. You can get round that by lowering renderVolumeLODFactor to bring the lower LODs into view (0 for lowest LOD). Can't help you with the Maya=specific stuff, I'm afraid.
  6. Here is the sequence I just checked, assuming you have the unwrapped UV. The order is important. 1. Go to Edit mode/face select mode amnd select any face [should see the stippled active face display] 2. In 3D view, select all [should see the whole UV map. If you don't do this, onl;y the selected face will bake] 3. (optional) In UV ed view, View->View propeties : select "dashed" [lets you see the ao better] 4. Image->New, leave black. [to see the ao better] 5. Shading buttons(F5); World; Ambient Occlusion panel, set ratrace with 12 samples, energy=1.6, rest default. 6. Render->Bake Render Meshes->Ambient Occlusion. You should see the black image being overwritten with the ao. When it is finished, it shoud appear on the mesh if you select textured display, and you can you can save the image from the UV/image edit window. Then you can combine it with the diffuse texture either inside or outside Blender.
  7. Can you supply a bit more info? ... Is this Blender? What version? What does your UV map look like? How are you applying the texture? What do you see if you apply a grid texture?
  8. One aspect of this doesn't seem to have been mentioned ... When people come to the mesh forum looking for explanations and or solutions, and the answers are not immediately obvious, my first action is almost invariably to look at the jira to see if it's a known issue. More often than not it is, and the jira provides all the detailed information needed to explain and/or suggest solutions, say whether there is a fix coming, or say there is one already there in a new viewer. This kind of accessible and up-to-date information, concerning the nature and status of issues, is indispensible for providing reliable help to those who seek it, whether here in the forums or in Answers. It depends both on visibility and on comments from non-originators. This move to obfuscate the jira has completely destroyed that source of information. I don't know if I will choose to go on offering help. There is no satisfaction in relying on rumour, guesswork and supposition. If others persevere, the quality of the help they can provide will be compromised. Given that LL long ago decided to rely entirely on user-provided help, this must result in a significant diminution in the quality of the help available to users, including those who have never even heard of the jira. That in turn can only decrease user satisfaction and weaken retention.
  9. "How in the world can we avoid duplicate JIRA's?" Easy ... just don't submit any!
  10. "it was LL's decision to merge the customer issues JIRA with the development team's JIRA." But as far as I could see, they always cloned issues into a private jira when they started work on them. So I can't see that as the explanation.
  11. "Where, if it wasn't for the jira, would people have been able to discover issues like SVC-8124,...?" This is an apallingly retrograde decision. It demonstrates contempt of contributors to the jira. It removes at a stroke a most valuable user resource. I seriously regret having put substantial effort into jira issues in the past, only to be met with such a dismissive response. There is clearly no point in doing so in the future. In answer to your question, it must surely be that they want to prevent people knowing about such issues. "Shut up and eat what's put in front of you!"
  12. Any chance there could be a doubled row of vertices, with a set of very thin upward-facing faces between them? Then the smooth shading could cause that sort of lighting effect. Perhaps this is caused by distortion introduced by the rigging? If you apply a regular grid texture, any such distortion should be immediately apparent.
  13. If you use orthographic view, you can put them far away from the center to avoid the clutter.
  14. I tried some silver using this as a starting point. Matbe you are using Mirror in the texture with low/no diffuse texture, so that when there's nothing to reflect, it's black. I think you need to have something to reflect if you want to see the shininess. I surrounded this with a huge inside-out sphere with a procedural texture on it. All the lights and the camera have to be inside the sphere. If you know where the object is going to be, in a fixed position, you can use (an approximation of) the actual surroundings to get realistc reflections. ETA - better picture! I forgot to turn the subsurf on.....and better texture and lighting.
  15. It's not your shadow, it's Ruth's. How did you get a red cloud? Mine is always white.
  16. Mesh is the best solution. You could make excellent toes with 30 times less faces to render than a sculpty, and let the merge together at lower LODs to save even more. Maenwhile, the sharp edge between "merged" sculpties is often because of the abrupot change in normal of the adjacent smooth shaded surfaces. You can minimise this by using plane or cylinder sculpts with open edges where they meet and the minimum anlgle between them. See below for examples of open and closed joints.
  17. Agreed. Really I tried the conversion to mesh to see if I could then use the result to bake a texture onto a simple mesh. The resutls were horrible though. So I miust use just an image of the rendered particles instead. In that sense, it's just an experiment in using Blender to draw a hair texture, and there are plenty of other, perhaps better, ways to do that. I might try having some of the mesh carrying an alpha material to give that sort of layered effect, but I worry about alpha layering bug if it's too much.
  18. I'm trying to make some new hair to replace my present 100-toruis one. It will just be a simple mesh with the rendered texture from Blender. I'm stll doing experiments with that. Complicated isn't a strong enough word.I got started by looking at (it's 2.48, but....). You will see many more if you search youtube for "Blender hair".I did try making the particle hair into a mesh ... you can convert it into hairs that are one edge, then make it solid by extruding 0.003 x, y, -x, -y and removing doubles. The soliodify in Blender doesn't seem to be able to cope with something this complex. Anyway, it loses a whole lot of segments. Not sure why. But that ended up with over half a million quads, and I don;t want to put something like that in SL, even if it was possible.
  19. Kwakkelde, it's C4D, not Blender. Madelfieste, in Blender that is the problem you would get if you rotated one relative to the other in Object mode and didn't apply the rotations - even though they look the right way round in Blender. As Kwak says, the rotated box in LOD4 is then stretched and squashed to fit it into the bounding box of the high LOD mesh. I don't know whether there's something similar to Object mode rotation in C4D. Congratualtions on solving it.
  20. What happens when you look at the lowest LOD of that? (set RenderVolumeLODFactor to 0). Does the mesh have more than one materials? If so, are the repeat settings different on the different materials? (I ask that because there is a bug with material-to-face allocation on LOD switching that has not been fixed,)
  21. I was waiting for someone else to answer. The only thing I can think of is that the UV map for LOD4 is not right. It's the sort of thing that happens in Blender when you remove edge loops by merging vertices and forget to adjust the UV map to take account of the vertex movement on the model but not the uv map. Now we have the dissolve function that avoids this problem. As you are using C4D, I can't give any specific advice, but it may be something similar.
  22. Not at all. Properly made with optimising for SL, meshes are usually more resource friendly than sculpties. That is not reflected in the LI for reasons of backward compatability. Also, there are plenty of unoptimised meshes around that do waste resources, just as there are plenty of excessively used sculpties.
  23. Just select the faces you want to solidify and do Mesh->Faces->Solidify. Adjust the thickness in that panel at the bottom of the tool shelf on the left. This will make the connecting faces too. (That's an entirely amateur suggestion, by the way).
  24. "is it possible to bake that into a texture that can be used in SL?" Well, after a bit more playing .... It seems that in principle you can because you can convert the particle tracks into mesh, make them solid and then bake that into a texture on a simpler mesh. That's a bit involved and too slow on the machine I'm using at the moment. I'm noy sure how well it would work. Maenwhile, it's easy to grow the hair out of a plane, adjust clumpiness, culiness etc and get things like these. Then use those suitably mapped onto a mair mesh. With a bit of work, this might be a better route. (You might want to change the color, I suppose!)
  25. You provoked me into playing with it. Now I will never get anything done!
×
×
  • Create New...