Jump to content

NaomiLocket

Resident
  • Posts

    94
  • Joined

  • Last visited

Everything posted by NaomiLocket

  1. You do have a point there and there was certainly some interest.
  2. It might be worth noting the specific contexts for which this is actually true and relevant. Or rather underline and stress that. It is nice that you've taken the time to try and do a basic test, and find something out. But to be realistic, it is missing some fundamental points. A stripped down basic flat shader will render hundreds of thousands to millions of triangles without sneezing on old outdated hardware. Though if the controlling program, or shader is written particularly poorly, it might not. But that is a hard one to get wrong. In terms of SL, different windlight settings will impact the same viewed content. Even exponentially. The content is not the problem in those cases, clearly. That is to say, the same content that renders fine in 30-60fps can by changing scene settings crawl to a snail pace. We can observe that out in the wild. Other teams and software packages set the bar for now outdated "nextgen" at 10k triangles a character years ago. You're talking 4.4 frames per second. No one ever realistically feels that unless you are under fifteen, but some may argue twenty. I've stubbornly played a FPS game averaging 8 frames per second. I know from experience when and what that kind of pain is. The number of triangles you are talking about in your test is fundamentally tiny and doesn't relate to issues of using high poly content that is not designed to be used for real time rendering. Which is probably the more significant topic. Textures, being images are just a table of data. A shader need never use all of it. I doubt the underlying library even does explicitly all the time. OpenGL has many revisions over the years and is supported/implemented by nvidia. That doesn't necessarily mean that SL doesn't, I guess, I haven't actually looked at the code to be sure. But it will come down to how it is written and used. In that sense texture size for us, that do not touch any of the viewers rendering pipline, is mainly a time to download concern. Circle of control and circle of concern. Of course most reasonable creators are not totally suicidal, and if it doesn't work on their own viewer and machine, it clearly doesn't work. Just a few things to consider. It is worth being reasonable and wise with triangle counts, but it is not imperative or necessary to stress them.
  3. Some of the decisions people make are based on personal convenience or solving for a particular thing. I have my doubts something like this will be as hugely technical as many felt, or that it marks them at the time good or bad in a particular way based on someone's opinion. Though it does suggest they are not trimming as much as possible at the end of the workflow. Some of the basics of SL mesh implementation dictate that the bounding box will be stretched and filled to. Or at least I am pretty sure it was mentioned as a caveat when doing custom physics shapes and why it may not match the visual geometry. A persons personal preference on what object to pad the ratio of the geometry with will likely be based on their selection tastes in the tool they are using. A tiny triangle that may be backface culled is prone to more difficulty to work with and place than a cube. SL's rendering of non-masked transparencies would cover the entire surface or may impact click-through ability in some conditions. So limiting the surface area and making the object humanely convenient at speed seems a likely factor. And I left out points of origin and snaps used for translation, well until now, I just mentioned it. Though the previous explored possibilities may still weigh into it. My instincts immediately go to workflow, human sanity, a hybrid between tool and target, and time reasons, before anything deeply technical and mathematical.
  4. Build with all the tools. Preferably with the LOD/Object detail setting at 2. But build with all the tools as often as you can muster. They all have their purposes and strengths. Unfortunately there was maybe some politics in their implementation along the way. But push on with them all. And do it little or no apology. Even the Unreal engine for a long time if not still, had/has its own dynamic building blocks. Even though you can build everything there with static mesh, it isn't recommended. Same thing with Second Life. The prims are a mutable construct created procedurally/programmatically (as far as I have felt) from a defined set of parameters. They are not always optimised. Sculpties are an image-data packed 3D lattice array. Basically a subdivided plane. Which is synonymous with the terrain, but as a modifiable object. It was pretty much an obvious genius no-brainer back in the day. Given that Second Life's typical upload and asset database use was textures (images). People used it for as much as they could. And for what ever reasons, its implementation never grew to solve actual typical static mesh uses. Such as physics, multiple faces, and rigging. It had to have a specific shader to create the representing image from other software packages. It is probably the single thing I know of that Second Life really pioneered deeply. I've defended sculpties in debate with some of my friends before, within limitations. Mesh, is a love-hate thing. It is the opportunity for a custom prim. But it is also static. The implementation isn't really ideal. And the costing is weird in some places. The whole prim-equivalence debate was probably the worst thing for it. I've had the importer dialog suggest to me that a UV mapping point was an extra vertex. Which just undermines and penalises attempts to "optimise" mesh anyway. Still there is a lot of freedom that comes with "mesh". It will just take you a while to get to grips with "acceptable costs" and getting around the tax issues. Project bento being introduced finally opened up new opportunities for Second Life to grow a bit, at least on the avatar front. It will simply take time to find out what each item is best at. My suggestion is to play with them all. Don't worry too much about optimisations and lag. Just be level headed about it. There are times and places optimising for a system that is far from optimised becomes a bit silly. Learn, practice, and have fun. Find what works, and what does not.
  5. I alternate between Sublime-Text, Atom, and VS Code.
  6. https://code.visualstudio.com/ VS Code is a smaller cut down version on the editor side of things. ie: minus the gigabytes of SDK's. In other words, MS's sublime. I don't think anyone has made a LSL debugger for it yet. So it just has a couple of syntax highlighting extension projects so far, that may be slightly behind the latest keywords/constants. It has an integrated terminal window, so you can run your external lslint or whatever. So sublime-text 3 might still be some peoples preference to date. The software itself has some decent in editor tooltip documentation features, that need a bit of attention and love. The highlight package I went with, doesn't really use it yet.
  7. An Artist. Specifically one trained in creating art assets for interactive entertainment. Various games credits lists can give insight to some of the typical categories and headings different specialisations or generalisations fall under.
  8. wherorangi wrote: i would probably look at using llGetAttachedList on attach then read some unique property of any attachments with the same name http://wiki.secondlife.com/wiki/LlGetAttachedList I would try wherorangi's approach first personally. Even as basic as the same name. The first one will never find a duplicate. The second one would. The only time I can figure we'd have more than two is an Add race. They only need to find one other to detach.
  9. High and Low poly are not relative terms. They have never been relative. They are descriptive of characteristics and physical properties of two different meshes. They come about from the results of tools and workflows. A previous poster already explained that LowPoly meshes are made up of verticies, edges and faces that make up and contribute in some tangible way to the actual silhouette(shape). It does not matter how many that is. because the object could be of any size and be made of any amount of components. As long as it follows that basic principle, it may make use of texturing techniques that are a derivative of a high resolution model. It is still LowPoly. LowPoly objects are game optimised. Take up a part of the over all budget. And you can have a certain number of these things until things slow down. HighPoly have no constraints. Are rendering specific. And are dense for detail above function. Just because a Quake 1 model is chunkier and uses less than a few hundred triangles and a modern day unreal character uses tens of thousands - doesn't make either different from each other. They are both LowPoly. HighPoly are in excess and are able to be represented close to true form by a more efficient asset. In terms of SecondLife, anything that is uniformly dense and not tesselated to key areas to control low shader shadows. Is just a tightly packed grid. is HighPoly. If it's hard to see through the wireframe. it's HighPoly (or part of it is)
  10. Sassy Romano wrote: Either way, your CPU question is now irrelevant. A CPU question is never irrelevant. Games not built for mulitcores have a habit of stalling or crashing unless forced through settings to run on a single core. So asking if the viewer would benefit beyond a certain number of cores is always valid. From both a sanity and financial point of view.
  11. I haven't tried many CPU's so I can't answer that question. But for what people have been saying about graphics cards; In terms of graphics card, the two major parts you'll need to consider are processing shadows and projection lighting. They tend to be the heavy hitters. But it is irrelevant if you don't run those settings and still end up with a slow pc. I would be more concerned about other bottlenecks, like how often the HDD is being read and written to. If processing even the most basic things comes to a crawl it all gets kinda mute.
  12. Sometimes, Rolig, when I've copied an object, I have forgotten which I still have selected. The original or the copy. Do you suppose these other scripters you remember, forgot that they had the original still selected and that the original is what moves when you shift copy? The copy might have still reset.
  13. 4.1 and 4.2 cover it. It's pretty heavy reading and takes a while to wrap around what is said, versus what someone wants to hear. 4.1 makes it sound like, you are not to share it with any third party. (assigning an account) disclosure of login credentials is effectively an assignment. 4.2 makes it sound like technically authorized access is permissable but you are solely responsible for everything and if they need to and may terminate should there be a reason. The level of trust needed for this sort of thing is pretty much on an unconditional and epic proportion. That doesn't grow easily.
  14. It was just a yes or no question. Which the answer was yes. Though if people want to be helpful they'd just have to start explaining the introduciton to making a bumper counter game, with score theft, and why that would or would not work. (He/She only asked for how to start a [any] game)
  15. I am not sure on the why. I am sure that myself and my friends have had to deal with the same thing now and then. On many worn products. It could be memory or a streaming issue or any other related technical what-ever. What we have done in the past is make use of some viewers texture refresh option. Some viewers ability to move the camera far to unload. Or to reattach the affected item. It has usually not been permenant. Just that somehow the texture could not be displayed in full. If you are using the SL viewer, I suggest seeing if taking the item off and on again helps any. You probbaly already have but as I can't really move the camera far it kind of rules that option out. But the main thing is to try and force the viewer to reload and refresh the texture over others it is preferring. I think.
  16. It may depend on the UV map layout of the mesh in particular. As in where the face boundaries lay. But if you turn off stretch texture option in the build panel up the top that should do it for you as the object stretches. At least off the top of my head, I could be remembering wrong but if the texture is right it should displace the seams when it repeats. *crosses fingers*
×
×
  • Create New...