Jump to content

NaomiLocket

Resident
  • Posts

    91
  • Joined

  • Last visited

Everything posted by NaomiLocket

  1. The problem is that code comes first. Everyone knows that. Performance is entirely at the mercy of code, it always has been. It dictates how data is handled. As a graphics professional you also know that. You also know that there is more to art and context than a blanket "this is bad and not optimised", because you know what a budget is and basic math. Hiding the problem with overly reduced content is not optimising or addressing any underlying problem. It's also not in line with necessity being the mother of all invention either, that happened and the lab found single points of failure or that people had memory leaks and crashed visiting stores and not because the mesh was bad. There are probably areas we'd find to agree with, but I won't agree with blowing a problem out of proportion, that doesn't mean that I am not aware there are meshes so dense in the wireframe you can't see through them but it also doesn't mean that it's crashing my tiny ass 2gb card either - because it's not using excessive memory. Yes, people have made content in ways I would not. Some content 'less poorly' made lagged my system more so than high density mesh that is beyond what I would make, because that is the raw nature of the content and code that acted on it. Half the people that rail against someone for a high detail LOD, don't properly understand what a high detail LOD and what the rest of the LOD system does and just abuse someone to the point they delete threads - and go back to the professionals instead where they have accounts. You need to be more flexible on optimisation and remember that more than one road leads to rome.
  2. That seems to be a gross overstretch of the imagination. There is nothing wrong or incorrect with expecting a developer to actually do software engineering. That is what professionals do as fundamentally proven by ID software. They accepted the criticism that 2016 doom in all its technological marvel and success in preventing many issues with memory via a texture buffer, didn't give enough detail clarity in its current state. So they changed it for doom eternal like grown ups. In contrast when developers don't quite get it, you have Fallout 76's idea of code reuse and OOP, propagating vulnerabilities all over, causing their customers to lose money - and "it just works" as a meme.
  3. Seeing as you can upload something upwards of 60li and skyrocketing without following their rules for a given reason, I'd say they did exactly as you said. My point is, content creators that create same or similar goods in a kinder fashion need to also be merchants doing a merchants job. Can't complain if someone that owns venue x, buys item y that looks great to them at a price they don't mind, and puts it in venue x & z of their choice. Ignoring that is missing the point of SL's design and function along the entire chain.
  4. It doesn't need to work. There is a point where it is not the problem of the system and the system doesn't need to make it the concern. Skyrim modders not understanding that Skyrim was made before 4k was a thing has nothing to do with when the next remaster hits. Second life based on freeform content doesn't have the same argument any other platform does for sitting on its engine too long either, but that is another topic. The fact of the matter is you do not need to, and should not, coerce an ideal of creation at this level. It is simply not the way. Using a more suitable decimation solves the original described point about collapsing objects that don't need to be collapsed fully. Or basically shifting the bottom end of the threshold. Seeing as when you strip all the nonsense out of the OP that was tacked onto LOD and the need to ignore balanced settings, that is the matter you are left with and going no further is required.
  5. @ChinReyPretty much, and one up to the actual task wouldn't be a walk in the park for sure.
  6. Once I noticed a seller had miss-permed a script in a gift. In that gift happened to be that typical generation method coupled with the default suggested addition. I forget off the top of my head if that addition was 100, or 1000. But what ever value was suggested on the forum at the time it was a thing. But it is probably worth a shot in the dark sometimes.
  7. The only thing you do, do (if you do), is implement a better decimator that ensures a minimum hull integrity for auto generation. No social engineering, no degrading loss of ethics and moral standing with abusing responsibility through marketing twists, deceits, sarcasm, or trying to be 'cleaver'. You just ensure a minimum hull integrity for auto decimation, and leave manual hulls with full control. However in doing so, you would also expect to ensure the lower limit of cost has not risen. Yes, that does mean some that will do it anyway will do it externally. That solves the importer. Leaves the newbies, intermediates, and well practiced alone. And leaves the effectiveness of good content to their own marketing responsibility and strength of their merit. It is not the responsibility of the import dialog to push any one content over another. It has one job. To validate and accept a valid mesh. That is its only job. I would fundamentally agree with Molly and Beq on this topic, and also ChinRey's last post.
  8. It is a nice way of putting it, but unless it is the camera specifically at the end of it all, I would still expect issues when camming far from the avatar. The biggest driving point of it is no matter where you cam there is no longer a perceivable error with where the surfaces of prims or the faces that make them are between each other.
  9. No we don't and that is why we don't. The simulators by themselves are a static container with a origin corner that never moves. That is old school and by definition no shift of the zeropoint. In order to zeropoint shift the origin must move figuratively or philosophically speaking. We have global coordinates because when we have contiguous regions we have to be able to know how to draw child regions in reference to the simulator we occupy. Then we have a camera that has a position that moves, as an observer not as an origin. To zeropoint shift properly the camera would effectively never move, the rest of the world would, in a sense. Eliminating the rending bugs and loss of precision for the depth buffer and high altitudes because all objects have a consistent vanishing margin of error the closer they get to the camera approaching no error at all. Because the camera is no longer an observer (it becomes the origin, clientside; the day we see that happen is the day things are no longer wibbly thousands of meters above the ground level). ETA: I believe Stationeers implemented or planned to implement zeropoint shifting for the very purpose to handle the very specific problem of the further you travel and the integrity of display. I don't believe they came up with it originally, but learned of it in the course of things. Just as a point of reference. Any simulation and game that has vast distances has to deal with "what to do with it as you approach or exceed the initial play area". The short answer to that problem, is to never exceed it and never have a boundary.
  10. As a side tangent it's a shame we don't have zeropoint shifting like some other places (for rendering)
  11. After reading the very introductory lines on the Epsilon topic, I doubt it compares a difference and passes. Seeing as when you feed a valid six precision float into a vector, obtain a five precision (that used to be six) from a prims colour, and fail to return true for a difference of 0.000001. @Wulfie Reanimator "When is Orange not Orange" problem, and the failing to have a same band of colour that acts as its own case. Unless I completely misunderstand the point of it.
  12. You need them to be extremely precise, exact in nature, and able to be debugged true to their value. A conditional test needs to succeed as expected, because all logic and action following it depends on that outcome.
  13. That just leaves me with the hard task of convincing them.
  14. What they did specifically, was take the vector from getlinkprimitiveparams into a list, take the value from the list into a vector (list2vector), capture the listfindlist passing the vector as a list as a direct argument [ variable ] to integer and do a ~. They expected it to match their handwritten colour variable (that was 6 places). The [ variable ] was spat out as 5. I am guessing now they tried to set it with 6 and it was perhaps truncated.
  15. No that is fine. They ran the ~ on an integer result captured from the typical listfindlist.
  16. Oh derp. I'm going to go peek at it again. (they just had a scripter group in world go over it and stuff)
  17. They were still unable to get their script to pass under a typical ~ for the case. ie: 0.123457 != 0.12345
  18. If you have a friend who happens to be premium, knows how you script, and is willing to let you be an experience contributor then you might avoid the premium requirement.
  19. And type casts. DumpList2String suggests we can use (string) to our hearts content and get an "identical" result. The typecast page tells us that precision is now at 5 places. A friend passed me a situation, and I did a simple dump of a quick mock of vectors showing that LSL is still doing 6 places of precision sometimes, and my friends if(~value) are failing when they should not (if precision were consistent). Can someone amend the wiki page to reflect the situation, instead of assuming people will go checking the Typecast page (when people are not doing typecasts themselves it still happens). Object: [GetPrimitiveParams] // Raw llDumpList2String <1.000000, 1.000000, 1.000000> 1.000000 Object: <avector> // Typecast <1.00000, 0.53654, 0.00000> My friend related their recent experience, the precision worked as expected under LSO, but not under mono when pulling colour vectors from parameters and comparing to a hardcoded vector. For now they expect to limit precision themselves and round off, but it is an annoying surprise. I'd have put this on a wiki discussion page, but you know why that isn't happening.
  20. The most important in my opinion is actually not LSL at all. That may sound counter-productive at first but give it a chance. The most important thing I would suggest, is learning anything and everything about objects and linksets. All scripts run live in them as containers. So everything to do with a linkset, its lifecycle, its limits, capability, utility, when it can be detected, when it is lost, and contiguous region behaviour, would be fundamentally important and point to different functions in LSL to learn along the way. Picking up the structure of LSL would follow the desire to make things do things. Getting too concerned about doing it well can be stifling and stop people trying. Which is seldom the point of it. Particles can be fun. If you enjoy them, keep learning them, and their limits. Find the good ideas, and learn from the not so great ones.
  21. Forgetting about explicit code for a bit or even thinking of touch events as being distinct events. It would feel that touch start would have a chance to not happen in the presence of event chance, but touch end would require a touch start or touch to pre-exist. Just thinking out loud tonight.
  22. That would be more ideal seeing as you wouldn't have to position yourself or your camera how you expect, or learn an image based programming language for screen based macros.
  23. Basically boils down to the algorithm being bunk the whole time and animat's observation is basically correct. It isn't possible for an LOD to cover a m2 area that is not displayed or one that it isn't rendered for. The prim equivalence doesn't exist also. And the data transferred is not increased in size, so the metric shouldn't be repurposed. In the same way that a function should ideally be responsible for one thing, so too should the words and metrics used to describe an asset.
  24. Arbitrary can help sure, but a guarantee is better I think.
  25. You would need to do something with windows API calls and VBA, or an equivalent.
×
×
  • Create New...