Jump to content

Aeonix Aeon

Resident
  • Posts

    30
  • Joined

  • Last visited

Posts posted by Aeonix Aeon

  1. Just wanted to add that when it comes to virtual worlds in a web browser, you end up being jack of all trades and master of none by default. That's usually the tradeoff, and ultimately why I've never been excited about that particular flavor of virtual world implementation.

    Thanks for the mention, Indigo...

  2. As an aside, the concept I was referring to in relation to procedural textures is that if we are generating the textures from the same underlying data, by which we are extrapolating the detail as a mathematical routine, then the lower fidelity takes less time to generate versus the high fidelity.

    In a static world, we're choosing a preset level of fidelity and forcing the clients to generate to that level (or preset levels) which I would readily agree would be costly for lower end systems to maintain. However, the idea that I was getting at was that procedurally generated textures can be generated with less as well as more fidelity based on computation time given to the routines up front, in which case we can create an automatic adjustment routine tied to something like FPS wherein if the GPU load is too high and the FPS drops below an acceptable threshhold, then the level of detail for the procedural texture generation is lowered to compensate and balance the FPS versus Fidelity.

    In this manner, we create a procedurally generated world (at least dealing with textures) wherein the same data used to represent the high end movie-quality rendering on gaming rigs can be used for lower end systems with less streaming computation. While the fidelity of the output does lower in the process, it is an acceptable trade off in much the same way we wouldn't expect all systems to comfortably handle Ambient Occlusion and Shadows maxed out in quality.

    As far as I am aware, systems such as Allegorithmic Substance stream the procedural computation through the GPU in real time based on similar criteria, as well as lower the fidelity of the output based on radius from the camera (because obviously we don't need high definition output for items in the distance). The end result is real-time procedural texture streaming that can scale up and down in fidelity based on the client computer running it.

     

    Thank you for the mention, Indigo :)

×
×
  • Create New...