Jump to content

Dave Emerald

Resident
  • Posts

    1
  • Joined

  • Last visited

Reputation

2 Neutral
  1. I've seen a few threads talking about Unreal Engine, UE5, and game-engine technology with interest in seeing this kind of innovation in Second Life Viewers. I thought I would spend a moment to take a shallow dive into these technologies and put them into the context of Second Life. 1. Lumen is an incredible feature that could add significant lighting realism to Second Life, if there were a UE5 based SL Viewer, because it could add a real-time global illumination approximation to existing SL geometry. This does require a pretty beefy modern GPU, and it also would need some lighting adjustments to rebalance light around secondary light bounce effects. However, the results can be quite impressive, even with existing legacy geometry. For a taste of how legacy low-poly geometry can look with Lumen, check out some of the many "World of Warcraft remastered in UE5" videos. Lumen is a runtime feature, and can handle runtime generated (and downloaded) geometry, such as would occur with SL. How is this different than current SL lighting? Current Second Life viewers are based around only primary light sources. The reason SL looks "flat" compared to games, is that most games use an expensive "lighting bake" that does a slow edit-time raytrace of all secondary light bounces into a "lightmap" for the entire scene. However, this not only freezes time-of-day, but it also freezes the position of all objects in the scene. This is not done in Second Life for many reasons, but mostly because objects can be moved, and each time they were moved, lightmaps would look "wrong and broken" and some computer would have to run a potentially long lighting bake to make new lightmaps. Lumen is a technology that allows an approximate raytraced lighting to be computed in real-time, even for dynamically generated geometry. Is it possible to implement Lumen-like tech in existing SL Viewers? Of course, but it would take years of funding and expert graphics developers. Lumen merges previous research into voxel global illumination, screen space global illumination, and light-probes into a really craft solution. While each of those technologies is relatively simple (and open source), combining them together into Lumen was years of work by expert graphics engineers at Epic to create creative solutions to the edge cases, that have led to a real-time GI solution that not only has an acceptable level of artifracts, but is also incredibly efficient. To get a respect for it's complexity, you can watch the 2021 SIGGRAPH presentation of Lumen by one of the developers. The next milestone we should be watching for is for *any* other engine to have technology comparable to Lumen. It is possible this might not happen before we have hardware capable of full real time path-tracing (probably in 3-6 years). 2. Nanite geometry is an alternative to polygon meshes, allowing efficient runtime tesselation for perfect automated LOD. This technology is not easy to make use of for a Second Life viewer, because Nanite geometry can not be runtime-generated inside a UE5 game client. It is a slow-step that only happens in the UE5 Editor. What is Nanite? Nanite both ultra-high detail objects to be used with reasonable performance, and it allows incredible scene complexity to be rendered with reasonable performance. This would be an incredible addition to Second Life, but it's not one that can come "for free" in a Second Life Viewer. To use Nanite in UE5, every SL object, after it is uploaded, would need to be processed into the Nanite format **inside the Unreal Engine Editor**. While these objects could also exist in a legacy mesh form for other clients, this kind of reprocessing of meshes into an alternate format (and serving that alternate format to a Nanite capable viewer) is much more extensive than writing a Second Life Viewer client. 3. Don't be fooled by the path-tracing examples. We've all seen shocking UE5 scenes, sometimes with absolutely massive numbers of trees (millions!), like this "UE5 Sierra Nevada" Scene. These images are *not* real-time rendered. They are using the UE5 Editor's progressive path-tracer, which is a similar technology to Blender's Cycles progressive path-tracer. If you watch these users working in UE5 editor, you can see "path tracing" enabled in the editor, and when they move the scene you can see the image pixelate before filling in over time. Equally important to note, is that when scenes have an absolutely huge number of instanced objects, such as millions of trees, they can *not* be rendered at real time in a game. In other words, just because you see people building these absolutely massive scenes in Unreal Editor, does *not* mean they would work in real-time in an Unreal Engine based game. 4. Large View Distances. Modern game technology, including UE4 and UE5 can support fast rendering of vastly larger view distances than you see in Second Life.. However, this is as much a product of the way these terrains are built as it is because of the game engine technology. Many Second Life locations are built by users in performance inefficient ways. For example, by using tons of uniquely different "ground" objects, instead of using the terrain height map, or by placing a massive number of unique objects to give the scene a certain visual look. Scenes built like this could cause any game engine (without Nanite) to struggle and render with low FPS. When working on a game, designers make choices for performance as much as aesthetics, and SL does not require, enforce, or even encourage a workflow with these kinds of limitations. The SL solution is to keep a relatively short view distance always, to allow users to add scene complexity right in front of them. 5. Large Object Count. In game rendering, one thing that slows down rendering is the number of uniquely drawn objects (draw call count). Modern graphics systems, like Vulkan, are lowering this cost, but there is still a cost. In a game engine like Unreal, often the many different pieces of an object are merged into a single mesh in the Unreal Editor, and many objects placed in a scene are merged into the scene object mesh. This is one of the reasons that so many games have "bookshelves" and "props" that are immovable and indestructable. They are already merged into the background mesh. It is possible for any Second Life Viewer to do this kind of "mostly static" mesh merging at runtime, to optimize drawing (and it's possible some do, i have no idea), but this isn't something game engines do out of the box, so it's very possible that a "Second Life Viewer implemented in UE5 without runtime mesh merging" would have performance issues rendering scenes with very large numbers of unique placed objects. This is related to #4 above, as this is another reason I suspect view distances are kept relatively short in SL. I'm sorry for how long that turned out, and how technical some of it is. I hope some of that explanation is helpful to set expectations for this technology showing up in SL Viewers, and also to inspire some possibilities of things we might see in the future.
×
×
  • Create New...