Jump to content

New Firestorm Version September 2022


arabellajones
 Share

You are about to reply to a thread that has been inactive for 560 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

On 9/14/2022 at 10:31 PM, Wulfie Reanimator said:

This, specifically, was what I didn't understand.

If I'm in a scene that is practically unchanging (everything has rezzed/decoded and my camera is static), the framerate limiter can easily cut off 20 FPS if I adjust it juuust right.

For example, when I average ~51 FPS without the limiter and set it to 48 FPS, Firestorm will (as explained) often render frames at ~30 FPS instead.

image.png.5c31745bf7c633faf888b95146c2ca33.png

After doing some more digging (thanks @Beq Janus), it really does seem like normal behavior and not just wonky implementation. I can also see why Henri suggested triple buffering after reading this wonderful wall of text (which I would put up as recommended reading on the topic).

I am glad I found this thread. Tearing my hairs out to find whats wrong with the latest updates. Something seems suddenly seriously broken.

Testing: I am standing on a platform in the sky. Just the platform visible and my prefab system avatar. I used to have easily 150+ fps at this place. But now.. with latest FireStorm viewer only 21, with latest SL viewer only 40? And it doesnt matter if you put the slider on low or ultra. Same statistics with a frame time of about 51 and a huge jitter visible, like the picture of Wulfie above. Jitter is worse when you turn your view. I used to have a straight vertical line of red dots in my statistics display, at like 150-200 fps. Now the frame times are all over the place even on this stationary platform.

This is with an Intel i7 10700k and a GTX 2070s. So.. I replaced it with a 3080ti.  That was a worse experience! On mainland framerates dropped to 8fps after a while and seemed to be locked down to that value. If you drive a vehicle, thats totally unplayable. So, I went back to the 2070s because the experience is still bad but better. Go figure..

The things I tried to improve and diagnose:

  •     vsync off in the FS viewer and in the nvidia control panel
  •     maximum performance set in nvidia control panel
  •     framerate limiter off in FS
  •     graphics settings to Low (Low or Ultra doesnt matter much, especially not on the 3080ti)
  •     using only one main display, disconnected secondary
  •     Finding devices and drivers with high latencies and interrupts usage and removed all unnecessary software (Razor.. ughh)
  •     Re-installed drivers, tried different versions, older and newer. No change.
  •     Used ddu driver remover to wipe the graphics drivers, re-installed newest
  •     Re-installed Firestorm and SL viewer from scratch, wiping all related caches, registry and directories
  •     Ran several DirectX and OpenGL benchmarks to check my system - all give expected performance

Standing on the platform I do notice that my CPU usage is rather high, like 20% of the 16 thread i7 CPU. That would mean 3 threads are running 100%. Is that normal, SL & FS viewer using 3 full cores, standing on a platform?

So, also noting that graphics settings dont mattter much, something is heavily CPU bound. And that holds back the GPU. What could that be?

Is there a location where I can download older Second Life clients with which you can still login?

Thanks for any suggestion, because I am stymied right now and cant really enjoy SL anymore.

-----------
CPU: Intel(R) Core(TM) i7-10700K CPU @ 3.80GHz (3792 MHz)
Memory: 16303 MB
Concurrency: 16
OS Version: Microsoft Windows 10 64-bit (Build 19044.2006)
Graphics Card: NVIDIA GeForce RTX 2070 SUPER/PCIe/SSE2
Graphics Card Memory: 8192 MB
Windows Graphics Driver Version: 31.0.15.1740
OpenGL Version: 4.6.0 NVIDIA 517.40

Second Life Release 6.6.4.575022 (64bit)

 

  • Like 1
Link to comment
Share on other sites

I tried to edit the above but cant find the edit button.

Surprise! The Cool VL Viewer runs fine! >100 fps, pretty smooth. Similar to what SL viewer and FireStorm use to be.

Cool VL Viewer v1.30.0.21, 64 bits, Oct 15 2022 10:53:18

It has this "Core GL profile" setting in the graphics panel but that doesn do much, I switched it off and still > 100fps. Then I enabled "high DPI setting" which warns that it will halve your fps. It doesnt! Still between 90 and 150 on the platform (with the GTX2070s). The jitter in frametimes are still there but at high FPS less noticeable. All this on a 2560 x 1361 display size.

So what's going on here? If Cool VL viewer runs much faster, it is not the OpenGL driver. Possibly some recent optimization in SL Viewer that went into FS Viewer may not always work as intended. But what..

Edit: SL Viewer now also runs close to 60fps. That is my monitor refresh and no way I can get rid of the vsync, is that build in? Why the change, maybe because Cool VL Viewer installed some settings in the same appdata folder?

FireStorm still sucks and is limited at around 27 fps on low or on ultra settings, all the same. So, guess its time to go to the FS forums and use Cool VL instead.

Edited by Anatova Akina
Update new info.
Link to comment
Share on other sites

4 hours ago, Anatova Akina said:

Surprise! The Cool VL Viewer runs fine! >100 fps, pretty smooth. Similar to what SL viewer and FireStorm use to be.

Cool VL Viewer v1.30.0.21, 64 bits, Oct 15 2022 10:53:18

The Cool VL Viewer has always been faster... 😜

4 hours ago, Anatova Akina said:

It has this ”Core GL profile” setting in the graphics panel but that doesn do much, I switched it off and still > 100fps.

Very strange... The core GL profile provides a massive boost for me with NVIDIA cards (no gain whatsoever with Intel or AMD, and even a slight perf degradation with them), tested on three different PCs, with GTX 460, GTX 660 and GTX 1070Ti, under Linux, Win7, Win11.

Did you enable threading in the NVIDIA control panel 3D (system-wide) settings ?

4 hours ago, Anatova Akina said:

Then I enabled ”high DPI setting” which warns that it will halve your fps. It doesnt!

The warning is merely for GPUs much weaker than yours (which could likely deal with 4K display at the same frame rates).

4 hours ago, Anatova Akina said:

SL Viewer now also runs close to 60fps. That is my monitor refresh and no way I can get rid of the vsync, is that build in?

Again, check your NVIDIA system-wide settings, and verify that VSync is set to ”let application decide” or something in that vein. That panel also allows to set per-application settings: check that Firestorm or the SL viewer do not have VSync overrides set in these...

4 hours ago, Anatova Akina said:

maybe because Cool VL Viewer installed some settings in the same appdata folder

No. The Cool VL Viewer uses separate files (different file names) for all its settings (and a different cache folder too, for assets, textures, objects, etc).

Edited by Henri Beauchamp
Link to comment
Share on other sites

15 hours ago, Henri Beauchamp said:

Again, check your NVIDIA system-wide settings, and verify that VSync is set to ”let application decide” or something in that vein. That panel also allows to set per-application settings: check that Firestorm or the SL viewer do not have VSync overrides set in these...

Thank your your reply! If you look at my first post above, there is a long list of options I tried. Indeed I spent a lot of time playing with the driver settings in the NV control panel, both the system wide and application settings. Threading is enabled. With vsync off, Cool viewer runs fine without FPS limits, but the SL default viewer still clips to vsync. And FS does not even reach vsync frametimes. I will do a clean driver re-install again to be sure, maybe some setting was still lingering.

15 hours ago, Henri Beauchamp said:

No. The Cool VL Viewer uses separate files (different file names) for all its settings (and a different cache folder too, for assets, textures, objects, etc).

A bit off topic, but yes the cache in .../AppData/Roaming is in a separate folder, but .../AppData/Local is the same for both viewers? Other browsers create their own folders there too. Even if files are different, for a clean install you would wipe Local and that would wipe the settings of both viewers. Good to know, thanks for explaining.

Edited by Anatova Akina
Add: threading enabled
Link to comment
Share on other sites

56 minutes ago, Anatova Akina said:

but .../AppData/Local is the same for both viewers?

Yes, and this is done on purpose; this way you can share the same IMs, groups and chat history files for use in my viewer and LL's, for example... Since all other, viewer-specific files and sub-folders bear different names in that folder, there is no risk of any collision.

Quote

for a clean install you would wipe Local and that would wipe the settings of both viewers

The Cool VL Viewer installer never touches your settings and chat/IM/groups logs files and thus always keep them intact, even when uninstalling the viewer.

Quote

If you look at my first post above, there is a long list of options I tried. Indeed I spent a lot of time playing with the driver settings in the NV control panel, both the system wide and application settings.

About graphics settings, another one you will want to enable is triple buffering (this prevents tearing with VSync off, and could, maybe, prevent to turn VSync off if kept off as well). IIRC, there are two related settings (”allow” and ”enable”, or something in that vein), in the NVIDIA control panel. Same thing in case of per-application settings, of course...

Quote

Standing on the platform I do notice that my CPU usage is rather high, like 20% of the 16 thread i7 CPU. That would mean 3 threads are running 100%. Is that normal, SL & FS viewer using 3 full cores, standing on a platform?

Yes, it is normal that today's viewers use a full CPU core at 100%, plus 50% to 100% of another full core (actually used by NVIDIA's threading driver) when just ”sitting” there with everything rezzed and frame limiter and VSync both off; with SMT CPUs, it means you will see 3 or 4 virtual cores loaded at 100%.

While rezzing, newer viewers (implementing LL's ”performances viewer” improvements) will eat up even more cores, and the Cool VL Viewer even more than these (up to 100% of your CPU cores) so to rez things even faster.

To avoid wasting power and keep fans quiet(er), you may want to try the frame limiter (for viewers having one), but AFAIK, only the Cool VL Viewer implements a ”smart” limiter that does not impact the rezzing speed at all (and in fact, improves it since, at each frame and when there is still some job to do, rather than just sleeping the CPU, its ”free time” is used by my frame limiter algorithm to rez even faster, while still keeping a lighter load on the GPU by not rendering uselessly too many frames that your monitor won't even display).

Edited by Henri Beauchamp
Link to comment
Share on other sites

6 hours ago, Henri Beauchamp said:

About graphics settings, another one you will want to enable is triple buffering

I understand that games in windows always use triple buffering? Windows forces V-Sync and triple buffering on all windows, which includes fullscreen borderless windows like those in SL. I never see screen tearing in SL clients even if they outperform my monitor sync freq. I wonder what the NV setting does and how that works with OpenGL vs Direct3D. Also triple buffering gives extra lag so in full screen fast shooters, "G-Sync" is the way..

6 hours ago, Henri Beauchamp said:

While rezzing, newer viewers (implementing LL's ”performances viewer” improvements) will eat up even more cores, and the Cool VL Viewer even more than these (up to 100% of your CPU cores) so to rez things even faster.

To avoid wasting power and keep fans quiet(er), you may want to try the frame limiter (for viewers having one).

Using more of my cores is fine. But a main graphics thread that chokes on one core.. thats not fine. A fast single core performance CPU still does wonders for SL. And sure I use the frame limiter, but I first want to see that high FPS and low jitter again! FS frametimes are low and all over the place recently. You would not expect that when it is using more cores.

You did a good job with the Cool viewer, thanks for that! Are you also in the FS dev team? They could learn something maybe..

Link to comment
Share on other sites

5 hours ago, Anatova Akina said:

understand that games in windows always use triple buffering? Windows forces V-Sync and triple buffering on all windows, which includes fullscreen borderless windows like those in SL.

No, it's either VSync and double buffering, or triple buffering (using both would be pointless). VSync is by far the worst choice, introducing frame jitter and slowing down the main thread (which the SL viewer is using for both rendering and doing a sh*tload of other things that get delayed at each GL swap call when you enable VSync).

5 hours ago, Anatova Akina said:

Also triple buffering gives extra lag so in full screen fast shooters, ”G-Sync” is the way

No, in fact triple buffering is by far the less laggy and smoothest solution to prevent tearing !... Do not get fooled by ”triple”: there are not three frames waiting to be displayed, but just one frame displayed, and another being calculated: see this article.

5 hours ago, Anatova Akina said:

Using more of my cores is fine. But a main graphics thread that chokes on one core.

With the Cool VL Viewer, you may reserve a (full) core (i.e. 2 virtual cores on SMT CPUs) to the render thread via the MainThreadCPUAffinity debug setting (this is a bitmap; e.g. 2^0+2^1 = 3 to reserve the first full core of a SMT CPU, or 2^2+2^3 =12 for the second), all the other threads will be affected to all other available cores. Note that this is for Windows; for Linux you will want to reserve two full cores to the main thread (because unlike what happens for Windows, the graphics driver threads will also use those reserved cores).

5 hours ago, Anatova Akina said:

Are you also in the FS dev team

No, but any viewer team may borrow my Open Source code as long as credit is given; some do or did... It's their choice, not mine.

Edited by Henri Beauchamp
Link to comment
Share on other sites

On 10/16/2022 at 10:23 AM, Henri Beauchamp said:

Yes, and this is done on purpose; this way you can share the same IMs, groups and chat history files for use in my viewer and LL's, for example... Since all other, viewer-specific files and sub-folders bear different names in that folder, there is no risk of any collision.

The Cool VL Viewer installer never touches your settings and chat/IM/groups logs files and thus always keep them intact, even when uninstalling the viewer.

I think Ana is referring to when the user manually deletes these folders. It is... a very common thing for users to immediately go in scorched earth rather than deleting only the settings file for instance.

I've had several instances where i was offering to look into their settings and manually fixing them (before asking them to delete the settings, making a backup beforehand) and before i could even finish my sentence they were already in the Roaming folder deleting everything because that's what they were always taught coming from Firestorm.

Link to comment
Share on other sites

On 10/16/2022 at 10:29 PM, Henri Beauchamp said:

No, it's either VSync and double buffering, or triple buffering (using both would be pointless).

I think it is a bit more complicated, and confusing, in Windows-10. The desktop windows manager (DWM) forces render ahead with a default of 3 frames in any window. Which is similar to triple buffering but a bit different, since no frames can be dropped. If you disable V-Sync the framerate will be uncapped. But, because of the render ahead, there is no tearing, never ever in any windows-10 window. And if you have v-sync on the frames per second will still be capped by your monitor refresh rate. Which may be desirable to limit GPU power consumption, but not if you fancy low keyboard/mouse input lag. So vsync still makes sense in a game running in a Windows-10 window which has a forced "triple buffering". Like SL. Also, double buffering DOES NOT EXISIT in a game running inside a (borderless) window. No tearing. Never.

There is a lot of discussion on the confusion between triple buffering and render ahead see for example:

https://www.anandtech.com/show/2794/4

https://linustechtips.com/topic/577522-does-windows-10-force-triple-buffering-in-windowed-mode/

I found somewhere: "If a game has no tearing with Vsync off, it's because it's not running in exclusive fullscreen mode, it's in borderless window (which has its own problems). With Vsync off in exclusive fullscreen mode, there is no way to avoid tearing (unless using something like G-Sync)."

I think that sums it up nicely.

Link to comment
Share on other sites

4 hours ago, Anatova Akina said:

I think it is a bit more complicated, and confusing, in Windows-10.

.../...

You are confusing DirectX games and OpenGL ones...

In the viewer, triple buffering is requested at the OpenGL context creation (and if you do not configure your driver to allow it, the context would fall back to double buffering only); without it, you use double-buffering, always (i.e. one draw buffer is displayed, while the other is rendered, and they are swapped either on VSync, when enabled, or when the second buffer is ready, which would cause tearing).

As for ”render-ahead”, this does not exist in OpenGL...

Here is an old but excellent article about triple buffering and why you want to use it instead of Vsync. You will also see at the end of this article, in the ”update”:

Quote

Microsoft doesn't implement triple buffering in DirectX, they implement render ahead (from 0 to 8 frames with 3 being the default).

But OpenGL is not DirectX...

Edited by Henri Beauchamp
Link to comment
Share on other sites

I'll have to come back and read all this slowly ...    some of it sounds familiar and some of it ~~  right over my head!   😲

 

I do tinker with my settings and sometimes I break things that way - so I try to be very careful.   I want the best performance without doing any damage to my nice new expensive pc.  😁

 

Thanks to all for sharing your knowledge.   My head hurts now.  

Link to comment
Share on other sites

8 hours ago, Henri Beauchamp said:

You are confusing DirectX games and OpenGL ones...

In the viewer, triple buffering is requested at the OpenGL context creation (and if you do not configure your driver to allow it, the context would fall back to double buffering only); without it, you use double-buffering, always (i.e. one draw buffer is displayed, while the other is rendered, and they are swapped either on VSync, when enabled, or when the second buffer is ready, which would cause tearing).

As for ”render-ahead”, this does not exist in OpenGL...

Here is an old but excellent article about triple buffering and why you want to use it instead of Vsync. You will also see at the end of this article, in the ”update”:

But OpenGL is not DirectX...

Interesting, not only did i learn that the "composite rendering" how everyone called it back then in DWM that prevents tearing without VSync is called render ahead (that explains the driver setting) it also doesnt exist in OpenGl, now that begs the question, if its not render ahead that is preventing tearing in windowed applications with OpenGl, what is? SL will never ever have tearing, regardless of your settings or Vsync as long as you do not start SL in exclusive fullscreen (which to my knowledge is not possible in any Viewer and was removed a decade ago because is was a crashy mess), something is still preventing it from happening. Any insight on that?

Edited by NiranV Dean
  • Like 1
Link to comment
Share on other sites

3 hours ago, NiranV Dean said:

if its not render ahead that is preventing tearing in windowed applications with OpenGl, what is?

Triple buffering, NOT to be confused with DirectX ”render ahead” (which is actually a ”render queue”, and thus introduces as much lag as they are draw buffers in that queue). With triple buffering, you are still at worst ”one frame late” (the two back buffers are simply swapped in such a way that no tearing can happen whenever the Vsync pulse happens).

Link to comment
Share on other sites

14 hours ago, Henri Beauchamp said:

Triple buffering, NOT to be confused with DirectX ”render ahead” (which is actually a ”render queue”, and thus introduces as much lag as they are draw buffers in that queue). With triple buffering, you are still at worst ”one frame late” (the two back buffers are simply swapped in such a way that no tearing can happen whenever the Vsync pulse happens).

...but triple buffering is off by default.

image.png.34ba36084f5e93f58ee5c71f7983fe44.png

I don't have an override for my Viewer (or any for that matter) either, its globally off. So unless SL forces it thats not either.

Link to comment
Share on other sites

7 hours ago, NiranV Dean said:

...but triple buffering is off by default.

Then you will have tearing... Unless you use VSync/GSync. Try turning your avatar left or right without moving forward/backward, and you should notice it (it is sometimes hard to notice, especially at high frame rates).

Edited by Henri Beauchamp
Link to comment
Share on other sites

5 hours ago, Henri Beauchamp said:

Then you will have tearing... Unless you use VSync/GSync. Try turning your avatar left or right without moving forward/backward, and you should notice it (it is sometimes hard to notice, especially at high frame rates).

Every other application immediately shows tearing, very noticeable. SL does not and has never in 15 years that i am in SL now. I'd see it, i see even the slightest tearing blindfolded, i hate it so much.

Link to comment
Share on other sites

On 10/18/2022 at 11:54 PM, Henri Beauchamp said:

You are confusing DirectX games and OpenGL ones...

And you are challenging me to do more research! So I did a deep dive into the inner guts of the Windows Display Driver Model (WDDM). I collected 10+ nice reads on WDDM, DWM and how Direct3D and OpenGL play together on Windows after WDDM was introduced in Vista back in 2008. Will share some here.

First this picture: https://learn.microsoft.com/en-us/windows-hardware/drivers/display/windows-vista-and-later-display-driver-model-architecture

This shows the three graphics paths to the kernel and to your graphics card: 1. the Direct3D runtime, the OpenGL runtime, and then the legacy win32 GDI for 2D drawing. However this does not show you how you could mix these on a single screen or desktop.

On the wiki there is more: https://en.wikipedia.org/wiki/Windows_Display_Driver_Model, interesting read but not directly related to OpenGL. The biggest change is that graphics devices are completely virtualized.

A major change in Vista was the introduction of the Desktop Window Manager (DWM) to render your desktop. This is a native Direct3D application that collects internal frontbuffers (!) of all applications, collects all the window meshes, borders, alpha values, and then composes them and writes out the result to the GPU front framebuffer. This is all accelerated by the GPU.

On the OpenGL site I found an explanation on the changes to OpenGL on systems with the new WDDM, so after Vista. An interesting read: https://www.opengl.org/pipeline/article/vol003_7/ :

"Graphics applications now have to share resources with the 3D-accelerated window manager. Each OpenGL window now requires an offscreen frontbuffer, because there's no longer direct access to the surface being displayed: the desktop. This is also true when the Desktop Windows Manager (DWM) is off."

So, OpenGL writes to an internal buffer only. The final flipping to the front buffer is done after compositing by the DWM Direct3D application, which uses a render ahead of 3. If you talk about settings for vsync and fps limiters, we need to look at both what OpenGL is doing and what the DWM application is doing.

What about native OpenGL applications and games like DOOM? Well, they can get exclusive control to the still virtualized graphics hardware, similar to Direct3D. I found this article with another nice graph explaining how Direct3D, DWM and OpenGL play together:
https://www.opengl.org/pipeline/article/vol003_9/

I annotated that picture in the context of our discussion:

200618044_20070420_opengl_directx_diaAA.thumb.jpg.8599da2e954f1cbb6cf71a1fc6a9f384.jpg

I think it is essential to understand the WDDM (after Vista) architecture when talking about vsync, triple buffering and framerates in windowed OpenGL applications like Second Life viewer. I am not saying I fully understand it myself, but it makes more sense now.

Now coming back on topic, the questions still unanswered are:

1. how exactly are the Nvidia Control Panel settings for vsync, triple buffering, framerate limiter etc used by the application, by the Opengl32 API, by the ICD User Mode driver, by the DWM and finally by the kernel level graphics driver?
2. Same for the vsync application setting in FS, which must be an OpenGL setting or a Direct3D setting (for the DWM) or both. Where in the graphics chain does it have an influence?
3. what happened to the SL Viewer render pipeline recently causing a major loss in framerates for some, but not for others? Why can vsync get stuck? Where do we find the settings to get back our high FPS? Why does a reinstall sometimes work and sometimes not?
4. why does Cool Viewer not have the same limiting behavior as SLV and FS. At least not for me..

Unfortunately, enough to research still. It would be a major effort to dive into all these source codes. I wouldnt know where to start really. I am secretly hoping a Linden dev or FS dev would read this..

EDIT: tried SLV Release 6.6.5.575749 of yesterday. No change, even if I disable vsync in the SLV debug settings (rendervsyncEnable=FALSE). Also re-tried Cool Viewer. Now also that viewer seems FPS limited, no 100+ fps anymore regardless of what settings I try. Any viewer seems limited always regardless of NVCP or application settings. *sighs*

 

Edited by Anatova Akina
More testing
Link to comment
Share on other sites

Well, Windoze is Windoze... It sucks rocks. Period. It also keeps destroying every standard (here, OpenGL), to implement their ”own way” of doing (or rather failing) things...

My advice is: install a Linux dual boot, and use the viewers under Linux: they will fly at 200+ fps, and your experience will be waaaaaaay smoother (with way less fps dropouts and hiccups): I recently (a few days ago) stress- tested my viewer under Windows 7 & 11, by ”Yava-podding” (YavaScript Pod tours) at 250% speed and 256m DD, and geez, what a hiccupy and bumpy ride, when compared with what I get under Linux on the same machine !!!

As a last advice (because I certainly do not see any 100 fps limits on my systems under Windows 7 or 11): did you disable the Windows 10/11 game mode ?... If not, do disable it, and see how it fares.

Link to comment
Share on other sites

16 hours ago, Henri Beauchamp said:

Well, Windoze is Windoze... It sucks rocks. Period. It also keeps destroying every standard (here, OpenGL), to implement their ”own way” of doing (or rather failing) things...

In my opinion thats a bit harsh in the current context. First, MS does not provide any specific OpenGL drivers - the IHV / hardware vendors do. Enough OpenGL games prove that it works fine. However the SL viewer graphics implementation has not been updated in years and also OpenGL development is halted in favor of Vulcan. Still why didnt we see a full screen SL Viewer version with exclusive mode access after the WDDM driver model was released? Thats years ago. Should not be that difficult since SL Viewer already draws everything it needs by itself. No Window Manager is needed at all. Then you would have complete low level OpenGL hardware control. Alternatively, since OpenGL is a dead end anyway, port SL Viewer to Vulcan please.

16 hours ago, Henri Beauchamp said:

My advice is: install a Linux dual boot

Thats an option, even if only to test. Its been years. Any advice on what distribution is current, for gaming?

16 hours ago, Henri Beauchamp said:

As a last advice (because I certainly do not see any 100 fps limits on my systems under Windows 7 or 11): did you disable the Windows 10/11 game mode ?

Yes that software is gone/disabled already but will check again if maybe some update re-enabled it.

Also I will start anew with Windows 11 on a new PC in a couple of weeks.

Thanks for the feedback so far!

Link to comment
Share on other sites

46 minutes ago, Anatova Akina said:

In my opinion thats a bit harsh in the current context.

Not harsh at all. Purely factual (but truth can indeed be cruel, sometimes). Comparing the viewer (and many other programs) performances under Windows and Linux makes it so very obvious and undeniable a fact...

46 minutes ago, Anatova Akina said:

Still why didnt we see a full screen SL Viewer version

You may already (try to) use the viewer in full screen mode (if you are lucky enough to not cause it to crash, due to viewer+driver+OS bugs), at which point you would find yourself in the ”Doom” case, with Windoze minding its own business and not interfering any more with OpenGL screen buffering...

46 minutes ago, Anatova Akina said:

Alternatively, since OpenGL is a dead end anyway, port SL Viewer to Vulcan please.

I keep advocating for a Vulkan renderer, even if it means implementing a double (OpenGL+Vulkan), switchable renderer in the viewer for a transition period, the time for everyone to migrate to a system with proper Vulkan support.

But this would require a heavy manpower investment from LL to fully re-code the renderer for Vulkan (which is much less ”programmer-friendly” than OpenGL due to how much closer it is to the hardware, with less pre-made ”building blocks” available at the driver level); AFAIK, LL is indeed considering a Vulkan renderer, but they want first to update the OpenGL one and get it to a state where it would be easier to port to Vulkan...

46 minutes ago, Anatova Akina said:

Any advice on what distribution is current, for gaming?

Nowadays, every Linux distribution will perform just fine for gaming.

Your options might however get narrowed due to Windows 11 and its ”secure boot” requirement, meaning the Linux kernel must be signed with M$ key (how secure is that, huh ? 🤪 )... Of course, you can convince W11 to install itself on a PC without TPM (or with TPM disabled), no secure boot, in BIOS/CSM mode, on a MBR partition, like I am doing, but it requires some tinkering.

Some distributions do sign their kernels: Ubuntu, for sure, OpenSuse too, I think...

Edited by Henri Beauchamp
  • Thanks 1
Link to comment
Share on other sites

On 10/22/2022 at 11:50 AM, Henri Beauchamp said:

You may already (try to) use the viewer in full screen mode (if you are lucky enough to not cause it to crash, due to viewer+driver+OS bugs), at which point you would find yourself in the ”Doom” case, with Windoze minding its own business and not interfering any more with OpenGL screen buffering...

Thanks, I tried. No difference. FS running at about 30 fps (in the sky, no objects in sight) where before that would be like 200+, SLV and Cool about 60. But yes this looks like it indeed takes the exclusive control now, not a border-less window anymore. Playing with driver settings like vsync did not help though (in NVCP, FS). Did another test, the benchmarks in OpenGL Extensions Viewer 6.3.7.0 all run up too 1000fps. And also in that test vsync in NVCP does not have any influence at all. That could be again because its running through the DWM and if this little proggy can do that, why not SL Viewer? What is wrong with that? Using a full CPU core up to 100% doing tasks, for what? Just to render the sky very very slowly while my GPU is doing nothing but picking its virtual nose. Highly frustrating.

Thanks again. I will stop nagging and assume its only me. Will look into a complete system change and re-install, also linux and W11, in due time.

EDIT: with a clean install on a spare disk of W10, all drivers and just the SL viewer applications the problem seems to be resolved. That kinda sucks indeed, agree on that.

Edited by Anatova Akina
Clean install..
Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 560 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...