Jump to content

Anatova Akina

Resident
  • Posts

    8
  • Joined

  • Last visited

Reputation

1 Neutral

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Thanks, I tried. No difference. FS running at about 30 fps (in the sky, no objects in sight) where before that would be like 200+, SLV and Cool about 60. But yes this looks like it indeed takes the exclusive control now, not a border-less window anymore. Playing with driver settings like vsync did not help though (in NVCP, FS). Did another test, the benchmarks in OpenGL Extensions Viewer 6.3.7.0 all run up too 1000fps. And also in that test vsync in NVCP does not have any influence at all. That could be again because its running through the DWM and if this little proggy can do that, why not SL Viewer? What is wrong with that? Using a full CPU core up to 100% doing tasks, for what? Just to render the sky very very slowly while my GPU is doing nothing but picking its virtual nose. Highly frustrating. Thanks again. I will stop nagging and assume its only me. Will look into a complete system change and re-install, also linux and W11, in due time. EDIT: with a clean install on a spare disk of W10, all drivers and just the SL viewer applications the problem seems to be resolved. That kinda sucks indeed, agree on that.
  2. In my opinion thats a bit harsh in the current context. First, MS does not provide any specific OpenGL drivers - the IHV / hardware vendors do. Enough OpenGL games prove that it works fine. However the SL viewer graphics implementation has not been updated in years and also OpenGL development is halted in favor of Vulcan. Still why didnt we see a full screen SL Viewer version with exclusive mode access after the WDDM driver model was released? Thats years ago. Should not be that difficult since SL Viewer already draws everything it needs by itself. No Window Manager is needed at all. Then you would have complete low level OpenGL hardware control. Alternatively, since OpenGL is a dead end anyway, port SL Viewer to Vulcan please. Thats an option, even if only to test. Its been years. Any advice on what distribution is current, for gaming? Yes that software is gone/disabled already but will check again if maybe some update re-enabled it. Also I will start anew with Windows 11 on a new PC in a couple of weeks. Thanks for the feedback so far!
  3. And you are challenging me to do more research! So I did a deep dive into the inner guts of the Windows Display Driver Model (WDDM). I collected 10+ nice reads on WDDM, DWM and how Direct3D and OpenGL play together on Windows after WDDM was introduced in Vista back in 2008. Will share some here. First this picture: https://learn.microsoft.com/en-us/windows-hardware/drivers/display/windows-vista-and-later-display-driver-model-architecture This shows the three graphics paths to the kernel and to your graphics card: 1. the Direct3D runtime, the OpenGL runtime, and then the legacy win32 GDI for 2D drawing. However this does not show you how you could mix these on a single screen or desktop. On the wiki there is more: https://en.wikipedia.org/wiki/Windows_Display_Driver_Model, interesting read but not directly related to OpenGL. The biggest change is that graphics devices are completely virtualized. A major change in Vista was the introduction of the Desktop Window Manager (DWM) to render your desktop. This is a native Direct3D application that collects internal frontbuffers (!) of all applications, collects all the window meshes, borders, alpha values, and then composes them and writes out the result to the GPU front framebuffer. This is all accelerated by the GPU. On the OpenGL site I found an explanation on the changes to OpenGL on systems with the new WDDM, so after Vista. An interesting read: https://www.opengl.org/pipeline/article/vol003_7/ : "Graphics applications now have to share resources with the 3D-accelerated window manager. Each OpenGL window now requires an offscreen frontbuffer, because there's no longer direct access to the surface being displayed: the desktop. This is also true when the Desktop Windows Manager (DWM) is off." So, OpenGL writes to an internal buffer only. The final flipping to the front buffer is done after compositing by the DWM Direct3D application, which uses a render ahead of 3. If you talk about settings for vsync and fps limiters, we need to look at both what OpenGL is doing and what the DWM application is doing. What about native OpenGL applications and games like DOOM? Well, they can get exclusive control to the still virtualized graphics hardware, similar to Direct3D. I found this article with another nice graph explaining how Direct3D, DWM and OpenGL play together: https://www.opengl.org/pipeline/article/vol003_9/ I annotated that picture in the context of our discussion: I think it is essential to understand the WDDM (after Vista) architecture when talking about vsync, triple buffering and framerates in windowed OpenGL applications like Second Life viewer. I am not saying I fully understand it myself, but it makes more sense now. Now coming back on topic, the questions still unanswered are: 1. how exactly are the Nvidia Control Panel settings for vsync, triple buffering, framerate limiter etc used by the application, by the Opengl32 API, by the ICD User Mode driver, by the DWM and finally by the kernel level graphics driver? 2. Same for the vsync application setting in FS, which must be an OpenGL setting or a Direct3D setting (for the DWM) or both. Where in the graphics chain does it have an influence? 3. what happened to the SL Viewer render pipeline recently causing a major loss in framerates for some, but not for others? Why can vsync get stuck? Where do we find the settings to get back our high FPS? Why does a reinstall sometimes work and sometimes not? 4. why does Cool Viewer not have the same limiting behavior as SLV and FS. At least not for me.. Unfortunately, enough to research still. It would be a major effort to dive into all these source codes. I wouldnt know where to start really. I am secretly hoping a Linden dev or FS dev would read this.. EDIT: tried SLV Release 6.6.5.575749 of yesterday. No change, even if I disable vsync in the SLV debug settings (rendervsyncEnable=FALSE). Also re-tried Cool Viewer. Now also that viewer seems FPS limited, no 100+ fps anymore regardless of what settings I try. Any viewer seems limited always regardless of NVCP or application settings. *sighs*
  4. I think it is a bit more complicated, and confusing, in Windows-10. The desktop windows manager (DWM) forces render ahead with a default of 3 frames in any window. Which is similar to triple buffering but a bit different, since no frames can be dropped. If you disable V-Sync the framerate will be uncapped. But, because of the render ahead, there is no tearing, never ever in any windows-10 window. And if you have v-sync on the frames per second will still be capped by your monitor refresh rate. Which may be desirable to limit GPU power consumption, but not if you fancy low keyboard/mouse input lag. So vsync still makes sense in a game running in a Windows-10 window which has a forced "triple buffering". Like SL. Also, double buffering DOES NOT EXISIT in a game running inside a (borderless) window. No tearing. Never. There is a lot of discussion on the confusion between triple buffering and render ahead see for example: https://www.anandtech.com/show/2794/4 https://linustechtips.com/topic/577522-does-windows-10-force-triple-buffering-in-windowed-mode/ I found somewhere: "If a game has no tearing with Vsync off, it's because it's not running in exclusive fullscreen mode, it's in borderless window (which has its own problems). With Vsync off in exclusive fullscreen mode, there is no way to avoid tearing (unless using something like G-Sync)." I think that sums it up nicely.
  5. I understand that games in windows always use triple buffering? Windows forces V-Sync and triple buffering on all windows, which includes fullscreen borderless windows like those in SL. I never see screen tearing in SL clients even if they outperform my monitor sync freq. I wonder what the NV setting does and how that works with OpenGL vs Direct3D. Also triple buffering gives extra lag so in full screen fast shooters, "G-Sync" is the way.. Using more of my cores is fine. But a main graphics thread that chokes on one core.. thats not fine. A fast single core performance CPU still does wonders for SL. And sure I use the frame limiter, but I first want to see that high FPS and low jitter again! FS frametimes are low and all over the place recently. You would not expect that when it is using more cores. You did a good job with the Cool viewer, thanks for that! Are you also in the FS dev team? They could learn something maybe..
  6. Thank your your reply! If you look at my first post above, there is a long list of options I tried. Indeed I spent a lot of time playing with the driver settings in the NV control panel, both the system wide and application settings. Threading is enabled. With vsync off, Cool viewer runs fine without FPS limits, but the SL default viewer still clips to vsync. And FS does not even reach vsync frametimes. I will do a clean driver re-install again to be sure, maybe some setting was still lingering. A bit off topic, but yes the cache in .../AppData/Roaming is in a separate folder, but .../AppData/Local is the same for both viewers? Other browsers create their own folders there too. Even if files are different, for a clean install you would wipe Local and that would wipe the settings of both viewers. Good to know, thanks for explaining.
  7. I tried to edit the above but cant find the edit button. Surprise! The Cool VL Viewer runs fine! >100 fps, pretty smooth. Similar to what SL viewer and FireStorm use to be. Cool VL Viewer v1.30.0.21, 64 bits, Oct 15 2022 10:53:18 It has this "Core GL profile" setting in the graphics panel but that doesn do much, I switched it off and still > 100fps. Then I enabled "high DPI setting" which warns that it will halve your fps. It doesnt! Still between 90 and 150 on the platform (with the GTX2070s). The jitter in frametimes are still there but at high FPS less noticeable. All this on a 2560 x 1361 display size. So what's going on here? If Cool VL viewer runs much faster, it is not the OpenGL driver. Possibly some recent optimization in SL Viewer that went into FS Viewer may not always work as intended. But what.. Edit: SL Viewer now also runs close to 60fps. That is my monitor refresh and no way I can get rid of the vsync, is that build in? Why the change, maybe because Cool VL Viewer installed some settings in the same appdata folder? FireStorm still sucks and is limited at around 27 fps on low or on ultra settings, all the same. So, guess its time to go to the FS forums and use Cool VL instead.
  8. I am glad I found this thread. Tearing my hairs out to find whats wrong with the latest updates. Something seems suddenly seriously broken. Testing: I am standing on a platform in the sky. Just the platform visible and my prefab system avatar. I used to have easily 150+ fps at this place. But now.. with latest FireStorm viewer only 21, with latest SL viewer only 40? And it doesnt matter if you put the slider on low or ultra. Same statistics with a frame time of about 51 and a huge jitter visible, like the picture of Wulfie above. Jitter is worse when you turn your view. I used to have a straight vertical line of red dots in my statistics display, at like 150-200 fps. Now the frame times are all over the place even on this stationary platform. This is with an Intel i7 10700k and a GTX 2070s. So.. I replaced it with a 3080ti. That was a worse experience! On mainland framerates dropped to 8fps after a while and seemed to be locked down to that value. If you drive a vehicle, thats totally unplayable. So, I went back to the 2070s because the experience is still bad but better. Go figure.. The things I tried to improve and diagnose: vsync off in the FS viewer and in the nvidia control panel maximum performance set in nvidia control panel framerate limiter off in FS graphics settings to Low (Low or Ultra doesnt matter much, especially not on the 3080ti) using only one main display, disconnected secondary Finding devices and drivers with high latencies and interrupts usage and removed all unnecessary software (Razor.. ughh) Re-installed drivers, tried different versions, older and newer. No change. Used ddu driver remover to wipe the graphics drivers, re-installed newest Re-installed Firestorm and SL viewer from scratch, wiping all related caches, registry and directories Ran several DirectX and OpenGL benchmarks to check my system - all give expected performance Standing on the platform I do notice that my CPU usage is rather high, like 20% of the 16 thread i7 CPU. That would mean 3 threads are running 100%. Is that normal, SL & FS viewer using 3 full cores, standing on a platform? So, also noting that graphics settings dont mattter much, something is heavily CPU bound. And that holds back the GPU. What could that be? Is there a location where I can download older Second Life clients with which you can still login? Thanks for any suggestion, because I am stymied right now and cant really enjoy SL anymore. ----------- CPU: Intel(R) Core(TM) i7-10700K CPU @ 3.80GHz (3792 MHz) Memory: 16303 MB Concurrency: 16 OS Version: Microsoft Windows 10 64-bit (Build 19044.2006) Graphics Card: NVIDIA GeForce RTX 2070 SUPER/PCIe/SSE2 Graphics Card Memory: 8192 MB Windows Graphics Driver Version: 31.0.15.1740 OpenGL Version: 4.6.0 NVIDIA 517.40 Second Life Release 6.6.4.575022 (64bit)
×
×
  • Create New...