Jump to content

A 3070 report and fps :D


You are about to reply to a thread that has been inactive for 1109 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

On 4/8/2021 at 1:00 AM, Mollymews said:

the article gets the basics right.  I got my understanding of FPS from how FRAPS does it, as mentioned in the article

i have my NVidia set to Adaptive VSync. Which for me works out pretty ok in SL.  More here: https://www.nvidia.com/en-us/geforce/technologies/adaptive-vsync/technology/

This is absolutely TLDR...

The opening paragraph makes sense to me: "Nothing is more distracting when gaming than frame rate stuttering and screen tearing. Stuttering occurs when frame rates fall below the VSync frame rate cap, which is typically 60 frames per second, matching the 60Hz refresh rate of most monitors and screens. When frame rates dip below the cap VSync locks the frame rate to the nearest level, such as 45 or 30 frames per second. As performance improves the frame rate returns to 60."

When frame rates fall below the refresh rate, it's necessary to repeat frames until a new one is available. This happens when converting 24fps film to 60Hz video. In chat case, both the film and video rates are stable and synchronized (so no tearing), and the machinations required to convert from one rate to the other result in "judder". Judder is different than stutter in that the visible frame cadence, though uneven, is constant over time and not terribly objectionable. We're all accustomed to 24 fps judder from watching films in theaters, regardless whether the film is actually film or a conversation to digital.

Stutter is random, and like tearing, is highly objectionable. To reduce both tearing and stutter when rendering at frame rates below the refresh rate, it's important to synchronize to a sub-multiple of the refresh rate (a multiple of the refresh time) to avoid tearing. The easiest to imagine is 30fps (33.33ms), where you'd simply play each rendered frame twice, synchronized to the 60Hz VSync (16.67ms) 16.67x2=33.33.

At 45fps (22.22ms), you need four 60Hz refresh cycles (66.7ms) to render three frames. You might refresh the first rendered frame twice and the next two once each. In this case the refresh rate is 60Hz, the frame rate is 45fps and the judder rate is 30jps. I'd not be surprised to find other tricks being played, such as doing motion estimation between 45Hz frames so that no frame is repeated exactly during the up-conversion. High frame rate televisions do this on existing 60fps content.

Why not synchronize 50fps (20ms) to 60Hz? The lowest common multiple of 20ms and 16.67ms is 100ms, which produces 10Hz judder. That's much more objectionable than the 30Hz judder of 45fps. Even 20fps might look better, as it produces 20Hz judder, which each frame played three times. The "levels" of NVIDIA's "VSync locks the frame rate to the nearest level" are chosen to minimize judder.

Okay, I get all that, but then NVIDIA goes on to confuse me by stating "When VSync is disabled in-game, screen tearing is observed when the frame rate exceeds the refresh rate of the display (120 frames per second on a 60Hz display, for example)."

Tearing occurs at ANY frame rate that's not synchronized to VSync. At 40fps, it takes 25ms to draw a frame. At 60Hz, it takes 16.7ms to refresh. If rendering begins the instant the screen starts refreshing, the refresh will complete while there's still 8.3ms of frame left to render. As a result, the first 8.3ms of the next screen refresh will contain 8.3ms of one frame and 8.3ms of the next frame, tearing the screen across the middle. The next refresh will contain the final 16.7ms of the second frame, with no tear.

When the frame and refresh rates are exactly the same, you can still have tearing if they aren't synchronized. The tear line will be stationary on the display at some vertical position that's proportional to the misalignment between rendering and refreshing. At 120fps and 60Hz, you're guaranteed to have either one or two tears separating two or three frames. Here's the example image from the NVIDIA page, right after the above quote.

image.png.d05c0743294533d96c0712c017a089a7.png

That image is 326 pixels tall, and I measure the distance between the two tear lines at 177 pixels. The narrow slice of scene above the first tear is from the earliest of three frames. The widest slice is the second frame, the slice at the bottom is the third. If a game frame takes 177/326's of a 16.7ms refresh period (60Hz), the game is rendering at 111fps.

I've no idea why NVIDIA makes the claim they do unless there is something else going on behind the scenes they don't discuss.

On 4/8/2021 at 12:46 PM, Kathrine Jansma said:

After all, its totally useless (if you go by https://en.wikipedia.org/wiki/Nyquist–Shannon_sampling_theorem ) to render more then 2x the Hz of your display in fps (fps is basically Hz), so setting some limit in the display update range is useful to save energy. 

Nyquist is involved, but not quite in the way you suggest. Reality is the thing being "sampled" here, and that's done by rendering, at the frame rate. If you want to avoid the aliasing problems we see in movies (stagecoach wheels appearing stationary or spinning backwards while the coach moves forward), you must render at least 2x faster than the fastest repetitive motion in the scene, usually things that spin. For most scenes, there's not much of that. Higher frame rates to produce smoother motion and, in gaming situations, lower latency.

Rendered frames then get sampled by the display at the refresh rate. But this isn't quite the same kind of sampling. Any "reality" that happens between rendered frames has already been lost, or accommodated in some way (motion blur) by the rendering. There's no need to oversample the rendering. If you render at 60fps, nothing is lost by refreshing at 60Hz.

There is one potential advantage to oversampling, hinted at by @Wulfie Reanimator. If a game locks everything to the frame rate, latency will be reduced by drawing faster, even if most of the rendered frames are tossed. Imagine a game rendering (and taking mouse input, computing physics etc) at 60 fps and displaying on a 60Hz monitor. At the moment a frame is completed and sent to the display, the parameters that drove the rendering are now 1/60th of a second old. If the game is rendering a million frames a second, the parameters will be only one millionth of a second old. Either way, it'll still take 1/60th of a second for the image to appear on the display, but it is possible to cut latency in half by rendering far far faster than you can display, and tossing all but the most recent frame at refresh time. Increasing the display refresh rate reduces latency in the actual display of the frame, so that's also worth pursuing for reasons beyond scene motion quality.

Displays that refresh on command (variable refresh rate) provide the best performance overall, by eliminating the tradeoffs that go into minimizing judder and tearing while also improving power consumption during static scenes. That's where mobile devices are going.

Edited by Madelaine McMasters
  • Like 2
Link to comment
Share on other sites

Some things in this thread sound no longer uptodate.

I can use either a vsync mode that will use vsync if the fps is over the monitor freqency and switch off vsync if its lower. So there is no drop to 30Hz or something. (Adaptive Vsync/Fast Vsync)

Or I can use a variable monitor frequency that will adapt to the framerate. (within limits)

The reason that players say a higher fps looks better is because there is no constant fps in games. Even if you run at 100 fps there are frames that are much lower and thats what the eye can catch. A constant 60 fps will look better than an average 100 fps.

For SL: Fast Vsync is fine - no tearing - frames over monitor frequency are dropped. That is 100 Hz in my case. I get around 100 at home with shadows and that place has some complexity. I run at 3440x1440 - thats 60% of the Pixels for 4K. In full clubs I need to switch to my club setting to keep the fps at 30. The GPU load is usually shown with 30-35% (a Nvidia 2070) so I don't think that high level GPU's give much benefit for SL - not even for 4K.

 

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 1109 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...