Jump to content

New Firestorm Version September 2022


arabellajones
 Share

You are about to reply to a thread that has been inactive for 588 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

9 hours ago, Jackson Redstar said:

this argument back and forth reminds me of the days I worked as a UI designer for a major corp - and how we would always kinda laugh when developers thought they could design a UI. Always turned out to be a UI only other developers would possibly understand

Funny thing is, as a RL developer, we always facepalm about the ideas of the UI/UX team because their ideas are often a "stylish regression and step back" because they have no idea of how the current applications are being used! But then I just don't care enough to emphasize the fundamental flaws in their design after giving them an initial feedback... 😂

Link to comment
Share on other sites

15 hours ago, Aishagain said:

You see...there's the fundamental misunderstanding.  SL is not played solely  by gamers that know how their rigs work.  It is also embraced by folk that otherwise might not have known much more than how to turn a computer on and send an email.  I wasn't much different myself 15 years ago when I first explored this virtual world (via On Rez and some sort of CSI game).  We never stop learning but we do need the complexities explained now and then.

SL is played by 15 year old accounts that still don't know how to open chat after all these years.

Other things most people don't know which are universal across all Viewers:
Basic Camera Controls (Alt Zoom, Panning, Orbiting)
Movement Controls (Arrow Keys/AWSD, E/PageUp, C/PageDown, F/End)
Common Shortcuts (CTRL + P = Preferences, CTRL + I = Inventory, CTRL + C = Copy, CTRL + V = Paste, CTRL + B = Build, CTRL + 1/2/3/4 = Build Modes)
Common Functions (Enter = Open/Send Chat, Right-Click - Reset Skeleton, Right-Click - Hover Height)
Useful UI Functions (Arrow Keys in dropdowns, Enter to select them, Arrow Keys in sliders for smallest increment changes)

I keep making fun of that one incident where someone asked me over IM how to chat in local chat (after 13 years of Second Life) but coming to think about it its a sad matter that after 10+ years lots of people still don't know the basics.

I really can't repeat it often enough but after just under a year you MUST know the basics, there is really no excuse not to. Learning the basics is essential and takes you a week at most, they are so basic you are going to use them practically all day.

12 hours ago, Jackson Redstar said:

The first mistake in designing a UI would to be assume all users are 'gamers' LOL

Followed by assuming the user can read.

And finally the last but not least is assuming the user can apply common sense.

Combine all of these three mistakes and you've essentially included practically all UI design that is not "like Firestorm" because "like Firestorm".
I think we can neatly wrap all the mistakes up as "It's not like Firestorm"

Edited by NiranV Dean
  • Like 1
Link to comment
Share on other sites

On 9/11/2022 at 1:10 AM, Beq Janus said:

For my money, Vsync is a poor frame rate limiter. I have posted why on another thread, with caveats to say that I may be totally interpretting this the wrong way. What you will most likely get with vsync on is a far more choppy performance. (As noted on the other thread, I would be very happy to have this proven incorrect, and my misunderstanding of the OpenGL documentation clarified)

 

On macOS, application output (even at fullscreen) is composited to the desktop at 60 fps regardless of how many fps the application is capable of generating or generates. If the viewer generates more than 60 fps, you will see tearing when panning the scene. So right now Mac users better keep vsync on. 

In the upcoming macOS 14 (Ventura) Apple supports monitors that can vary the frame rate between 3 and 120 Hz, which will be used on their portable HW to conserve battery (also in iPhone and Apple Watch for always on display support). The effect of this is that the viewer code for vsync no longer works on Ventura and the viewer will  run at the fastest frame rate the HW can generate. 

The current desktop compositor in macOS is written in Metal 2 and the Metal version was first introduced in macOS 10.11.

The original Metal implementation was flaky in that it allowed games and also the viewers to interact directly with the OpenGL HW in the machines, resulting in them overwriting GPU memory that the Metal compositor used and therefore crashed the GPU. This was only completely fixed/blocked in the macOS kernel in macOS 12.6, and is the reason why it is an exceptional bad idea to build the viewer with deployment target of less than macOS 11.13.

The Ventura compositor is written in Metal 3.

On Windows, that does not have the double buffer output and OS level desktop compositing of macOS, this obviously works different.

Edited by Gavin Hird
  • Thanks 2
Link to comment
Share on other sites

The bottom line is leave Vsync enabled if you're using a 60Hz refresh display or your cpu/gpu aren't relatively modern. If you have a high refresh rate monitor and your hardware (and cooling solution) are up to it, try turning it off. It's not going to break anything, but it will make your system work harder. Whether it makes a real world noticeable difference is up to you to decide. A steady 60Hz refresh is pretty good given SL isn't and FPS game and the input precision isn't really that good.

  • Like 2
Link to comment
Share on other sites

47 minutes ago, Crim Mip said:

The bottom line is leave Vsync enabled if you're using a 60Hz refresh display or your cpu/gpu aren't relatively modern. If you have a high refresh rate monitor and your hardware (and cooling solution) are up to it, try turning it off. It's not going to break anything, but it will make your system work harder. Whether it makes a real world noticeable difference is up to you to decide. A steady 60Hz refresh is pretty good given SL isn't and FPS game and the input precision isn't really that good.

Bottom line is keep it off on Windows, there is zero reason to use it, it slows down the viewer internal processes, causes additional input delay and is a bad framerate limiter. On Windows its impossible for windowed applications to have screen tearing thanks to the desktop composite renderer which acts as natural vsync without fps limit, if you want to limit your fps use the inviewer solution.

  • Like 3
Link to comment
Share on other sites

12 hours ago, Ansariel Hiller said:

Funny thing is, as a RL developer, we always facepalm about the ideas of the UI/UX team because their ideas are often a "stylish regression and step back" because they have no idea of how the current applications are being used! But then I just don't care enough to emphasize the fundamental flaws in their design after giving them an initial feedback... 😂

[my reply was possibly a bit too snarky, sorry, removed]

Edited by Katherine Heartsong
Link to comment
Share on other sites

45 minutes ago, NiranV Dean said:

Bottom line is keep it off on Windows, there is zero reason to use it, it slows down the viewer internal processes, causes additional input delay and is a bad framerate limiter. On Windows its impossible for windowed applications to have screen tearing thanks to the desktop composite renderer which acts as natural vsync without fps limit, if you want to limit your fps use the inviewer solution.

100% agreed. To restate what we've said here. Vsync is not "capping you fps at your monitor refresh", it will cap at some unit fraction of your refresh rate depending on the actual frame time of the scene you are rendering and has potential to cause significant jitter as a result. The built in limiter achieves the result in a far more consistent manner and results in better all round experience.

  • Like 2
  • Thanks 3
Link to comment
Share on other sites

6 hours ago, Beq Janus said:

100% agreed. To restate what we've said here. Vsync is not "capping you fps at your monitor refresh", it will cap at some unit fraction of your refresh rate depending on the actual frame time of the scene you are rendering and has potential to cause significant jitter as a result. The built in limiter achieves the result in a far more consistent manner and results in better all round experience.

Question then: There is a frame limiter in the Rendering options and also in the Nvidia control panel you can set a max frame rate application specific. Is one of those option better than the other? iregardless of 'tearing' issues (which I've never seen) with v-synch on, I had been using v-sync cause I really have no need for anything more than 60 fps in SL, so there is no need for my graphics card to work so hard to give me something I really dont need

Link to comment
Share on other sites

On 9/10/2022 at 4:55 PM, Aishagain said:

Just my two penn'orth on this release.  We were warned about two bugs in this release, inherited from LL.  One affects right clicking objects inworld,  I was told... for me this just does not seem to happen, except a minimal freeze on right clicking a scripted object.  For me it is so slight it is negligible.  I gather it is worse for others so I wonder if it is dependent on CPU  speed.

The other is  crash to desktop if you attempt to save a 360° snapshot.  Well, for me that is not an issue since I have NO idea what use I could make of it!

In short, the new Firestorm 6.6.3.67470 is simply the best viewer I have used in SL to date.  Busy clubs no longer force me to lower my quality level or reduce my FPS to slide-show speeds.

The base Linden Viewer changes were good, this is simply better!

Thankyou FS developers and testers you have given me my SL back.

Thanks  you answered the question i thought of asking here.

I have problem with right click objects for editing, not only scripted objects. In edit when i copy a prim, move the prim or delete it. All screen lock for a second.

When move or copy a prim it's about  a half second. Verry irritating as i build now and then.

Yes FPS is good on high settings but i soon go back to older FS. Less FPS is better then this.

Link to comment
Share on other sites

On 9/12/2022 at 8:49 PM, Jackson Redstar said:

Question then: There is a frame limiter in the Rendering options and also in the Nvidia control panel you can set a max frame rate application specific. Is one of those option better than the other? iregardless of 'tearing' issues (which I've never seen) with v-synch on, I had been using v-sync cause I really have no need for anything more than 60 fps in SL, so there is no need for my graphics card to work so hard to give me something I really dont need

The frame limiter in the viewer set a time budget for each frame. If I set it to 20FPS then it will set my limit at 50ms. If my frame takes 30ms to draw, then it will sleep for the remaining 20ms. If you take more than 50ms then it doesn't do a thing. It is a proper cap to the performance and has no impact on the frames below that cap. 

I'm not sure what the frame limiter in the NVidia tools does. but as it cannot force the viewer to sleep, I presume it must block on the screen swap similarly to the vsync but without syncing. IF I am right in that assertion then it will be better than vsync but the built in limiter is arguably best as that is localised to the viewer, I could probably argue the other case if I wanted 🙂

Vsync is just a poor choice because as we've seen it is not a cap at a fixed rate but can cap at various lower rates and causes jitter

 

Link to comment
Share on other sites

8 hours ago, Beq Janus said:

The frame limiter in the viewer set a time budget for each frame. If I set it to 20FPS then it will set my limit at 50ms. If my frame takes 30ms to draw, then it will sleep for the remaining 20ms.

There are smarter ways than just sleeping for 20ms straight: doing such a thing still leads to slower rezzing (since the CPU sleeps when it could process more data, decoding object update messages, polling the texture fetching threads, recalculating the textures priority and updating it for fetch and decode threads, recovering textures from the decode threads and launching GL image creation in the GL threads for them, etc).

In the Cool VL Viewer, the frame limiter I implemented uses the free CPU time in 1ms steps, polling and checking for newly fetched/decoded images, yielding to coroutines, etc, at each step, and updating threads until no work is pending, then sleeping again for 1ms, etc, until the minimum frame duration is reached. There is almost no impact on the CPU power consumption compared with a monolithic long sleep, and it still saves the same amount of power on the GPU side. The net result however is that the viewer rezzes things much faster (including faster than when the frame rate is unlimited, because then a larger proportion of CPU time is used rendering extraneous frames rather than decoding textures etc).

I encourage the Firestorm's developers team (and all other viewer teams) to ”borrow” my code (as long as due credit is given, all such reuse of Open Source code I produce is most welcomed).

8 hours ago, Beq Janus said:

Vsync is just a poor choice because as we've seen it is not a cap at a fixed rate but can cap at various lower rates and causes jitter

Indeed. VSync is the worst choice of all. And for people worried with tearing, there is a way better way to solve it: triple buffering.

Edited by Henri Beauchamp
  • Like 4
Link to comment
Share on other sites

14 hours ago, colleen Criss said:

Before the update my cp ran just fine with 100 or better fps now it won't run above 60 no matter what I'm doing. I log into mud like textures and crowds tank me to 12fps. Never happened before.

The only issue I had noticed was that my Shadow Quality was set to 3.8, and with the previous Firestorm version that just ran fine. In the new version there's no chance running it at 3.8 cause it slows down extremely. However, setting it from 3.8 to 2.0 solved it for me (and runs super smooth now, way better than ever before), and the shadows still look good. Possibly they've changed something here as well.

Edited by xDancingStarx
Link to comment
Share on other sites

29 minutes ago, xDancingStarx said:

The only issue I had noticed was that my Shadow Quality was set to 3.8, and with the previous Firestorm version that just ran fine. In the new version there's no chance running it at 3.8 cause it slows down extremely. However, setting it from 3.8 to 2.0 solved it for me (and runs super smooth now, way better than ever before), and the shadows still look good. Possibly they've changed something here as well.

Without seeing the stats it is hard to be certain but in general a shadow quality that high will cause a lot more data to be shuffled between CPU and GPU, causing far larger SwapBuffer times (this is the delay enforced by the graphics driver while it gets all its work done) With the perf changes we are now utilising the GPU a lot more heavily so in conjunction with the extra load of high detail shadows it does not surprise me that you'd see a degradation like that, which as you say would be reduced by lowering the shadow quality setting. A fair chunk of supposition on my part in this statement but I suspect it is a reasonable bet.

Link to comment
Share on other sites

3 hours ago, Wulfie Reanimator said:

Can somebody explain why vsync is so uniquely terrible in SL viewers? (Or rather, why hasn't it been improved after all this time?)

SL viewers perform a sh*tload of tasks in their main thread in excess of rendering the 3D world (in particular, downloading, decoding and adding to the 3D world, in real time, the objects, avatars, etc). Even if many of these tasks are nowadays partly threaded (partly, because each of the said threads still take their input from, and report their output to the main thread, the latter also having to prioritize each tasks based on your avatar's camera moves), there is still a lot of CPU code executed in the main thread which execution time cannot be predicted at all (causing a lot of jitter in each frame calculation time).

Beside, to cite myself:

Quote

.../..., the SL render engine is not about dealing with pre-made optimized contents, but about rendering anything its users could have decided to create or upload, and then throw at it !  We are speaking here of totally random meshes, textures, animations, and combinations of the latter, in any variable amount/concentration at any given place.

This is unlike AAA games, where all the required data resides on your disk or in memory, where all objects are pre-made and well optimized, with all textures and meshes adjusted so that their level of details is exactly what it needs to be to ”look good” when you will see them in already scheduled situations, and where the coders have carefully balanced the amount of objects/details you will see in each scene to keep the same average rendering time for each frame.

Even games able to generate the world ”on the fly”, are using pre-made contents and algorithms to keep things in line as far as rendering time is concerned.

With a SL viewer, a frame can render in as little as a single millisecond (for example, in a sky box, or when sailing on a sea away from the coasts), or as much as 100ms (at which point you lament for dropping to single digit frame rates), and this is totally unpredictable.

Vsync makes things even worst because of the delays (it blocks the main thread on the GL buffer swap call, when it needs to wait for the next VSync pulse, wasting CPU time that could be used to load more textures and rez more objects), and of the additional jitter it introduces (the worst cases are when you need slightly above your monitor Vsync pulse interval to render every frame, or even slightly above half that time).

Bottom line: don't use it !

Edited by Henri Beauchamp
  • Like 3
  • Thanks 1
Link to comment
Share on other sites

3 hours ago, Wulfie Reanimator said:

Can somebody explain why vsync is so uniquely terrible in SL viewers? (Or rather, why hasn't it been improved after all this time?)

The problems being described here don't seem to exist anywhere else -- regardless of the engine/frameworks being used.

These "problems" are well-known and exist across the board at least for openGL solutions, I've not checked whether other engines do the same. The implementation of VSync is an openGL feature not a viewer feature.

Vsync should not be confused with FPS Limiting and most people are using it for this reason and that's the mistake that underlies pretty much this entire conversation.

VSync by definition has to sync to the end/start of a frame, therefore it can only ever be some fraction of the frequency of your monitor, you cannot have a 60Hz monitor and run at a constant 40 FPS, The game/viewer/whatever has to wait for the transition in the frame in order to sync. meaning it is syncing each frame, every other frame, every third frame etc. without that, it would not be in "sync".

G-Sync and Free-Sync technologies alter this equation by adjusting the monitor refresh to the frame rate, though how well that works for SL which is typically running in a windowed form I have no idea. Not everyone has G-Sync or Free-sync in any case.

Read the following article for example, which suggests a number of external frame rate limiters but concludes with the statement that in general if you are lucky enough to have an in-built limiter then use it.  https://www.gpumag.com/fps-limiters/

Meanwhile, a short google excursion will yield many results relating to VSync behaviour in all kinds of games

Take this reddit thread for example, but many similar things exist.

 

 

 

 

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

1 hour ago, Henri Beauchamp said:

it blocks the main thread on the GL buffer swap call, when it needs to wait for the next VSync pulse

the worst cases are when you need slightly above your monitor Vsync pulse interval to render every frame, or even slightly above half that time

This, specifically, was what I didn't understand.

If I'm in a scene that is practically unchanging (everything has rezzed/decoded and my camera is static), the framerate limiter can easily cut off 20 FPS if I adjust it juuust right.

For example, when I average ~51 FPS without the limiter and set it to 48 FPS, Firestorm will (as explained) often render frames at ~30 FPS instead.

image.png.5c31745bf7c633faf888b95146c2ca33.png

After doing some more digging (thanks @Beq Janus), it really does seem like normal behavior and not just wonky implementation. I can also see why Henri suggested triple buffering after reading this wonderful wall of text (which I would put up as recommended reading on the topic).

Edited by Wulfie Reanimator
  • Thanks 1
Link to comment
Share on other sites

23 hours ago, Beq Janus said:

The frame limiter in the viewer set a time budget for each frame. If I set it to 20FPS then it will set my limit at 50ms. If my frame takes 30ms to draw, then it will sleep for the remaining 20ms. If you take more than 50ms then it doesn't do a thing. It is a proper cap to the performance and has no impact on the frames below that cap. 

I'm not sure what the frame limiter in the NVidia tools does. but as it cannot force the viewer to sleep, I presume it must block on the screen swap similarly to the vsync but without syncing. IF I am right in that assertion then it will be better than vsync but the built in limiter is arguably best as that is localised to the viewer, I could probably argue the other case if I wanted 🙂

Vsync is just a poor choice because as we've seen it is not a cap at a fixed rate but can cap at various lower rates and causes jitter

 

Thanks for the unssights, Ben,  In the nvidia panel it is called Max Frame Rate, For now I set that to 80. Ive just always been kinda skittish running a CPU or a GPU at full throttle for extended periods time, Even though this 3070 I have now has really good cooling and the max temp I see now at 100% usage on the GPU is about 70-75c, but soon will be winter the house will be alot colder so I expect it will cool down by a few degrees, so i may ney even need to worry about capping the frame rate then anyways

Link to comment
Share on other sites

On 9/15/2022 at 12:51 AM, Jackson Redstar said:

Thanks for the unssights, Ben,  In the nvidia panel it is called Max Frame Rate, For now I set that to 80. Ive just always been kinda skittish running a CPU or a GPU at full throttle for extended periods time, Even though this 3070 I have now has really good cooling and the max temp I see now at 100% usage on the GPU is about 70-75c, but soon will be winter the house will be alot colder so I expect it will cool down by a few degrees, so i may ney even need to worry about capping the frame rate then anyways

With energy prices as they are here in the UK, I can embrace the FPS and be warmer 🙂

  • Like 1
Link to comment
Share on other sites

On 9/20/2022 at 8:00 AM, Beq Janus said:

With energy prices as they are here in the UK, I can embrace the FPS and be warmer 🙂

There's not such a lot of power used, even by a powerful desktop.

There are a lot of changes happening to electricity metering in the UK. The label to be wary of is "smart metering". You don't always have predictable timing for the different rates. Better to pay attention to keeping the heat in the house, whatever rate is charged. I have a place wired for the old Economy 7 system, which has storage radiators connected through a time clock to use cheap rate power. Only thing is, I am now getting charged the same rate on the two meters.

Not really relevant to Second Life, but it looks as though people are going to get caught out by such changes. I saw a letter to a newspaper extolling the advantages of overnight cheap-rate electricity. Somebody is going to get a surprise. My computer has an idle/sleep option which I use now. Second Life uses a lot less power than something such as video coding.

 

Link to comment
Share on other sites

On 9/20/2022 at 9:00 AM, Beq Janus said:

With energy prices as they are here in the UK, I can embrace the FPS and be warmer 🙂

Well, the ”good news”, at least for Winter time (a bad news in Summer), is that everything your computer sucks up from the mains power socket gets, in the end, converted into heat, so yes, it will actually contribute to the heating of your room, and that's as much power your actual heating installation will not have to spend. 😜

Link to comment
Share on other sites

5 hours ago, Henri Beauchamp said:

Well, the ”good news”, at least for Winter time (a bad news in Summer), is that everything your computer sucks up from the mains power socket gets, in the end, converted into heat, so yes, it will actually contribute to the heating of your room, and that's as much power your actual heating installation will not have to spend

I still have three PDP-11s and a Microvax that in an emergency can be plugged in and used as room-heaters :)

Going back to the original point though, I've played around with the new Firestorm quite a bit now and am happy with it. I haven't looked at the FPS figure (I never do), there is no sense of sluggishness as I move around, which is to my mind the critical measurement, albeit a subjective one.

  • Like 2
Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 588 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...