Jump to content

Max Performance graphics on Firestorm


bbkeh
 Share

You are about to reply to a thread that has been inactive for 1520 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

I am a keen firestorm user however I lost some of my settings due to a laptop change.

I currently have an Alienware with the Nvidia GTX 1060

I basically don't care about heat, I just want 30-40 FPS without lag

Anybody have any tips, what should I check and uncheck

thanks

Bea

 

Link to comment
Share on other sites

1 hour ago, bbkeh said:

I basically don't care about heat

I really hope that you realize you can and likely will burn out your power supply if you don't pay attention to the HEAT situation.  You can also burn out your motherboard if your power supply goes.  Research is your friend. 

  • Like 2
  • Confused 1
Link to comment
Share on other sites

3 hours ago, Chic Aeon said:

I really hope that you realize you can and likely will burn out your power supply if you don't pay attention to the HEAT situation.  You can also burn out your motherboard if your power supply goes.  Research is your friend. 

3 year full warranty + cool pad.

anybody have any setting tips?

 

Edited by bbkeh
  • Like 1
Link to comment
Share on other sites

- view range: in a club it needs to cover the distance to the walls, when you admire the landscape while sailing you will need more. View Range has a major effect on your fps.

- avatars: set the number of impostors. How many depends on your hardware. I personally don't render hi-arc-avatars avatars over 200k. So this avatars already fall out of the calculation then. The # of Impostors have a major effect on your fps - well - in a crowded area - if you are alone ... not 😎

- shadows: add alot of geometry and they cost quite some fps. I reduced the shadow resolution to 0.5 - that reduces the shadow quality but increases the fps significantly. Reduced shadows are much better than no shadows. 😎

- fog: in some scenes fog can noticeable reduce fps. If you need all you can get activate a fog free windlight.

All other settings are not worth to mention - minimal or no effect. Tweaking 20 little things may sum up to a few fps though. May be different on different hardware of course. For me catznip is not faster than FS (in high load areas and under 60 fps). I see same fps if I apply same settings.

There is no setting for all situations - so make a few setups and switch depending on where you are and what you are doing.

Link to comment
Share on other sites

You cannot utilize your GPU to the fullest. Firestorm simply doesn't offer options that are capable of using your GPU to its maximum.

Options that can ramp up GPU utilization quickly:

  • The old original high quality depth of field at maximum resolution and high blur will quickly use 100% of your GPU. (I'm unsure whether FS has the old DoF, probably not)
  • Screen Space Reflections is a pure GPU shader and will drive GPU utilization up quickly if you turn up the resolution. (Black Dragon exclusive)
  • Volumetric Lighting is a pure GPU shader, although dependent on shadows it is a purely GPU dependent effect and will also utilize your GPU a lot if you raise its resolution. (Black Dragon exclusive)

To clarify which options do not utilize your GPU (only little)

  • Shadows: Most of their impact comes from them being CPU dependent, not GPU, the GPU only does little when it comes to shadows, most is done by the CPU.
  • Draw Distance: Is mostly just CPU, you are essentially just tossing more draw calls at your CPU which it can't handle and will fall behind your already under-used GPU.
  • Avatars: Same as draw distance.
  • Fog: Is a pure shader effect but doesn't impact your FPS at all, its calculation is super fast and so easy on your hardware you could essentially run it on a pocket calculator.

In short, all options that just toss "more of the same stuff" at the Viewer are going to the CPU and will likely reduce your framerate with your GPU idling. Everything done post process in a shader is run on the GPU, settings counting into this are: Screen Space Ambient Occlusion, Depth of Field, Screen Space Reflections, Volumetric Lighting, Exodus's Post Processing (Tone Mapping, Color Correction), shadow maps being translated and layered onto your world, half of which Firestorm doesn't even possess.

  • Like 1
  • Thanks 3
Link to comment
Share on other sites

After a bit of sleep i'm back to explain further when shadows actually ramp up GPU usage.

In the very last part of shadow rendering they are being translated and applied to the world, that's the part thats being run on the GPU, before it does these calculations however it does a simple check against lighting. IF the point in question is on the opposite side of where the sun is coming from, lets say you pulled the sun down all the way to where it is below the terrain and the moon lighting hasn't yet risen enough to be shown, shadows would simply "early out" with a "we're facing away from sunlight, we MUST be in shadow", there's no need to calculate where the shadow is if the point in question isn't in sunlight, which means we can save the rest and assume the point is shadowed. Any and all surfaces pointing away from the sun are automatically considered shadowed, this saves a lot of processing power with high resolution shadows and/or far shadow distances. However there is a point... specifically a sun angle which is a super edge case, you can put the sun in such an angle that the shader doesn't yet consider everything in shadow but still have the entire world shadowed due to the sun being in such a low angle that everything has a shadow drawn across, this is the post apocalyptic scenario you never want in this very case possibly every single point needs to calculate shadows fully, check against the shadow map and decide how shadowed something is, this scenario can often be achieved in very foggy or low-sunlight-high-ambient-fog windlights which basically eliminate any sunlight and have the entire world being shadowed, this will make the shadow calculation run rampant which is why @Nova Convair here might see such a huge framerate impact with "fog". Note that said shader is running on the GPU so a good GPU will negate most of this but also keep in mind that said shader will do a crazy amount of extra texture lookups which might be costly and comparisons against 4 shadow maps, possibly 2 at the same time (where shadow maps overlap) or more in possibly unseen weird instances, this combined with higher shadow resolution (usually in all Viewers except Black Dragon coming from higher screen resolution) can quickly massivey inflate the amount of times this calculation is run and can possibly quickly tax the GPU to the point it can't keep up anymore.

I thought about moving shadows from pipeline into shaders to transfer more load from possibly CPU to GPU but i haven't exactly looked into the pipeline shadow calculation that much, most of it however should be just calculations which the shader could do too... i wonder if it was worth moving those into shaders.

  • Like 3
Link to comment
Share on other sites

Getting the best performance and the best quality render is a balancing act. Personal preference is a big factor in deciding which balance is right.

Much of what has been written about optimizing for SL is old and out of date. Once upon a time Anti-Aliasing (AA) was a huge factor in performance. With newer video cards like the 1060 and new ways to handle AA it is mostly a non-issue. I can't see a difference in FPS between 2x and 16x AA on my 1060. Nor do I see a perceptible visual difference. So, these days I tell people to try AA at max and 2x to see what it does to their performance and to look to see if they can see a difference.

I wrote Graphics Tweaking for SL sometime ago (2010) when I was trying to squeeze all I could from an older computer. That article explains what the various settings do.

When I built my newer computer (2016-17) I went through the process of figuring out what is best for my current machine. In the process I came across NVIDIA's recommendations for running SL, NVIDIA 2016 SL Settings. These are a combination of Windows setting and NVIDIA setup. Part of it is setting up a gaming profile for SL as there are none included with newer video cards.

NiranV gets into the meat of what loads the viewer. The two articles are the long story.

  • Like 2
Link to comment
Share on other sites

  • 7 months later...
On 6/21/2019 at 2:58 AM, NiranV Dean said:

After a bit of sleep i'm back to explain further when shadows actually ramp up GPU usage.

In the very last part of shadow rendering they are being translated and applied to the world, that's the part thats being run on the GPU, before it does these calculations however it does a simple check against lighting. IF the point in question is on the opposite side of where the sun is coming from, lets say you pulled the sun down all the way to where it is below the terrain and the moon lighting hasn't yet risen enough to be shown, shadows would simply "early out" with a "we're facing away from sunlight, we MUST be in shadow", there's no need to calculate where the shadow is if the point in question isn't in sunlight, which means we can save the rest and assume the point is shadowed. Any and all surfaces pointing away from the sun are automatically considered shadowed, this saves a lot of processing power with high resolution shadows and/or far shadow distances. However there is a point... specifically a sun angle which is a super edge case, you can put the sun in such an angle that the shader doesn't yet consider everything in shadow but still have the entire world shadowed due to the sun being in such a low angle that everything has a shadow drawn across, this is the post apocalyptic scenario you never want in this very case possibly every single point needs to calculate shadows fully, check against the shadow map and decide how shadowed something is, this scenario can often be achieved in very foggy or low-sunlight-high-ambient-fog windlights which basically eliminate any sunlight and have the entire world being shadowed, this will make the shadow calculation run rampant which is why @Nova Convair here might see such a huge framerate impact with "fog". Note that said shader is running on the GPU so a good GPU will negate most of this but also keep in mind that said shader will do a crazy amount of extra texture lookups which might be costly and comparisons against 4 shadow maps, possibly 2 at the same time (where shadow maps overlap) or more in possibly unseen weird instances, this combined with higher shadow resolution (usually in all Viewers except Black Dragon coming from higher screen resolution) can quickly massivey inflate the amount of times this calculation is run and can possibly quickly tax the GPU to the point it can't keep up anymore.

I thought about moving shadows from pipeline into shaders to transfer more load from possibly CPU to GPU but i haven't exactly looked into the pipeline shadow calculation that much, most of it however should be just calculations which the shader could do too... i wonder if it was worth moving those into shaders.

I noticed this exact behavior while bench marking fps while I browsed thru various Windlight settings. However I use a GTX 780 (gpu usage maxed out) whereas my friend who doesn't have this issue had a GTX 1070. Would you at any point recommend the newer AMD gpu's vs Nvidia? Or is that a non-starter? Thanks for the valuable information. Also would your viewer favor the specs of one product over the other due to "shaders" specs etc.?

Link to comment
Share on other sites

2 hours ago, Ormand Lionheart said:

I noticed this exact behavior while bench marking fps while I browsed thru various Windlight settings. However I use a GTX 780 (gpu usage maxed out) whereas my friend who doesn't have this issue had a GTX 1070. Would you at any point recommend the newer AMD gpu's vs Nvidia? Or is that a non-starter? Thanks for the valuable information. Also would your viewer favor the specs of one product over the other due to "shaders" specs etc.?

Nvidia is always better if you are aiming for SL. AMD is infamous for their bad OpenGL driver support which SL just so happens to be using.

My Viewer is no different there, generally everything will run but NVidia will obviously have less potential problems.

  • Like 1
  • Thanks 1
Link to comment
Share on other sites

What I have never understood is some sims the GPU ramps up and is at 90-100% and FPS is generally pretty good. Other sims given the same settings and WL, it is choking on higher CPU and very little on the GPU with lower FPS. I know of course it is not apples to apples, but what gets a sim to run more GPU and one run more CPU? I don't know if it is just different servers or different code or what. Some sims you enter and everything is choking the CPU is high and the FPS is super low until the scene gets more processed, then GPU comes up along with FPS. Other sims right off the GPU is cooking and FPS is great. I really have no idea what is running 'under the hood'

Link to comment
Share on other sites

When you crap out try sliding the particle slider down to zero. A lot of them you don't even know are on the SIM and they tend to be a hard hit. It seems to me that most of the time your GPU is being bottle-necked by the cpu even though it looks like your cpu usage is low. If you take a look at the cores individually one of the threads will be close to maxed out so the other number you see is an average of all threads which tend to be close to zero. SL does seem to be trying to use more of the other threads I think. I have compared my video card to people with 1080 ti's on the same SIM looking at the same objects and they tank as well. I think your weapon of choice is the best cpu you can get but they had 8700's and still had issues depending on where they were. SL is just too inefficient due to non-professional builders. I'm sure if they knew they would optimize or polish their skills. Some places are wicked fast. If I built a laggy SIM I think I would find out why and try and fix it.

Link to comment
Share on other sites

  • Moderators

Greetings!

If people wish to have a conversation on lag, including the definition, take it somewhere else.  

Hijacking a users thread to have an discussion about another topic is not only rude and extremely immature, it is against the forum rules. 

http://wiki.secondlife.com/wiki/Linden_Lab_Official:Community_Participation_Guidelines

 

  • Like 2
  • Confused 3
  • Sad 1
Link to comment
Share on other sites

7 hours ago, Dakota Linden said:

Greetings!

If people wish to have a conversation on lag, including the definition, take it somewhere else.  

Hijacking a users thread to have an discussion about another topic is not only rude and extremely immature, it is against the forum rules. 

http://wiki.secondlife.com/wiki/Linden_Lab_Official:Community_Participation_Guidelines

 

Uh what? You can't talk about graphics and performance without talking about lag (assuming that any "lag" mentioned in this topic is of course performance related lag). Performance is linked to lag. The absence of performance is lag so the moment you talk about either one you talk about both and the OP clearly asked about how to get a certain performance threshold which is impossible to achieve and impossible to explain to the OP that it is impossible to achieve without mentioning lag and where its coming from, a simple "you can't" isn't a sufficent answer and could even be seen as childish troll attempt.

  • Like 3
Link to comment
Share on other sites

Howdy folks and thanks for the fountain of knowledge. I have another question re: graphic performance. Does anyone recommend a Quadro even the hardware specs seem lower than their GTX counterparts and excessively expensive but perform very well with 3D programs from what I've read. Isn't that what SL is? I've read some forums etc that suggest they are "jaw droppingly fast" but never a good reference to compare to. Does anyone have any experience. If I win the Lotto I might give one a run.

thanks

Ormand

Link to comment
Share on other sites

2 hours ago, Ormand Lionheart said:

Does anyone recommend a Quadro even the hardware specs seem lower than their GTX counterparts and excessively expensive but perform very well with 3D programs from what I've read.

Generally, no, I think GeForce cards are better performers and much better values for SL. Personally, I think there are two good use cases for Quadro cards in SL. First is upgrading an old office PC that can't power a GeForce card. Old Quadros will run in practically anything. The second is if someone hands you the corporate credit card and tells you to go nuts (or the lotto thing).

The Quadros I have are all older models I bought on eBay for pretty good prices, so take it for what it's worth. No RTX 8000s here. None of them are "jaw droppingly fast" compared to GeForce cards in SL. Sorry, as far as I can tell, the stories that Quadros have OpenGL secret sauce that make all your SL dreams come true are myths.

  • Sad 1
Link to comment
Share on other sites

On 2/19/2020 at 7:54 PM, Lyssa Greymoon said:

Generally, no, I think GeForce cards are better performers and much better values for SL. Personally, I think there are two good use cases for Quadro cards in SL. First is upgrading an old office PC that can't power a GeForce card. Old Quadros will run in practically anything. The second is if someone hands you the corporate credit card and tells you to go nuts (or the lotto thing).

The Quadros I have are all older models I bought on eBay for pretty good prices, so take it for what it's worth. No RTX 8000s here. None of them are "jaw droppingly fast" compared to GeForce cards in SL. Sorry, as far as I can tell, the stories that Quadros have OpenGL secret sauce that make all your SL dreams come true are myths.

thanks for the info. Which Quadro models do you have and which GTX models were you comparing to?

Link to comment
Share on other sites

4 hours ago, Ormand Lionheart said:

thanks for the info. Which Quadro models do you have and which GTX models were you comparing to?

The Quadros I have are a K600 (don't bother), a Quadro 2000 (an underclocked GTS 450) and a K2200 (more or less a 4GB GTX 750. The GeForce cards are a GTS 250 (1GB), GTX 275, GT 1030, GTX 560 (2GB) and GTX 970 (RIP).

The K2200 is really nice and runs SL about on par with the GTX 560. Which is about what I'd expect, but pretty disappointing if you're looking for "jaw droppingly fast". I don't have a GTS 450 to do a direct comparison to the Quadro 2000, so it gets to fight it out with the GTS 250, and that ends really badly for the Quadro.

 

Link to comment
Share on other sites

The Quadro cards have not been optimized for game environments. Their main target area is 3D design software - AKA CAD applications used in industrial designing. They will produce robust 3D models without errors in geometry. That's why Quadro cards are rather expensive cards compared to GTX cards which are optimized for game environments. There is no advantage in using Quadro card in SL. It's just wasted money. With the same amount of money what Quadro costs it's possible to get much faster GTX or RTX card which are intended  for gaming environments.

  • Like 3
Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 1520 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...