Jump to content

Is it safe to wear fitted mesh yet


You are about to reply to a thread that has been inactive for 3675 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts


Madelaine McMasters wrote:

I wonder if there's a max frame rate debug setting. If there is, it should be set to the display frame rate.

I wish there was. Currently the only way to establish it for Second Life is to use a setting on Nvidia control panel (I don't use AMD cards, there should be an equivilent setting) - it can have side-effects in other applications, so it's kind of a pain to enable system-wide.

It's commonly referred to as Vsync.


Madelaine McMasters wrote:

I recall a couple years back that some game geek guy was bragging that his video card rendered 3000 FPS. I don't think he was interesed to learn that 98% of those frames were overwritten before they could be seen.

Excellent. :D I once achieved a similar feat (shortly before crashing, don't try this at home); if you create a basic application in C++ and fail to establish any limitations, it will occupy 100% of GPU and CPU time. Much like building a for/while loop that never expires, but with the lingering scent of burning components. Speed is not always best.

Link to comment
Share on other sites

There is a max frame rate debug setting, it's called MaxFPS


The latest Firestorm viewer exposes this as a slider in the graphics settings under the hardware tab if I remember correctly.

The "faster than you can see" thing is an anomaly since humans don't see in "fps".  Persistence of vision and the ability to identify a rapidly changing scene or event such as in the periphery are quite separate things.  This debate came up a while ago since it's a common idea that we only "see" around 30fps but that's just not the case.  Persistence of vision and fooling the brain into resolving static but regularly changing pictures such as with cine film does not preclude us from seeing faster events.

That said, I lock my GTX 680 to 25fps purely because when allowed to run away, it uses around 60W of extra power, fans speed up, it gets noisier for little benefit but that choice of frame rate isn't based upon anything else.

Link to comment
Share on other sites


Sassy Romano wrote:

"it's a common idea that we only "see" around 30fps but that's just not the case.  Persistence of vision and fooling the brain into resolving static but regularly changing pictures such as with cine film does not preclude us from seeing faster events."

The human visual system is a remarkable conglomeration of completely whacked out optics, sensors, and processing. Clearly the whole is greater than the sum of its parts.

I have an unusually high flicker fusion threshold (>80Hz frequency, <1% modulation). I can walk into Best Buy and sort the plasma from LCD TVs from across the store with unfailing accuracy. I hate old style fluorescent lamps, and CRTs, especially in TVs.

Gamers who rave about 120Hz monitors are not imagining the improvement, they're seeing it. And we see the difference between 60/120/240Hz HDTV, so long as the interpolation algorithms work well. Curiously, high frame rate TV can give movies a look many (including me) don't like, called the "soap opera" effect.

There's a hell of a lot going on when we watch motion, and we're still learning about it. We're analog creatures in an increasingly digital world that's increasingly good at modeling analog.


That said, I lock my GTX 680 to 25fps purely because when allowed to run away, it uses around 60W of extra power, fans speed up, it gets noisier for little benefit but that choice of frame rate isn't based upon anything else.

Now that you mention this, I remember a discussion about it. Didn't someone say that MaxFPS is actually ignored in the latest release? And as I think harder, I believe I tried changing this and saw no change in power consumption.

I'd probably limit my MaxFPS like you, to save power. But I'd set it at 24fps, simply to recall memories of the old 1916 Motiograph projector via which I watched movies in my barn when young.

Link to comment
Share on other sites


Sassy Romano wrote:

There is a max frame rate debug setting, it's called
MaxFPS

The latest Firestorm viewer exposes this as a slider in the graphics settings under the hardware tab if I remember correctly.

The "faster than you can see" thing is an anomaly since humans don't see in "fps".  Persistence of vision and the ability to identify a rapidly changing scene or event such as in the periphery are quite separate things.  This debate came up a while ago since it's a common idea that we only "see" around 30fps but that's just not the case.  Persistence of vision and fooling the brain into resolving static but regularly changing pictures such as with cine film does not preclude us from seeing faster events.

That said, I lock my GTX 680 to 25fps purely because when allowed to run away, it uses around 60W of extra power, fans speed up, it gets noisier for little benefit but that choice of frame rate isn't based upon anything else.

It was about two years ago there was a long discussion here about that. I've searched twice now unsuccesfully for that thread.  The discussion may have been a tangent from the original subject but it was jam packed full of good info.

What we want is to see is a matter of perception.  We want to see fluid motion. Tricks such as motion blurring are employed to acheive this.

When frame rates drop (lag) what is happenning is the time to process all the information in an individual frame is taking longer and longer.  It is taking more time than what is actually allotted between frames. That's what causes the perceived jerkiness or stuttering in a scene.

I've been trying to logic this out but I simply don't know enough.  But you may possibly be making yourself more susceptible to lag.  If the servers are sending you 45 FPS and you are running at 25 FPS then somewhere along the line data has to be tossed out.

Of course, and I'd never dispute this, if you are happy with those settings and what you see, that is the thing that counts the most.

It's a fascinating subject.

Link to comment
Share on other sites

I'm replying to Maddy but primarily because she's at the top of the list; I'm also replying to Freya and Sassy and Perrie as well. Thanks, all. I suppose this is the price I pay for having deliberately ignored anything to do with computer/digital graphics despite having been involved with things technical for many years.

Every single product I was ever connected with used only the most rudimentary graphics. Monitors were just another way to display system information and in a great many cases that information was primarily text; still is, even with the things I work with now. I spent my concentration on learning how the stuff that mattered worked. The downside is that I didn't learn much about other things technical. The upside is that I was really, really good with the stuff we were making.

I will for sure lock my Nvidia frame rate down if for no other reason than to reduce power consumption.

 

Link to comment
Share on other sites


Dillon Levenque wrote:

I'm replying to Maddy but primarily because she's at the top of the list; I'm also replying to Freya and Sassy and Perrie as well. Thanks, all. I suppose this is the price I pay for having deliberately ignored anything to do with computer/digital graphics despite having been involved with things technical for many years.

Every single product I was ever connected with used only the most rudimentary graphics. Monitors were just another way to display system information and in a great many cases that information was primarily text; still is, even with the things I work with now. I spent my concentration on learning how the stuff that mattered worked. The downside is that I didn't learn much about other things technical. The upside is that I was really, really good with the stuff we were making.

I will for sure lock my Nvidia frame rate down if for no other reason than to reduce power consumption.

 

I'm in a similar boat to yours. So on a lot of things I'm learning as I go.

What's really odd is I am surrounded by computer/technology degrees in my family.

I've always enjoyed using computers but it wasn't until after I started SL that I got interested in what was going on inside of them.

Link to comment
Share on other sites


Madelaine McMasters wrote:

 

Now that you mention this, I remember a discussion about it. Didn't someone say that MaxFPS is actually ignored in the latest release? And as I think harder, I believe I tried changing this and saw no change in power consumption.

I'd probably limit my MaxFPS like you, to save power. But I'd set it at 24fps, simply to recall memories of the old 1916 Motiograph projector via which I watched movies in my barn when young.

I brought it up because it wasn't working in the Firestorm beta but is fully functional in the current 4.6.1 release.

Link to comment
Share on other sites


Madelaine McMasters wrote:

Woo hoo! I've set MaxFPS to 24 and
FSLimitFrameRate to TRUE.

I'll spend the saved watts on my electric blanket. Winter still has it's teeth in my ankle.

Thanks, Sassy!

;-)

 

I had to steal some time to get inworld and try this and saw no change. I looked and didn't see anything to toggle to apply my MaxFPS but I see, now that I've actually read your comment, what I must have missed: turning on FSLimitFrameRate. I'll take another look tonight. I'm going with 29 to match my SL age.

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 3675 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...