Jump to content

Mac related: Why is framerate so much worse when enabling shadows?


Moni Duettmann
 Share

You are about to reply to a thread that has been inactive for 3013 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

 I've got a Mac Mini with a 2.3 GHz quad core Intel i7 processor, 8 GB RAM and an Intel HD 4000 graphic card with 1.5 GB memory. So not bad at all. It's running super-smooth as long as I don't enable shadows (advanced lighting model). I know, that for every graphics feature you add, there will be a certain slow-down of the framerate. But this is insane: without shadows I have a framerate 5 times (5x) (!) higher than with shadows. So, if I go with 30 fps, which is perfect, I drop to 6 fps, when enabling shadows. Can it be THAT dramatic? I can't believe it! Why is that? A friend who plays SL on a Windows PC has less CPU and less graphic memory than I have, still his SL runs much better. What's wrong? Is it my graphic card? Or is the Mac viewer not programmed properly to address the graphic card? Nobody could yet explain to me, why there is such a tremendous difference in performance, when it comes to the advanced lighting setting.

Link to comment
Share on other sites


Moni Duettmann wrote:

 I've got a Mac Mini with a 2.3 GHz quad core Intel i7 processor, 8 GB RAM and an Intel HD 4000 graphic card with 1.5 GB memory. So not bad at all. It's running super-smooth as long as I don't enable shadows (advanced lighting model). I know, that for every graphics feature you add, there will be a certain slow-down of the framerate. But this is insane: without shadows I have a framerate 5 times (5x) (!) higher than with shadows. So, if I go with 30 fps, which is perfect, I drop to 6 fps, when enabling shadows. Can it be THAT dramatic? I can't believe it! Why is that? A friend who plays SL on a Windows PC has less CPU and less graphic memory than I have, still his SL runs much better. What's wrong? Is it my graphic card? Or is the Mac viewer not programmed properly to address the graphic card? Nobody could yet explain to me, why there is such a tremendous difference in performance, when it comes to the advanced lighting setting.

Graphics and shadows are largely a function of the GPU itself - the Intel graphics chips aren't really intended to run heavy 3D graphics and even a mid-range dedicated Nvidia/AMD graphics card will outperform them.  Is the difference still there with advanced lighting on but shadows off? (You can turn shadows on and off separately inside of Advanced Lighting.)

If shadows are being cast, one big factor will be how many avatars are there and how they're dressed. It takes a lot of calculation to cast a shadow of an avatar wearing mesh clothing and/or a mesh body because the shadow needs to be changed every time the rigged mesh items change shape, which is constantly with an avatar that is being animated. Some third-party viewers have a setting that simplifies avatar shadows while leaving the rest of the shadows the same.

Link to comment
Share on other sites

I'm trying to answer your questions as precisely as I can:

The main load seems to be the advanced lighting setting itself. When I disable shadows, the frame rate gets a little bit better, say, from 6 to 10 fps. Still only a third from my 30fps without "advanced".

I don't see how avatar shadows have an influence on the frame rate. When I disable them entirely, the framerate just stays where it was - no effect whatsoever. I'm trying to keep my framerate tests comparably. There are no other avatars around and the landscape is quite low prim. I always keep the same draw distance (128) to get comparable results.

When you say the graphic work is done by the CPU: that would be the best part of my computer. A quad i7 is more than many of my SL friends have, who run SL on Windows. If you are right, then it must have something to do with the viewer software not making use of the power of the computer.

Actually I heard people say, that the Mac version of SL isn't programmed very well. But that's beyond my tech skills. Seriously I doubt it, because the advanced lighting feature would be such a basic thing to build in a viewer. They would not miss it. Others say, it's the graphic card model. However the graphic cards that Linden itself is recommending, have less memory RAM than mine. 1.5GB may not be a gamer monster card, bit it isn't so bad either.

So still looking for a better solution...

Link to comment
Share on other sites

The problem is that you don't have a graphics card at all,  just an underpowered on-board graphics chip which  was not made for 3d gaming. As you are using a MAC mini there is nothing you can do to get a better graphics chip for your machine, so either live with the restrictions of your on board graphics or get  a computer which suits the demands of running a graphics intense 3d application like second life better than your actual MAC.

LL's hardware recommendations are completey out of date by the way, and there is much more than just the gpu's memory which makes  a gpu or an on-board graphics chip suitable for running a viewer.

J.

Link to comment
Share on other sites

As a couple of people have told you, that HD 4000 is the problem. Those chips are made for streaming video. They are not made for graphics 'rendering'.

Streaming is about getting data from a file (local or remote) through the system and into video ram. The only calculation is decompressing the an image and maybe some anti-aliasing.

Rendering is calculation intense. The shape of a mesh item is figured out and then where light is blocked by it is calculated to give a shadow shape and then the area within that shape is colored shifted which is another calulation. A CPU can do that calculation. But, the CPU is optimized for other purposes. Thus the need for graphics cards that are optimized for rendering 3D objects into a 2D image.

Read up on GPU, CPU, and APU. I think the main take away is that a CPU is going to have 2 to 16 cores. A video chip is more about data transfer speed than cores doing calulation. In fact the HD chips depend almostly on the CPU cores for calulations. A video CARD can have hundreds of cores. My nVidia GTX 560 has 336 cores.

So my old quad core CPU and 336 video cores gives me 340 separate processing units. Your i7, if an extreme chip, will have 8 cores. So, it isn't surprising that your HD 4000 quickly overloads and drops frame rates. You just do not have the calculating power I have. Not even close. 

That you get as good a performance as you do is a testiment to Intel's design ability.

Link to comment
Share on other sites

Thanks for your detailed replies, esp. Natales. I see much clearer now, why "memory" isn't everything about a graphics chip. That was the mssing information I needed to get a picture.
I would have gone for a PC instead of a Mac, for Second Life only, if I had not got stuck with a blue screen every 3 days. ;) That Window system is really tricky. But for what you pay you probably get a better deal hardware-wise.

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 3013 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...