Jump to content

The viewer makes my graphics card burn


chrisbreitling
 Share

You are about to reply to a thread that has been inactive for 4214 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

Wow, this SL Viewer is pretty intense!!! I have an NVidia GTX285 (1GB VRAM) and when I put the SL Viewer graphical settings at maximum (the slider) the card litterally burns! the load indicator goes into RED (overload) and stays like that.

As soon as I move the slider one notch down (I believe 3 in a scale of 4) then it goes into Purple or Blue which is heavy but ok.

My mobo/CPU also has a built in HD4000 which in combination with the Virtu MVP should take advantage of both video ards (integrated & discrete) but the load is still heavy.

Link to comment
Share on other sites

The GTX285 card is an old card (4 series back from the current series which is 600 series) but it's a pretty powerful card within the 200 series.........more than capable of running SL at high settings.  I don't know what you are talking about with that "red" or "purple" stuff but I do know that any video card (new series or old) is going to work very hard with SL settings at "Ultra" (the highest settings you can set in SL).  What's the temperature of the card when you have those "red" indicators?  That is what "burns" a card........heat.  Most video cards and CPU's have an upper heat threshold of 105 C.  The heat threshold is the temp that once reached is, putting the component in danger or burning up (really, physically burning itself up......destruction).  The GTX 285 is powerful enough that, even at ultra settings, it should not burn up.  If it does then something else is going on (such as dust build up on the card).  Get hardware monitoring software (there are plenty of good free ones available on the Internet) and monitor your temps......that will tell you infinitely more than some "red", "purple" indicator color.  Temps in the 90 C plus range are temps to be concerned about.......less than that, it's pretty much normal.

You are running a laptop that has the CPU intergrated video chipset (the HD4000 video).  You, apparently, also have the discrete GTX 285 video card.  Your computer is designed to auto-switch from the HD4000 onboard chipset to the discrete card when the video demand warrants it..........with the more recent CPU's with the auto-switch technology that happens very well (it didn't used to be the case because it simply didn't happen all the time.....the HD4000 video says your computer is a recent one).  It's an either or thing............the two video adapters do not work together.  The HD4000 works when the video demands are low and the GTX 285 works when the demand is high.  Then never work together (if they do, then that might be your problem).

Link to comment
Share on other sites

When you ask a tech question include information about your computer and viewer. Get the info from the viewer's Help-About... and paste that into your post with your question. Help->About... provides all the version numbers we need.  To add your info to an existing question use OPTIONS->EDIT. It’s in the upper right of your post.

You can get the free programs CPU-Z and GPU-Z to see what your computer is doing. I can't imagine a 285 loading up all that much with SL. My 560 only uses 30+% of the GPU with shadows on.

You can also see your GPU temperature. If the card is heating up it may be time to open the case and clean out the heat sink. My card runs at 35 to 45C. On some benchmarking programs I can push it up to 70+C with the GOU hitting near 100%.

I doubt the SL viewer can make use of the HD and nVidia cards at the same time.

You may want to look through Graphics Tweaking for Second Life  to see if you can improve performance.

Link to comment
Share on other sites

Chris, if it's true that you're on a laptop then you must consider that the mobile version of your GPU is much (much!) weaker than its desktop namesake. I've already destroyed a rather expensive Asus lappy due to overheating. Here some protips to avoid a terrible death of your hardware:

- open the machine and vacuum it out every 3 months

- use it on a cooling pad, not ever on your lap or, even worse, in bed on a soft matress

- make sure there is a steady airflow around the lappy, don't block the air ducts

- if you're keeping it on a desk as you should do anyways, make sure it's not a wooden desktop. At least put a tile under the lappy, or the suggested cooling stand

- do not blame the problems on the viewer. Blame it on SL's best treat, the ability to render your own, non-optimized, textures. We all can build stuff in SL, which is great, but comes with the cost of having a lot of amateurish builds.

 

Link to comment
Share on other sites

The only thing your HD4000 chip is really useful for is streaming video, which isn't a big advantage in SL. It's poor at 3D calculation so SL is using almost entirely the GTX285. It also sounds like you have an old video card in a new motherboard - hardware and software usually work together most reliably when they're all about the same age.

The "notches" under "Preferences" are far too coarse for fine adjustment of video settings - click on the "Advanced" button and use the individual settings. In particular, turn down your draw distance and don't have the viewer calculate shadows when you're in an area with a lot of avatars.

Link to comment
Share on other sites

Did you check how many fps the card generates? In some sims my 670 likes to draw 200 fps, then it gets pretty warm, slightly over 70 C. That's not a real issue, but especially in summertime I don't need a furnace next to me. I capped the fps at 60 and that seems to do the trick, it's in the NVidia settings.

You can see your fps with ctrl-shift-1. If it's over 60, I'd cap it. 60 is most likely the best your monitor can do.

If fps are normal, you could try and check the thermal paste, airflow, look for dust etc...

Link to comment
Share on other sites


Peggy Paperdoll wrote:

You are running a laptop that has the CPU intergrated video chipset (the HD4000 video).  You, apparently, also have the discrete GTX 285 video card.

You do know that desktop CPUs can have an integrated GPU as well, right? The OP never said anything about using a laptop, and as you noticed he's using a GTX 285 which is a monster of a card some 10 inches long and pulls 204 watts. Do you really think he's using that in a laptop? Jumping to conclusions is bad, m'kay.

 


Peggy Paperdoll wrote:

Your computer is designed to auto-switch from the HD4000 onboard chipset to the discrete card when the video demand warrants it..........with the more recent CPU's with the auto-switch technology that happens very well (it didn't used to be the case because it simply didn't happen all the time.....the HD4000 video says your computer is a recent one).  It's an either or thing............the two video adapters do not work together.  The HD4000 works when the video demands are low and the GTX 285 works when the demand is high.  Then never work together (if they do, then that might be your problem).

And here I thought the OP was pretty clear about what program he was using.

 

 

Link to comment
Share on other sites


Orca Flotta wrote:

 

- open the machine and vacuum it out every 3 months


Although I'm guilty of doing this myself (so far without any issues), it's highly recommended NOT to. Vacuum cleaners can create static charges, potentially ruining your computer. Most of the time I use a toothbrush, if you have it at your disposal, use compressed air.

Link to comment
Share on other sites

Yes I do know that desktops have CPU's with intergrated video adapters........most (if not all) the Intel i-series CPU's in production today have the GPU incorporated in the CPU.  However, the auto-switching function was developed for laptops in an effort to save battery life.  Desktops get their electrical power directly from the commericial (or if one has an indendent power source) power supplied by copper wires......they do not depend on stored powered supplied by a battery.  There is no advantage to having such a system if you have a discrete card installed in the computer.  The auto-switching should be disabled in a desktop......though I guess you could still use it if you wanted (I have an Intel i-5 but it's an older generation i-series CPU and has no intergrated graphics so I can't check it out for you).  If the OP is using a desktop and has the auto-switching enabled then the solution just might be to turn it off......it's doing no good what so ever and because it's enabled it could very well be the source of the problem he/she is experiencing.

I went to that link you posted.  I'm not impressed at all.  It's a sales pitch for someone selling software.  I find it hard to believe that any software can sync up two very different GPU chipsets and deliver anything more than headaches.  Intel and nVidia?  I doubt it..........I doubt it very seriously.  If the OP is using that software then that is a very big potential source of the problem.  An nVidia GTX 285 video card is more than capable of running SL at high settings without breaking a sweat.........at ultra, it's going to work but still capable enough to deliver satisfactory performance (there will be a FPS hit just as any card on the market today will take a hit) 

My mid-level GTX 550 Ti runs ultra at a 10 to 15 FPS hit.  And at mostly 40 FPS without shadows a 15 FPS hit still has rather smooth video.  I don't use ultra because I don't like how it makes SL look (too stark and contrasty).

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 4214 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...