Jump to content

I have a poor performance, but I use a high-performance PC


nyko Piek
 Share

You are about to reply to a thread that has been inactive for 1475 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

I have a poor performance, but I use a high-performance PC. It is ridiculous that when I'm in a place where there are more than 10 people 6 fps look. I lowered the minimum quality graphics and even there the performance is very poor.
I have an AMD processor cores 8 and 16gb ram NVIDIA 1060 GTX. Have a 12MB band internet. What I can do to improve the fps?

Link to comment
Share on other sites

Give us all the details on the computer and system. Open the viewer and click HELP->ABOUT... Copy and paste that information into your post. Do that with any tech question for a faster more accurate answer.

In general to improve performance set the Draw Distance to 128m. Depending on the region you are in the SL environment may be overloading the render process. This is the number one cause of poor performance on high-end gaming rigs in SL.

Set the Avatar Complexity Information to 350k or lower. This setting will render 90+% of all SL avatars and engage the video crasher protection.

Set your Max Bandwidth to 80% of your download speed or 1500, whichever is less. The tells the server how much 'update' data to throw at you. This controls ONLY the UDP protocol, which has no error correction. Lost packers are lost damaging performance.

Set avatar impostors at 12 or 14. This seting stops the viewer from fully rendering avatars further away from your avatar. It reduces the render load with little if any visible change in your scene.

If you are on a laptop, make sure your system is using the video chip (GTX 1060). The chip pulls power and a laptop in power saving mode will turn off the chip and render everything by CPU.

What is the speed of the CPU? SL Viewer performance is very sensitive to processor speed. Core speed is more important than the number of cores. CPU and memory speed can kill viewer performance.

My i5 @ 4GHz with a ROG GTX 1060 hits 100+ FPS and usually runs 20 to 80 FPS in crowds.

Be sure you have a NVIDIA game profile for the viewer. See: http://blog.nalates.net/2016/06/05/nvidia-settings-2016/

Link to comment
Share on other sites

If you are on Windows 10 and the poor performance issue suddenly started after a recent Windows 10 update then you may be getting bitten by this bug: https://jira.secondlife.com/browse/BUG-37795

It appears that the new Nvidia 373.06 driver may have fixed this bug for those on Nvidia systems - see the last few comments on that JIRA issue.

Link to comment
Share on other sites

Not to burst your bubble but you really have a somewhat middleware computer rather than what is considered high end these days. SL is basically legacy software and it cannot take advantage of multiple CPU cores. AMD processors rely on many cores running at less computational power than Intel. In fact you can go into the bios and turn off all but two cores and SL will still run the same (but will generate less heat, an old overclocker's trick).

The Nvidia 1060 is roughly equilvilent to a Nvidia 980 and is by no means a slouch card> I have a Nvidia 1080 with an Intel i7-5820 CPU and 32 gigs of ram and I can barely get 30 FPS with 10 avatars nearby and it get's below 10 fps with much more than that even with imposters activated. Of course I have a lot of eye candy turned on but turning it down doesn't affect it like you might think it should. The kicker is that even though SL is struggling to do 10 fps, the resources of the GPU and CPU aren't being used. Case in point, my i7 with 6 cores was dragging at 11 fps in a crowdwd club with lots of mesh furry avatars and my resource meter for the cpu was 43% on one core and 20% on another and the rest were idle, the video card was basically idling as well.

That's right, SL can't use the mega resources that modern computers can throw at it. Things will get a little better when SL moves to a native 64 bit viewer in the near future but don't get too excited as it will only allow more system memory to be utilized. SL code was first written in 1997... back when Redneck Rampage was the state of the art and Tomb Raider II was just released. It wasn't even developed by game makers, the original crew were internet wizards doing things considered impossible at the time. The game was secondary to be honest. They have been stuck in this trap for a while and really fixing it would break everything.

 

Link to comment
Share on other sites

CPU: AMD FX-8120 Eight-Core Processor (3120.91 MHz)
Memoria: 16366 MB
Versión del Sistema Operativo: Microsoft Windows 10 64-bit (Build 14393)
Fabricante de la tarjeta gráfica: NVIDIA Corporation
Tarjeta gráfica: GeForce GTX 1060 3GB/PCIe/SSE2

Versión de Windows Graphics Driver: 21.21.0013.7290
Versión de OpenGL: 4.5.0 NVIDIA 372.90

 

I'll read your answers and test the performance. Now I installed the firestorm and walk a little better.
I keep trying and give you an answer as I have done.
Thank you all for the support and taking the time and dedication in the answers.

Link to comment
Share on other sites

  • 3 years later...

To avoid a new topic I figured I would bump an old one...

 

On 10/8/2016 at 1:24 PM, Kathmandu Gilman said:

Not to burst your bubble but you really have a somewhat middleware computer rather than what is considered high end these days. SL is basically legacy software and it cannot take advantage of multiple CPU cores. AMD processors rely on many cores running at less computational power than Intel. In fact you can go into the bios and turn off all but two cores and SL will still run the same (but will generate less heat, an old overclocker's trick).

The Nvidia 1060 is roughly equilvilent to a Nvidia 980 and is by no means a slouch card> I have a Nvidia 1080 with an Intel i7-5820 CPU and 32 gigs of ram and I can barely get 30 FPS with 10 avatars nearby and it get's below 10 fps with much more than that even with imposters activated. Of course I have a lot of eye candy turned on but turning it down doesn't affect it like you might think it should. The kicker is that even though SL is struggling to do 10 fps, the resources of the GPU and CPU aren't being used. Case in point, my i7 with 6 cores was dragging at 11 fps in a crowdwd club with lots of mesh furry avatars and my resource meter for the cpu was 43% on one core and 20% on another and the rest were idle, the video card was basically idling as well.

That's right, SL can't use the mega resources that modern computers can throw at it. Things will get a little better when SL moves to a native 64 bit viewer in the near future but don't get too excited as it will only allow more system memory to be utilized. SL code was first written in 1997... back when Redneck Rampage was the state of the art and Tomb Raider II was just released. It wasn't even developed by game makers, the original crew were internet wizards doing things considered impossible at the time. The game was secondary to be honest. They have been stuck in this trap for a while and really fixing it would break everything.

 

You mention it will get better when they move to an updated viewer and 64 bit support...

Its been 4 years, and sadly looks like nothing had changed...

I just recently powered up SL for the first time since 2012, I have the following build:

Ryzen 9 3900x
32GB 3600 CL16
Vega 64
4K screen
SL Installed onto a PCIe 4.0 NVMe Firecudda 520
100mbps down 10mbps up on the net
Cache and bandwidth limits maxed in the viewer.
Litteraly little that could be better, aside from a bit of a GPU upgrade.

I am still seeing just 2 cores used by SL, and am seeing my GPU sit idle in a low power state.  Decent VRAM usage but still 3 GB free, and of the 5 in use 3 is by the system.  system memory is free...

FPS sits at 10-35 with dips into the single digits.

What gives?... How has this game survived all this time without at least getting a *touch* of optimization?...  Is there a non-official viewer that actually uses a modern programming API with proper multicore and high end GPU acceleration support?  Can I force the client to load *all* assets for a tile on spawn before drop in?  And can I use a local dedicated server in the states to grab the files from the SL network?  I have 100mbps down, and would rather wait 3-5 minutes to "load" the entire world in at 60-80mbps from a dedicated SL server and then spawn with all assest cached to the SSD propper.

Any advice?...

Edited by MasterEddie Nemeth
  • Haha 1
Link to comment
Share on other sites

From the looks of the testing I have done otnight it seems ot be the advanced lighting function.  I can set ultra setttings and tur that off and get a solid 45 which seems to the what the game engine is running at on my system according to statistics viewer.  Doesnt matter what settings combination I use in terms of the advanced ones under the advanced lighting system, I can disable all of them *but* the advanaced lighting checkmark.

What I dont get is why the advanced lighting is an issue...  Its not pushing the GPu anywhere near max, in fact rarely into even mid power states...

Without it its even less load on the GPU...

Whats going on?

Link to comment
Share on other sites

So turned out the issue persists. That fix only helped in the individual tile I was in at that time. Other tiles with more load are similarly laggy, and still without any reasonable load on the system resources...

Whats the issue here?  Is this application litterally just so outdated that it can use more than 1 thread and a modern iGP's worth of GPU power?...

I can run any modern AAA title at 4K but can get a solid and consistent 30FPS in SL?  I dont even want 60.  I just want 30 solid without frame dips....

Is it not possible?

Link to comment
Share on other sites

Excluding any PC related issues (like power saving mode being on) SL should be more than easily run at 30 FPS minimum with Advanced Lighting and even Shadows on, it does for me, has for the most part of the past 8 years for me even with my old FX. I can only assume its a setting issue and the first thing i'd dump is your attempt to run SL at 4k. 4k means you are running shadows at 4k resolution too, which requires quite a good chunk of extra CPU power that your CPU simply can't supply because SL cannot use all available cores. You are running in a limited resource pool and 4k is using up a tremendous amount of that pool already, just the resolution alone, not to mention any CPU bound graphics (that should really run on GPU only but don't) that scale with your resolution (like shadows do).

First, lower your SL window resolution to 1920x1080.

Second, lower your draw distance to something appropiate like ~96-128m max.

Third make sure you make use of "Max Avatars." and possibly the "Avatar Complexity" options, they are your biggest tools to retain performance. Avatars are the biggest performance hitters.

From there on we can see what we can do

  • Like 3
Link to comment
Share on other sites

6 hours ago, NiranV Dean said:

Excluding any PC related issues (like power saving mode being on) SL should be more than easily run at 30 FPS minimum with Advanced Lighting and even Shadows on, it does for me, has for the most part of the past 8 years for me even with my old FX. I can only assume its a setting issue and the first thing i'd dump is your attempt to run SL at 4k. 4k means you are running shadows at 4k resolution too, which requires quite a good chunk of extra CPU power that your CPU simply can't supply because SL cannot use all available cores. You are running in a limited resource pool and 4k is using up a tremendous amount of that pool already, just the resolution alone, not to mention any CPU bound graphics (that should really run on GPU only but don't) that scale with your resolution (like shadows do).

First, lower your SL window resolution to 1920x1080.

Second, lower your draw distance to something appropiate like ~96-128m max.

Third make sure you make use of "Max Avatars." and possibly the "Avatar Complexity" options, they are your biggest tools to retain performance. Avatars are the biggest performance hitters.

From there on we can see what we can do

I am LUCKY on my sys in FS at a sim with 10+ avis at 1920x1080 res and 'everything on' to ever get more than like 15fps, usual it is in 8-12 fps range. and this is with a 4th gen core i7, GTX 1080 and 16g ram

Link to comment
Share on other sites

1 hour ago, Jackson Redstar said:

I am LUCKY on my sys in FS at a sim with 10+ avis at 1920x1080 res and 'everything on' to ever get more than like 15fps, usual it is in 8-12 fps range. and this is with a 4th gen core i7, GTX 1080 and 16g ram

Probably either because the SIM is already extremely bad or the avatars are so crazily bad and you just allow them to render. Unless the SIM is total ***** i'm pretty sure i can reach 30 FPS there with a Ryzen 5 3600 and "everything on" (Deferred, Shadows, SSAO, SSR, Light Softening), you should get even more. You should give me a link to the place, i'll check it out.

  • Like 1
Link to comment
Share on other sites

1 hour ago, NiranV Dean said:

Probably either because the SIM is already extremely bad or the avatars are so crazily bad and you just allow them to render. Unless the SIM is total ***** i'm pretty sure i can reach 30 FPS there with a Ryzen 5 3600 and "everything on" (Deferred, Shadows, SSAO, SSR, Light Softening), you should get even more. You should give me a link to the place, i'll check it out.

this is when i video weddings, so avi quality needs to remain up and i can't really de-render folks . and this is in Firestorm, unfortunately BD isn't suitable to shoot weddings still, much as I'd like to

Link to comment
Share on other sites

3 minutes ago, Jackson Redstar said:

this is when i video weddings, so avi quality needs to remain up and i can't really de-render folks . and this is in Firestorm, unfortunately BD isn't suitable to shoot weddings still, much as I'd like to

Well with everyone needing to be rendered theres nothing you can do. Your performance will drop to poop.

Link to comment
Share on other sites

20 hours ago, MasterEddie Nemeth said:

So turned out the issue persists. That fix only helped in the individual tile I was in at that time. Other tiles with more load are similarly laggy, and still without any reasonable load on the system resources...

Whats the issue here?  Is this application litterally just so outdated that it can use more than 1 thread and a modern iGP's worth of GPU power?...

I can run any modern AAA title at 4K but can get a solid and consistent 30FPS in SL?  I dont even want 60.  I just want 30 solid without frame dips....

Is it not possible?

Modern AAA titles are made by professionals who optimize (i.e. limit) them to make your hothouse flower of a gaming computer look impressive. Second Life is "made" by hundreds of people with limited knowledge making selfish decisions for their own benefit, and with no coordination with anyone else.

  • Thanks 1
Link to comment
Share on other sites

  • 2 weeks later...

 

 

On 4/1/2020 at 7:56 PM, MasterEddie Nemeth said:

To avoid a new topic I figured I would bump an old one...

 

You mention it will get better when they move to an updated viewer and 64 bit support...

Its been 4 years, and sadly looks like nothing had changed...

I just recently powered up SL for the first time since 2012, I have the following build:

Ryzen 9 3900x
32GB 3600 CL16
Vega 64
4K screen
SL Installed onto a PCIe 4.0 NVMe Firecudda 520
100mbps down 10mbps up on the net
Cache and bandwidth limits maxed in the viewer.
Litteraly little that could be better, aside from a bit of a GPU upgrade.

I am still seeing just 2 cores used by SL, and am seeing my GPU sit idle in a low power state.  Decent VRAM usage but still 3 GB free, and of the 5 in use 3 is by the system.  system memory is free...

FPS sits at 10-35 with dips into the single digits.

What gives?... How has this game survived all this time without at least getting a *touch* of optimization?...  Is there a non-official viewer that actually uses a modern programming API with proper multicore and high end GPU acceleration support?  Can I force the client to load *all* assets for a tile on spawn before drop in?  And can I use a local dedicated server in the states to grab the files from the SL network?  I have 100mbps down, and would rather wait 3-5 minutes to "load" the entire world in at 60-80mbps from a dedicated SL server and then spawn with all assest cached to the SSD propper.

Any advice?...

Wrong. Lots has changed since 2012. Delivery of content has moved to HTTP and CDN. Most viewer communication is via error correcting HTTP giving up more and more of data deliver over UDP. More processes have moved into separate threads. Mesh, Fitted Mesh, Bento and BoM have all been added. Region crossing processes have been updated. I could go on. While the viewer has been updated and enhanced for performance it has also been asked to do more. The result is the uninformed don't see a net gain in FPS and assume the Lindens haven't done anything. DUH!

Also, max bandwidth is for the SL server not the viewer. Max is probably not ideal. The FS Dev team still recommends a max of 1500. I've had people tell me they find higher values better. I suspect it depends more on the region than the viewer and your connection.

You aren't looking close enough. Viewers will run 20+/- threads but average 2.5. 

That is a settings and/or location/avatar problem. Tweaking a viewer with >3,000 settings is an acraine craft. Just to complicate things... even similar computers are different enough duplicate setting are less than optimal. Thus requiring people to have some knowledge of 3D rendering to experiment on their own. Fortunately trial and error works.

You can have a great CPU and graphics card and a weak motherboard, narrow PCIe bus, too little memory or slow memory chips any one of which can kill performance. What the hardware supports is another performance issue. Anti-aliasing, especially with 4K, can be a big hit.

On 4/1/2020 at 9:51 PM, MasterEddie Nemeth said:

From the looks of the testing I have done otnight it seems ot be the advanced lighting function.  I can set ultra setttings and tur that off and get a solid 45 which seems to the what the game engine is running at on my system according to statistics viewer.  Doesnt matter what settings combination I use in terms of the advanced ones under the advanced lighting system, I can disable all of them *but* the advanaced lighting checkmark.

What I dont get is why the advanced lighting is an issue...  Its not pushing the GPu anywhere near max, in fact rarely into even mid power states...

Without it its even less load on the GPU...

Whats going on?

That is what a bottleneck does. Some part of your system is holding the other parts back. You can use HWMonitor and MemHistory (free) to chase down the problem. The Windows Resource Monitor is a help too as is System Internals from Microsoft.

You can press Ctrl-Shift-9 (a toggle) to open Fast Timers to see where the viewer is spending its time. That can give you a clue as to what is killing performance. For me and where I am shadow geometry and avatars use the single biggest blocks of my render time. Click the PAUSE button and then click inside the rows to see the task in the tree to left highlight the name. You can expand large blocks to see more detail in the block.

image.png.aa48342e58ac350a6c4b3ee81c13c907.png

Just remember some regions are built so poorly there is nothing you can do. To test that jump up to 2,000m in a Linden region where there is nothing and no avatars. This will give the max FPS your system is going to generate in SL. Then visit a region Like oRgAsM's, Sands, or any region where avatars are way script heavy with loads of attachments, landing Safe Hubs are good too. Then visit a moderately dense, well built place like North Yard @ Wastelands – 3/17/2020 - http://maps.secondlife.com/secondlife/North Yard/34/43/74

These three should give you more of an idea where the load is coming from and which viewer and systems settings to tweak.

 

  • Like 2
Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 1475 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...