Jump to content
You are about to reply to a thread that has been inactive for 94 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

Posted

I believe this is what the classic CPU bottle neck looks like. On the FS PBR viewer. I recently installed a 4070 ti super running with my i7 10-700k. Not sure a i9 will help or not but if I do go that route i'll stick with 10th gen so I can keep the same mother board, 10850k or 10900k
7b41762531ef054cc3ad28c521486e60.png

  • Like 1
Posted (edited)

Does GPU utilization remain that low all of the time? I'd expect 50-80% with a PBR viewer.

I am a bit confused by that image though, 10700K isn't 4C/8T, it's an 8C/16T processor. Is this a VM or something?

Edited by AmeliaJ08
Posted
2 hours ago, Jackson Redstar said:

once the CPU settles down the GPU does often get up to 80-90 % But yeah I didnt notice that it is suppose to be 8 core that is pretty strange

Have a look in the BIOS settings, if this isnt a VM where you have specified this configuration then it's likely you have some cores disabled in the BIOS.

Not that it would make any real difference in SL but something's wrong :)

Posted (edited)
2 hours ago, Jackson Redstar said:

All cores on now this is at a sim about 70 there though not in camera view just scattered about. About 22fps. PBR viewer
d67463616069112f245011f750cd2bc0-png.jpg

I'd expect higher utilization numbers for CPU and GPU.

I was having an issue with low utilization under 30% recently, I suspect mine was Thunderbolt related though since messing with my laptops power profiles was the solution (I think, I did nuke and re-install the Nvidia driver at the same time but suspect it was a power management issue). When I fixed it GPU utilization shot up to 70-80% consistently.

Since you've got a desktop this likely isn't the issue though. I would suggest maybe trying some driver experimentation, try some older ones if you can. Use DDU between driver changes to make sure each installation is clean.

Do you find the CPU/GPU are able to hit high utilization in other applications/benchmarks?

 

 

 

Edited by AmeliaJ08
Posted
53 minutes ago, AmeliaJ08 said:

I'd expect higher utilization numbers for CPU and GPU.

I was having an issue with low utilization under 30% recently, I suspect mine was Thunderbolt related though since messing with my laptops power profiles was the solution (I think, I did nuke and re-install the Nvidia driver at the same time but suspect it was a power management issue). When I fixed it GPU utilization shot up to 70-80% consistently.

Since you've got a desktop this likely isn't the issue though. I would suggest maybe trying some driver experimentation, try some older ones if you can. Use DDU between driver changes to make sure each installation is clean.

Do you find the CPU/GPU are able to hit high utilization in other applications/benchmarks?

 

 

 

in heaven benchmark yeah hit 98%, also when rendering video. It just seems like in SL it can really depend on the sim. Some sims can get high GPU and others just don't. I just don't know any of the science behind it. I used DDU and only have the latest 4070ti driver . I just use the quick ASUS motherboard CPU over clock and just a mild overclock on the GPU. I'm not a gamer at all

  • Like 1
Posted

Every cpu will be a bottleneck for SL, as it doesn’t utilize a lot of cores very well. You can brute force performance by just going for single thread performance and overclocking, even disabling other cores to OC further. But that gets into a territory of heavily specialized minmaxing for SL where it becomes questionable if it’s really worth it to buy a 14900k, disable half the cores and OC it to the moon just to get more frames in secondlife. 
I did this with my 11900k before downgrading my PC heavily, I don’t think it’s worth it. You can get pretty close with basically anything that’s similar in terms of single core performance, that’s not always the high end cpu in the range that’s the best value for that, since stuff like the same generation i5 will have a single core rating that’s pretty close.
I can think of far better ways to spend hundreds if not over a thousand dollars on SL. A few years of premium plus, a rather large bit of land, basically just buy out the DJs at your favorite club every night to only play music you like, etc

Posted

yeah been doing more research and experimenting and it never really dawned on me before but the FPS limiter makes a perfect example. If you set it to say, 40fps, and your are only getting 25fps - then that is the limit of the CPU you are getting not the GPU. And while having tons of VRAM help keep things smooth, or so it seems - in SL you absolutely hit the wall with the CPU usually before any decent GPU. Seems the ideal setup for SL might be a i7, 16 or 32 gig RAM and something like a GTX 1080- but with 12 or 16 gig of VRAM

Posted
19 hours ago, Jackson Redstar said:

yeah been doing more research and experimenting and it never really dawned on me before but the FPS limiter makes a perfect example. If you set it to say, 40fps, and your are only getting 25fps - then that is the limit of the CPU you are getting not the GPU. And while having tons of VRAM help keep things smooth, or so it seems - in SL you absolutely hit the wall with the CPU usually before any decent GPU. Seems the ideal setup for SL might be a i7, 16 or 32 gig RAM and something like a GTX 1080- but with 12 or 16 gig of VRAM

SL's rigged mesh implementation is still done on the CPU, whereas other games use the GPU for it. If you are in an area with lots of other people with high-poly rigged mesh, this is at least part of why you will experience lower performance.

Posted
54 minutes ago, BriannaLovey said:

SL's rigged mesh implementation is still done on the CPU, whereas other games use the GPU for it. If you are in an area with lots of other people with high-poly rigged mesh, this is at least part of why you will experience lower performance.

I mostly video events often with 40+ people  so this  is just how it is for me. Does anybody know - other than radical overclocking - how to get the CPU to process the draw calls faster and get  them to the GPU? like any Nvidia control panel settings or Windows settings?

Posted
3 hours ago, Jackson Redstar said:

I mostly video events often with 40+ people  so this  is just how it is for me. Does anybody know - other than radical overclocking - how to get the CPU to process the draw calls faster and get  them to the GPU? like any Nvidia control panel settings or Windows settings?

That would be the holy grail setting for SL performance... unfortunately it doesn't exist and our super duper modern multi core CPUs are not being used properly.

 

Posted

My partner still has her i3 7350k around and it’s an example of some shenanigans that can be enacted with single core dependent use cases. Around its time, ie the 7700k being top tier, Quad-Core hell, the 7350k was just an enthusiast option to play with. It let you do stupid overclocks without really worrying about toasting an expensive cpu, because it’s just a dual core i3. Intel did that every now and then with stuff like the Pentium g3258 where it let people overclock with little consequence, because it’s just a low tier cpu but multiplier unlocked.

That cpu at stock speeds has a very similar single core metric to the 7700k, something like 5% different. As they are both effectively the same cores with a 4.2ghz stock boost speed, just the i3 has less cache to work with. But the i3 can overclock much harder with only having 2 cores to manage. She got that thing to 5.1ghz on water, mostly stable enough to play games, and it would absolutely smoke the 7700k in titles that didn’t use multiple cores well, like at the time, csgo and Minecraft.

I kind of want to know how well that would’ve done in SL compared to other common options of the time? As well, if there’s ever going to be a product like that again. Imagine the theoretical i3 14150K, an unlocked relatively low power quadcore you can put on Z790 and push the clock speeds to 6ghz. I think the reason why Intel doesn’t do that is, well for one the new CPUs are dying enough as is, but also that I think such a product again wouldn’t have any niche benefit like it used to have. Except for sl, the market of the 14150K would be secondlife players who only need single core performance.

 

ramble, I’m gonna dig that board and cpu out of her storage and see if I can replicate that, it sounds funny 

Posted (edited)

I can asure you... using a 1080ti or a 4080 makes a big difference. 
I need to run SL in 4k though. (Lower res than native without anything like DLSS would look awefull anyways.)
My old Rig uses a Ryzen 7 1800X with a 1080ti. (Your 10700K is quite faster than a 1800X.)
While the newer one uses a Ryzen 9 7950X3D with a 4080super.

The difference is about three times the performance. While the CPU usage is miniscule. (~8-12% depending where i'm)
While the GPU is utilized a lot more. (60-80% with uncapped Framelimit)

Personally, i find your CPU usage and GPU usage quite strange. Something you should look further into.

The 1080ti was nearly 100% (...) utilized with my settings and would be a horrible choice for my settings, at least if you are "forced" to 4k.

When i tie SL to only once physical core, it runs awefull. Even with two. Only at four cores, SL stops to stutter or lagging, crippling the GPU to much.

I guess something different is going on with your System depening on SL and you need to rule out stuff.
Reinstall everything SL depending. Deleting everything, and i mean everything. Local Folder, Roaming Folder including. And try different viewer and monitor their behavior.
Exclude any files in your Antivirus Software, exclude it also in your firewall just to be sure, if not done already.
SL has ongoing Network traffic. And such software "overlooking" it CAN slow down stuff and make your performance worse.
Means: If your viewer "waits" for stuff, it "pauses". That as an result lowers the performance if you want to call it like that because it "throttles" down.

 

PS: I wouldn't consider any Intel CPU of the 13th or 14th generation right now for an Upgrade. EVEN with the Microcode "fix" for their Ringbus Voltage "self destruct" ability...
But if you really want to... look into manual undervolting. It's not hard and is a way better workaround as that "wanna be fix" from Intel itself.

Edited by Feuerblau
Posted

Maybe it's time to ditch this PBR crap. I'm a little tired of seeing my experience greatly degraded for a result that is strictly speaking totally *****ty. You no longer control your engine and as proof I have my configuration going down to less than 10fps (constant 180 fps in 4K full details normally) with an i9 14900K... In short NO EXCUSE.  If you want to destroy your game in record time you are on the right track. For information in 20 years of SL (I arrived in 2004) I have NEVER seen that. Stop the massacre. Besides, most creators get NOTHING from the PBR... And it's the same whatever the viewer. .React!

You are about to reply to a thread that has been inactive for 94 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...