Jump to content

SecondLife tech myths


You are about to reply to a thread that has been inactive for 163 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

15 minutes ago, filz Camino said:

I'm only talking about 2023 computers because that is all I have available on my desk to actually test.

If you'd like to include older computers in the discussion, why don't you run some of the tests I've done on your computer and get back to us to let us know how it compares?

I was just asking, now I know the scope of your assertions. Thanks!

 

Link to comment
Share on other sites

3 hours ago, elleevelyn said:

i have 12th gen processor. This year's available is 13th gen. 

Surely we are at 14th generation now? https://www.intel.com/content/www/us/en/newsroom/news/intel-core-14th-gen-desktop-processors.html

I've seen tests for the i9-14900K, I think it has been available for a month or so.

But I agree - your current CPU is already very powerful, I certainly would not bother to upgrade it if I were you. But given that you like to have all the graphics settings maxed out, I guess you would get some benefit from having  the very latest processor, which I gather has a single core performance that is about 20% faster than the processor you currently have.

For SL I would tend to suggest people get the fastest processor available but settle for a mid-range graphics card, because (unlike most games) I think the workload in SL is slanted towards the processor. I think the 4070ti that we both have is already overkill for SL, I would not expect to get much improvement with a 4090 for example. 

Here's the computer workload for the location where your frame rate dropped to 12 fps: https://gyazo.com/249092da23b6879158002fb2e130ee7e

It is the CPU that is maxed out (probably drawing all the shadows), the 4070ti GPU is only showing 34% utilisation.

I think the only real weakness of the 4070ti is it's rather low VRAM, If I update it in the next few years, I'd anticipate that as being the reason.

 

Edited by filz Camino
Link to comment
Share on other sites

6 hours ago, filz Camino said:

I've seen tests for the i9-14900K, I think it has been available for a month or so.

 

For SL I would tend to suggest people get the fastest processor available but settle for a mid-range graphics card, because (unlike most games) I think the workload in SL is slanted towards the processor. I think the 4070ti that we both have is already overkill for SL, I would not expect to get much improvement with a 4090 for example

 

i have to able to buy computer components at the computer builder shop where I live (New Zealand). The i9 14th gen CPU will be available next year. International supply chains yes

on the second matter, my experience is different to yours. Same scene - everything to the max. The i9 12th gen is nearly asleep

WIn+G (Gamebar)

Screenshot2023-12-25011202.thumb.png.e9036e283f80db9eea00297e30d084ef.png

 

 

other people have shown similar pics on these forums at different times with their i7s and i5s.  In general SL use, the Intels rarely come under pressure

is why the standard general advice has been consistent over the years. For SL the combination to buy is Intel + NVidia.  This is not to say that AMD haven't made big gains in recent years, is just that as you showed your AMD CPU works a lot harder

ps add. Just to be clear. we don't actually need an i9 to run SL. Where it becomes important is when we also running other 3D apps simultaneously, like Blender, etc. And even then an i7 is fairly capable of doing this

 

Edited by elleevelyn
followon
Link to comment
Share on other sites

29 minutes ago, elleevelyn said:

on the second matter, my experience is different to yours.

yes that is very interesting - your GPU is working almost 3 times harder than the same GPU in my computer whilst delivering a frame rate that is not as high. that does make me think that there may be some setting somewhere (either in the computer or the software) that's different between your and my setup.

 

Link to comment
Share on other sites

On 12/25/2023 at 2:09 AM, filz Camino said:

yes that is very interesting - your GPU is working almost 3 times harder than the same GPU in my computer whilst delivering a frame rate that is not as high. that does make me think that there may be some setting somewhere (either in the computer or the software) that's different between your and my setup.

 

i had a time to check the difference which I think is Texture Compression. With this enabled then I get 20 FPS on the hill

Link to comment
Share on other sites

I don't think looking at total CPU usage makes much sense. The CPU has multiple cores, if one of them is maxed out (because that's where the work happens) and the others are idle, the total usage will appear low much lower than the reality. It's rare for games to make use of 4-8+ cores at a sustained rate.

When I'm playing a game like Warframe and 5 (of 14) cores are at capacity, the reported total CPU usage is only 15-20%.

Similarly, if I go to a very active club in SL, there's two cores with a constant workload and 2-4 other cores fluctuating between high and low workloads. The total usage is about 15%, which is a misleading number when the working cores are at 50-90% most of the time.

image.png.740536dd2434d03cce1b66ca5f14f238.png

And just for context, this is the scene being drawn. I'm blurring most of it for reasons you can imagine.

gyc.thumb.jpg.ea67773ec16f06464b4484d58aee2b43.jpg

There are 50 avatars within 50 meters of me (almost all of them are in view), LOD factor 4, no avatar impostors or Complexity limit, practically no draw distance limit (it's a skybox), and still getting 20 FPS. Not that 20 is "good," but it could be worse, considering the amount of avatars. (And I'm not talking about this to compare performance, my main point is about people making assumptions about "low CPU usage.")

Edited by Wulfie Reanimator
  • Thanks 3
Link to comment
Share on other sites

5 hours ago, elleevelyn said:

i had a time to check the difference which I think is Texture Compression. With this enabled then I get 20 FPS on the hill

OK, perhaps that makes a difference although I didn't enable texture compression when I tried the Linden Labs viewer, so it is possible that something on the sim has changed between your first 12fps visit and your and my subsequent 20fps visits.

Link to comment
Share on other sites

4 hours ago, Wulfie Reanimator said:

I don't think looking at total CPU usage makes much sense.

I  agree, and for the reasons you stated. 

I tend to use the heuristic that many gamers use, which is just to look at GPU usage and assume that if the GPU is maxed out, then the game is GPU limited, but if it is not maxed out, then the game is probably CPU limited.

(Most games can max out the GPU but not all cores of the CPU, so if the game is CPU limited, it is likely to be limited by the single core performance rather than the full multi-core performance of the CPU).

 

Edited by filz Camino
Link to comment
Share on other sites

On 12/22/2023 at 10:37 PM, Coffee Pancake said:
On 12/22/2023 at 3:30 AM, filz Camino said:
  1. Second Life needs a discrete GPU, integrated graphics aren't good enough: In 2023 this is false, some modern gaming-oriented iGPUs perform as well as lower end discrete graphics cards and are certainly able to provide a good SL experience.

Oh yes it very much does need a discrete GPU, and it will need more of one as the amount of PBR content goes up.

I have no expertise, certainly not like yours. But this statement does seem very dated. Y’all seem to be fighting without educating yourselves on the current state of technology. I bought an M3 max MacBook Pro last month ( not for SL!). But I occasionally spend a little time in SL, one step below ultra. It rocks 120 fps at home alone in Elysion, 90+ with my partner. 35 to 50 in a busy sim (Siren’s with 60+ avatars on sim, 25 or so in the club). At Muddy’s with 45+ avatars and viewing the entire crowd, I have gotten it to drop to about 10 fps… but it jumps back up to 35++ when I narrow my field of view. So I feel this is strong performance and I am delighted over the dramatic improvement from my old 2019 MacBook Pro that was intel based, and did have a cheap discrete graphics card.

i am also not a gamer, but did some research before buying. YouTube is littered with reviews of Apple silicon now playing AAA games at high frame rates.

So how do you reconcile your belief that “it very much does need a discrete GPU”? And are you saying that PBR implementation will render my Mac unable to use SL with the joy I now see? (I only use Firestorm now).

Thank you in advance for your response. Honest questions… I do not want to get involved in y’all’s food fight ;)

  • Like 1
Link to comment
Share on other sites

25 minutes ago, Crash Sewell said:

I have no expertise, certainly not like yours. But this statement does seem very dated. Y’all seem to be fighting without educating yourselves on the current state of technology. I bought an M3 max MacBook Pro last month ( not for SL!). But I occasionally spend a little time in SL, one step below ultra. It rocks 120 fps at home alone in Elysion, 90+ with my partner. 35 to 50 in a busy sim (Siren’s with 60+ avatars on sim, 25 or so in the club). At Muddy’s with 45+ avatars and viewing the entire crowd, I have gotten it to drop to about 10 fps… but it jumps back up to 35++ when I narrow my field of view. So I feel this is strong performance and I am delighted over the dramatic improvement from my old 2019 MacBook Pro that was intel based, and did have a cheap discrete graphics card.

i am also not a gamer, but did some research before buying. YouTube is littered with reviews of Apple silicon now playing AAA games at high frame rates.

So how do you reconcile your belief that “it very much does need a discrete GPU”? And are you saying that PBR implementation will render my Mac unable to use SL with the joy I now see? (I only use Firestorm now).

Thank you in advance for your response. Honest questions… I do not want to get involved in y’all’s food fight ;)

The GPU in Apple's M series chips is one of few exceptions to the rule. The others are the relatively capable Iris Xe and Integrated Arc in more recent (12th gen?+ I can't remember) Intel mobile (and some desktop I think) CPUs.

On the AMD side of things the 680M and 780M found in some more recent Ryzen mobile chips is also similarly mostly-capable.

None of them are ideal, SL is particularly harsh on any integrated GPU but it is at least possible to run with advanced lighting etc enabled on these exceptional integrated GPUs. It wouldn't be the best solution for anyone though, in the case of the Apple Silicon GPU you are running through translation layers, shared system/video memory (lots of 2023 machines being sold with 8GB 'unified' RAM when 16GB should be considered the minimum) and an uncertain future as far as OpenGL on the platform. It will work, it won't work as well as it could and it isn't likely to work better in the future unless someone comes up with a better solution which would be a native client and alternative renderer for Apple Silicon. This might well happen, until it does you're on shaky ground and Apple's attitude towards backwards compatibility would make me wary.

In the case of Iris Xe and Integrated Arc the performance is just about adequate, supposedly. Same goes for Radeon 680M/780M, performance similar to a lower end 10 series Nvidia dedicated card. In each case 16GB unified RAM would be necessary.

The advice against using integrated GPUs is the fact that a lot of machines are still being sold with integrated graphics solutions that are simply not capable of adequate performance. This is at least changing at last though but the Intel UHD (for 3D) and previous integrated Radeons are still being commonly used and can still be found in laptops being sold today, the exceptions mentioned above are not universal yet.

I think for the moment it is sound advice and avoids overloading people who may not be that tech interested with the technicalities and relatively rare exceptions to the rule.

 

 

 

Edited by AmeliaJ08
  • Like 2
Link to comment
Share on other sites

Thanks Amelia for the response! 

The MacBook Pro I just bought has 48gb of ram and 40 graphics cores, so no problem there. I bought my newest for other reasons than SL. Just a bonus that I am able  to get (imo) stunning performance in SL. Also, I have yet to even hear my fans spin up while on SL… so it only gets warm.

If LL suddenly abandoned Mac users I would have the option of buying a cheap PC, or using their new (in alpha supposedly) mobile app. Or, just no longer using SL.

I certainly agree, buying a new Mac just for SL would be foolish… due both because of price, and uncertainties for the future.

  • Like 1
Link to comment
Share on other sites

1 hour ago, Crash Sewell said:

So how do you reconcile your belief that “it very much does need a discrete GPU”?

here's my theory: reality is a system of opposites - everything has a positive side and a negative side. so how does that apply to knowledge and expertise? 

- the good side of knowledge is obvious, knowledge and expertise is a huge asset for understanding the world and solving problems

- the bad side is less obvious. the problem with knowledge is, the more of an expert someone feels themselves to be the more sure they are that they fully understand the field and the less need they feel to listen to people around them, particularly when that person is saying something that contradicts what they already believe.

so ironically, one quality of knowledge is that it degrades our capacity to acquire new knowledge. 

i'm pretty sure that's all that is going on here. and i don't think any of us are exempt from this phenomena, i have even observed it in myself in my own domain of expertise. i'm less likely to be open-minded in areas where i already have considerable experience.

it is a bit irritating though when the whole topic of this thread is the fact that in 2023 the situation has changed, and yet even on this thread, people are still just posting yesterdays (out of date) knowledge on auto-pilot.

Edited by filz Camino
Link to comment
Share on other sites

40 minutes ago, filz Camino said:

here's my theory: reality is a system of opposites - everything has a positive side and a negative side. so how does that apply to knowledge and expertise? 

- the good side of knowledge is obvious, knowledge and expertise is a huge asset for understanding the world and solving problems

- the bad side is less obvious. the problem with knowledge is, the more of an expert someone feels themselves to be the more sure they are that they fully understand the field and the less need they feel to listen to people around them, particularly when that person is saying something that contradicts what they already believe.

so ironically, one quality of knowledge is that it degrades our capacity to acquire new knowledge. 

i'm pretty sure that's all that is going on here. and i don't think any of us are exempt from this phenomena, i have even observed it in myself in my own domain of expertise. i'm less likely to be open-minded in areas where i already have considerable experience.

it is a bit irritating though when the whole topic of this thread is the fact that in 2023 the situation has changed, and yet even on this thread, people are still just posting yesterdays (out of date) knowledge on auto-pilot.

Is it fair to say, that both "experts" and those who have "knowledge but are not experts", can be wrong on occasion?

I read your logic above as wiggling out of ever being "wrong".

 

  • Haha 1
Link to comment
Share on other sites

2 hours ago, Crash Sewell said:

Thanks Amelia for the response! 

The MacBook Pro I just bought has 48gb of ram and 40 graphics cores, so no problem there. I bought my newest for other reasons than SL. Just a bonus that I am able  to get (imo) stunning performance in SL. Also, I have yet to even hear my fans spin up while on SL… so it only gets warm.

If LL suddenly abandoned Mac users I would have the option of buying a cheap PC, or using their new (in alpha supposedly) mobile app. Or, just no longer using SL.

I certainly agree, buying a new Mac just for SL would be foolish… due both because of price, and uncertainties for the future.

That's a nice sounding MBP :) I'm sure SL will continue to run fine with it for a while to come, maybe forever if Apple decide to keep the translation they are using in MacOS but it does seem the intention is to phase out all OpenGL applications on the platform. If anything SL is probably one of very few things people might want to use (on a current day Apple computer) that is still using OpenGL.

I think the mobile viewer is pretty interesting in that it is using the Unity engine so in theory something more modern might come to desktop from it as well eventually, this could mean a native Apple Silicon viewer for example with no legacy stuff. In theory...

 

Edited by AmeliaJ08
Link to comment
Share on other sites

 

1 hour ago, Love Zhaoying said:

I read your logic above as wiggling out of ever being "wrong".

i'm starting to think you are just a troll acting in bad faith, because that's not a remotely reasonable reading of anything i have ever said.

my own view of myself is that i'm often wrong. i've also noticed that i'm actually pretty good at acknowledging it when i'm proven to be wrong. 

Link to comment
Share on other sites

My take on all this is:

  • Experts are people who gave themself the title.
  • People repeat outdated first and secondhand information as it it's eternally relevant.
  • People describe performance of hardware they don't have.
  • People describe performance of software versions they have never used.

So, looks like "people" is what all those have in common.

So, after seeing all this rubbish flying around, I'll add some more.  FRAMES mean nothing.  What's in a frame?  Stuff.  How much stuff?  Dunno.  So, "frames per second" is completely useless as a performance metric BUT is a really good comfort metric, as long as the inter-frame time is fairly consistent.

I remember, long time ago, GPU specifications had stuff like how many triangles it renders per second and what its texture fill rate is.  Second Life Viewer does report triangles per second.  Would triangles per second anywhere be useful to maybe come closer to a comparable metric than frames per second at such-and-such nightclub on December 11th at 2:26 AM wearing high-heels and a French teddy?

Oh, in case that wasn't clear, the myth is that FPS is a useful bragging metric in Second Life.

Edited by Ardy Lay
Link to comment
Share on other sites

6 hours ago, Ardy Lay said:

FRAMES mean nothing.  What's in a frame?  Stuff.  How much stuff?  Dunno.  So, "frames per second" is completely useless as a performance metric BUT is a really good comfort metric, as long as the inter-frame time is fairly consistent.
 

welllll, FPS does mean something. When I have SL open in a window and another app open which I am working in then the SL window out-of-the-box as a background application was set to 20 FPS

which is a bit slow as I can see a skipping stutter between animation frames when my avatar is dancing. This is something I do quite often. Turn on the radio stream, put my headphones on and dance my avatar while I am working. And when a song comes on the stream that I really like then I will pause what I am doing and sing and dance along with my avatar in my chair. Music and dance helps break up my work day

looking into my NVidia graphics card settings, I saw that I can change up the FPS for background applications: Background Application Max Frame Rate

so I changed it up to max. 45 FPS and now my avatar dances a lot smoother when running as a background application. I think the more smooth look comes from the interpolation of 45 FPS being greater than the typical animation frame rate of 30 which most of my dances are set to

 

 

 

Edited by elleevelyn
()
Link to comment
Share on other sites

9 hours ago, Ardy Lay said:

Would triangles per second anywhere be useful

I take your point about FPS being a badly-specified measure of hardware performance, but is triangles per second (TPS) any better?

It would be easy to test - if TPS is a good specification of hardware performance, then TPS should always remain constant, regardless of the scene content. My hunch is that TPS is probably a bit like FPS in that TPS will vary somewhat depending on the specifics of the scene being displayed.

Otherwise, surely we'd already be using TPS as a hardware performance metric? (I think we all know that FPS fails to differentiate scene complexity from the hardware performance, that's why we make an effort to specify the scene when quoting it).

(And "we" here equals the entire gaming industry).
 

Edited by filz Camino
Link to comment
Share on other sites

I think the reason that FPS is used by everyone is because FPS defines how fluid the game experience is. So it is meaningful in the sense that it is a good specification of whether a game is playable or not.

TPS doesn't have to be high for a game to be fluid and playable, a game with a low triangle count and high fps might not look great, but will be perfectly playable. 

In a sense, FPS is a reasonable performance specification of the game software combined with the computer hardware. And by taking either the game software or the hardware performance as an axiom, a measurement of FPS can be used to critique the performance of the other.

E.g. low FPS in a game when using an RTX 4090 is evidence of a badly optimised game.

Whereas low FPS on a game known to run well on low end hardware is evidence of a poorly performing computer.

Edited by filz Camino
Link to comment
Share on other sites

Just tested TPS, I get a far higher reading for TPS on a complex sim with lots of avatars than I do in a simple skybox, and TPS variability factor between the different scenes (5.3) actually exceeds the variability factor for FPS (2.4). So whilst neither is a perfect measurement of absolute hardware performance, FPS does seem to be a slightly more reliable measurement than TPS.

Busy club (748,625 tps, 70fps):

https://gyazo.com/6996775c3b9f37aad7e24c0fe8d73a29


Skybox (142,039 tps, 193 fps):
https://gyazo.com/5ceb9525f95a1e937a8d5ca8bf574808
 

Edited by filz Camino
Link to comment
Share on other sites

On 12/30/2023 at 11:13 PM, elleevelyn said:

which is a bit slow as I can see a skipping stutter between animation frames when my avatar is dancing. This is something I do quite often. Turn on the radio stream, put my headphones on and dance my avatar while I am working. And when a song comes on the stream that I really like then I will pause what I am doing and sing and dance along with my avatar in my chair. Music and dance helps break up my work 

I only work from home two days a week now, but I thought I was the only one who did that.

  • Like 1
Link to comment
Share on other sites

16 hours ago, Janet Voxel said:

I only work from home two days a week now, but I thought I was the only one who did that.

Nope, you're not alone. I'm completely remote, and have been wandering in ISON buying clothes while in a work ZOOM meeting I had to, but didn't need to be, attending. Shhhh. ;)

  • Like 2
  • Haha 1
Link to comment
Share on other sites

There are tools for finding out what's really going on.

slowframe2.png

Putting a microscope on a slow frame in Sharpview's rendering library, Rend3/WGPU.

This is a tool called Tracy. You can record several minutes of activity, and then zoom in on the log down to the nanosecond level. The line at the top shows how long each frame took. You can pan and zoom and see in detail what happened in each frame. You can build the SL viewer with this level of tracing enabled. It will slow down by a few percent.

Unless you're doing viewer development, you probably don't want this level of detail. The point here is that performance issues are not forever mysterious and un-knowable. There are ways to find out.

  • Thanks 2
Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 163 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...