Jump to content

Performance Comparison (FPS) Intel i3 + i5 / AMD Ryzen 5 + 7 [stock + OCed]


You are about to reply to a thread that has been inactive for 421 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

My goal was to find out which CPU is better for SL (without paying a 500 bucks worthy CPU like the i9-9900k)
I'm aware SL doesn't use multi-cores, but single-core: so in theory it benefits from the highest clocked frequency.


I finished my CPU performance comparison FPS-wise in SL with same hardware/settings/viewpoints/avs/etc. for:

Ryzen 5 3600 vs. Ryzen 7 1700x [stock + overclocked]
check it out here: https://youtu.be/VDpN-w-0sts


Intel i3-9350kf vs. i5-9600k [stock + overclocked]
check it out here: https://youtu.be/my-J89BM6Nk


Conclusion below the vids.

Edited by Bas Curtiz
  • Like 3
Link to comment
Share on other sites

@Lillith Hapmouche 
Not an average, but minimum difference.
In other words: either you get the experience of smooth fps (above 24 fps like cinema) if u go for i5 and OC it.... or not so much.

@Maryanne Solo
FYI, end of November 2019 price-wise:
i3-9350kf = €145
Ryzen 5 3600 = €191
Ryzen 7 1700x = €194
i5-9600k = €222

i7-9700k = €375
i9-9900kf = €499

So a different ballgame, atleast in price.
I'll try to compare the i7+i9 later on aswell.
 

Link to comment
Share on other sites

Right about now is the prime time to buy first gen Ryzen btw

It got outclassed by the 3rd gen super hard so they’re all dirt cheap. A Ryzen 5 1600 costs like 100$ new in the US and can definitely still hold its own in any AAA game, let alone SL. 
Starting with a B450 motherboard and 3600mhz ddr4 gives you room to upgrade in the future to 3rd gen Ryzen 7, or if you went for a higher end board, 3rd gen Ryzen 9.

 

Link to comment
Share on other sites

@cheesecurd

I think you didn't watched the video(s).
Let's say a Ryzen 5 1600 will perform as good as the Ryzen 5 3600 (which it won't, but still)...

You get 20 fps max. in busy sims then (if overclocked).
With i5 9600k you get 27 fps max. in busy sims (if overclocked).
That's a fps gain of 35%.

Since we all tend to feel ok with 24fps (like cinema) and above, but NOT less fps(!), it's the difference between a smooth picture and a choppy picture.

Conclusion:
Best bang for your buck fps SL wise = i5 9600k (when overclocked to 5ghz) 😎

 

Link to comment
Share on other sites

13 minutes ago, Bas Curtiz said:

SL wise = i5 9600k

The the important part here. SL wise.

a 9600k is a worse choice in literally every other way. If your pc exists solely to play SL then it’s a better choice but if you do quite literally anything else or value your money in the slightest then the R5 3600 is the better choice by far, and the 100$ r5 1600 is a good way to get on to that platform. Cheaper motherboards, an actual upgrade path, cheaper initial processor cost.


Also consider that SL runs like literal garbage regardless of what you play on. 

  • Like 2
Link to comment
Share on other sites

 

@cheesecurd

I think the topic stated was very clear: "My goal was to find out which CPU is better for SL."
(Not for any AAA-game, 3d autocad rendering, unzipping, crypto-mining, or whatever.)

Nevertheless, I can't argue with Ryzen 3rd gen being a better choice for all those other jobs (however i5 will still have slight advantage game fps-wise),
since these are overly tested and shown all over the web already.
Unlike... Second Life.

And that's why I had to setup such test comparison myself.

Common case scenario:
"Hey guys, I want to play SL in best performance possible, without paying a leg or an arm. I check my mail and surf the web, print things out, and play SL.
Which CPU should I get?
"

...And those people finally have their long-awaited answer (& proof).

Link to comment
Share on other sites

@Bas Curtiz and I do understand your point, you’re benchmarking for SL. But there’s more to a PC than just raw fps.

Nobody recommends 9th gen Intel at the moment with the exception of the 9900K, especially not the 9600k. It’s inferior in every other way than the Ryzen lineup.

And that’s not just fanspeak, that’s the message of dozens of tech reviewers, YouTube channels and other “professional critics”. 
If someone were to build a PC where the only thing it would do is play SecondLife and they never want to upgrade it, then sure, go for it. But nobody uses their pc for just SecondLife, and anyone buying a system with any of those processors intends to upgrade at some point rather than just toss it and buy a new system.

I could not argue to anyone that 7fps is ever worth it for buying into a dead end platform with an overpriced processor.

Then there’s just that SL runs on anything, but doesn’t run _well_ on anything. All you’re seeing is slightly better single threaded performance for cpu based rendering tasks, you’d probably see very similar numbers out of a high clocked 6700k, since very little has changed since.

 

Link to comment
Share on other sites

2 hours ago, Love Zhaoying said:

I hope they have decent internet, or the CPU is irrelevant..

Wrong.

See: 

 

Conclusion:
No difference in FPS, when you got a slow internet connection.
So internet-speed is not bound to fps. CPU remains relevant.
Highly a bit slower in rezzing the world.
 

  • Confused 1
Link to comment
Share on other sites

Fair benchmarking in SL is next to impossible. There are just too many variables that can alter FPS. 

If you're looking for the "best" desktop cpu for SL the Intel 9900-K (or KS) will outperform any cpu out there. Anything less than that the AMD Ryzen 3000 series is the best way to go. Hopefully the Ryzen 4000 series will catch up to Intel in frequency. Laptops are a completely different story as they sacrifice performance for energy savings. For an excellent rundown on what cpus and gpus are best for gaming and or productivity please see the following youtube channels. Gamers Nexus (nothing but raw data from different use cases. If youre not a total geek this channel may induce a sudden case of narcolepsy) Tech Deals (What is the best value for your budget and or use case). Linus Tech Tips (Entertaining and explains the technology well). What operating system youre using also plays a major factor. My beloved 2015 iMac can only get half the frame rate in MacOS and compared to running windows in bootcamp and and linux on that iMac (I assume because of the AMD driver for the r9-m395x) is 33% worse than MacOS. My laptop hovever with a Nvidia gpu (rtx-2080m (not max-q)) has about a 20% performance improvement in linux as compared to windows. I do not have any info on discrete AMD Radeon gpus with the exception of my iMac.

  • Like 1
Link to comment
Share on other sites

@MarissaOrloff
As my videos show 9900k is indeed the best, but my question was, which is best (without paying up a 500 bucks worthy cpu) for SL.

I'm aware any av with high complexity in unsual values (300k+ for example) will make this test not valid.
Hence I used the Exhale club with loads of avs in one spot, but same amount of avs, and those who have too much complexiity/scripts/etc. (the red ones on the board) get auto ejected after 2 mins.

So all in all, I can look at numbers at the usual sources you describe above, but that doesn't say how it would be for SL.
Hence I think I've come close to a fair benchmark for SL.

-

Great info on the iMac + Bootcamp getting performance degrades; that's useful for someone.

Also interesting that Linux has 20% performance improvement on your 2080 compared to Windows.
Would love to see videos of that, like I did, since everyone is just talking without any real evidence for SL.

 

Link to comment
Share on other sites

8 minutes ago, Bas Curtiz said:

 

Also interesting that Linux has 20% performance improvement on your 2080 compared to Windows.
Would love to see videos of that, like I did, since everyone is just talking without any real evidence for SL.

 

I was surprised a bit that as well. I only use self compiles and Windows uses openjpeg 1.4 vs linux with openjpeg 1.5.1. Release versions of Firestorm and the official LL client use KDU Linux also has performance improvements when compiled with avx2 extensions, though according to Ansariel (FS dev) it should actually degrade performance and it does with windows. Also from reading the output from nvidia-smi linux uses much more GPU power than windows will. On "average" linux will use about 38% of the GPU vs about 15-20% of the GPU in windows on one down from ultra. Using ultra however GPU usage double or more. That could very well explain the performance increase as well.

However, KDU ive found doesnt really improve FPS but it does seem a bit more stable. Thats just an observation though.

As far as for MacOS. Apple treats opengl as the redheaded stepchild and uses opengl 2.1 (from 2005!?!?!) for SL as opposed to opengl 4.2 (on my iMac) for AMD and opengl 4.6 for nvidia for win and linux. I think with the next version of MacOS after Catalina opengl will be gone completely. Hoping (fingers crossed) LL will upgrade the client to Vulkan to avoid that nightmare, 

About the videos... I can make them but I signed up to use Windows insider which has given windows a serious performance cut. Reinstalling windows is on the schedule but not a priority right now. I rarely use it and with all the drivers I need to load, development libraries and my programs it just takes a good whole afternoon to reinstall. Unlike linux which takes about 20-30 minutes.

Link to comment
Share on other sites

10 hours ago, Bas Curtiz said:

I'm aware any av with high complexity in unsual values (300k+ for example) will make this test not valid.

Hence I used the Exhale club with loads of avs in one spot, but same amount of avs, and those who have too much complexiity/scripts/etc. (the red ones on the board) get auto ejected after 2 mins.

So all in all, I can look at numbers at the usual sources you describe above, but that doesn't say how it would be for SL.
Hence I think I've come close to a fair benchmark for SL.

So in other words you converted Second Life into your 'supposed' perfect world situation (of which you technically didn't). This in itself will skew your benchmarks as Second Life does not exist in a perfect world situation. A game with completely optimised content will but second life cannot. Additionally whilst they might eject people over a certain ARC at that club you cannot guarantee that all people rezed within that area (i.e. entire sim) were all at the same ARC level at each time (irrespective of you changing it to load only 40 avatars). If you want your test to be valid you would need to ensure that every one of those avatars in the club have exactly the same arc level as well as the same textures, mesh, objects, scripts etc and control people coming in and out so that each one is exactly the same. The same applies to every sim you went to.

Furthermore in all situations you haven't controlled the environment and therefore fps is affected accordingly. This is evident even in your latest videos where you have lower GHz cpu's at 140 fps and higher OC versions at half the non overclocked cpu's due to having avatars in view and not in the others etc.

Did you clear the cache before each time or was this a fresh login after a cache has been cleared? Did you clear the cache before each teleport to a different sim? Did you take into account the loading of inventory into your calculations at first startup or not? Did you take into consideration the simulator time and fps at each sim and work that into your calculations. How about turning on AO and Shadows in your viewer as well? Where was your cache located? What storage device was your cache on SSD, high RPM HDD, Normal HDD or M.2?

Active scripts in some of those sims are different to some test cases and looking at that Avatar impact board at the club goes to show you your test environment is skewed.

The way you tested this was as if someone was benchmarking and upon each time they changed systems they rendered a different scene in blender or they had one system play only BF5 and then the other system play only Minecraft. Coming to the conclusion that because PC 1 rendered BF5 at 110FPS and the Minecraft system rendered at 200FPS the Minecraft system is far superior.

As a non oc'd gen 1 ryzen owner I can tell you now that the amount of time it takes your ryzen test pc's to load textures etc., is way less than what it takes mine. My FPS with no shadows and AO on is 80+ yet somehow yours is as low as 30 at times with no shadows and no AO on at near empty sims? Additionally I can have shadows on and still get 50fps in a moderately occupied sim. This all with a ping of 280-480 and using an older Rx580 at 8g compared to your titan. In other words, something is very wrong with your tests.

10 hours ago, Bas Curtiz said:

Wrong.

See: 

Conclusion:
No difference in FPS, when you got a slow internet connection.
So internet-speed is not bound to fps. CPU remains relevant.
Highly a bit slower in rezzing the world.
 

Completely wrong and you only took their statement as meaning upload/download speed. Firstly you cannot just limit the download/upload rate and think that that is all that matters. You have to account for latency, packets in and out etc. There is far more to it that that.

That said, if you have managed to overcome the internet tied to fps issues then Google and their Stadia team are desperately waiting for you to contact them.

Edited by Drayke Newall
  • Like 1
Link to comment
Share on other sites

 

Quote

So in other words you converted Second Life into your 'supposed' perfect world situation (of which you technically didn't). This in itself will skew your benchmarks as Second Life does not exist in a perfect world situation. A game with completely optimised content will but second life cannot. Additionally whilst they might eject people over a certain ARC at that club you cannot guarantee that all people rezed within that area (i.e. entire sim) were all at the same ARC level at each time (irrespective of you changing it to load only 40 avatars). If you want your test to be valid you would need to ensure that every one of those avatars in the club have exactly the same arc level as well as the same textures, mesh, objects, scripts etc and control people coming in and out so that each one is exactly the same. The same applies to every sim you went to.

My perfect world situation ? I just tried to have the same situation.
Which indeed is near to impossible, unless I'd create 40 alts, put them on a sim I bought myself.
Which then again, won't show a ''real life situation'' like I've shown in the vids.

 

Quote

Furthermore in all situations you haven't controlled the environment and therefore fps is affected accordingly. This is evident even in your latest videos where you have lower GHz cpu's at 140 fps and higher OC versions at half the non overclocked cpu's due to having avatars in view and not in the others etc.

Not sure what you mean by ''controlling the environment'', but I bet you talking Draw Distance? Or derezzing other avatars?
Yep, ''real life situations'' are shown; when there are avatars around the fps will drop, hence I took a few different locations/cam POV's and re-recorded the same over and over to see if the results are consistent. and they are.
There's simply a difference in which CPU used, and what FPS it reaches.

 

Quote

Did you clear the cache before each time or was this a fresh login after a cache has been cleared? Did you clear the cache before each teleport to a different sim? Did you take into account the loading of inventory into your calculations at first startup or not? Did you take into consideration the simulator time and fps at each sim and work that into your calculations. How about turning on AO and Shadows in your viewer as well? Where was your cache located? What storage device was your cache on SSD, high RPM HDD, Normal HDD or M.2?

I didnt clear cache, since that's bound to how fast stuff rezzes, not with FPS-rate.
All sims in my testings ran at 45fps.
I used a WD Green SSD, which is again not important when we talk FPS comparison.

 

Quote

Active scripts in some of those sims are different to some test cases and looking at that Avatar impact board at the club goes to show you your test environment is skewed.

The way you tested this was as if someone was benchmarking and upon each time they changed systems they rendered a different scene in blender or they had one system play only BF5 and then the other system play only Minecraft. Coming to the conclusion that because PC 1 rendered BF5 at 110FPS and the Minecraft system rendered at 200FPS the Minecraft system is far superior.

Right, and therefor you can see in my vids, the same consistent overall more FPS at one CPU compared to the other?
I wasnt showing Second Life vs. Minecraft , nor I showed totally different locations or viewpoints.
I showed the same different locations and viewpoints.

 

Quote

As a non oc'd gen 1 ryzen owner I can tell you now that the amount of time it takes your ryzen test pc's to load textures etc., is way less than what it takes mine. My FPS with no shadows and AO on is 80+ yet somehow yours is as low as 30 at times with no shadows and no AO on at near empty sims? Additionally I can have shadows on and still get 50fps in a moderately occupied sim. This all with a ping of 280-480 and using an older Rx580 at 8g compared to your titan. In other words, something is very wrong with your tests.

And there you're trying to compare - i don't know what your location is / what you are seeing etc - to mine.
Something is very wrong in your comparison there ^.^

 

Quote

Completely wrong and you only took their statement as meaning upload/download speed. Firstly you cannot just limit the download/upload rate and think that that is all that matters. You have to account for latency, packets in and out etc. There is far more to it that that.
That said, if you have managed to overcome the internet tied to fps issues then Google and their Stadia team are desperately waiting for you to contact them.

I can't test a real crappy internet connection, since I dont have one.
But it atleast shows, speed isn't the culprit. CPU does however.

-
Feel free to do better comparison (with evidence) than I did 😃

We could for instance go inworld, same spot, some POV and same settings, to compare your rig against mine.

Edited by Bas Curtiz
Better readability using quote system
Link to comment
Share on other sites

1 hour ago, Bas Curtiz said:

Stuff was not here.

Please don't edit/respond in quotes to answer makes it extremely difficult to reply as the quote box replies empty otherwise.

Point 1 - You are benchmarking cpu's to find out which one is better. You cant do this without a base of which you do not have. Yes, if you want to compare cpu's you have to have exactly the same base. So as you said 40 alts on your own sim or control the environment within a parameter. That is the only way you can identify how each cpu performs against each other. If you don't have the exact same base your data would be skewed.

Point 2 - Controlling the environment is making sure the base across all systems are the same. You are benchmarking, not testing real world situations. Every one knows that when an avatar enters a sim or is on the sim there is an FPS drop. That tells nothing as to a performance of FPS across multiple cpu's. You say there is a difference in fps, but how did you determine this? Seems like all you did is tped to random locations on the same system and that's it without taking into consideration anything else. If an avatar has a higher ARC setting then that will effect the fps and skew fps data.

Point 3 - if texture loading, size of texture etc. doesn't have any effect on your fps or cpu then why talk about limiting avatar ARC or limit avatars to max 40 in an area like that club? What is it about an avatar being in the sim do you think that reduces FPS if texture loading or textures don't matter? If texture loading doesn't matter why do you see an increase in fps once textures on avatars have loaded fully?

Point 4 - No I saw a fps reduction when there were avatars present and higher fps when there wasn't. you showed the same locations but at different times in different conditions i.e avatars on the sim or loading in etc. How can one determine what cpu is better when the base test field is different across multiple systems. Granted the OC'ed cpus where higher in most situations but anyone could have told you that was going to be the case.

Point 5 - No i'm not trying to compare at all I simply said that on my system that is inferior to yours runs second life in all situations far better than yours. You say location and viewpoint matters, yet fail to see how avatars within those same fields rendering don't?

Point 6 - You say that internet speed doesn't matter as far as FPS goes yet according to Google Stadia not only does speed matter so does latency and that is with a standard game with no custom content. But hey its Google, what do they know.

Let me put it this way, my system has a non oc gen 1 ryzen 3.7GHz CPU, rx580, 32 gig of ram, a normal HDD for cache and with a 250ms ping I can run at 20-30fps with your graphics settings in a full club sim loading all 70 avatars on that sim. I can also run 120+ on a sandbox sim or generally any other sim. and 40-60fps on a mall sim depending on where I look due to high res textures being present in certain areas. But apparently textures don't matter for fps so *shrug*

As to your offer, I have better things to do than try and work out which cpu is better for second life, and to be honest don't care as I am with @cheesecurd. Any person that buys a pc solely for SL and not other things is just silly and ridiculous and to be honest apart from the very rare circumstances just wouldn't happen. Especially considering that content creation within SL requires those additional high end software.

That's said, even with your skewed data if someone buys a cpu you recommend at a higher price that doesn't necessarily perform better for other programs etc based on a 2-7 fps difference they are just silly and is why I posted in this thread to show how skewed your results are. If most tech reviewers etc are saying Ryzen at the moment is better across the board, cheaper and next year is only going to be better and with those tech reviewers switching from intel to ryzen, that should say it all.

Edited by Drayke Newall
felt bold so bolded
Link to comment
Share on other sites

1 hour ago, Drayke Newall said:

You say that internet speed doesn't matter as far as FPS goes yet according to Google Stadia not only does speed matter so does latency and that is with a standard game with no custom content. But hey its Google, what do they know.

Trusting google is never a good idea, they are same kind as facebook, which "totally not collecting and selling your information". Especially when it comes to selling their new shiny (and pretty terrible) thing, which will eventually be closed like countless other services they brought (and bought) before.

Only way bad connection (serious packet loss) should affect FPS in SL is "pop-ups". Viewer will get content from cdn in bigger chunks when it can, since normal flow is interrupted by packet loss and this will cause some fps drops while CPU works through those. Kinda same effect as if you teleport to a new place, you get lower fps while it rezzes all at once, then it does reach somewhat stable numbers (minus busy places like popular stores and clubs, where people tp in and out non stop).

I did use SL from unstable mobile connection a few times (same hardware/software). Slower rezzing, gray textures, rubberbanding when moving etc, sure, but fps were pretty much the same.

Although to be fair, I do remember a few cases when a terrible code in some online games did cause CPU to do extra work in order to compensate for too high ping (usually over 450-500) and/or packet loss. Some kind of prediction stuff just in case if client won't receive next packet in time. It wasn't really noticeable in most cases, but on low end machines it was, because even without extra tasks their CPUs already were maxed out.

I do agree about the rest, though. To benchmark different hardware the environment should be identical. No real point going to the busy place and watching the fps, even if there's same amount of avis around. People will teleport in and out and their avis also will be quite different triangles, vertices and vram usage wise. Which can and will cause very big fps fluctuation.

And yep, the better the CPU the better SL will run, no surprise here. Intel or AMD purely depends on your needs, budget and preferences. For pure gaming overclocked 9900k/9900ks (9700k as well, if game/app doesn't utilize more than 8 threads, which are majority) are still the best for the time being, top Ryzens are up there as well and better if you do stuff like rendering or streaming. That's not considering upgrading the platform factor, but I kinda doubt that people who buy 500$ CPU are too concerned about keeping their platform on the same socket for as long as possible. For mid/low-range Ryzen is better, cheaper and more future proof.

Edited by steeljane42
Link to comment
Share on other sites

1 hour ago, Drayke Newall said:

Point 6 - You say that internet speed doesn't matter as far as FPS goes yet according to Google Stadia not only does speed matter so does latency and that is with a standard game with no custom content. But hey its Google, what do they know.

I mostly agree with you but here you're equally misinformed. The quality or bandwidth or latency of your connection has no impact on FPS, because there's absolutely no networking involved in rendering on your computer.

Stadia is a streaming service, where the rendering happens somewhere else and the resulting frames are sent to you over the internet, which is why FPS might be lowered by Stadia to reduce the size of the stream when they detect you can't handle it.

There might be some residual side effects from a slow connection because of other systems, like texture loading, but even that shouldn't be a thing because if there's little texture data coming in, there's also nothing to decode, which means no lag from loading.

Link to comment
Share on other sites

On 11/24/2019 at 10:05 AM, Bas Curtiz said:

My goal was to find out which CPU is better for SL (without paying a 500 bucks worthy CPU like the i9-9900k)
I'm aware SL doesn't use multi-cores, but single-core: so in theory it benefits from the highest clocked frequency.

Lets correct this point...

Take it from a Linden or run System Internals yourself.

But, the render engine is all in one thread :/ So, CPU speed is way more important than the number of cores. IMO memory speed is the second most important factor.

20 hours ago, Love Zhaoying said:

I hope they have decent internet, or the CPU is irrelevant..

Internet speed has almost no effect on FPS. It has a major effect on scene render time. Slow Internet = grey world.

You may even see FPS go up on a slow net as grey renders faster than textures... but, that is a brief and temporary case.

16 hours ago, MarissaOrloff said:

However, KDU ive found doesnt really improve FPS but it does seem a bit more stable. Thats just an observation though.

As far as for MacOS. Apple treats opengl as the redheaded stepchild and uses opengl 2.1 (from 2005!?!?!) for SL as opposed to opengl 4.2 (on my iMac) for AMD and opengl 4.6 for nvidia for win and linux. I think with the next version of MacOS after Catalina opengl will be gone completely. Hoping (fingers crossed) LL will upgrade the client to Vulkan to avoid that nightmare, 

KDU provides more consistency in uploaded textures. Before Emerald/Phoenix/Firestorm added KDU we had lots of texture render issues.

Vulkan... Oz Linden has told us, a couple of times, this year that a switch to Vulkan is not on the list of things to do. I take that with a bit of salt. The Lab did take on 2 new employees specializing in rendering. We've met them at various UG meetings. The present job is to solve some sticky EEP problems. But, after that... we can hope they will look at some upgrade to the SL render engine. I keep hoping some upgrade will get us into VR range.

7 hours ago, Drayke Newall said:

So in other words you converted Second Life into your 'supposed' perfect world situation (of which you technically didn't). This in itself will skew your benchmarks as Second Life does not exist in a perfect world situation.

True enough. SL is way complex when it comes to getting decent benchmarks.

You left out that the viewer adjusts itself to the hardware it is running on. We think we can override those viewer made tweaks, which I think is debatable. So, not only is the render task changing via location and population but what the viewer needs to render changes based on its opinion of the computer.

So, everyone approximates there benchmarks. I walk out on my front porch (SL) and check FPS, jump up to a skybox and check again and visit a Safe Hub to check with avatars present. But, from time to time my neighbors change and the Hub never has the same people present. But, generalizing  is good enough.

  • Thanks 2
Link to comment
Share on other sites

10 hours ago, Wulfie Reanimator said:

I mostly agree with you but here you're equally misinformed. The quality or bandwidth or latency of your connection has no impact on FPS, because there's absolutely no networking involved in rendering on your computer.

Stadia is a streaming service, where the rendering happens somewhere else and the resulting frames are sent to you over the internet, which is why FPS might be lowered by Stadia to reduce the size of the stream when they detect you can't handle it.

There might be some residual side effects from a slow connection because of other systems, like texture loading, but even that shouldn't be a thing because if there's little texture data coming in, there's also nothing to decode, which means no lag from loading.

I will grant you that there isn't a massive fps difference, but as Nalates has stated grey renders quicker than a texture and therefore can have an impact on fps which was my point and is why I stated speed as well as latency matters as these all constitute how quickly your textures are streamed to your PC. This is because the textures themselves are generally similar to a streaming service as until they are streamed over the network and rendered the fps can and generally will change accordingly depending on how far you are from the servers and how quick your internet is.

Whilst yes SL is not a full fledged streaming system such as Stadia, it still does stream content to your pc of which can effect fps. Although as Nalates states it is a temporary effect governed by how quick your scene renders with faster connections and lower latency meaning you may not even notice it. Living in Australia however with a high latency and reasonable internet it is noticeable to some degree.

This is also why I responded to @Bas Curtiz as it was he that said that slow internet speed does not effect fps which it can. That said, I do agree that their is no tie to internet speed and the cpu or the system as far as fps goes.

Edited by Drayke Newall
Link to comment
Share on other sites

1 hour ago, Drayke Newall said:

is why I stated speed as well as latency matters as these all constitute how quickly your textures are streamed to your PC.

Even if I grant you that grey textures are somehow bad for FPS on their own (I don't agree), latency does not affect how fast something is downloaded in the overall scale.

You can have 10000+ ping and still download things at gigabytes per second. (Theoretically speaking. It's an extreme example to bring the point across.)

Latency is a measure of time it takes for one packet (or a round-trip) to travel from A to B (or A->B->A). When you're downloading contiguous data like a texture file, latency won't affect the download speed after the first packet has arrived, as the rest will follow just as quickly regardless of distance, because they were already on the way right behind the first packet. (Similarly, the viewer won't wait for one texture to finish downloading before the next -- the viewer is receiving multiple file downloads at once. No delay between each texture.)

Latency is only important for things like communicating inputs. If it takes 1 second after pressing W before your avatar starts moving... that's not pleasant, but you wouldn't notice the ping just from looking at how fast textures are loading.

1 hour ago, Drayke Newall said:

Whilst yes SL is not a full fledged streaming system such as Stadia, it still does stream content to your pc of which can effect fps.

The act of streaming itself doesn't cause FPS issues. It's the CPU/HDD time spent on decoding a finished download. Let's say that hypothetically you start downloading all the textures on a sim at the same time (ignoring viewer restrictions), but due to a disconnect or you throttling your internet speed to like 1Kb/s, you're not going to experience any lag from textures. Why? Because your computer can and only render what's already on your computer, and rendering no textures is super easy. I guess this is a bad example because you think that the streaming itself causes problems and disconnect/1kbps means there's no streaming. But just go and open some debug consoles in your viewer and watch where the slowdowns happen. It's not proportional to current network activity.

No textures ever finish downloading = No lag from decoding them into usable data = No texture lag.

And then you get disconnected from SL because 1Kb/s is not enough to sustain you.

Edited by Wulfie Reanimator
Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 421 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...