Jump to content

Why is Second Life so laggy now compared to the past?


Rohan Dockal
 Share

You are about to reply to a thread that has been inactive for 1716 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

6 minutes ago, Alyona Su said:

Turning off Advanced Lighting also disables shadows. I run on Catznip at full Ultra 100% of the time (as it's the best-performing of all viewers) - but a simple switch-off of Advanced lighting in crowded place (dance club, shopping events, etc.) does wonders. As for newbies... the context of most of this thread have nothing to do with them. :)

You can selectively turn off shadows though. Shadow requires advanced lighting on, but advanced lighting doesn't require shadows to be on.

And that may be true, but it's still somewhat relevant as they will often be facing it if the default viewer decides their computer is 'good enough' to set everything on high automatically. And as I said, Shadow's is probably the most taxing thing on SL for graphic options.

Edited by Digit Gears
Link to comment
Share on other sites

13 minutes ago, Drayke Newall said:

I'm sorry, I must say it is a pet peeve of mine that people think FPS is the same as Lag. Its not. Any drop in FPS is solely due to your PC nothing more nothing less. If you don't have a good enough PC to handle game settings then don't run everything on high. This includes shadows and max number of imposters. Now I will say one could argue that the SL graphics aren't well optimised but that will only reduce your FPS. Yes, large textures like 4k will drop your FPS and that is one reason to not use them in SL. Even high end gaming systems cant handle 4k texture graphics for everything even in modern games.

Lag on the other hand, in SL's case, is the speed in which the data is transferred over the internet. So @kiramanell I respectively disagree with you. High res Textures do cause lag as does high poly meshes, multiple textures etc. as these are all needing to transfer to your computer from LL's servers. In the case of Second Life, lag would be purely defined as what you say it isn't, blurry textures.

 

I respectfully disagree. I'm sorry, but I must say it's a bit of a pet peeve of mine that people think any type of slowdown is the same as lag. It isn't. And no, lag is NOT 'the speed in which the data is transferred over the internet' (athough superficially it may look like it is).  What matters is how fast relevant data can be transferred, like positional data (of residents, [moving] objects, etc). The moment you're walking, and are basically out of sync with the 24/sec fixed framerate regions run at (essentially skipping frames), then, and only then, are you experiencing true lag. Or when dragging an object, and it's suddenly a few meters ahead; that's lag, as it means your viewer client wasn't able to catch all positon packets (UDP?) of the object in time, or vice versa, causing a type of rubber-banding.

A slowly loading texture is not lag. After all, every texture takes time to load -- sometimes it just goes faster than other times (texture size being the main cause, of course). By your definition, everything which takes time to load, would be lag. Well, it isn't. Nor therefore is a blurry texture a sign of lag. There are essentially 2 reasons a texture is blurred: it may not have fully loaded yet; or (and I feel you didn't quite grasp my above point on this one), because it's getting a 'bias' by the viewer, basically because you don't have enough texture memory to support your entire surroundings (within your draw distance). The latter is pretty annoying. When Firestorm only offered 1G of texture memory to be used, many textures of my scenes would continually shift in an out of a state of blurriness because of it; up to a point where I even considered leaving SL. Now, with 2G, even the most materials-enabled scenes (potentially tripling the texture memory requirements) stay nicely in memory.

  • Like 2
Link to comment
Share on other sites

1 hour ago, kiramanell said:

I respectfully disagree. I'm sorry, but I must say it's a bit of a pet peeve of mine that people think any type of slowdown is the same as lag. It isn't. And no, lag is NOT 'the speed in which the data is transferred over the internet' (athough superficially it may look like it is).  What matters is how fast relevant data can be transferred, like positional data (of residents, [moving] objects, etc). The moment you're walking, and are basically out of sync with the 24/sec fixed framerate regions run at (essentially skipping frames), then, and only then, are you experiencing true lag. Or when dragging an object, and it's suddenly a few meters ahead; that's lag, as it means your viewer client wasn't able to catch all positon packets (UDP?) of the object in time, or vice versa, causing a type of rubber-banding.

A slowly loading texture is not lag. After all, every texture takes time to load -- sometimes it just goes faster than other times (texture size being the main cause, of course). By your definition, everything which takes time to load, would be lag. Well, it isn't. Nor therefore is a blurry texture a sign of lag. There are essentially 2 reasons a texture is blurred: it may not have fully loaded yet; or (and I feel you didn't quite grasp my above point on this one), because it's getting a 'bias' by the viewer, basically because you don't have enough texture memory to support your entire surroundings (within your draw distance). The latter is pretty annoying. When Firestorm only offered 1G of texture memory to be used, many textures of my scenes would continually shift in an out of a state of blurriness because of it; up to a point where I even considered leaving SL. Now, with 2G, even the most materials-enabled scenes (potentially tripling the texture memory requirements) stay nicely in memory.

Agree to disagree. Lag is anything that causes a delay between the server and the player. You are using lag in the sense of a typical game of which SL is not. All textures are downloaded from the servers to your computer causing a delay. A normal FPS, MMO etc. wont do this as all assets are on your computer hence the lag seen is solely positional, similar to how there is a lag in SL crossing a region, though this is server side as well as client.

The way you talk about slow loading textures is as if they are already on your machine. If they were already on your machine I would agree a slow loading texture in this case isn't lag just your computer not having the memory. But when a high res texture has to transfer to your machine of which it can only then process it, that is lag (the server saying "texture a needs to go in this position, your computer then has to download that texture and position it to that location). The delay in time between your computer and the server is lag.

EDIT: This is also what they are currently trying to solve as far as cloud gaming goes. In a Multiplayer scenario for a light client similar to SL the user would have to download everything live. This causes issues as a person closer to the servers can download it faster and therefore process the textures, video, positioning everything quicker than someone a long way away giving them the advantage. It is also why currently only single player games have worked well with cloud gaming or where all data is downloaded first prior to accessing the cloud game. Defeating the purpose of it.

Conversely the time it takes for the server to process and send you the image may mean that by the time the image is sent through and rendered you are already dead, not based of positioning or input lag, but purely because the rendered image didn't render in time on your pc from the server and therefore you didn't see the enemy.

As far as your issue with 1 gig goes, never had that issue even on the default viewer when it was 512 limit. Once a texture was loaded from my cache it stayed sharp always and never went shifting between blurriness unless there was texture thrashing or my cache size wasn't large enough to keep the textures in there. But then I also didn't have my draw distance at 256 the equivalent of 3 sim loading if standing in the corner of all at the junction piont.

Edited by Drayke Newall
Link to comment
Share on other sites

In my experience, SL should be at least okay on anything more than the cheapest of laptops with the most patatoey of potato GPUs. They probably weren't hacking it in '09 either. If a Core i7 with a GPU meant to run 3D graphics is giving you 2fps staring at the floor, I'd suspect there's something else going wrong. Either it's throttling spectacularly badly or something is chewing up all the CPU cycles. (I'm looking at you, Microsoft antimalware agent). 

Link to comment
Share on other sites

1 hour ago, Digit Gears said:

You can selectively turn off shadows though. Shadow requires advanced lighting on, but advanced lighting doesn't require shadows to be on.

And that may be true, but it's still somewhat relevant as they will often be facing it if the default viewer decides their computer is 'good enough' to set everything on high automatically. And as I said, Shadow's is probably the most taxing thing on SL for graphic options.

Yes, we know this. You are apparently misunderstanding the context of the comments, so I'll just leave things as they are for now.

Link to comment
Share on other sites

On 2/25/2019 at 8:20 AM, Alyona Su said:

My standard response to people with complaints about SL performance is always the same: Firestorm ain't all that when it comes down to it. You need to set your own priorities. If it's performance then do something about. If it's so-called "features" (that you likely never use most of) then so be it - pick one and stop bi**hing about the other.

I'm just saying that complaining about the same things over and over and doing nothing about it is, well... 

I'll leave it at that.

No offense meant here Alyona, but I just don't get this attitude.

So we are supposed to stop using what we like to use or shut up our hopes of said things ever improving, and not give our opinions? "They are already using our product so who cares what they think"?

Most of the bi**hing that goes on in the discussion of a product eventually leads to improvement in that product. Its called feedback.

Oh and Firestorm ain't all that...according to....You?

Nobody likes Firestorm for the full slate of features. We like it for the features we want, which vary from user to user. One might like feature A, one might like feature B. Kudos to Firestorm for offering both, even if I might never look at either one, because perhaps I like feature C. See how that works?

One last thing - Performance issues occur for different people for different reasons and in different ways. The official viewer gives performances issues just as Firestorm does.

Link to comment
Share on other sites

1 hour ago, Adam Spark said:

So we are supposed to stop using what we like to use or shut up our hopes of said things ever improving, and not give our opinions?

Context is everything, sir. No, this is not what is being asked. You are being asked to choose your own priority because you cannot have your cake and eat it, too, it is a technological hurdle that must be crossed and that threshold is pretty high ($5,000 U.S. Computer? You're over the hurdle. $350 Computer? Pick one or the other priority, you cannot have both.)

SO; Pick your priority: performance or features - performance equals lightweight, small nimble footprint, whereas feature-priority equals creep toward "heavy bloat". Iceberg ahead! Which has the better chance, a four-seater motorboat or the Titanic? Oh, that one is answered already, never-mind. BUT, I'll bet they were all griping about the steering performance since they wanted to go *fastest*, right? LOL 

The point being: if UR biotching about performance, then solve your own problem and use a better-performing viewer. If you refuse because you demand the features (50% of which most people never use) then quicherbichen.

Edited by Alyona Su
Link to comment
Share on other sites

3 hours ago, Alyona Su said:

Context is everything, sir. No, this is not what is being asked. You are being asked to choose your own priority because you cannot have your cake and eat it, too, it is a technological hurdle that must be crossed and that threshold is pretty high ($5,000 U.S. Computer? You're over the hurdle. $350 Computer? Pick one or the other priority, you cannot have both.)

SO; Pick your priority: performance or features - performance equals lightweight, small nimble footprint, whereas feature-priority equals creep toward "heavy bloat". Iceberg ahead! Which has the better chance, a four-seater motorboat or the Titanic? Oh, that one is answered already, never-mind. BUT, I'll bet they were all griping about the steering performance since they wanted to go *fastest*, right? LOL 

The point being: if UR biotching about performance, then solve your own problem and use a better-performing viewer. If you refuse because you demand the features (50% of which most people never use) then quicherbichen.

Never ask for an improvement to a product you choose, because said improvement is in a product you otherwise don't care for.

Got it.

Remind me not to give feedback or complain to XBox if my suggestion is for something that exists in Playstation. I guess I should just switch consoles.

Edited by Adam Spark
Link to comment
Share on other sites

9 hours ago, kiramanell said:

 

I respectfully disagree.

Textures themselves don't cause lag. Lack of memory, however, does (personally, I have a GTX 1080 Ti, with 11G memory. And 32G machine RAM; so, not an issue for me). The only thing that really matters about a texture is its size. Too many big textures won't cause lag either, though (even though loading times may slightly increase). It will result in blurring, though (aka, the viewer will either apply a bias, to make it use less video memory, or it will simply throw out the texture altogether, which leads to continual loading of textures, whenever you walk to an another area). Strangely enough, video memory usage appears to be down to a max of 2G (which is really kinda odd, as every moden graphics cards will start at 4G). But with 2G, and all materials-enabled scenes, you can still manage comfortably in SL.

Also, textures with backed in shadows are actually highly preferred over enabling shadows, perfomance-wise. In fact, the single most dramatic drain to your FPS will be your shadow quality setting. Try setting it to 4.0, and see what happens. :) Real shadows are way prettier, of course; and, indeed, you get weird effects when a baked shadow starts to interfere, visually, with the way the shadows are actually cast by the render engine. But shadows are a performance killer, really. Still, a drop in FPS is not necessarily the same as lag: on a busy sim, your FPS may drop drastically, for sure, because of lag; but with too much graphics candy enabled, your video card may simply have to work too hard.

In GTA V, for instance, if you examine a typical ymap, you'll see they're using some form of faux ambient occlusion:

      <artificialAmbientOcclusion value="255"/>

That's clever. After all, ambient occlusion tends to be pretty uniform. Like where a wall meets the floor (and vice versa), you know you'll get a rather standard gradient from light to darker. Adding faux ambient occlusion like that to SL would already all but eliminate the need for baked textures, as the absence of ambient occlusion is really what makes older builds look so 'off.'

Lightmapping in video games is NOTHING LIKE BAKING SHADOWS INTO TEXTURES. Lightmapping is way more efficient and doesn't use nearly as much memory because the textures themselves are not baked, only the light data, and it's done at a very low resolution.

Low end computers can't use realtime shadows because they usually don't have a good GPU, but if they don't have a good GPU they also don't have a lot of ram or vram , so burning shadows into textures solves absolutely nothing, it just makes downloads longer, travel impossible and overwhelm the viewer cache.

Imply viewer performance is purely a "personal problem" is neither intelligent, nor cool.

  • Like 2
Link to comment
Share on other sites

1 hour ago, Kyrah Abattoir said:

Lightmapping in video games is NOTHING LIKE BAKING SHADOWS INTO TEXTURES. Lightmapping is way more efficient and doesn't use nearly as much memory because the textures themselves are not baked, only the light data, and it's done at a very low resolution.

 

No, that's never true, of course. Pre-baking shadows into textures may be quite CPU/GPU intensive, but it's done on ppl's own home PC (aka, from within their copy of Blender). Once a beautiful baked home is imported into SL, the extra impact of those baked shadow is literally zero. After all, a texture with a baked shadow on it is just like any other texture: just an image. So, whatever lightmapping you do inworld, or real shadows, or ALM, always cost more resources than just a texture alone.

 

Quote

Low end computers can't use realtime shadows because they usually don't have a good GPU, but if they don't have a good GPU they also don't have a lot of ram or vram , so burning shadows into textures solves absolutely nothing, it just makes downloads longer, travel impossible and overwhelm the viewer cache.

Imply viewer performance is purely a "personal problem" is neither intelligent, nor cool.

 

Whence this outright hostility? I never implied viewer performance is purely a "personal problem." Unless you meant, of course, where I said lack of memory is the predominant cause of low performance. Which it is, of course (disk/vram/ram. etc) For example, first advice Microsoft gives to everyone complaining about performance issues in Windows: "Add more ram!" True story.

 

Quote

so burning shadows into textures solves absolutely nothing, it just makes downloads longer, travel impossible and overwhelm the viewer cache.

 

Again, what's on a texture has no performance impact, whatsoever, on the SL servers, or your connection: it doesn't make downloads take longer, doesn't overwhelm your cache more (or less, for that matter). It's just a picture: might be of a cat, might be a wall with baked ambient occlusion at the edges. Makes no difference.

So, precisely pre-baked ambient occlusion in modern mesh homes is a very good solution for ppl with computers that aren't too powerful. It doesn't tax the SL servers, and it doesn't tax their own system. Real shadows, like I said, are the real performance killer.

Sooo.... why are you so rabidly against my proposal to have LL implement some form of middle road? Something more sophisticated than just outright baked shadows you see in modern homes (that suffer from the drawback of visually clashing with the real shadow system, amongst other things), yet entirely less CPU/GPU intensive than true shadows? (aka, the 'artificialAmbientOcclusion' I mentioned in GTA V).

Link to comment
Share on other sites

7 hours ago, kiramanell said:

A slowly loading texture is not lag.

I can live with this semantic distinction, Firestorm and other informal use notwithstanding, but the distinction between streaming and rendering delays also represents a system-level design trade-off. The amount of rendering lag is less because we've decided to live with horrific texture streaming delays. That's not actually a difficult or controversial decision in the case of SL because it has always needed to run on midrange or lesser graphics hardware. (On the other hand, though, it also must run across a planet-worth of internet with pockets of pretty dreadful bandwidth; SL basically throws those potential users over the rail.)

In SL it's an important decision because the grid is effectively streamed but for the occasional cache hit. That direction may have some underpinning in Philip Rosedale's background at RealNetworks, where streaming was the business.

(Just in passing, though: If rendering FPS is the only slowdown that should be called "lag", I'm gonna need to find a new term for "script lag".)

5 minutes ago, kiramanell said:

Again, what's on a texture has no performance impact, whatsoever, on the SL servers, or your connection: it doesn't make downloads take longer, doesn't overwhelm your cache more (or less, for that matter). It's just a picture: might be of a cat, might be a wall with baked ambient occlusion at the edges. Makes no difference.

Since I brought it up, maybe I should try to explain. Look again at those two images. The one from the distant past uses easily an order of magnitude less streaming texture bandwidth because surfaces that depict the same substance all share the same texture, and can use lower resolution repeated maps because they don't have baked lighting. Mesh on the other hand almost always uses baked lighting textures at super high resolution, never repeated across surfaces. 

Now, I suspect that even with theoretically infinite memory and streaming bandwidth, any GPU will suffer some rendering performance degradation with ten times the amount of texture complexity to wrap around object surfaces. But I could be convinced otherwise.

  • Thanks 1
Link to comment
Share on other sites

7 minutes ago, Qie Niangao said:

I can live with this semantic distinction, Firestorm and other informal use notwithstanding, but the distinction between streaming and rendering delays also represents a system-level design trade-off. The amount of rendering lag is less because we've decided to live with horrific texture streaming delays. That's not actually a difficult or controversial decision in the case of SL because it has always needed to run on midrange or lesser graphics hardware. (On the other hand, though, it also must run across a planet-worth of internet with pockets of pretty dreadful bandwidth; SL basically throws those potential users over the rail.)

In SL it's an important decision because the grid is effectively streamed but for the occasional cache hit. That direction may have some underpinning in Philip Rosedale's background at RealNetworks, where streaming was the business.

(Just in passing, though: If rendering FPS is the only slowdown that should be called "lag", I'm gonna need to find a new term for "script lag".)

 

For starters, I like this type of conversation: civil, and on-topic. :) That's what the 'Thanks' was primarily for.

As for the part I bolded, definitely, script lag is a very real thing! And when the servers are so bogged down, lag will indeed start to occur. Not sure we need a new term for that, though. I mean, when the server no longer has enough CPU cycles to perform its own tasks even (like polling scripts for possible outstanding events), then everything starts to suffer. I would assuredly call that lag too.

 

Quote

Since I brought it up, maybe I should try to explain. Look again at those two images. The one from the distant past uses easily an order of magnitude less streaming texture bandwidth because surfaces that depict the same substance all share the same texture, and can use lower resolution repeated maps because they don't have baked lighting. Mesh on the other hand almost always uses baked lighting textures at super high resolution, never repeated across surfaces. 

Now, I suspect that even with theoretically infinite memory and streaming bandwidth, any GPU will suffer some rendering performance degradation with ten times the amount of texture complexity to wrap around object surfaces. But I could be convinced otherwise.

 

You make a good point there about repeatable textures vs. huge ones with baked-in shadows. Can't argue with that. :) 

In my last post, though, we were foremost talking about lag. In my earlier post, however, I was more talking about it, in terms of supporting my position that "Textures with backed in shadows are actually highly preferred over enabling shadows, perfomance-wise." And I think that remains true: seen from the perspective of a lighting-system, baked-in shadows should always win over any real light system in effect (even though there are potentially, and likely bigger textures involved with baked-in shadows).

As for 'some rendering performance degradation with ten times the amount of texture complexity to wrap around object surfaces,' I think it's fair to say it's true. Same could probably be said for using compressed textures (although I don't know of anyone enabling that): the complexity of some images will simply make some textures more CPU-intensive to compress; but those effects should be infinitesimal, I think, and are largely academic; and probably fall outside the scope of this thread a bit.

Memory remains the thing, though, IMHO. I have VM's running on a Celeron even, that perform badly under normal circumstances (like via hard disk); but run them on an SSD, with like 16G, and they suddenly fly. :) 

Link to comment
Share on other sites

2 hours ago, kiramanell said:

For starters, I like this type of conversation: civil, and on-topic. :) That's what the 'Thanks' was primarily for.

Was that a jab at me.. 😛 I swear I was civil and on topic with my posts.

Quote

"Textures with backed in shadows are actually highly preferred over enabling shadows, perfomance-wise." 

If done right (despite me hating seeing shadows baked on a surface going the opposite direction to the sun angle) and I would say it depends on how the light is implemented that creates the shadow. Sure you cant stop the somewhat minimal performance hit of shadows caused by the sun, but I also think a lot of the performance issues people may suffer with shadows on is how an artificial light source is created.

There are 2 ways in second life to create a light source. The simplest and almost exclusively used way is tick that light tick box in the build menu leave the settings as a 10m radius light and hey presto I have a new lamp that can light up my whole house. This method will cast a shadow on everything even through walls in some cases which, in a realistic environment as well as a typical game engine environment just doesn't happen. This means you cant control the amount of shadows or the performance hit it creates as it is rendering shadows for everything in a 10m radius.

The second option is to use the spot lighting method and adapt from there. Spot lighting allows you to use a texture whereby the light only comes out from that texture space. This means I can directionally light something without it necessarily casting shadows on other objects or avatars. You then further this but reducing the radius and fade etc. so as the light source radius or in the spot light example, distance only casts a light to the required distance needed to illuminate the surface.

Shadows will always be a performance hit as you say, that's why there is also settings in SL that allow you to manage them and even more so in viewers like Black Dragon. I generally just use the shadows by sun and moon option and if taking pictures etc will then set it to everything.

Link to comment
Share on other sites

9 hours ago, Adam Spark said:

Most of the bi**hing that goes on in the discussion of a product eventually leads to improvement in that product. Its called feedback.

Unless LL are actively soliciting feedback about a feature (like BoM or EEP), it really doesn't work that way.

Which sadly is also the answer to vast majority of pet theories about why SL performs the way it does.

 

You want :

  • a bug fixed - file a JIRA (or download the source, fix it, and submit that to LL)
  • a performance improvement - download the source and have at it.
  • something done differently - download the source , put in the client side stuff, fake the server side that LL would need to do, and write a proposal.
  • a game engine - get the source and a game engine and ... 
  • a fundamentally different client server architecture - try something else .. maybe sansar

 

Sorry to be blunt, but that's the state of play right now. LL have many on going SL projects, a limited number of developers and different priorities.

Endless threads on the forums blaming textures, avatars, creators, LL, scripts, whatever, do nothing and motivate no one to do anything. This whole mess isn't new, the way LL develop isn't new, we're 16 years in a hole. This is the hole. Shouting about the hole is redundant.

  • Like 2
Link to comment
Share on other sites

10 hours ago, kiramanell said:

No, that's never true, of course. Pre-baking shadows into textures may be quite CPU/GPU intensive, but it's done on ppl's own home PC (aka, from within their copy of Blender). Once a beautiful baked home is imported into SL, the extra impact of those baked shadow is literally zero. After all, a texture with a baked shadow on it is just like any other texture: just an image. So, whatever lightmapping you do inworld, or real shadows, or ALM, always cost more resources than just a texture alone.

The impact of BAKING textures by the content creator is irrelevant to the discussion, why are you even bringing it up? The impact of those textures is the ram/vram they are hogging:

  • Option A: House with a single 512x512 stone texture repeated on the walls. 6Mb of vram, will look prettier with ALM/Shadow/SSAO.
  • Option B: The same house, the same texture, but it has been baked into 16 1024x1024 pre-lit textures. 144Mb of vram, will probably look just the same if not worse with ALM.

And no. Lightmapping is processed at the shader level and is so light and efficient that all cellphone games use it. That's what all 3D games where using since at least as far back as Quake. And quake was doing it without the benefits of hardware acceleration.

On the other hand, the impact of pre-lit TEXTURES is that they essentially get rid of the main reason we use textures in the first place instead of a one time "pixel atlas": recycling and repeating, which is one way to get nice sharp details for cheap.

10 hours ago, kiramanell said:

Again, what's on a texture has no performance impact, whatsoever, on the SL servers, or your connection: it doesn't make downloads take longer, doesn't overwhelm your cache more (or less, for that matter). It's just a picture: might be of a cat, might be a wall with baked ambient occlusion at the edges. Makes no difference.

You are completely missing the point, what's on the texture doesn't matter since compression is (mostly) irrelevant to vram usage, what DOES matter is how many textures and how big they are. The official viewer allocates up to 768Mb of vram to textures only (the rest is used for other things) before it begins the great "texture juggle".

Camming inside a friend's "average" furnished house makes me shoot past that after a few seconds (Firestorm has a higher limit but there will always be a limit).

FirestormOS-TestBuild_2019-07-31_10-52-58.png.03eeabf5ac478e497dbc696d18216dbc.png

And guess what? This is just one house on a 1700 sqm parcel, regions can contain dozens of those, and much worse ones.

And obviously i'm not getting into avatars, those are far, far worse.

As for how much of an improvement you can get from texture recycling...

FirestormOS-TestBuild_2019-07-31_10-59-15.thumb.jpg.a16230ebcca65a7026e591ec391e9fa1.jpg

10 hours ago, kiramanell said:

Sooo.... why are you so rabidly against my proposal to have LL implement some form of middle road? Something more sophisticated than just outright baked shadows you see in modern homes (that suffer from the drawback of visually clashing with the real shadow system, amongst other things), yet entirely less CPU/GPU intensive than true shadows? (aka, the 'artificialAmbientOcclusion' I mentioned in GTA V).

I wouldn't be against a region-centric lightmapping solution, but Linden Lab has rejected this option in the past (when I brought it up) and I can imagine many reasons why.

  • Like 4
Link to comment
Share on other sites

13 hours ago, CoffeeDujour said:

Unless LL are actively soliciting feedback about a feature (like BoM or EEP), it really doesn't work that way.

Which sadly is also the answer to vast majority of pet theories about why SL performs the way it does.

 

You want :

  • a bug fixed - file a JIRA (or download the source, fix it, and submit that to LL)
  • a performance improvement - download the source and have at it.
  • something done differently - download the source , put in the client side stuff, fake the server side that LL would need to do, and write a proposal.
  • a game engine - get the source and a game engine and ... 
  • a fundamentally different client server architecture - try something else .. maybe sansar

 

Sorry to be blunt, but that's the state of play right now. LL have many on going SL projects, a limited number of developers and different priorities.

Endless threads on the forums blaming textures, avatars, creators, LL, scripts, whatever, do nothing and motivate no one to do anything. This whole mess isn't new, the way LL develop isn't new, we're 16 years in a hole. This is the hole. Shouting about the hole is redundant.

Yeah, the uproar here had nothing to do with them rescinding the group limit reduction announced recently. Oh and they completely ignored the forums during the Tillia confusion didn't they?

Second Life and the Lab has changed drastically in 16 years. You can't paint it all with one brush and say "It went this way 16 years ago, so it can't go another way today".

And besides, last I checked voicing frustration wasn't against forum rules, and neither was skipping past threads you consider redundant.

Link to comment
Share on other sites

I'm not techy enough to know if that'd be possible, and if, how hard it would be to implement, but what about some sort of ability to scale down textures on objects? 
For example - I have a potted succulent, I really like it, it looks beautiful, but it has 2 textures. 1024 x 1024 for most of it - it's a small plant, I should mention. So that's rather exessive, half of that would be more than enough. The second texture is for the succulents leaves. Which are a small part. Also an 1024 x 1024 Texture, when possibly an 256 x 256 would have sufficed.
I like the textures, and would prefer to keep them - mainly because my own texturing skills are meager at best, but also because those are no copy objects. But they're mod at least, so the possibility is there. But it would make things so much easier if I could just scale the textures down to the sizes I want them to have. :/

Link to comment
Share on other sites

2 hours ago, Sukubia Scarmon said:

I'm not techy enough to know if that'd be possible, and if, how hard it would be to implement, but what about some sort of ability to scale down textures on objects? 

Making a smaller texture is a trivial task for a computer to do. You can't do it for nomod objects in SL though. The viewer tries .. 

SL's textures are sent in jpeg2000 which includes multiple sizes of the image encoded in a progressive way, so the more data you get, the higher resolution the image (rather than a regular jpeg on a website that is just one image and you load it from start to finish) - the smaller image you want to make already exists.

SL gets image data in chunks, trying to only get enough data to have a texture at the optimum resolution on screen, it wont go straight for the max res version. In theory you could have unlimited resolution images and SL would be clever and only get enough data to show the texture on your screen at the size needed.

This entire system was designed back when we all had a lot less bandwidth and seems like a really smart idea.

And then it all falls apart ... Progressive jpeg2000 is useless for your graphics card. So it has to be decoded before it can be used and this is slow. The viewer will keep fetching more data and decoding higher resolutions as needed .. but it can't step a texture back down a notch if currently loaded image is too big (say you zoom in on your pot plant and then zoom out). Mainly because in order to do that it would have to bin the current image and start from scratch decoding up to the resolution needed, while it could be added, its so expensive that the performance gain is obliterated by all the extra decoding 

 

So we have half a great idea that's train wrecked by the implementation.

 

LL have started work on remaking the image cache, so it will store decoded data rather than jpeg2000. This will use a lot more disk space, but each texture will only ever be decoded once ..  make the cache can contain separate decoded images for various resolutions and now we can have the viewer step up and down resolution as needed without a lot of expensive decoding slowing everything down.

This will be further helped by your OS's built in file caching. Potentially it will be a LOT faster .. like OMG WOOSH faster (for images you have in your cache).

The viewer will have the spare CPU time to be smarter. Smarter about what images are cached (like always caching  your home location!), and smarter about what images are kept loaded and given to the graphics card. Also means you can throw hardware at the problem to improve performance further, a bigger cache on a faster drive will make a real difference.

 

 tl;dr  .. yes, it can be done automatically .. and we're just waiting on LL to find some development time to get things moving again.

 

  • Thanks 2
Link to comment
Share on other sites

7 hours ago, Selene Gregoire said:

Sometimes you have to shout for help to get out of the hole.

Or you can just keep digging.

I think LL needs to stop digging and start yelling for help.

The hole is a lot of fundamental SL design decisions that date back to way back when. The stuff that makes SL unique is also the stuff that hurts it the most, there are lots of good reasons why no one else does it like SL.

LL did stop digging. They designed a new platform building on the lessons learnt from SL's architectural design limitations. That would be Sansar.

Only we're all in the old hole, we like this hole, and there is no way we're going to try some new fangled hole. The devs left on SL are furiously trying to shore up the walls, and we're yelling at them to bring on the TNT. That was my point.

  • Like 2
Link to comment
Share on other sites

Another big issue that causes texture multiplication is that SL has shifted towards more of a competitive environment than a collaborative one.

I know a circle of people who have a shared pool of textures that they try to use as much as possible to reduce the number of unique textures they use. But your typical content creator is not going to share his assets with anyone else, even if they are highly reuseable.

In fact i'd wager that some see baking textures as a form of "soft DRM" since the texture becomes unuseable on its own...

Edited by Kyrah Abattoir
  • Like 4
Link to comment
Share on other sites

9 hours ago, CoffeeDujour said:

LL did stop digging. They designed a new platform building on the lessons learnt from SL's architectural design limitations. That would be Sansar.

I don't think Sansar is built from lessons learned, everyone who actually founded the bases SL rests upon is gone, and the new people seem either unable to bring Sansar to part, or refuse to go down "that path".

In a way it reminds me of Ultima Online, it was ambitious and flawed, and every mmorpg that followed it seem to be doing everything in their power 'not' to follow that path.

SL Is flawed obviously, but a lot of those flaws could be corrected if LL didn't have to take in account legacy content.

  • Close all the holes that scripters work around by tying ressource consumption to avatars or parcels rather that the per script/per object naive approach they ended up with.
  • Redesign avatar and land limits to take in account everything, from server load to client impact, nothing should be displayed or tax the servers "for free".
  • Make a new system avatar that actually follows the deform bones rather than the weird hybrid thing we have now.

Every single problem SL has is due to the lack of reasonable limits on what is fair ressource consumption. I can put thousands of scripts in a single prim... why?

Link to comment
Share on other sites

On 7/31/2019 at 2:59 AM, Drayke Newall said:

...There are 2 ways in second life to create a light source. The simplest and almost exclusively used way is tick that light tick box in the build menu leave the settings as a 10m radius light and hey presto I have a new lamp that can light up my whole house. This method will cast a shadow on everything even through walls in some cases which, in a realistic environment as well as a typical game engine environment just doesn't happen. This means you cant control the amount of shadows or the performance hit it creates as it is rendering shadows for everything in a 10m radius...

This is incorrect. A point light created in this way NEVER casts shadows. That's why it shines through walls. The only things that can cast shadows in SL are the sun/moon and projector lights. Objects with the light emitting property and no projector texture set as you refer to below are simply a ranged light source with no consideration about whether that light is occluded by anything and no participation in generating shadows.

Quote

...The second option is to use the spot lighting method and adapt from there. Spot lighting allows you to use a texture whereby the light only comes out from that texture space. This means I can directionally light something without it necessarily casting shadows on other objects or avatars. You then further this but reducing the radius and fade etc. so as the light source radius or in the spot light example, distance only casts a light to the required distance needed to illuminate the surface.

Shadows will always be a performance hit as you say, that's why there is also settings in SL that allow you to manage them and even more so in viewers like Black Dragon. I generally just use the shadows by sun and moon option and if taking pictures etc will then set it to everything.

You can't set "everything" - the most you can choose is "sun moon and projectors" - light-emitting objects without a projector texture will still never cast shadows even at the highest setting. Not even in Black Dragon, which is my regular viewer.

Shadows == "none": Daylight and moonlight come through your roof and walls lighting up your interior. Projector lights pass through objects, lighting up everything to their max range. Point lights pass through objects, lighting up everything to their max range.

Shadows== "sun/moon": Daylight and moonlight don't pass through roof and walls -You will need to light up your interiors. Projector lights pass through objects lighting up everything to their max range. Point lights pass through objects lighting up everything to their max range.

Shadows == "Sun/moon + projectors": Daylight and moonlight don't pass through roof and walls -You will need to light up your interiors. Projector lights are occluded by objects and cast shadows. Point lights pass though objects lighting up everything to their max range.

When building and designing you should always plan for all three, because you don't know how the user seeing it has their viewer set. To account for lower graphics settings you will want to bake in a small amount of light-based shading, but any time you bake in more than a little ambient occlusion shading you're making your object look worse at higher graphics settings which is probably not what you intend.

  • Like 4
Link to comment
Share on other sites

Sigh. Kyrah, you seem to be conflating a great many separate issues, and then draw conclusions that belong to one set of criteria, and apply them to another. That makes your arguments rather confused. So, let me spell out the various issues involved:

1) GPU/CPU perfomance issues due to choosing a light system vs. using baked textures;

2) Perfomance issues as a result of using high-res textures;

Here already you are confusing the two, when you say

"The impact of those textures is the ram/vram they are hogging."

That belongs to item 2), whereas I went out of my way to even re-iterate, to Qie, that, regarding computers with weak GPU/CPU hardware, "Textures with backed in shadows are actually highly preferred over enabling shadows, perfomance-wise." That is true under 1); 2) is simply a different issue.

3) Perfomance issues between very old homes and newer ones;

4) Perfomance issues related to using textures with baked shadows;

3) and 4) are somewhat related, but not in the absolute, the way you portray the matter. In that, yes, textures with baked-in shadows tend to be larger (4), but not just because of the shadows per se: much newer homes simply have to look way better, regardless of shadows (3). And yes, there are many modern homes that even come with the option to disable the baked-in shadows; and yet, of course, these homes use the same amount of texture memory, in both cases.

5) The amount of texture memory available.

Having little texture memory available (to you) is not itself a valid general argument against using this-or-that lighting system. It MAY be a personal reason not to prefer buying stuff with huge baked textures (2, 4); for another person, however, with enough texture memory available (2G in Firestorm), but with a slow video card, for instance, they might very well prefer the GPU/CPU performance increase (1) by going the baked route.

And finally, you seem to continue to argue from the perspective that all of SL must somehow conform to the lowest common denominator. I fundamentally disagree. If you want fancy homes, you simply need fancy hardware. And nobody is forcing you to buy the fancy homes. So you can choose not to. A pretty lighting scheme, with real shadows, this too will require decent hardware. And yes, if 'fancy' means having to use more texture memory too, then so be it!

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 1716 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...