Jump to content

I think we should, collectively, be somewhat embarassed.


You are about to reply to a thread that has been inactive for 518 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

So I've been part of SL now since 2005. And since that time I've watched the evolution of the grid over the entire time.

And early on, back in the early years, it made sense that SL was laggy and performed terribly because hey everyone's PC's were so much less powerful.

But since then the SL Viewer standard has remained pinned to minimum requirements that are no where near realistic in scope.

And what's more the rendering technology even for a streamed asset platform of this type, is entirely built around standards that use literally none of the advanced CPU architecture or GPU architecture available to us in 2022.

SL looks better than ever, but remains unable to pair those looks with acceptable performance because the Lab appears absolutely married to the minspec approach. But even in that case, new advances in Intel GPU architecture means even stuff like UHD integrated and the new ARC series of integrated chips, will have none of their advanced features used.

With the low poly nature of collada mesh, and the low resolution assets SL is married to (2048x2048 is not a an HD resolution), the fact that my PC in 2022 gets the same 5 - 10 FPS on ultra with 256 draw distance in a sim that isn't EMPTY is literally embarrassing and really needs to be addressed if this platform ever wants to engage the modern user base that is out there, but avoids SL because it runs terribly and looks like a ten year old game even at its best.

And while I understand not everyone in the world has a high end PC, there's no reason not to have support for the large percentage of power users in SL that do. In fact there is no reason why the minspec/integrated graphics design of SL is somehow exclusive to SL actually starting to use modern CPU and GPU capabilities. 

I'd like to see this platform last me another 15 years, but I'm starting to think its a waste of money to even upgrade my hardware any more other than for the tax write offs.

a82e4a53a3c8c6015afad3e49623c83a.jpg

44843294d747e727ada027a37dcd5b9d.png

  • Like 4
  • Thanks 1
  • Haha 1
Link to comment
Share on other sites

Just now, Solar Legion said:

That's nice.

Want to take a crack at rewriting the Viewer?

Sure I mean if Phillip will hire in some IDTech devs to handle the OpenGL overhaul.....I'm just fine with getting involved with the viewer.

  • Like 6
  • Haha 2
Link to comment
Share on other sites

Yeah, not likely going to happen.

The entire point of my response was as follows: If you want such changes, you're better off making it yourself or finding out at the least what it will require to redo the entire Viewer while maintaining the required compatibility for using existing content and allowing for existing users to continue to use their hardware.

It's oh so nice that you have/had the money to make such a system. You're not the first person with an over priced system to make complaints.

It'd be lovely if the Viewer had kept pace - now it is catch up time.

  • Like 5
  • Haha 1
Link to comment
Share on other sites

Just now, Solar Legion said:

Yeah, not likely going to happen.

The entire point of my response was as follows: If you want such changes, you're better off making it yourself or finding out at the least what it will require to redo the entire Viewer while maintaining the required compatibility for using existing content and allowing for existing users to continue to use their hardware.

It's oh so nice that you have/had the money to make such a system. You're not the first person with an over priced system to make complaints.

It'd be lovely if the Viewer had kept pace - now it is catch up time.

Well I have system because SL made it possible, much like a lot of creators here. That aside I can't fix things from a viewer side and I know this because I was peripherally involved with Singularity and you can find a lot of that branch in Alchemy now. And there are just things you can't fix from the viewer side in this regard. Which is why I feel its on the Lab to modernize.

Its been almost 20 years, and what we have now is basically a bunch of feature upgrades that have been stapled on to a rendering architecture that is older than half the people using SL. It has to be optimized and modernized at some point, it can't keep running like this and expect to continue to attract new market.

  • Like 5
  • Thanks 1
  • Haha 1
Link to comment
Share on other sites

And not a single mention was made concerning keeping it where it is right now. The Region/Simulator side is only half of the equation whereas the half where your hardware actually matters is the Viewer end.

If you've been involved as you say you have, you should know this - and the distinction - already.

Very little can be done on the server side to optimize performance enough to matter or make a single difference in regard to what hardware you have.

If your issue was with the server side then listing your computer specifications was a useless bit of information to have.

I'll leave you to converse or quibble with someone else.

Edited by Solar Legion
  • Like 1
  • Thanks 1
  • Haha 1
Link to comment
Share on other sites

2 minutes ago, Solar Legion said:

And not a single mention was made concerning keeping it where it is right now. The Region/Simulator side is only half of the equation whereas the half where your hardware actually matters is the Viewer end.

If you've been involved as you say you have, you should know this - and the distinction - already.

Very little can be done on the server side to optimize performance enough to matter or make a single difference in regard to what hardware you have.

If your issue was with the server side then listing your computer specifications was a useless bit of information to have.

I'll leave you to converse or quibble with someone else.

I know what you mean I am not saying you are wrong, so I'm not entirely sure what the hostile stance is about.

The problem is the viewers are all pinned to the same branch of code that the main viewer is on, you can't deviate out too far from what is already there without it basically being incompatible.

Take Black Dragon for example, its effectively a higher end graphics solution, but does nothing for performance, Singularity/Alchemy have slightly stronger performance focuses, but graphically don't improve on anything.

And NONE of them do anything to better utilize modern GPU capabilities because the main branch has no support for it.

  • Like 5
Link to comment
Share on other sites

If you get only 5-10 FPS with that PC, then you have a very badly configured system... For example, you did enable multi-threading in the NVIDIA driver, right ?

In any case, with a 5.0GHz 9700K (locked on all cores), a GTX 1070 Ti and a 1920x1200 screen, with the Cool VL Viewer and under Linux, I never, ever fall below 20fps, even with ALM + shadows in a busy sim, and I'm usually in the 60-250fps range in most places with shadows off (ALM either on or off and it does not make a huge difference: barely 10% less fps with it on). As for DD, I am always at 256m in sims with neighbouring regions and 512m in others.

Edited by Henri Beauchamp
  • Like 9
Link to comment
Share on other sites

18 minutes ago, Henri Beauchamp said:

If you get only 5-10 FPS with that PC, then you have a very badly configured system... For example, you did enable multi-threading in the NVIDIA driver, right ?

In any case, with a 5.0GHz 9700K (locked on all cores), a GTX 1070 Ti and a 1920x1200 screen, with the Cool VL Viewer and under Linux, I never, ever fall below 20fps, even with ALM + shadows in a busy sim, and I'm usually in the 60-250fps range in most places with shadows off (ALM either on or off and it does not make a huge difference: barely 10% less fps with it on). As for DD, I am always at 256m in sims with neighbouring regions and 512m in others.

Hey I get 150+ FPS in light rendering areas, and 70 to 80 in small builds that aren't terribly intense.

I'm in a 3 sim situation that is all very high ended design though and yes thats where I drop to 5 to 10 FPS and its not even the assets in use because I've been very ***** about that and my entire texture base I've made for it runs at 512 or 1024 as a standard because I've been making these textures since 2006. I think my system is configured fine, its just that it doesn't give me any particular advantage when pushing SL to its maximums and since SL's maximum is literally 20 years behind industry standard, I feel that is embarassing.

10 minutes ago, Mollymews said:

to see where Linden is working on making the viewer perform more efficiently, have a look at the Performance Improvement candidate viewer:  https://releasenotes.secondlife.com/viewer/6.6.0.569349.html

 

 

Thank you Molly I will have a look at that. I appreciate the information.

  • Like 1
Link to comment
Share on other sites

Got to love how the tone is either 

  • SL Sucks, it's supposed to suck, suck it up.
  • You're doing it wrong.

Till LL do something about the single threaded asset fetch decode cache pipeline and sequential processing of rigged avatar attachments it's not going to get better.

It's is well beyond the scope of any third party project to overhaul the viewer in those areas as an unpaid hobby!

  • Like 5
  • Thanks 2
  • Haha 1
Link to comment
Share on other sites

2 hours ago, Coffee Pancake said:

Got to love how the tone is either 

  • SL Sucks, it's supposed to suck, suck it up.
  • You're doing it wrong.

Till LL do something about the single threaded asset fetch decode cache pipeline and sequential processing of rigged avatar attachments it's not going to get better.

It's is well beyond the scope of any third party project to overhaul the viewer in those areas as an unpaid hobby!

Yeah its kinda weird how people communicate on issues we all experience. That aside the whole viewer overhaul process I couldn't possibly do that myself.

But I know for a fact that the Lab and Tilia make a ton of money just on me making money, so I'm pretty sure they can afford to find people who can do it. Which is kinda my point.

  • Like 4
Link to comment
Share on other sites

I’ve asked this question a time or two but the other day I was given a brilliant response.

 

“Legacy Code”

 

The English we speak today was derived Germanic which was influenced by Latin. From a code standpoint you can view Second Life’s original code as Latin. As a language develops common words change. For instance “Hello” can easily be substituted with “What’s up” or “Oi” but this is just an extension of today’s code. The issue lies within the Legacy Code itself. If someone were to alter the Legacy Code what would happen to the newer code? As Second Life has grown the coding has too but it all revolves around that original code. Given how Second Life is more of a user based platform a minor change might have a giant domino effect. It’s a feat they were able to move all their servers, data, and other things over to the AWS servers. I’m curious as to how many of the current devs know this legacy code. Do the newer devs understand the various foot notes and information littered throughout the old code? If you think about how Second Life was coded in terms of language it was written in various forms as it evolved. It would be quite an undertaking to alter the core.

 

While I’m not embarrassed by it I’ve wondered why they haven’t done more to “update” second life. But you have modern games which have been coded so poorly they won’t run on some of the newest tech. This isn’t an issue which is limited to just Second Life. I’m sure plenty of WoW users feel the same. It’s also a question of accessibility. While plenty of users are running Second Life on a toaster (there is nothing wrong with that) how many could afford to upgrade in the current market? If Second Life were to change overnight how many would get left out in the dark? People who pay for sims, lindens, membership even if they can’t enjoy it like someone using a high end rig.

 

I do think Liden Labs needs to be more engaging with their community. What can they do better. How do we attract new users but when it comes to “changing” the code drastically... rather than move mountains it might be easier to make a new one. The answer isn't as easy as we think it should be.

 

  • Like 2
Link to comment
Share on other sites

5 hours ago, DalNiente said:

I’ve asked this question a time or two but the other day I was given a brilliant response.

 

“Legacy Code”

 

The solution to legacy code is to replace it with better more advanced systems, sure there might be some breakage, but assuming we get more new shiny into the deal it's not the end of the world.

SL creators will quickly replace everything that was lost with newer better things that accomplish the same result only better.

A good example would be bodies. If LL had replaced the default avatar body with a better mesh body a decade ago like we wanted and then continued to revisit and improve upon it, wouldn't be in the mess we're in now. yes we would have lost the old default body which is a real pain for those yearly "dress like an oldbie" contests, but someone would have made an replacement just for those days.

  • Like 5
Link to comment
Share on other sites

Linden Lab, and the rest of us, are lucky to be still able to use OpenGL. They've had years of warning, and now OpenGL is officially deprecated on some hardware platforms. That's been its status for about three years. The Lindens seem to have done sweet Fanny Adams.

I suppose we should use the technical meaning of "deprecated". It still works, but there is no certainty that it will work in the next release. There's a lot of overlap with standard English dictionaries, and you can call this usage jargon, specialised usage but not secret.

  • Like 3
  • Thanks 1
Link to comment
Share on other sites

22 minutes ago, arabellajones said:

Linden Lab, and the rest of us, are lucky to be still able to use OpenGL. They've had years of warning, and now OpenGL is officially deprecated on some hardware platforms. That's been its status for about three years. The Lindens seem to have done sweet Fanny Adams.

It's only deprecated by Apple trying to convince people it is "outdated technology" while in reality they were too lazy to fix their broken implementation of it and push everyone into using their Metal API for advanced vendor lock-in effects and squeezing more money out of their users.

  • Like 7
  • Thanks 2
Link to comment
Share on other sites

20 hours ago, Henri Beauchamp said:

If you get only 5-10 FPS with that PC, then you have a very badly configured system... For example, you did enable multi-threading in the NVIDIA driver, right ?

In any case, with a 5.0GHz 9700K (locked on all cores), a GTX 1070 Ti and a 1920x1200 screen, with the Cool VL Viewer and under Linux, I never, ever fall below 20fps, even with ALM + shadows in a busy sim, and I'm usually in the 60-250fps range in most places with shadows off (ALM either on or off and it does not make a huge difference: barely 10% less fps with it on). As for DD, I am always at 256m in sims with neighbouring regions and 512m in others.

How would one go about enabling multi-threading in their NVIDIA driver. I feel like  I should be getting better performance Alienware laptop, but I have only the most basic idea of how to reconfigure it.

  • Like 1
Link to comment
Share on other sites

2 minutes ago, Persephone Emerald said:

How would one go about enabling multi-threading in their NVIDIA driver. I feel like  I should be getting better performance Alienware laptop, but I have only the most basic idea of how to reconfigure it.

 

Hey not sure if you will get a response or not but you can go to the Geforce Control Panel and set specific options and overrides for your viewer of voice as shown in this shot.

 

fb4725004bf4b66cc2aefc387315ad08.png

  • Like 1
  • Thanks 3
Link to comment
Share on other sites

57 minutes ago, Ansariel Hiller said:

It's only deprecated by Apple trying to convince people it is "outdated technology" while in reality they were too lazy to fix their broken implementation of it and push everyone into using their Metal API for advanced vendor lock-in effects and squeezing more money out of their users.

Quote

"Deprecated"

And just used in 2020 for Doom Eternal so....also the MAC market share for personal computing is not that high as compared to their mobile share.

OpenGL 4.6 is a pretty capable rendering tech for as old as it is. It just requires people working on that side of the system and no one has for a very long time.

  • Like 1
Link to comment
Share on other sites

12 hours ago, Ansariel Hiller said:

It's only deprecated by Apple trying to convince people it is "outdated technology" while in reality they were too lazy to fix their broken implementation of it and push everyone into using their Metal API for advanced vendor lock-in effects and squeezing more money out of their users.

Whatever Apple's motivation for deprecating and removing OpenGL, it does not help to stick your head in the sand for years and pretend nothing will happen. – Sure as the bank, Apple will switch off OpenGL, and most likely this fall when they have replaced the two last machines with Intel processors (Mac mini Intel config, and Mac Pro) with Apple Silicon configurations after the M2 processor is introduced at the WWDC in June.  

Then what? Is the lab happy to lose 15% of their active users and perhaps as many as 25% of the creators overnight? – If that is the case, then please tell us, because we Mac users can surely take our business elsewhere. 

 

Add to that: The only reason why OpenGL currently runs on Apple Silicon is because their Intel-based implementation runs under Rosetta2 translation over a small HAL (hardware abstraction layer) to the Apple GPU. There does not exist an Apple Silicon native implementation of OpenGL, and there never will.

The reason Apple only will support Metal is now they design their own CPU+GPU complex from scratch, they don't want any dependency on companies such as Khronos Group, Microsoft or NVIDIA in their implementation of the processors. They also don't want to share any IP with these companies on how they have built their GPU. 

For Linden Labs to stay on Apple devices, they can either reimplement their renderer in Metal or they can find a renderer that already supports Metal and jerry rig the viewer into that renderer, but in doing so they introduce a dependency on a third party that may or may not be in sync with Apple hardware and system software releases.

Edited by Gavin Hird
  • Like 3
  • Haha 2
Link to comment
Share on other sites

11 hours ago, Lyssa Greymoon said:

OpenGL hasn't gotten an update since 2017. I don't think Apple is the only one that's left it for dead.

Does that make it automatically bad? The problem is not that the OpenGL spec hasn't been updated since 2017, but the fact the viewer implementation is still vastly depending on tech from 2005. And it's still Apple to blame for not releasing a proper compatibility profile back with OpenGL 3.2, effectively locking the viewer in OpenGL 2.1 since you can't mix newer features with old fixed-function pipeline stuff still in the viewer (which will hopefully change with the performance viewer, but I still found some old ff stuff in there...)

3 hours ago, Gavin Hird said:

Then what? Is the lab happy to lose 15% of their active users and perhaps as many as 25% of the creators overnight? – If that is the case, then please tell us, because we Mac users can surely take our business elsewhere.

You mean the rather 5% of people being unable to run SL anymore in contrast to the larger number of people with hardware that doesn't support something like Vulkan? Take a guess on the business decision. (On a sidenote: If you are a big creator in SL making lots of money and making a living from it, you'd probably get a Windows PC if Mac support is gone rather than saying byebye and going out of business.)

3 hours ago, Gavin Hird said:

The reason Apple only will support Metal is now they design their own CPU+GPU complex from scratch, they don't want any dependency on companies such as Khronos Group, Microsoft or NVIDIA in their implementation of the processors. They also don't want to share any IP with these companies on how they have built their GPU.

Of course they don't wanna share anything - it's Apple! The only thing Apple is sharing is their walled garden and let you pay for that "experience"! Can't even exchange an SSD anymore so they can charge you ridiculous prices for standard hardware labeled as "Apple certified quality product", made in the same fab on the same machines they produced the same thing for the Windows/Linux PC plebs in the other shift!

  • Like 6
  • Thanks 2
Link to comment
Share on other sites

4 minutes ago, Ansariel Hiller said:

Does that make it automatically bad? The problem is not that the OpenGL spec hasn't been updated since 2017, but the fact the viewer implementation is still vastly depending on tech from 2005. And it's still Apple to blame for not releasing a proper compatibility profile back with OpenGL 3.2, effectively locking the viewer in OpenGL 2.1 since you can't mix newer features with old fixed-function pipeline stuff still in the viewer (which will hopefully change with the performance viewer, but I still found some old ff stuff in there...)

You mean the rather 5% of people being unable to run SL anymore in contrast to the larger number of people with hardware that doesn't support something like Vulkan? Take a guess on the business decision. (On a sidenote: If you are a big creator in SL making lots of money and making a living from it, you'd probably get a Windows PC if Mac support is gone rather than saying byebye and going out of business.)

Of course they don't wanna share anything - it's Apple! The only thing Apple is sharing is their walled garden and let you pay for that "experience"! Can't even exchange an SSD anymore so they can charge you ridiculous prices for standard hardware labeled as "Apple certified quality product", made in the same fab on the same machines they produced the same thing for the Windows/Linux PC plebs in the other shift!

No wonder Firestorm runs like a train-wreck on macOS with that attitude. 😉

  • Like 2
  • Haha 4
Link to comment
Share on other sites

7 minutes ago, Ansariel Hiller said:

 And it's still Apple to blame for not releasing a proper compatibility profile back with OpenGL 3.2, effectively locking the viewer in OpenGL 2.1 since you can't mix newer features with old fixed-function pipeline stuff still in the viewer (which will hopefully change with the performance viewer, but I still found some old ff stuff in there...)

Apple only offers two OpenGL profiles, the OpenGL 3.2 profile and the Legacy profile which the viewer currently use.

To enable the OpenGL 3.2 profile you have to get rid of all the BOOL in the viewer code (12k+ of them) otherwise you cannot link the viewer with the necessary libraries. This is a hurdle that LL could have gotten rid of 3-4 years ago. 

Next, Apple has documented what you need to do to transform your application to Metal, and the intermediary steps, one of which is upgrade your code to the OpenGL 3.2 profile with specific attention areas documented for this to happen. The documentation was issued over 4 years ago, including multiple sessions and workshops at WWDC, and there is no evidence in the current viewer code these recommendations have been followed; even in the performance update. Rather they stupidly try to go directly to the graphics HW and enable OpenGL 4.1, which is unreachable for any normal running code on macOS.

LL should have, IMO, rather than go ahead with the expensive and disruptive EEP project (still mainland is on WindLight), started the work to upgrade the viewer to OpenGL 3.2 and only then - maybe have built EEP on top of it. 

  • Like 2
  • Haha 3
Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 518 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...