Jump to content

Linden Performance viewer and Texture thrashing


You are about to reply to a thread that has been inactive for 689 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

15 hours ago, Lyssa Greymoon said:

It shrinks your dong too.

I think what you are referring to is something usually described as "turn off" the opposite of something that turns you on, but yes, it will do that too.

Edited by NiranV Dean
Link to comment
Share on other sites

On 7/23/2022 at 12:16 PM, Jenna Huntsman said:

The Linden viewers all have an abitrary cap on VRAM of 512MB, which simply isn't enough. LL is working to fix this, but those patches haven't surfaced yet.

512MB for textures, not total. The Linden viewer will use additional VRAM for meshes and so on.

The problem has always been that there is no "100% dependable" way to get the total VRAM on all systems, because no one size fits all single solution could be found, the entire problem just got shelved as "wont fix". This is lamentably the exact same reason for group chat still being wonky and probably dozens of other SL issues.

Edited by Coffee Pancake
  • Sad 2
Link to comment
Share on other sites

2 hours ago, Coffee Pancake said:

entire problem just got shelved as "wont fix"

Any time I want to do something against the rules in SL, I start using the words "hard work" a lot. This ensures LL will avoid it and I never get caught.

Link to comment
Share on other sites

3 hours ago, Coffee Pancake said:

The problem has always been that there is no "100% dependable" way to get the total VRAM on all systems, because no one size fits all single solution could be found, the entire problem just got shelved as "wont fix". This is lamentably the exact same reason for group chat still being wonky and probably dozens of other SL issues.

DRTVWR-563 is going to (try to) address this and will use all available VRAM, using methods of the operating system to determine (or guesstimate if you are a poor Mac user) the current VRAM usage. There will be no texture memory slider anymore - which also means Linux users should prepare to get bent over hard unless some Linux fanboy implements those VRAM detection method on Linux... 😁

  • Thanks 1
Link to comment
Share on other sites

19 hours ago, Coffee Pancake said:

512MB for textures, not total. The Linden viewer will use additional VRAM for meshes and so on.

The problem has always been that there is no "100% dependable" way to get the total VRAM on all systems, because no one size fits all single solution could be found, the entire problem just got shelved as "wont fix". This is lamentably the exact same reason for group chat still being wonky and probably dozens of other SL issues.

16 hours ago, Ansariel Hiller said:

DRTVWR-563 is going to (try to) address this and will use all available VRAM, using methods of the operating system to determine (or guesstimate if you are a poor Mac user) the current VRAM usage. There will be no texture memory slider anymore - which also means Linux users should prepare to get bent over hard unless some Linux fanboy implements those VRAM detection method on Linux... 😁

I was gonna say. That's total BS. Every goddamn game manages to do this, you can't tell me we can't. Also why would "other" systems stop LL from implementing a solution that works for one system, its better to have a system that works for some rather than something that doesn't work at all. Also reading the total VRAM isn't exactly hard nor a secret unless you are using an AMD GPU, on windows at least, been doing this for years now and it has been a really good solution.

Link to comment
Share on other sites

8 minutes ago, NiranV Dean said:

I was gonna say. That's total BS. Every goddamn game manages to do this, you can't tell me we can't. Also why would "other" systems stop LL from implementing a solution that works for one system, its better to have a system that works for some rather than something that doesn't work at all. Also reading the total VRAM isn't exactly hard nor a secret unless you are using an AMD GPU, on windows at least, been doing this for years now and it has been a really good solution.

When I asked questions about this, a few years ago, I was told that the combination of Apple's OS and AMD's drivers made knowing how much VRAM is installed in an Apple computer "too hard to do reliably because we use OpenGL and Apple and AMD lie to the OpenGL API about product capabilities."  Another answer I got was less pointed and included some mumbling about "integrated graphics".  I suggested having the program ask the user when the hardware/driver/operating system combination proves inscrutable, or maybe for all combinations.  At this point in the meeting Product Owner's hair burst into orange flames and I left the area.

Can't say that I learned much there, aside from reinforcing my old training.  I was taught that one doesn't answer the question "What computer should I purchase?" until after "List all software you need to use, its operating system and hardware requirements" has been completed.  We would then find some intersection of the sets to suit the customer's needs.  Sometimes that meant a combination of systems.

Personal Computing switched all that around backwards.  Now many people buy what looks cute to them then try to haul heavy cargo around with it.

Who do we blame?  Will identifying the correct entity to blame solve the problem?  No.  The problem is always "The other guy."  I think owning the solution would be more glamorous than owning the problem and am thus perplexed as to why it hasn't happened yet.

Link to comment
Share on other sites

1 hour ago, Ardy Lay said:

When I asked questions about this, a few years ago, I was told that the combination of Apple's OS and AMD's drivers made knowing how much VRAM is installed in an Apple computer "too hard to do reliably because we use OpenGL and Apple and AMD lie to the OpenGL API about product capabilities." 

The information is reported accurately in the System Report. I'm pretty sure it has been for years and it can't be that difficult to access. This whole thing has the we couldn't figure out how to do it in 2004 and no one ever tried again vibe.

Edited by Lyssa Greymoon
Link to comment
Share on other sites

24 minutes ago, Lyssa Greymoon said:

The information is reported accurately in the System Report. I'm pretty sure it has been for years and it can't be that difficult to access. This whole thing has the we couldn't figure out how to do it in 2004 and no one ever tried again vibe.

There are also edge cases where some systems will straight up lie and everything's fine till you try to use the VRAM on offer.

Link to comment
Share on other sites

1 hour ago, Wulfie Reanimator said:

Why do they lie?

Leaving aside the obvious answers (fun, profit, avoiding the issues) there are a lot of graphics cards on the market that came out of China with a fiddled bios telling the OS "I'm a this" when the actual hardware isn't that at all. A utility called GPU-tech will identify such fakes as "Fake" although it can't actually tell you what the base GPU is that the cunning little (Prince Phillip Pejorative) modded.

I've played around with a couple of these cards out of curiosity, and what I found is interesting: What I would regard as GPU-heavy programs like the TC3-TS2009-TS2012 Auran Train Simulators run perfectly fine on them, but go into a region in SL with a few pretty-pretty mesh avatars and watch the black screen arrive. Even just turning around too quickly in an unpopulated region will cause them to crash, where an old but genuine GT520 can manage to hold it all together.

This isn't a texture-size problem, many of the Auran assets have 1024x1024 textures, the mesh in there is also quite complex, and the size of the area being scanned and displayed can be three times the area of a typical SL region. Some of the Devs who've posted above might know more on this.

Link to comment
Share on other sites

2 minutes ago, Profaitchikenz Haiku said:

Leaving aside the obvious answers (fun, profit, avoiding the issues) there are a lot of graphics cards on the market that came out of China with a fiddled bios telling the OS "I'm a this" when the actual hardware isn't that at all. A utility called GPU-tech will identify such fakes as "Fake" although it can't actually tell you what the base GPU is that the cunning little (Prince Phillip Pejorative) modded.

Oh it's not the weirdo cards that are the problem, it's stuff with a brand name that's been shoved in a laptop.

Link to comment
Share on other sites

3 hours ago, NiranV Dean said:

I was gonna say. That's total BS. Every goddamn game manages to do this, you can't tell me we can't. Also why would "other" systems stop LL from implementing a solution that works for one system, its better to have a system that works for some rather than something that doesn't work at all. Also reading the total VRAM isn't exactly hard nor a secret unless you are using an AMD GPU, on windows at least, been doing this for years now and it has been a really good solution.

My understanding is that most games treat their VRAM settings as hard limits based on the total available from the hardware, regardless of the amount available (i.e. let the OS manage graphics memory as required)

SL, on the other hand, is attempting to be "smart" and limit itself to the free memory pool so it doesn't conflict with other applications. Neat idea in theory, although to my knowledge nothing else does this (maybe with good reason, but hey.).

Not really my field of knowledge so I can't really make any insightful comments on the matter.

Link to comment
Share on other sites

17 minutes ago, Jenna Huntsman said:

My understanding is that most games treat their VRAM settings as hard limits based on the total available from the hardware, regardless of the amount available (i.e. let the OS manage graphics memory as required)

SL, on the other hand, is attempting to be "smart" and limit itself to the free memory pool so it doesn't conflict with other applications. Neat idea in theory, although to my knowledge nothing else does this (maybe with good reason, but hey.).

Not really my field of knowledge so I can't really make any insightful comments on the matter.

Games operate with the assumption that they have exclusive control over the system and will just happily go for broke and use everything the OS serves up .. and then some depending on your settings. Quite often the VRAM usage is determined by in game settings rather than any clever management, and you can quite easily pick settings that bring to party to an end.

SL could work that way, but that would be in conflict with how people actually use SL, that is with a browser, or blender (etc) or an actual full fat game running on the side.

  • Like 1
Link to comment
Share on other sites

5 hours ago, NiranV Dean said:

I was gonna say. That's total BS. Every goddamn game manages to do this, you can't tell me we can't. Also why would "other" systems stop LL from implementing a solution that works for one system, its better to have a system that works for some rather than something that doesn't work at all. Also reading the total VRAM isn't exactly hard nor a secret unless you are using an AMD GPU, on windows at least, been doing this for years now and it has been a really good solution.

Read - Think - Write!

I am pretty sure I was talking about available VRAM, not total physical VRAM. Latter is easy, former is not. Since only nvidia is offering an OGL extension for that (with the exception of ancient ATI GPUs), you will have to use a platform-specific solution to reliably determine the amount of available VRAM. Windows API offers a pretty good solution to this, the OSX solution LL came up with looks a bit like "guess and hope for the best" approach to me. And for Linux, LL made - oh wait! LL hasn't been supporting for years now. But pretty sure they will come up with a solution for Linux - NOT!

As for other games: They have pretty much full control of what is going to be rendered and the VRAM needed for textures is commonly determined by the detail level of the textures. Thus the max. amount of VRAM needed for textures is pretty much fixed and known beforehand. Also, they usually assume the needed VRAM is actually available. If it's not, expect random crashes of random applications - not necessarily the game actually allocating more VRAM than available.

  • Like 2
Link to comment
Share on other sites

17 hours ago, NiranV Dean said:

I was gonna say. That's total BS. Every goddamn game manages to do this, you can't tell me we can't. Also why would "other" systems stop LL from implementing a solution that works for one system, its better to have a system that works for some rather than something that doesn't work at all. Also reading the total VRAM isn't exactly hard nor a secret unless you are using an AMD GPU, on windows at least, been doing this for years now and it has been a really good solution.

If it is such BS, why don't you whip up the code and show the rest of us how its done? – Windows and macOS.

Link to comment
Share on other sites

12 hours ago, Ansariel Hiller said:

the OSX solution LL came up with looks a bit like "guess and hope for the best" approach to me.

macOS (and not OSX) does not really give you any tools for how much VRAM is available, because all new Apple systems in marketing have unified memory where the CPU and GPU share and allocate the same memory. So depending on your system configuration, the GPU can allocated close to 120 GB memory if it wants to in a system with 128 GB main memory.

The above also means that the GPU allocated memory is subject to system paging, which in theory could overcommit GPU memory used by all applications running on the system. 

What Apple programming guidelines advices is for the application to  subscribe to a low memory system events and act accordingly to either reduce memory use or gracefully terminate. Because of other issues with the viewer code subscribing to such events are really not possible...

 

Edited by Gavin Hird
  • Like 1
Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 689 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...