Jump to content
  • 0
chalox

2060 6gb FS isnt reading the Vram

Question

18 answers to this question

Recommended Posts

  • 0
Posted (edited)

@chalox,

I'm not sure that it matters if your GPU's RAM isn't stated in the "About Firestorm" readout.

Try checking in Preferences instead...

graphicsmem2I.jpg.9cb2ca3bd5d8f28d471706aa6b4cda84.jpg

 

 

Edited by Lindal Kidd

Share this post


Link to post
Share on other sites
  • 0

2048MB is the max amount of texture memory you can set on Firestorm. Your system information looks fine.

Share this post


Link to post
Share on other sites
  • 0

The limit is a limit in the viewer's render engine. The FS team pushed the limit to 2GB. The Linden viewer is limited to 512MB.

How much VRAM you use pretty much only affects when texture thrashing sets in (Clear, burry, clear, blurry, etc).

The limit also affects those with older computers. The Lindens have talked about increasing the limit. But haven't to remain compatible with older systems. Yet, FS is used by many with older computers. So it is pretty fuzzy as to how old is old enough to have a problem.

Share this post


Link to post
Share on other sites
  • 0
14 hours ago, Nalates Urriah said:

So it is pretty fuzzy as to how old is old enough to have a problem.

I can say from personal experience that a 1gb 8800GT had absolutely no issues with textures in any location I've been, an 896mb GTX 275 would have problems in super detailed places with texture loading, and a 768mb Quadro FX 4600 had issues loading textures in less complex places at time.

Ive got several cards that are capable of running SL smoothly but with lower and lower video memory quantities if anyone would care to see that, 512mb ATI Radeon x1650 pro, 256mb 7800GS or 7300GT, 128mb 6600GT, etc 

Share this post


Link to post
Share on other sites
  • 0
8 hours ago, cheesecurd said:

I can say from personal experience that a 1gb 8800GT had absolutely no issues with textures in any location I've been, an 896mb GTX 275 would have problems in super detailed places with texture loading, and a 768mb Quadro FX 4600 had issues loading textures in less complex places at time.

Ive got several cards that are capable of running SL smoothly but with lower and lower video memory quantities if anyone would care to see that, 512mb ATI Radeon x1650 pro, 256mb 7800GS or 7300GT, 128mb 6600GT, etc 

With viewers designed for a 512MB VRAM card it is expected to run with those cards, not necessarily well but run. If the viewer design were changed, we could expect many of those cards to have problems. How many is where it gets fuzzy.

Share this post


Link to post
Share on other sites
  • 0
On 8/22/2019 at 5:31 PM, Lindal Kidd said:

@chalox,

I'm not sure that it matters if your GPU's RAM isn't stated in the "About Firestorm" readout.

Try checking in Preferences instead...

graphicsmem2I.jpg.9cb2ca3bd5d8f28d471706aa6b4cda84.jpg

 

 

you do know that isnt your Vram right that is the games vram limit and that doesnt change i have 6gb and a 8gb it always says 2048mbs

Share this post


Link to post
Share on other sites
  • 0
On 8/25/2019 at 9:41 AM, Nalates Urriah said:

With viewers designed for a 512MB VRAM card it is expected to run with those cards, not necessarily well but run. If the viewer design were changed, we could expect many of those cards to have problems. How many is where it gets fuzzy.

stll doesnt answer why SL isnt reading my Vram 

Share this post


Link to post
Share on other sites
  • 0
On 8/25/2019 at 9:41 AM, Nalates Urriah said:

With viewers designed for a 512MB VRAM card it is expected to run with those cards, not necessarily well but run. If the viewer design were changed, we could expect many of those cards to have problems. How many is where it gets fuzzy.

and your talking about old videocards i have a new videocard 2019 videocard the new teck but your 10fps has nothing on what im getting 

Share this post


Link to post
Share on other sites
  • 0

@chalox

Did you bother to read Nalate's post in this thread?

On 8/24/2019 at 10:53 AM, Nalates Urriah said:

The limit is a limit in the viewer's render engine. The FS team pushed the limit to 2GB. The Linden viewer is limited to 512MB.

How much VRAM you use pretty much only affects when texture thrashing sets in (Clear, burry, clear, blurry, etc).

The limit also affects those with older computers. The Lindens have talked about increasing the limit. But haven't to remain compatible with older systems. Yet, FS is used by many with older computers. So it is pretty fuzzy as to how old is old enough to have a problem.

You seem to be jumping all over the map in an attempt to discredit every piece of advice that's been offered.  If you already know it all, why are you asking us?

  • Like 1

Share this post


Link to post
Share on other sites
  • 0

2048 is the HARD limit imposed by the client, you can not use more than what the client is designed for.  "BUT SL AINT SEEING IT"  it never will unless it's coded to do so in the client, you want it using all your vram,  then go get the source code and figure out where the 2048 limit is added and adjust as necessary.

Share this post


Link to post
Share on other sites
  • 0
Posted (edited)
13 hours ago, Lindal Kidd said:

@chalox

Did you bother to read Nalate's post in this thread?

You seem to be jumping all over the map in an attempt to discredit every piece of advice that's been offered.  If you already know it all, why are you asking us?

my question isnt getting answered you 'll are give bad info i was asking why isnt it reading my Vram and all dont understand plz give the answer to my question and not on how to do old stuff that has nothing to do with my question  all of this answers is like call teck support and the first thing they ask is did you try turning it on like wow really 

Edited by chalox

Share this post


Link to post
Share on other sites
  • 0
Posted (edited)
10 hours ago, bigmoe Whitfield said:

2048 is the HARD limit imposed by the client, you can not use more than what the client is designed for.  "BUT SL AINT SEEING IT"  it never will unless it's coded to do so in the client, you want it using all your vram,  then go get the source code and figure out where the 2048 limit is added and adjust as necessary.

im not asking to use more just asking why isnt it reading the 6gb or 8gb  of Dedicated Vram is it because i have a 2060 6gb and 2070 8gb and SL hasnt ctach up yet

Edited by chalox

Share this post


Link to post
Share on other sites
  • 0
2 hours ago, chalox said:

im not asking to use more just asking why isnt it reading the 6gb or 8gb  of Dedicated Vram is it because i have a 2060 6gb and 2070 8gb and SL hasnt ctach up yet

If you think it is a bug, file a JIRA Bug Report.

CPU: Intel(R) Core(TM) i5-6600K CPU @ 3.50GHz (3504 MHz)
Memory: 32708 MB
OS Version: Microsoft Windows 10 64-bit (Build 18362.295)
Graphics Card Vendor: NVIDIA Corporation
Graphics Card: GeForce GTX 1060 6GB/PCIe/SSE2

Some time back the Lab changed how they handle graphics cards. Previous to those changes the graphics card stuff was kept in a file downloaded with the viewer. Keeping up with new cards required updating the file which became too much work so the system was changed to 'probe' the graphics card and determine its capabilities. Card information is pulled from the card. There is a command line option (http://wiki.secondlife.com/wiki/Viewer_parameters) to turn off the hardware probe as it can block some from logging in. You may want to check the command line options in you launching icon.

To answer your OP question, you don't. What you are reading in HELP is built into the card and viewer. You can use GPU-Z (free) to read the name embedded in the card. In my case the embedded name is 'NVIDIA GeForce GTX 1060 6GB'. So, the viewer can capture that text and use it.  I haven't read the code to see how it does that or how it assembles the  text for the HELP panel. But, I suspect your problem is an NVIDIA issue, not a viewer problem. But, you can try other viewers... they probably all use the same code... 🤔

You had a problem getting an answer you wanted because no one thinks what is in HELP is all that important... it is helpful... not important. It certainly doesn't matter to the operation of the viewer. Everyone expected you to be interested in performance not pretty text in an informational panel. No one is THAT silly. So, we provided performance related answers. Your being insistent we answer a question as worded and not expanding on it just frustrated everyone. Just hope the key people here didn't put you on their idiots list and avoid answering you in the future. 

  • Thanks 1

Share this post


Link to post
Share on other sites
  • 0
2 hours ago, chalox said:

im not asking to use more just asking why isnt it reading the 6gb or 8gb  of Dedicated Vram is it because i have a 2060 6gb and 2070 8gb and SL hasnt ctach up yet

First up, you never want to give a single application all of your VRAM. The OS needs some, chrome and whatever else you're running will need some.

There is almost no practical benefit to having more than 2GB dedicated to SL. With a reasonable draw distance (one that allows you to move smoothly), you will be hard pushed, even in busy locations, to actually use the full 2GB (that has been my experience at least). There is certainly no performance benefit to having surplus VRAM dedicated to SL.

The official client caps at 512mb mainly because it causes problems on lower end hardware, especially hardware that lies about how much VRAM it has. Turns out there isn't a single sure fire way to ask and get a dependable answer. I do expect a slightly raised official cap from LL as part of other changes this year.

Going beyond 2GB raises a number of other problems with the viewer, and due to the negligible practical use, none of the third party viewers have dedicated developer time to solving those problems.

Share this post


Link to post
Share on other sites
  • 0
On 8/27/2019 at 7:42 PM, chalox said:

stll doesnt answer why SL isnt reading my Vram 

Because that's not what it's doing. It's telling you how much of your card's VRAM is available for Second Life, not how much your card has. This is simply a misunderstanding about what the viewer is doing. 

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Answer this question...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...