Jump to content

Firestorm vs the LL Viewer


cykarushb
 Share

Recommended Posts

On 10/12/2018 at 1:52 PM, Nalates Urriah said:

So, if you are implying SL is a hog from inefficiency, I disagree.

Sl IS a hog from inefficiency, but not on the part of LL, sort of... Its due to merchants and people using 1024 x 1024 textures fro things like, buttons, doorknobs, jewelry, nails and eyes.. There is no need for it. If its so small you have to cam in tight to see it, use a 256.. or even a 128

  • Like 3
  • Thanks 1
Link to comment
Share on other sites

9 hours ago, Drake1 Nightfire said:

Sl IS a hog from inefficiency, but not on the part of LL, sort of... Its due to merchants and people using 1024 x 1024 textures fro things like, buttons, doorknobs, jewelry, nails and eyes.. There is no need for it. If its so small you have to cam in tight to see it, use a 256.. or even a 128

this has absolutely nothing to do with the framerate of SL, you would just see these massive textures load slowly, and uncompressed gradient 1024x1024 texture is 4mb. And you will never see uncompressed textures, nor are they all rainbow gradients (to create max file size). Most are going to be 100-300kb in size each.

So even if you have literally 10,000 of these textures visible its only 3gb of memory used for textures. The regular viewer caps at 512mb of texture memory because you're rarely gonna see more than 1gb of textures in one place anyway. Even if it was all stored in gpu vram (which it isnt) nobody should be using less than a 1gb video card in 2018 anyway.

tl;dr no, large textures do not affect performance in SL

This is a massive misconception by people who dont know what theyre talking about trying to find something to blame for SL's terrible performance. It is plain and simple that the game is not utilizing the hardware of modern PC's well, if at all. High end GPU's sit in this game at sub 30-50% usage and people go "huh, i wonder why that could be". Its because the majority of mesh object rendering is still handled by the CPU, almost entirely. Pretty much everything but textures. Shadows, the framing, lighting as a whole, bump mapping and shiny/specular effects, its all CPU bound and to top it all off SL's engine doesnt like to use multiple cores/threads very much. The GPU is really just handling textures and some of the more finnicky math related bits to object placement and alignment.

 

  • Confused 2
Link to comment
Share on other sites

On 15 October 2018 at 1:07 AM, cykarushb said:

and uncompressed gradient 1024x1024 texture is 4mb. And you will never see uncompressed textures, nor are they all rainbow gradients (to create max file size). Most are going to be 100-300kb in size each.

So even if you have literally 10,000 of these textures visible its only 3gb of memory used for textures

File formats for images (compressed or un-compressed) are ONLY relevant for STORAGE of images, say on a hard disk, or for transmission of images.

When textures are loaded into memory for display in an image viewing app, or used in a render engine, they get UNCOMPRESSED into BITMAPS, so a 1024 x 1024 rgba image would be 4 mb, REGARDLESS of what the image actually is or how complicated it might be.

On 15 October 2018 at 1:07 AM, cykarushb said:

people who dont know what theyre talking about

Like the people who assume that gfx cards GPUs are hardwired to understand the thousands of different image file formats out there, instead of leaving that to software to decode and dump on them in a bitmapped format?
 

  • Haha 1
Link to comment
Share on other sites

2 hours ago, Klytyna said:

File formats for images (compressed or un-compressed) are ONLY relevant for STORAGE of images, say on a hard disk, or for transmission of images.

When textures are loaded into memory for display in an image viewing app, or used in a render engine, they get UNCOMPRESSED into BITMAPS, so a 1024 x 1024 rgba image would be 4 mb, REGARDLESS of what the image actually is or how complicated it might be.
 

That's just not how any of that works. You could've done a literal 10 second google search and read even a single sentence to get more accurate information than this.

Before I explain the rest of that, this kinda posting, big words, bold font, argumentative demeanor and the confused reaction, this is why I state "people who don't know what they're talking about". Its seriously harmful to these forums for people who pretend to know what they're saying to try and answer the questions of those who genuinely need help. Like the time a certain someone pulled up a jira report from 2007 about a mobile ATI graphics chipset having problems when someone said their newer laptop was showing graphical artifacting...

Anyway, textures remain compressed in SL, in fact I can't really think of many things that would use uncompressed textures. Outside of some specific professional programs and well I guess any 3D modeling software. There isn't any reason to uncompressed them. Compressed images don't show a drastic reduction in quality and they're much easier to manage by a game that stores a lot of them for use. They also don't become bitmap files, they're not converted and the texture you see remains the texture the file originated as. The GPU in this game doesn't do much, but it's definitely not spending it's time ucompressing textured which is a hugely CPU bound task, not in designation but in pure efficiency. The GPU is terrible at compression tasks. You can actually test this yourself with 7zips benchmarking tool if you don't believe me. Compression methods rely on CPU clock speed, lots of weaker cores running at a lower speed don't make for very good compression/decompression (a gpu).

And just because a file is uncompressed does not mean it's always going to be the same file size regardless of the content in it. The more complicated the image, as in the more colors that are different from one another and are not next to each other the more complex the data of the image, the larger the images file size. The most complicated image is a 24 bit rgb gradient, so every pixel would be a different color. In regular old png format that gives you your 4mb image. Hell, make it a tiff if you want to be certain of it being a pita to compress.

The less complex the image the smaller the file size. An entirely black, entirely white image is going to be exponentially smaller, with image metadata in png format, under 50kb. Less if you can strip the metadata.

Remove all pixels and all metadata and you can have a 1024x1024 image that's under 1kb. I have demonstrations for these images if you would like, though I cannot post an entirely blank png with no metadata because it causes most web browsers to crash out.

 

Link to comment
Share on other sites

Demonstration here with compressed images, a PNG of a 1024x1024 section of the SRGB hued colorspace. This a 214kb image. Note that i cant really make it any more than 750x750, so im not spamming up the place its going to be 300x300. The image is here: https://stuff.mit.edu/~kenta/one/color-lab/dwvcvceg/manylab.png

manylab.thumb.png.4e0a6fe46435571739e566daae372f0a.png

This image cannot be compressed without losing any image quality, because every pixel is a different color. Even the spots that show entirely white or entirely black to the naked eye are actually different colors. The only places with continuation in pixels are the very top and very bottom single rows of pixels which are entirely black and entirely white.

If we lost some image quality, basic compression brings it down to 164kb, but its noticeable with a gradient like this. Leaving this one at 750x750 to show that.

manylab.thumb.png.becb4155c1675f7e640a26d5e6188b35.png

This is entirely black 1024x1024 png. It is 14kb.

black.thumb.png.cfbff7656e4584dfc8e44cc4c417f3d4.png

This is the same black png compressed down to 250 bytes.

black-min.thumb.png.486ccb407088fec921c9901dff471850.png

Chances are you are never going to see an uncompressed image ever used for anything in SL or most games for that matter. Since most of the time the textures you see arent color gradients. For a more realistic scenario, heres a 1024 texture of some wood.

wood1.thumb.jpg.58c8976c574aa3dd7ff0327d7340cb8c.jpg

This is a basic texture, its 284kb.

Compressing that down to 229kb brings us this result.

wood1-min.thumb.jpg.38de49f5bcd95fe651ff1fd599a817ee.jpg

They look identical.

Lets compress it down even more to 189kb.

wood1-min.thumb.jpeg.48ae56cba1550ff2e6eaec9aae1a755b.jpeg

Looks the same.

 

 

Link to comment
Share on other sites

and for further clarification on how any 1024x1024 image would become 4mb uncompressed and how it has nothing to do with color...

heres this screenshot from SL as both a jpeg and a 24 bit bmp

10241.thumb.jpg.d7ec826b3aac0458cb76be34524e6485.jpg

1.png.7a61b7cfc3571c0cbca1b2039cf7985a.png

 

definitely larger, but definitely not its theoretical max of 4mb because of all the color consistency thats there, and definitely not what your PC is handling hundreds of at a time

  • Confused 1
Link to comment
Share on other sites

2 hours ago, cykarushb said:

Its seriously harmful to these forums for people who pretend to know what they're saying to try and answer the questions of those who genuinely need help. Like the time a certain someone pulled up a jira report from 2007 about a mobile ATI graphics chipset having problems when someone said their newer laptop was showing graphical artifacting...

Nice attempted "smear" attack there... A certain someone...

Certainly wasn't me...

2 hours ago, cykarushb said:

Anyway, textures remain compressed in SL

they remain in Jpeg2000 format on the asset servers, in the CDN delivery system, and essentially in the disk cache, it's true, but... GPU's don't understand jpeg2000 asa rule, they deal in video-memory dump format, aka, Bitmaps.

Literally, a dump of the memory, with a header to specify the pixel resolution and colour bit depth.

2 hours ago, cykarushb said:

Compressed images don't show a drastic reduction in quality and they're much easier to manage by a game that stores a lot of them for use.

The operative words there being "handle" and "store"...

2 hours ago, cykarushb said:

The GPU in this game doesn't do much, but it's definitely not spending it's time ucompressing textured

We know, in fact that was kind of my point... GPU's do not deal in file formats, they deal in bitmap image data, passed to them by the software running on the CPU...

2 hours ago, cykarushb said:

And just because a file is uncompressed does not mean it's always going to be the same file size regardless of the content in it. The more complicated the image, as in the more colors that are different from one another and are not next to each other the more complex the data of the image, the larger the images file size.

Again the operative word there is "file size" when dealing with non bitmap file formats.

I'd really like you to show us a 1024 x 1024 24-bit RGB BITMAP that wasn't 3mb... Really, See if you can find one...

No cheating and using 8 bit or anything, 24 bit 1024 x 1024 bitmap image.

2 hours ago, cykarushb said:

1.png.7a61b7cfc3571c0cbca1b2039cf7985a.png

Oh and FYI... the "size" there is the actual size, to eof marker, of the image file, compared to the size on disk, which is the theoretical amount of space available in the number of blocks of hard disk capacity used to store the file.

It has NOTHING at all to do , directly with memory size of the image when it's being used or displayed on your screen by your gfx card.

For some self proclaimed "Tech-Guru" you are using some poor image diagnostic tools.

1360472689_Whowaswrong.jpg.deca2bf936a7672145dc136c35ceb3a9.jpg

Note here that the "current memory size" for a png file is the amount of memory it uses when unpacked for display on a screen by your GPU...

3 hours ago, cykarushb said:

this is why I state "people who don't know what they're talking about"

/me coughs...

3 hours ago, cykarushb said:

You could've done a literal 10 second google search and read even a single sentence to get more accurate information than this.

Take your own advice...

...

If we want to know if the $2500 GeekTech 9000 series GFX card is 0.000257 % faster on direct-draw calls than the $2250 CyberFool 8670 series... We'll call you... That's the sort of tech talk you're good at...


 

  • Thanks 1
Link to comment
Share on other sites

  • 3 weeks later...
On 1/28/2018 at 4:52 PM, cykarushb said:

Just wondering, what are your preferences for the LL viewer vs Firestorm? Is there any performance benefit for you and your hardware configuration?

originally firestorm was first to 64 bit so that it could use computer memory whereas  linden labs official viewer was 32bit for the longest time not sure i can run the linden labs 64 bit viewer to many blockers in it so if official does not work there are alternatives if you wanna play in main land and mega sims it is probably gonna exceed double the memory 32 bit can handle at least

i haven't used the official viewer in a long time but heard it was locked to 512mb on vrm pretty deprecated dilapidated that the official viewer doesn't support video cards newer then 2010

2010 400 series have 1GB+ memory

https://en.wikipedia.org/wiki/GeForce_400_series

firestorms is capped at deprecated dilapidated 2GB and doesn't support any video cards newer then 2013

https://en.wikipedia.org/wiki/GeForce_700_series

firestorm just supports a computer better by allowing it to use 4x the memory so performance and stability is far greater then official hence when i do not return or install touch or look at the official release anymore pointless lost cause

for my configuration both viewers gimp my performance with the artificial restrictions in place with no option or ability to change settings already in place

why all the effort from linden labs to still support 32 bit viewers side by side with 64 bit when amd and nvidia no longer support 32 bit

AMD

https://www.tomshardware.com/news/amd-drops-32-bit-support-radeon-drivers,37977.html

https://www.google.ca/search?ei=RATjW8PDD-XujwSiwqPIDw&q=does+amd+support+32+bit&oq=amd+support+32

Nvidia

https://www.engadget.com/2018/04/09/nvidia-32-bit-os-support-drivers/

https://nvidia.custhelp.com/app/answers/detail/a_id/4604/~/support-plan-for-32-bit-and-64-bit-operating-systems

https://www.google.ca/search?ei=YwTjW7bLHqXjjwSW0Z6oCA&q=does+nvidia+support+32+bit&oq=does+nvidia+support+32

  • Haha 1
  • Confused 2
Link to comment
Share on other sites

35 minutes ago, iceing Braveheart said:

i haven't used the official viewer in a long time

Which, to me, explains why your post is more off than on with regard to the OP question. One example: Why does Microsoft still support 32-bit apps in Windows 10? (Rhetorical question, there).

Firestorm is a really good viewer. But so it Catznip, Kokua, and yes: the Official LL viewer. As for the Video memory "limit" - I've seen someone much smarter than I explain how that's used and it's nothing like what you elude to.

My recommendation to anyone and everyone: If you want to know the differences between viewers and their features and performance your best option is to download and try them out. The performance will vary *wildly* between you and anyone else as your specific computer, configuration, Internet connection scenario and many other things specific to *you* will all play a role.

Only *you* can decide which viewer is best for you. I have chosen a third-party viewer myself and I've decided it's not Firestorm for me. You need to make your own decision, All of them are awesome viewers.

Edited by Alyona Su
  • Like 4
Link to comment
Share on other sites

4 hours ago, Alyona Su said:

My recommendation to anyone and everyone: If you want to know the differences between viewers and their features and performance your best option is to download and try them out. The performance will vary *wildly* between you and anyone else as your specific computer, configuration, Internet connection scenario and many other things specific to *you* will all play a role.

Only *you* can decide which viewer is best for you.

This.

 

For me, it's Firestorm hands down. It actually runs cooler GPU temperature wise than some of the games I play such as Empyrion: Galactic Survivial which can run up to 80C in the summer. Firestorm runs SL at around 62C, which is about what Sims 4 runs at.

And just for comparison:

ASUS M51BC Series
Windows 10 Home 64bit
AMD FX(tm)-8300 Eight-Core Processor, 3.3 GHz, 4 Core(s), 8 Logical Processor(s)
RAM 24 GB
ASUS GeForce GTX 1050 OC 2GB

 

Link to comment
Share on other sites

On 11/7/2018 at 11:09 AM, Alyona Su said:

Firestorm is a really good viewer. But so it Catznip,

k so i looked up Catzip

does this mean that I get 3760MB 3.7GB vrm instead of firestorms 2GB or linden labs 512mb?

https://ibb.co/nFJboV

page here says

 so even though your card might have 2Gb or more of RAM the slider is conservatively capped at 1/3rd of total.

http://catznip.com/index.php/Catznip_R12_Release_Notes#Increased_texture_memory

3.7GB is really weak but still allot more then firestorm 1.7GB more almost 100% more double the memory of firestorm and worlds more then second life official viewer

 

catzip update

1 viewer crash viewer evaporated

fps feels high but doesn't list it like firestorm does less toolbar buttons then firestorm has ui feels awful

using ultra preset catzip feels like garbage like the official linden labs viewer does i went to stores and nothing would render or focus in cannot use the viewer defective linden labs official and catzip are blurry fuzz ball when looking at the poster ads in stores firestorm works and loads and renders instantly linden labs and catzip look and feel and run nearly identical and also feel like depth of field is set so high nothing is capable of loading as you wait hours and hours and nothing loads renders focuses in or works catzip trash garbage can't say i didn't give it a chance is catzip just a clone of official? all the exact same identical bugs from decades ago on official are all present in catzips newest release catzip 3.7gb nothing works linden labs 512mb nothing works firestorm 2gb works guess these two viewers have endless countless problems then just memory are catzip and linden labs optimizing for cell phones kinda feels like it with how broken those two viewers are when's the mobile reveal like blizzard just had or has it already happened catzip has double the memory of firestorm and so many problems that it makes you wanna break your own keyboard over your leg and throw it through monitor screen

Edited by iceing Braveheart
  • Haha 3
  • Confused 2
Link to comment
Share on other sites

9 hours ago, iceing Braveheart said:

is catzip just a clone of official? all the exact same identical bugs from decades ago on official are all present in catzips newest release 

Catznip hasn't been updated since the beginning of the year, so yeah, it's getting a bit behind. Not decades behind, though, and since SL hasn't existed for multiple decades, I'll just assume this is the hyperbole of juvenile exuberance.

Also, you'll do yourself a favor by ignoring everything you think you know about video RAM. It is not the root of all your problems.

You also asked about the "mobile reveal." Some blog sleuthing will uncover that LL has indeed been hiring mobile engineers specifically for SL. I asked if they could supply any more details at the next Town Hall on the 15th, so we'll see if they take that question. I certainly hope they get a mobile platform viewer sooner rather than later. (Also VR -- specifically cheap mobile VR, which appears to be all that will survive of the VR hardware market in a couple years.)

Link to comment
Share on other sites

11 hours ago, iceing Braveheart said:

k so i looked up Catzip

Your diatribe forces me to be blunt and call out your post as nothing more than some kind of rage-rant.

Nobody cares about all the techy-specs you describe. They really don't. They care about experience. If you want to be a "fanboi" of one particular brand then be one, nothing wrong with that. It's no reason to bash on all the rest; it's not a race, there is no '"winner", there is only what works for you, the rest is irrelevant.

So, since you gave the Catznip such a deep review, I fully expect to see your next post to be one of Kokua or another then followed by the others on the SL Third-Party Viewer list. If it's not forthcoming then you are disingenuous and your post is nothing more than adolescent-sounding hot air angst, which will speak volumes about you.

  • Like 1
Link to comment
Share on other sites

21 hours ago, iceing Braveheart said:

k so i looked up Catzip

does this mean that I get 3760MB 3.7GB vrm instead of firestorms 2GB or linden labs 512mb?

https://ibb.co/nFJboV

page here says

 so even though your card might have 2Gb or more of RAM the slider is conservatively capped at 1/3rd of total.

http://catznip.com/index.php/Catznip_R12_Release_Notes#Increased_texture_memory

3.7GB is really weak but still allot more then firestorm 1.7GB more almost 100% more double the memory of firestorm and worlds more then second life official viewer

 

Do some searches on these forums or even out in Google and you'll find out why the viewers limit how much of your VRAM you can use.  There really is a good reason for it.  It has been explained in detail a few times here on the forums.

  • Like 1
Link to comment
Share on other sites

Skipping the words diarrhea further up, I could explain there is a good reason texture memory is limited, like... Because the texture pipeline also keeps copies of about twice the images in VRAM in actual system memory (Did somebody complain Firestorm is using so much system memory? Now what could be the reason for that? Anyone?)... Or maybe things like framebuffers and vertex buffers and other things use VRAM? Or the pointless Windows desktop or other applications daring to use VRAM as well? But I could just remove the limitation and then have Windows swap VRAM into system RAM and with a bit of luck even to the swap file on disk in the worst case - ok, that would be super slow as hell and people would start whining everything is lagging even more, but you could use waaaaaaaaaaaaaaaaaaaaaay more texture memory - even more as you have dedicated VRAM on the GPU. Even creators would love it, could they just plaster everything with 1024x1024 pixel textures! Woot!

  • Like 3
Link to comment
Share on other sites

24 minutes ago, iceing Braveheart said:

it would be nice to see firestorm add a slider for memory that is actually useful and helpful rather than worthless. 16GB would be a good start as it is the current consumer grade gpu on the market which could out date at any time 

You obviously still haven't done any reading on WHY the setting is throttled.  You really would not like life if the viewer was set to use all of your VRAM - or even 2/3-3/4 of it.

  • Like 3
Link to comment
Share on other sites

1 hour ago, iceing Braveheart said:

it would be nice to see firestorm add a slider for memory that is actually useful and helpful rather than worthless. 16GB would be a good start as it is the current consumer grade gpu on the market which could out date at any time 

16 GB VRAM... on a consumer GPU... in 2018... out of date soon... I almost spilled my drink! Let's check how much memory the current "mainstream consumer GPU" nvidia RTX 2800 TI for reasonably cheap 1300 Euros has: https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2080-ti/

11 GB - wow! The first 5 GB probably already vanished towards its successor... ?

Edited by Ansariel Hiller
Changed link to en-us version
Link to comment
Share on other sites

23 hours ago, Ansariel Hiller said:

16 GB VRAM... on a consumer GPU... in 2018... out of date soon... I almost spilled my drink! Let's check how much memory the current "mainstream consumer GPU" nvidia RTX 2800 TI for reasonably cheap 1300 Euros has: https://www.nvidia.com/en-us/geforce/graphics-cards/rtx-2080-ti/

11 GB - wow! The first 5 GB probably already vanished towards its successor... ?

nvidia is not the only manufacturer of gpu's

https://www.amd.com/en/graphics/workstations-radeon-pro-vega-frontier-edition

more moron then tech savvy this forum is

  • Haha 1
Link to comment
Share on other sites

1 hour ago, iceing Braveheart said:

nvidia is not the only manufacturer of gpu's

https://www.amd.com/en/graphics/workstations-radeon-pro-vega-frontier-edition

more moron then tech savvy this forum is

thats not a consumer GPU in the slightest
note how much video memory is actually used by most games: https://www.tweaktown.com/tweakipedia/90/much-vram-need-1080p-1440p-4k-aa-enabled/index.html

Which is next to nothing, cards these days have way more video memory than they actually need. Most of that is future proofing, but some is definitely about "big numbers" to make a product more appealing. Like why the RX 480/580's have 8gb variants, nothing a 480/580 would normally be capable of would really be able to use that much video memory.

On 11/10/2018 at 1:30 PM, iceing Braveheart said:

thus far firestorm looks better but doesn't exactly outperform them all firestorm was completely and utterly destroyed performance wise by catzip with the extra memory though the rendering bugs may have helped catzip as well can't say for sure as they do not update firestorm it is not 2010 anymore i don't have hardware that old anymore

I have hardware that old and older if you want to see SL run on hardware thats limited by having under 512mb of video memory. I actually played SL on its minimum and recommended requirements and found that the 896mb GTX 275 (recommended GPU according to LL requirements) was perfectly capable of playing SL with no issues. Fantastic framerates? Nah, but very manageable with some settings tweaked. And i did this on a dual core from 2009, i could try this out in a modern PC to entirely eliminate any CPU bottleneck (because SL is extremely CPU bound) if anyone would like to see how far back in pcie gpus you can go before video memory and sheer performance of the card becomes a problem.

 

 

Edited by cykarushb
Link to comment
Share on other sites

14 hours ago, cykarushb said:

 

I have hardware that old and older if you want to see SL run on hardware thats limited by having under 512mb of video memory. I actually played SL on its minimum and recommended requirements and found that the 896mb GTX 275 (recommended GPU according to LL requirements) was perfectly capable of playing SL with no issues. Fantastic framerates? Nah, but very manageable with some settings tweaked. And i did this on a dual core from 2009, i could try this out in a modern PC to entirely eliminate any CPU bottleneck (because SL is extremely CPU bound) if anyone would like to see how far back in pcie gpus you can go before video memory and sheer performance of the card becomes a problem.

 

 

yeah you have to really turn down the settings on firestorm because firestorm was designed for the personal computer of its creator only in mind or fps and performance really tanks hard sideways if you try and turn anything on or up graphically, firestorms like using someone's personal private mod that wasn't designed for public release or public consumption with a wide variety of hardware in mind aka better and newer then what the creator of the software had at the time of making the software

because firestorm is artificially restricted to the developers personal computer and catzip is not you can really crank up everything and max up all those settings without losing fps or breaking a sweat performance wise with newer and better hardware on catzip

 

i never noticed second life utilize a cpu amd phenom 2 1100T and intel i7 4770k ran second life identically performance wise with a nvidia 780 so definitely wasn't cpu bound back then

however guild wars 2 with a amd 1100T and nvidia 780 could not maintain 30 fps while intel i7 4770k with nvidia 780 could achieve over 150-200 fps

  • Haha 3
  • Confused 2
Link to comment
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...