Jump to content

Searching for reviews regarding the behavior of AMD 7000 series graphics cards (7800xt 7900xt 7900xtx) in Second Life


You are about to reply to a thread that has been inactive for 112 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

23 hours ago, Henri Beauchamp said:

I'm more under the impression you are seeking for just one favourable testimony to use it as an excuse to follow your personal feeling/belief that an AMD card would be better suited for you...

No, not at all, it's just that I never base my choice on a single opinion.

23 hours ago, Henri Beauchamp said:

Just go ahead, and buy whatever suits your own needs/preferences, and take your responsibilities. Just don't come back here to complain ”we” gave you a bad advice, should you find out you committed a mistake. 😜

I'm counting on it but with full knowledge of the facts 🙂

23 hours ago, Henri Beauchamp said:

As for the graphics cards prices, it might be wiser/smarter to wait a little bit: NVIDIA's cards are already seeing an adjustment of their prices as a result of AMD's newest cards releases (competition is a Good Thing ™) . It will take some time to propagate to France (but you could just as well buy a card from a more reactive German supplier), but prices are going to drop a bit in the coming weeks. The second half of  October is usually a good moment to buy computer hardware (long enough after people's return from Summer vacations, soon enough before Christmas). There is also the option to wait for a sale/opportunity on the former cards generation (even a RTX 3070 is plenty powerful enough for SLing).

You're right, it is better to wait, because Nvidia is worried about the catastrophic sales of its 4070 and especially the release of its most dangerous competitor, the 7800 xt. Now here in France the cheapest of the 4070 RTX is €649 (it's just 100 euros too expensive for this card). At AMD it's barely better, the 7800 xt which on its first day of sale was at €570 is now at €600.

Edited by kyte Lanley
Link to comment
Share on other sites

On 9/17/2023 at 2:05 PM, Wulfie Reanimator said:

I haven't had any SL related crashes with the 7900 XTX. I can say the same about my previous RX 6600.

If you can afford the 7900 XTX, buy it

Thank you for your testimony Wulfie, I know I'm asking a little too much of you but have you had the opportunity to compare the performance of your AMD cards with Nvidia?

Edited by kyte Lanley
Link to comment
Share on other sites

Recently I have owned a RTX-3080, 7900XTX and RTX-4090. The 7900xtx is a great card for vulkan and direct x. OpenGL which you need of Second Life, it is the worst of the three performance wise (for SL). It is stable, but if SL is your primary focus for your computer you are much better served with Nvidia. In opengl the 7900xtx was 20% less in fps vs the 3080 and 40% less than the 4090 (I'm using a 5900x so with a current gen 13900k or 7900x it will do much better in SL). 

Link to comment
Share on other sites

11 hours ago, kyte Lanley said:

Very interesting, this means that the 7900 XTX performs worse than a 4070 RTX (which is equivalent to the 3080) in Second Life. I think I can give you the Oscar for most informative testimony 😄.

If you are not doing ray traced AAA gaming, OpenGL or GPU compute tasks (blender, hardware video encodes) the 7900xtx is a good value and will perform similar to a 4080/4090 at better pricing. Ray Tracing, OpenGL and GPU Compute Nvidia is far ahead of AMD. I'm not saying the 7900xtx is a bad card but just pointing out the differences.

  • Like 2
Link to comment
Share on other sites

Your testimonies are very interesting, it's nice to see that my call for more testimonials from AMD card owners has been heard.

it appears that for all games (except Second Life) AMD has better performance (better in rasterization) but lacks optimization because of its drivers.

Concerning Second Life, you completely agree that Nvidia cards perform much better than AMD.

It's a shame because AMDs are cheaper and have more RAM (7800xt 16 GB compared to 12 GB for the 4070 RTX).

Continue to give your feelings, I'm interested.

Link to comment
Share on other sites

I'm not sure it's SL specific really, it's more of an OpenGL thing and the fact AMD just don't seem to care much about OpenGL aside from providing compatibility.

Makes sense given you won't find many/any games using it these days. Of course SL is not even a good test of a graphics card in general, zero optimization means all you can ever do is throw horsepower at it and even then you'll find the latest and greatest 500W rig/space heater will chug down below 30FPS given a suitably busy scene.

Nvidia just hold the crown for most mature, stable and best performing OpenGL implementation but I can't really blame AMD for not caring much since it's a pretty irrelevant thing these days especially in the world of gaming/consumer graphics hardware.

 

 

Link to comment
Share on other sites

Unity, the viewer being used for the mobile client, can also make DirectX Windows and Vulkan/Metal versions for Linux and OSX. I think OpenGL might not be relevant for SecondLife forever. But buying anything on a future promise is also a terrible idea. Regardless I agree with Amelia, OpenGL isn't really all that relevant anymore and it's been replaced by Vulkan, Metal, and DirectX. SL's problem is more about being a 15 year old game engine that's using outdated technology, and what graphics card can handle outdated stuff better than the other.

Personally I think we'll see 7800XT hit $350 to $450 by April 2024. I'm going to wait, but I use Linux and the open source AMD drivers are extremely good. The graphics card market is really messed up right now. I.E. everything is over priced and it's still selling out very quickly.

  • Like 1
Link to comment
Share on other sites

On 9/20/2023 at 5:46 PM, Flea Yatsenko said:

Unity, the viewer being used for the mobile client, can also make DirectX Windows and Vulkan/Metal versions for Linux and OSX. I think OpenGL might not be relevant for SecondLife forever. But buying anything on a future promise is also a terrible idea. Regardless I agree with Amelia, OpenGL isn't really all that relevant anymore and it's been replaced by Vulkan, Metal, and DirectX. SL's problem is more about being a 15 year old game engine that's using outdated technology, and what graphics card can handle outdated stuff better than the other.

Personally I think we'll see 7800XT hit $350 to $450 by April 2024. I'm going to wait, but I use Linux and the open source AMD drivers are extremely good. The graphics card market is really messed up right now. I.E. everything is over priced and it's still selling out very quickly.

Curious how does Nvidia's much maligned (rightly) Linux driver perform? is it as good as on Windows?

I definitely feel something has to give with SL though, an engine update is more than overdue and we're already seeing a pretty unacceptable difference in performance between brands despite hardware being roughly equal due to this reliance on what is effectively a dead graphics API. The counter to this is that you don't fix what isn't broken but how long can we even be sure OpenGL will remain relevant enough that both Nvidia and AMD even bother including compatibility? probably a long time to come but it does seem like something that will disappear eventually.

 

Edited by AmeliaJ08
Link to comment
Share on other sites

11 hours ago, AmeliaJ08 said:

Curious how does Nvidia's much maligned (rightly) Linux driver perform? is it as good as on Windows?

Maligned rightly ?... Only by stupid people, I'm afraid...

The proprietary drivers for NVIDIA under Linux work beautifully (and around 10% faster than under Windows), with first class, and super-long time support: all bugs I reported to NVIDIA in the past 19+ years I have been using their drivers have been addressed, most of them quite promptly (first class indeed, especially when compared with AMD and ATI cards I owned in the distant past, for which Linux support was abyssal), and I today can still run my old GTX 460 (a 13 years old card !) with the latest Linux LTS kernels and the latest Xorg version.

They are also super-stable, and adhere strictly to OpenGL specs.

The Vulkan drivers and the CUDA stack are great too (with CUDA much faster and actually often better supported under Linux than OpenCL: e.g. with Blender, which only recently started implementing support for OpenCL when CUDA has been supported for years).

It should also be noted that NVIDIA open-sourced their drivers for their recent GPUs and that, while AMD and Intel (used to) contribute more Open Source to Linux, they still rely on Mesa folks for their Linux driver (meaning less performances than a closed sources driver, because the Mesa maintainers do not have access to all the secret architecture details of the GPUs), and that you still need closed sources software ”blobs” to run their GPUs under Linux...

Edited by Henri Beauchamp
Typos
  • Like 2
Link to comment
Share on other sites

12 hours ago, Henri Beauchamp said:

Maligned rightly ?... Only by stupid people, I'm afraid...

The proprietary drivers for NVIDIA under Linux work beautifully (and around 10% faster than under Windows), with first class, and super-long time support: all bugs I reported to NVIDIA in the past 19+ years I have been using their drivers have been addressed, most of them quite promptly (first class indeed, especially when compared with AMD and ATI cards I owned in the distant past, for which Linux support was abyssal), and I today can still run my old GTX 460 (a 13 years old card !) with the latest Linux LTS kernels and the latest Xorg version.

They are also super-stable, and adhere strictly to OpenGL specs.

The Vulkan drivers and the CUDA stack are great too (with CUDA much faster and actually often better supported under Linux than OpenCL: e.g. with Blender, which only recently started implementing support for OpenCL when CUDA has been supported for years).

It should also be noted that NVIDIA open-sourced their drivers for their recent GPUs and that, while AMD and Intel (used to) contribute more Open Source to Linux, they still rely on Mesa folks for their Linux driver (meaning less performances than a closed sources driver, because the Mesa maintainers do not have access to all the secret architecture details of the GPUs), and that you still need closed sources software ”blobs” to run their GPUs under Linux...

Good lord, Linus Torvalds is stupid according to you and to this day the Nvidia driver is full of missing and broken features even on the most recent GPUs.

Settle down and answer the simple question about GL performance since I was curious if you can but don't call people stupid.

 

 

 

Link to comment
Share on other sites

1 hour ago, AmeliaJ08 said:

Good lord, Linus Torvalds is stupid according to you

When he is giving the finger, yes, he definitely looks like the stupidest man in the world... Linus Torvalds is no god, and while quite intelligent, he can also prove totally stupid, at times, like everyone (us included): giving the finger to people, for whatever reason, is one of the stupidest and pointless thing to do (and will likely achieve the exact opposite effect of what the person giving the finger would expect/hope) !

1 hour ago, AmeliaJ08 said:

to this day the Nvidia driver is full of missing and broken features even on the most recent GPUs

Oh, and what would it be, please ?.... I have been using NVIDIA cards and their proprietary drivers for over 19 years (my first NVIDIA card was a 6600GT), and never missed a single feature !

1 hour ago, AmeliaJ08 said:

Settle down

Settle down yourself, pretty please... I am not the person who is spreading FUD...

1 hour ago, AmeliaJ08 said:

and answer the simple question about GL performance

I already replied this question, but of course, if you only read the first phrase of my previous post, you missed it... Read again: it was in the second phrase... 🫣

Edited by Henri Beauchamp
  • Like 1
Link to comment
Share on other sites

3 hours ago, AmeliaJ08 said:

 

Settle down and answer the simple question about GL performance

 

 

 

Here are some quick benchmarks comparing OpenGL performance (not for Blender of course) in Linux and Windows using Valley (I'm not running Windows 8, Valley is just that old! :P ), Superposition and Blender Benchmark. I ran Superposition 3 times in Windows but the result is so far off (almost half of the Linux result) I don't trust it. I also included an old 7900xtx Superposition Linux benchmark for comparison.

BlenderBenchLinux.png

BlenderBenchWindows.png

ValleyBenchLinux.png

ValleyBenchWindows.png

SuperLinux.png

SuperBenchWin.png

 

SuperBench7900xtx2.png

Edited by MarissaOrloff
  • Like 1
  • Thanks 1
Link to comment
Share on other sites

22 minutes ago, MarissaOrloff said:

I ran Superposition 3 times in Windows but the result is so far off (almost half of the Linux result) I don't trust it.

Unigine Superposition is far from optimized for OpenGL... You'd get better results under Windows and DirectX than Linux and OpenGL, even though OpenGL Windows' performances with it are indeed abyssal. So yes, better not trusting too much its results for OpenGL.

The results of Valley are however perfectly in line with what I get with the viewer: around +10% fps in favour of Linux.

22 minutes ago, MarissaOrloff said:

I'm not running Windows 8, Valley is just that old!

In fact, you'd get better results with Windows 7/8 (less overhead than Win10 or Win11)... The problem being that you won't have valid drivers for it and such a modern GPU...

Edited by Henri Beauchamp
  • Like 1
Link to comment
Share on other sites

On 9/21/2023 at 4:35 PM, AmeliaJ08 said:

Curious how does Nvidia's much maligned (rightly) Linux driver perform? is it as good as on Windows?

I definitely feel something has to give with SL though, an engine update is more than overdue and we're already seeing a pretty unacceptable difference in performance between brands despite hardware being roughly equal due to this reliance on what is effectively a dead graphics API. The counter to this is that you don't fix what isn't broken but how long can we even be sure OpenGL will remain relevant enough that both Nvidia and AMD even bother including compatibility? probably a long time to come but it does seem like something that will disappear eventually.

 

Any time you introduce binary packages into a Linux build from a third party that's not open source, whatever libraries that proprietary package was built against can conflict and cause problems. I had an Nvidia a long time ago. This has probably been fixed but what would happen is Xorg would update before the graphics drivers and the drivers would break because they wouldn't support that version of Xorg.

Also, once your card gets dropped from support you have to stay on the same version of Xorg or switch to the open source drivers.

I built a 24 core server board a while ago for baking. It had an ATI Rage Pro 128. The open source Radeon drivers worked out of the box (but it was so slow you wouldn't have known unless you checked glxinfo lol).

People complain about the Nvidia driver because it's the better performing and working driver compared to nouveau. But it's never fun to add proprietary software to an open source ecosystem, unless you're running it in a container or something. But the video driver is so embedded into crucial parts of your desktop environment, it gets difficult. I haven't had a Linux and Nvidia build in a very long time, things have probably changed a lot. But I remember a lot of system upgrades only to find 3d was broken and I had to roll back packages until Nvidia updated their driver.

Link to comment
Share on other sites

5 hours ago, Flea Yatsenko said:

Any time you introduce binary packages into a Linux build from a third party that's not open source, whatever libraries that proprietary package was built against can conflict and cause problems. I had an Nvidia a long time ago. This has probably been fixed but what would happen is Xorg would update before the graphics drivers and the drivers would break because they wouldn't support that version of Xorg.

Also, once your card gets dropped from support you have to stay on the same version of Xorg or switch to the open source drivers.

I built a 24 core server board a while ago for baking. It had an ATI Rage Pro 128. The open source Radeon drivers worked out of the box (but it was so slow you wouldn't have known unless you checked glxinfo lol).

People complain about the Nvidia driver because it's the better performing and working driver compared to nouveau. But it's never fun to add proprietary software to an open source ecosystem, unless you're running it in a container or something. But the video driver is so embedded into crucial parts of your desktop environment, it gets difficult. I haven't had a Linux and Nvidia build in a very long time, things have probably changed a lot. But I remember a lot of system upgrades only to find 3d was broken and I had to roll back packages until Nvidia updated their driver.

I have been using Nvidia with Linux since 2009 and I have never experienced anything that you have described. The only issues I have had with Nvidia is that sometimes nvidia-dkms will skip over a new kernel I have compiled requiring me to reinstall nvidia-dkms (a 30 second fix if that) to resolve the issue. From what you were saying at the beginning, sloppy programming is sloppy programming. It makes zero difference if it's open source or proprietary. If the programmer is releasing bad code it's going to behave badly.   

  • Like 3
Link to comment
Share on other sites

On 9/23/2023 at 3:51 PM, Flea Yatsenko said:

Any time you introduce binary packages into a Linux build from a third party that's not open source, whatever libraries that proprietary package was built against can conflict and cause problems.

That is mostly FUD. The same is true for any third party packages, even if open source. If the developers of the open source drivers basically abandoned it or do not keep up with the kernel development or compiler/library changes, it breaks after some time. The main point is you need active maintainers.

You may have problems if you are on a rolling release/bleeding edge distro and have some bleeding edge hardware or some vintage hardware. Thats totally true, but thats no difference to the Windows experience actually. Immature drivers are common for both.

The only bonus you get from "in-tree" open source drivers is the convenient fixing by the kernel people when they break the ABI again or push more stuff behind the GPL curtain. 

Once you have a working system, it tends to stay working for quite some time. Sure if you intend to use hardware for 5-10 years, aiming for ultra stable in kernel drivers might be a thing. But 5-10 years service life for "gaming" systems is not really a thing usually. And 2-3 years is usually far less of a problem.

 

  • Like 2
  • Thanks 1
Link to comment
Share on other sites

15 hours ago, Kathrine Jansma said:

That is mostly FUD. The same is true for any third party packages, even if open source. If the developers of the open source drivers basically abandoned it or do not keep up with the kernel development or compiler/library changes, it breaks after some time. The main point is you need active maintainers.

You may have problems if you are on a rolling release/bleeding edge distro and have some bleeding edge hardware or some vintage hardware. Thats totally true, but thats no difference to the Windows experience actually. Immature drivers are common for both.

The only bonus you get from "in-tree" open source drivers is the convenient fixing by the kernel people when they break the ABI again or push more stuff behind the GPL curtain. 

Once you have a working system, it tends to stay working for quite some time. Sure if you intend to use hardware for 5-10 years, aiming for ultra stable in kernel drivers might be a thing. But 5-10 years service life for "gaming" systems is not really a thing usually. And 2-3 years is usually far less of a problem.

 

This!

The only issue I have with the Nvidia drivers is getting them properly signed with Secure Boot when an update comes up. Ubuntu makes this process automatic but occasionally I have to do it manually... that's the case with some other distros like Debian.

Link to comment
Share on other sites

Some of the Linux "distributions" that I use are so "stable"* that they still do not support the ASPEED AST-2500 or AST-2600!

*Where "stable" release means feature-frozen, like it used to mean before all the users came along and ASSUMED it means "doesn't crash all the time."  Yes, I have been using Linux that long.  Remember when Linus joked on the mailing list that he was going to rename the kernel to "Instaboot?"

One of these days I would like to try an AMD card, but I think you folks have talked me out of it.  The primary use for my personal computer is to run Linden Lab's Second Life Viewer.  So, I have purchased an nVidia RTX 4080.  Yes, that's way more than needed for SL.  I also purchased an Intel i9-13900k, which is also way more than is needed for SL.  I would not recommend spending so much money to run SL if asked.  Fewer fast CPU cores would work as well, most of the time.  I also bought 32 GB of DDR5-5600 memory.  I am hoping that, with 16GB on the GPU's carrier card and 32GB on the CPU's main board, I will not have to deal with insufficient memory issues much.

People will ask "What's FPS do you get?"  I know that's one of the metrics we feel the most, with inter-frame jitter also probably in the top two, but, really, FPS alone is meaningless as a comparison metric because you don't know what I am looking at.  "58 FPS in a full club" sounds nice, but, which?  Where?  When?  No, no!  Don't tell me!  I would like for us to have a set of fixed benchmarks derived solely from Second Life Viewer rendering performance when rendering an unchanging set of scenes.  Once upon a time there were some regions set aside for this but either they are gone or now people can go there and build like they are sandboxes.  I'll close by saying I erred on the side of buying way more "power" than needed, put probably about the right amount of VRAM and DRAM for what's coming soon to Second Life.  Maybe further iteration will prove otherwise.

  • Like 2
Link to comment
Share on other sites

Anyone using AMDs new cards also running into this issue:

https://gyazo.com/cb37ee682549c3d3c519d5f6288ae713

(VRAM isn't the whole 16GB because other stuff needs a bit as well. So yes, the VRAM ran full entirely here with the viewer. No matter if dynamic memory is chosen or the vanilla 2GB option.)

 

The first is the dedicated VRAM. Randomly Firestorm/SL sucks up all VRAM and starts to allocate into the shared system memory, if set.
Doesn't matter what driver since last year (there was a version that ran stable though, but that one is so old now that new games refuse to run, if you use it).

Fresh installation, DDU used... it happens sometimes or not. Sometimes on events, or not, time is also random. 10 Minutes? 2 Hours. More often though when you visit stuff and for some reason Firestorm/the Viewer of your choice decides to not empty up not used stuff.
Using a 6900XT here. But i read that some others even with 7000s series run into that issue occasionally. (As said, it doesn't happen every time. But often enough that it gets annyoing.)
Never had that issue with my 1080ti or read about that on nVidia cards so far.

So either a driver issue or Firestorm+Driver issue... can't pinpoint. 
So i would consider twice to go with an AMD card if you use SL a lot. For short visits it may work well. But for longer sessions it runs into a memory leak it seems.
Normally the VRAM hovers around 6GB. But always when the performance tanks down to a third of the normal FPS and i look the VRAM usage up... voila, VRAM ran full and the viewer needs a restart. Otherwise you get a diashow. (I talk 200+ FPS at my home down to 40...)

That was my experience with SL on one of the newer AMD cards so far. I also talked to other persons in SL itself who had similar issues though with 6000 or 7000 cards.

Everything else runs just fine. In terms of other games. As said, if you visit SL only on some occasions, go for them. Price > Performance is still good.

Edited by Feuerblau
Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 112 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...