Hey all,

In the market for a GPU, would like to use Bazzite, mostly a Steam user with some SteamVR (rare), and have run into nvidia issues with Bazzite and a 3070 previously.

With the recent news on nVidia’s beta drivers and Plasma’s sync support in beta, I’m newly on the fence about switching to AMD given nvidia having a better performance to cost ratio, the power usage (big one for a compact living room system), and the fact that they have the potential for HDMI2.1 support which AMD doesn’t have a solution to yet.

What are community thoughts? I’ll probably hold out for some reports on the new drivers regardless, but wanted to check in with the hive mind.

Thanks!

27 points
*

Just wait and see how good those drivers are first.

AMD given nvidia having a better performance to cost ratio

When the fuck?

and the fact that they have the potential for HDMI2.1 support which AMD doesn’t have a solution to yet.

An open source solution exists for Intel, the way it works is just by a translation layer between HDMI & DisplayPort. I imagine AMD will do the same thing.

permalink
report
reply
24 points

Why even use HDMI when AMD does DisplayPort 2.0 where nvidia only does 1.4?

I’d say nvidia would work fine but you need to take into account that the drivers can be a bit flakey.

permalink
report
reply
1 point

A lot of displays don’t support DP unfortunately. I have an LG C2 which is perfect for desktop use and one of the more affordable OLED screens out there, and it does not support DP. The PC monitor equivalent that uses the same panel is made by Asus, but that one has a $600 dollar mark up.

permalink
report
parent
reply
1 point

There are converters these days, albeit with some minor quirks. And they typically only use DP 1.4 anyway, although that’s enough for perceptually lossless 4k@120hz + HDR.

permalink
report
parent
reply
1 point

Do they support VRR though? Last I heard that was still an issue with these converters.

permalink
report
parent
reply
22 points

and have run into nvidia issues with Bazzite and a 3070 previously.

As have I. And everyone else I know. Nvidia sucks on Linux because they don’t care.

given nvidia having a better performance to cost ratio

Uhhhh no? Think you got that the wrong way round.

Regardless, I would pay a lot more just to avoid the issues of Nvidia on Linux. It’s just not something I want to deal with.

permalink
report
reply
5 points

given nvidia having a better performance to cost ratio

I actually agree. The 6700 XT, for example, was supposed to compete with the 3070, but instead, it barely surpassed the 3060ti in real world tests.

But I agree with your main point, and I’d trade that slight drop in performance per dollar for a better experience on Linux. I’m planning my exit strategy from Windows, and I’m still working on accepting that my Nvidia card just won’t feel as nice (until NVK is more mature).

permalink
report
parent
reply
6 points

An Nvidia 3070 costs 420 and benchmarks at 22,403 (benchmark point per dollar 53.34) An AMD 6800 costs 360 and benchmarks at 22,309 (benchmark point per dollar 61.97)

So you get a 0.4% drop in performance for a 14.3% drop in price. That is significantly more performance per dollar.

Or if you go with a 3070ti ($500 23,661 -> 47.32) vs a 7800 XT ($500 23,998 -> 47.97) you get a 1.4% performance increase for free (not really that significant I know, but still it’s free performance)

All of the numbers were taken from https://www.videocardbenchmark.net/high_end_gpus.html

permalink
report
parent
reply
5 points
*

My numbers were taken from a comparison of real world performance via gameplay (FPS comparisons), not artificial benchmark scores, but those prices are still really good.

6800 is better than a 3070 in both artificial benchmarks and real-world ones, and the fact that it’s cheaper means it’s certainly the better option for performance per dollar (somewhere between a 3070ti and a 3080).

Here’s an old graph I still have saved:

permalink
report
parent
reply
4 points
*

Nvidia sucks on Linux because they don’t care.

They really should care, because that market is growing, and Steam Deck uses Linux. Not sure which GPU decks use but… It would be cool they cared just a little bit.

permalink
report
parent
reply
5 points

Deck uses AMD

permalink
report
parent
reply
2 points

Right. So if Nvidia would even have a chance at being dominant in that market if handhelds are starting to use Linux, they’d better start caring.

permalink
report
parent
reply
18 points
*

given nvidia having a better performance to cost ratio,

In what part of the world? I haven’t found that to be true.

the power usage (big one for a compact living room system),

You might want to do some more homework in this area. I recall AMD having better performance/watt in the tests I read before buying, but it’s hard to declare a clear-cut winner, because it depends on the workloads you use and the specific cards you compare. AMD and Nvidia don’t have exactly equivalent models, so there’s going to be some mismatch in any comparison. In stock configurations, I think both brands were roughly in the same ballpark.

Departing from stock, some AMD users have been undervolting their cards, yielding significant power savings in exchange for slight performance loss. Since you’re planning a compact living room system, you might want to consider this. (I don’t know if Nvidia cards can do this at all, or whether their drivers allow it.)

Regardless of brand, you can also limit your frame rate to reduce power draw. I have saved 30-90 watts by doing this in various games. Not all of them benefit much from letting the GPU run as fast as it can.

and the fact that they have the potential for HDMI2.1 support which AMD doesn’t have a solution to yet.

AMD cards do support HDMI 2.1. Did you mean Fixed Rate Link features, like variable refresh rate, or uncompressed 4K@120Hz? You’re not going to get that natively with any open-source GPU driver, because the HDMI Forum refuses to allow it. Most people with VRR computer displays use DisplayPort, which doesn’t have that problem (and is better than HDMI in nearly every other way as well). If you really need those FRL features on a TV, I have read that a good DisplayPort-to-HDMI adapter will deliver them.

Another thing to consider: How much VRAM is on the AMD card vs. the Nvidia card you’re considering? I’ve found that even if a card with less VRAM does fine with most games when it’s released, it can become a painful constraint over time, leading to the cost (and waste) of an early upgrade even if the GPU itself is still fast enough for the next generation of games.

I switched from Nvidia to AMD, and have not been sorry.

permalink
report
reply
18 points
*

If you want a more open platform, that works better on Linux and has better value, go AMD

permalink
report
reply

Linux Gaming

!linux_gaming@lemmy.world

Create post

Discussions and news about gaming on the GNU/Linux family of operating systems (including the Steam Deck). Potentially a $HOME away from home for disgruntled /r/linux_gaming denizens of the redditarian demesne.

This page can be subscribed to via RSS.

Original /r/linux_gaming pengwing by uoou.

Resources

WWW:

Discord:

IRC:

Matrix:

Telegram:

Community stats

  • 1.4K

    Monthly active users

  • 816

    Posts

  • 10K

    Comments

Community moderators