Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

247 points

Remember when eVGA decided they would rather leave the market entirely than spend one more day working with Nvidia?

permalink
report
reply
26 points

I wish they would have started putting out AMD products. Powercolor just doesn’t feel like a flagship partner like evga was to nvidia.

permalink
report
parent
reply
14 points

I would’ve actually switched to AMD if EVGA did

permalink
report
parent
reply
25 points

Really?

permalink
report
parent
reply
74 points

Yup. It was something like 90% of their revenue, but 25% of their profit.

permalink
report
parent
reply
-62 points

And now they have 0 revenue and 0 profit.

permalink
report
parent
reply
138 points

Yep, it’s the RAM, but also just a mismatched value proposition.

I think it’s clear at this point Nvidia is trying to have it both ways and gamers are sick of it. They used pandemic shortage prices as an excuse to inflate their entire line’s prices, thinking they could just milk the “new normal” without having to change their plans.

But when you move the x070 series out of the mid-tier price bracket ($250-450, let’s say), you better meet a more premium standard. Instead, they’re throwing mid-tier RAM into a premium-priced project that most customers still feel should be mid-tier priced. It also doesn’t help that it’s at a time where people generally just have less disposable income.

permalink
report
reply
129 points

GPUs haven’t been reasonably priced since the 1000 series.

And now there’s no coin mining promising some money back.

permalink
report
reply
19 points

You mean Nvidia GPUs? I got my 6750XT for 500€, and I think it’s a good price for the performance I get.

permalink
report
parent
reply
58 points

That is still overpriced i think. Although, much less egregious than what Nv is doing. Launch msrp for a HD7850, which was the same category as the 6700XT today (upper middle tier) was 250 usd. A few years prior the 4850 started at 200 usd. Even the Rx 480 started at only 230 usd. And those were all very decent cards in their time.

permalink
report
parent
reply
5 points

I bought a GTX780 for $500 MSRP circa 2013. I considered that to be crazy expensive at the time, but I was going all out on that system. Currently I run a GTX1080Ti(bought used) with 11GB of VRAM and they want me to spend $600 for 1 more GB of VRAM? PS5 has 16 GB of shared memory, 16GB should be entry level of VRAM for a system thats expected to keep up with this generation of graphics. There’s no reason for Nvidia to do this other than to force users to upgrade sooner.

Funny part is the market is so fucked that reviewers are lauding this a decent deal. I think the 1080Ti will last me until OLED matures and I finally upgrade from a 1080p monitor. According to the steam survey most gamers are in a similar boat.

permalink
report
parent
reply
5 points

Yeah, I think it’s important not to lose perspective here and let expectations slide just because Nvidia are being more awful right now. Make no mistake, value went out the window a long time ago and AMD are also fucking us, just a little less hard than their main competitor. Even adjusting for inflation, what used to get you the top of the line now gets you last-gen midrange.

permalink
report
parent
reply
3 points
*

Launch msrp for a HD7850, which was the same category as the 6700XT today (upper middle tier) was 250 usd.

There’s much more effort involved to produce modern GPU now. Either way, if NVidia would be truly greedy, they’d close gaming gpu business right away and would produce only AI accelerators. You can take same 4070, add $200 worth of GDDR chips to the layout and sell this for $15k minimum, shit would be on backorder.

permalink
report
parent
reply
4 points

Yeah right? I got my 6700 XT for just over $400USD. It was a great deal.

permalink
report
parent
reply
2 points
*

Just got my brand new 6800xt for $350, upgrading from a 970 screw Nvidia.

permalink
report
parent
reply
2 points

That’s a shit deal when the 4070 is €550

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
18 points

The new mining is AI… TSMC is at max capacity. They’re not going to waste too many wafers making gaming GPU when AI acceleratora are selling for $30k each

permalink
report
parent
reply
4 points

ugh

permalink
report
parent
reply
96 points

Nvidia over pricing their cards and limiting stock, acting like there is still a gpu shortage from all the crypto bros sucking everything up.

Right now, their competitors are beating them at hundreds of dollars below nvidias mrp like for like with the only true advantage nvidia has is in ray tracing and arguably VR.

It’s possible we’re approaching another shorter with the AI bubble though for the moment that seems to be pretty far off.

TL;DR Nvidia is trying to sell a card at twice it’s value cause greed.

permalink
report
reply
35 points
*

They’re beating AMD at ray tracing, upsampling (DLSS vs FSR), VR, and especially streaming (NVENC). For the latter look at the newly announced beta partnership with Twitch and OBS which will bring higher quality transcoding and easier setup only for Nvidia for now and soon AV1 encoding only for Nvidia (at first anyway).

The raw performance is mostly there for AMD with the exception of RT, and FSR has gotten better. But Nvidia is doing Nvidia shit and using the software ecosystem to entrench themselves despite the insane pricing.

permalink
report
parent
reply
6 points

And they beat AMD in efficiency! I’m (not) surprised that people ignore this important aspect which matters in noise, heat and power usage.

permalink
report
parent
reply
21 points

Toms Hardware did a test, Rx 6800 is leader there. Next, RTX 3070, is 4.3% worse. Are their newer cards more efficient than AMD’s newer cards?

permalink
report
parent
reply
4 points

Streaming performance is really good on AMD cards, IME. Upscaling is honestly close and getting closer.

I dont think better RT performance is worth the big premium or annoyances nvidia cards bring. Doubly so on Linux.

permalink
report
parent
reply
0 points

And AI. They’re beating the pants off AMD at AI.

permalink
report
parent
reply
2 points
*

True enough. I was thinking more of the gaming use case. But even beyond AI and just a general compute workload they’re beating the pants off AMD with CUDA as well.

permalink
report
parent
reply
16 points

Couldn’t agree more! Abstracting to a general economic case – those hundreds of dollars are a double digit percentage of the overall cost! Double digit % cost increase for single digit % performance doesn’t quite add up @nvidia :)

Especially with Google going with TPUs for their AI monstrosities it makes less and less sense at large scale for a consumers to pay the Nvidia tax just for CUDA compatibility. Especially with the entrance of things like SYCL that help programmers avoid vendor lock.

permalink
report
parent
reply
86 points
*

My RTX 4060 has 16GB of RAM. What on earth makes them think people would go for 12GB?

permalink
report
reply
37 points

Not being a power of 2 gives me displeasure.

permalink
report
parent
reply
13 points

It is in base 6.

permalink
report
parent
reply
2 points

And base 3 and 12. But we don’t really use those numbering systems.

permalink
report
parent
reply
8 points

I have a 2060 super with 8GB. The VRAM is enough currently for FHD gaming - or at least isn’t the bottle neck, so 12 GB might be fine with this use case BUT I’m also toying around with AI models and some of the current models already ask for 12 GB VRAM to run the complete model. It’s not, that I would never get a 12 GB card as an upgrade, but you’d be sure, that I’d do some research for all alternatives and then it wouldn’t be my first choice but a compromise, as it wouldn’t future proof me in this regard.

permalink
report
parent
reply
8 points
*

Do you think there is a large overlap of people who buy $600-$900 cards and like 1080p?

My 3080 10GB runs out of VRAM personally at 1440p. I would never get <16GB again.

permalink
report
parent
reply
2 points

Hard to say. I was an early adopter of FullHD, always had the equivalent of an xx80 card. Then I stepped back a bit with the 970, as it was the best upgrade path for me (considering I was only upgrading the GPU and the CPU would very likely be the bottle neck moving forward). I was thinking to move to higher resolutions with my new PC. Then my PSU fried my mainboard, CPU and GPU while covid and crypto currencies cause huge price spikes on almost every component and I had to pay way to much for what I’d get performance wise. That’s why I’m running a 2060 super now and stay on FHD.

I might consider upgrading the next time I need a new PC, as this left me in an awkward spot: If I want a higher resolution, I need a new monitor. If I buy one, I’d need a new GPU probably, too. And since my CPU would now be a bottleneck for the rig I should also change that in this process. Then I might want a new mainboard, as I’m currently only running on DDR-4 RAM, and so… the best way forward is basically a new PC (I might save some money by keeping my NVMe drive, etc…).

I’m not sure, what I’m going to do in the future. Up until around the GTX 970, you could get a decent rig that plays current games in FHD on ultra or very high and would continue doing for about 1-2 years. If you degrade to medium - high, probably 4-5 years. You could get that easily for ~900-1000 bucks (or less). Nowadays, the GPU alone can get you close to this price range…

I get it. 1080p is about 2.1 megapixels, while 1440p is already 3.69 megapixels - that’s 75% more pixels and thus, you need way more performance to render it (or rather raster and shade it). But still… I don’t like these prices.

permalink
report
parent
reply
1 point

I have a 4090 and I feel the pinch on vram with ai. It’s never enough.

permalink
report
parent
reply
1 point

Thanks, that was going to be exactly my question. I don’t see anyone choosing low memory for video but had no idea what ai needs

permalink
report
parent
reply
2 points

You can run Stable Diffusion XL on 8GB of VRAM (to generate images). For beginners, there’s e.g. the open source software Fooocus, which handles quite a lot of work for you - it sends your prompt to a GPT-2 model (running on your PC) to do some prompt engineering for you and then uses that to generate your images and generally features several presets, etc. to get going easily.

Jan (basically an open source software that resembles ChatGPT and allows you to use several AI models) can run in 8GB, but only for 3B models or quantized 7B models. They recommend at least 16GB for regular 7B models (which they consider “minimum usable models”). Then there are larger, more sophisticated models, that require even more.

Jan can run on CPU in your regular RAM. Since it’s chatting with you, it’s not too bad, when it spits out words slowly, but GPU is / would be nice here…

permalink
report
parent
reply
1 point

I’ve seen people say that card is absurd. I’m not sure who is right there.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 10K

    Posts

  • 466K

    Comments