Summary: Meta, led by CEO Mark Zuckerberg, is investing billions in Nvidia’s H100 graphics cards to build a massive compute infrastructure for AI research and projects. By end of 2024, Meta aims to have 350,000 of these GPUs, with total expenditures potentially reaching $9 billion. This move is part of Meta’s focus on developing artificial general intelligence (AGI), competing with firms like OpenAI and Google’s DeepMind. The company’s AI and computing investments are a key part of its 2024 budget, emphasizing AI as their largest investment area.

92 points

The real winners are the chipmakers.

permalink
report
reply
71 points

Gold rush you say?

Shovels for sale!

Get your shovels here! Can’t strike it rich without a shovel!

permalink
report
parent
reply
17 points

I feel like a pretty big winner too. Meta has been quite generous with releasing AI-related code and models under open licenses, I wouldn’t be running LLMs locally on my computer without the stuff they’ve been putting out. And I didn’t have to pay a penny to them for it.

permalink
report
parent
reply
7 points

Subsized by boomers everywhere looking at ads on Facebook lol. Same with the Quest gear and VR development

permalink
report
parent
reply
-1 points

Was wondering why my stock was up. AI already improving my quality of life.

permalink
report
parent
reply
35 points

Who isn’t at this point? Feels like every player in AI is buying thousands of Nvidia enterprise cards.

permalink
report
reply
15 points

The equivalent of 600k H100s seems pretty extreme though. IDK how many OpenAI has access to, but it’s estimated they “only” used 25k to train GPT4. OpenAI has, in the past, claimed the diminishing returns on just scaling their model past GPT4s size probably isn’t worth it. So, maybe Meta is planning on experimenting with new ANN architectures, or planning on mass deployment of models?

permalink
report
parent
reply
2 points

Might be a bit of a tell that they think they have something.

permalink
report
parent
reply
5 points

Or they just have too much money.

permalink
report
parent
reply
4 points

Which will be solved by them spending it.

permalink
report
parent
reply
3 points

Would that be diminishing returns on quality, or training speed?

If I could tweak a model and test it in an hour vs 4 hours, that could really speed up development time?

permalink
report
parent
reply
4 points

Quality. Yeah, using the extra compute to increase speed of development iterations would be a benefit. They could train a bunch of models in parallel and either pick the best model to use or use them all as an ensemble or something.

My guess is that the main reason for all the GPUs is they’re going to offer hosting and training infrastructure for everyone. That would align with the strategy of releasing models as “open” then trying to entice people into their cloud ecosystem. Or, maybe they really are trying to achieve AGI as they state in the article. I don’t really know of any ML architectures that would allow for AGI though (besides the theoretical, incomputable AIXI).

permalink
report
parent
reply
17 points

The estimated training time for GPT-4 is 90 days though.

Assuming you could scale that linearly with the amount of hardware, you’d get it down to about 3.5 days. From four times a year to twice a week.

If you’re scrambling to get ahead of the competition, being able to iterate that quickly could very much be worth the money.

permalink
report
parent
reply
3 points

I’m sure that everybody has some, but to spend billions seems a little premature.

permalink
report
reply
-2 points

Six months from now: “damn, we’re way behind Meta on AI. We should have spent billions six months ago, it’s going to cost way more to catch up.”

permalink
report
parent
reply
0 points

Chips evolve. By the time a billion dollar contract is fulfilled, they are two iterations behind.

permalink
report
parent
reply
2 points

Pretty sure they’ll be given insight into the roadmap for that price, and be able to place speculative orders on upcoming generations.

permalink
report
parent
reply
7 points

Just like the Metaverse…this won’t have legs.

permalink
report
reply
14 points

Jensen’s gonna buy so many new leather jackets.

permalink
report
reply
4 points

And spatulas. Don’t forget the spatulas.

permalink
report
parent
reply
2 points

Could just buy Spatula City.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 12K

    Posts

  • 531K

    Comments