219 points

Selling shovels during a gold rush is the best way to get rich. :)

permalink
report
reply
19 points

While suing everyone else that makes shovel handles that work with your shovel heads.

permalink
report
parent
reply
124 points

Fuck this stupid world we’ve built.

permalink
report
reply
7 points

You missed out on buying while it was cheaper too, eh?

permalink
report
parent
reply
-2 points

Lmao, stupidity

permalink
report
reply
42 points

Time to sell Nvidia stock. Congrats to Huang for pulling it off. Get out when you’re on top.

permalink
report
reply
21 points

imagine how many leather jackets he can buy now

permalink
report
parent
reply
1 point

Depends on if they acquire/acquhire from here or if they don’t and get their lunch stolen by photonics plays.

permalink
report
parent
reply
-4 points

This is not how you do shares… :o

permalink
report
parent
reply
76 points

I didn’t know there were that many PC gamers out there. /s

Seriously, though, the pivot from making video cards to investing in AI and crypto is kinda genius. The crypto thing mostly fell into their laps, but they leaned in. The AI thing, though, I’m not sure how they decided to focus on that or who first pitched the idea to the board; but that was business genius.

permalink
report
reply
18 points

same as with crypto. the software community started using GPUs for deep learning, and they were just meeting that demand

permalink
report
parent
reply
38 points

To your point, when you look at both crypto and AI I see a common theme. They both need a lot of computation, call it super computing. Nvidia makes products that provide a lot of compute. Until Nvidia’s competitors catch up I think they’ll do fine as more applications that require a lot of computation are found.

Basically, I think of Nvidia as a super computer company. When I think of them this way their position makes more sense.

permalink
report
parent
reply
3 points
*

Also those thing are highly parallelizable and mainly deal with vector and matrix data, so the same “lots of really simple but fast processing units optimized for vectors and matrix operations working in parallel” that works fine for modern 3D Graphics (for example, each point on a frame image to display on the screen can be calculated in parallel with all the other points - in what’s called a fragment shader - and most 3D data is made of 3D vectors whilst the transforms are 3x3 Matrices) turns out to also work fine for things like neural networks were the neurons in each layer are quite simple and can all be processed in parallel (if the architecture of that wasn’t layered, GPUs would be far less effective for it).

To a large extent Nvidia got lucky that the stuff that became fashionable now works by doing lots of simple and highly paralellizeable computations, since otherwise it would’ve been the makers of CPUs that gained from the rise of said computing power demanding tech.

permalink
report
parent
reply
8 points
*
Deleted by creator
permalink
report
parent
reply
24 points
*

They were doing that for years before it became popular. The same tech for video graphics just so happened to be useful for AI and big data, and they doubled down on supporting enterprise and research efforts in that when it was a tiny field before their competitors did, and continued to specialize as it grew.

Supporting niche uses of your product can sometimes pay off if that niche hits the lottery.

permalink
report
parent
reply
7 points

Hardware made for heavy computing being good at stuff like this isn’t all that schokking though. The biggest gamble is if new technology will take off at all. Nvidia, just like google has the capital to diversify, bet on all the horses at once to drop the losers later.

permalink
report
parent
reply
9 points

To their credit they’ve been pushing GPGPUs for a while. They did position themselves well for accelerators. Doesn’t mean they don’t suck.

permalink
report
parent
reply
11 points
*

They were first to market with a decent GPGPU toolkit (CUDA) which built them a pretty sizeable userbase.

Then when competitors caught up, they made it as hard as possible to transition away from their ecosystem.

Like Apple, but worse.

I guess they learned from their Gaming heyday that not controlling the abstraction layer (eg OpenGL, DirectX, etc) means they can’t do lock in.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 12K

    Posts

  • 531K

    Comments