I’ve said it before, and I’ll say it again: Fuck this “AI” nonsense, the techbros shoving it into everything, and the Bitcoin cryptobros that came before them.
I don’t understand, clean nuclear power has never been easier. Why not just build some current gen nuke plants?
As I understand it, planning new, grid-scale nuclear power plants takes 10-20 years. While this isn’t a reason not to start that process now, it does mean something needs to fill the demand gap until the nuke plants (and other clean sources) come online to displace the dirty generation, or demand has to be artificially held down, through usage regulation or techniques like rolling blackouts, all of which I would imagine is pretty unpalatable.
I think it’s fair to predict energy consumption will continue to rise. With that timescale, it’s basically “the best time to plant a tree was 20 years ago, the second best time is today”. Doesn’t solve the immediate issue, but if we keep not starting new nuclear projects, it’s going to remain an issue forever.
Oh, I totally agree – didn’t mean to give any impression otherwise. Filling the energy demand gap as quickly as possible with the least impactful generation source should be very high on societal goals, IMO. And it seems like that is what’s happening, mostly. Solar, wind, and storage are the largest share of what’s being brought up this year:
It takes a long time to get a nuclear plant up and running. While it would be great to replace coal plants with nuclear, it wouldn’t help with all of the power being wasted on AI right now.
Time…
And a lot of concrete.
It takes a long time to see the climate gains from a nuclear reactor.
Hell, depending on size it can take a decade or longer to finish curing, and part of curing is releasing CO2 into the atmosphere.
Very sustainable technology, this AI 😎
I mean, it’s all runing on general purpose hardware. If we decoded 4k video on general purpose hardware we’d use more power than every AI company put together, but once that became popular we developed chips capable of decoding it at the hardware level that consume barely any power.
And the exact same thing is starting to happen with dedicated machine learning chips / NPUs.
Let’s have a round of slow claps for the tech industry.
How fucking convenient.