Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he “absolutely” believes that Amazon will soon start charging a subscription fee for Alexa
I no longer use mine for much but to control one light with an illogical switch placement. I could pretty much replace her with The Clapper™️
As someone at a company still using free AI credits in their commercial products and hasn’t figured out how he’s going to price the shit when the credits are up… this AI market looks a lot like Uber subsidies…
Like a year or two from now, probably any AI stuff that isn’t self hosted is going to be 100% inaccessible to normal people due to cost. It’s just a question of how hard they’re going to fight to keep current free to download LLM models off the internet once this happens.
Something tells me that they’ll still listen to you for free.
I never got the appeal of those things even ignoring how their design is the antithesis of privacy. It just seems dumb to talk to the computer box, like it’s a thing to talk to when it’s just a microphone and software. I simply prefer direct, precise, and silent control of devices
It’s good for hands/device free control. Setting timers while cooking by simply saying “set a timer” or controlling lights from across the room without fiddling with a phone or remote.
It’s very sci fi. Star Trek amongst many others from the 80s. If you are old enough then you would remember that this was the stuff of fantasy. I can see why it appeals to people with disabilities and possibly kids for homework or something. But I am 1000 percent with you on the privacy part. No thanks.
Running AI may be currently expensive, but the hardware will continue to improve and get cheaper. If they institute a subscription fee and people actually pay for it, they’ll never remove that fee even after it becomes super cheap to run.
but the hardware will continue to improve and get cheaper.
Eh. I mean sure the likes of A100s will invariably get cheaper because they’re overpriced AF, but there isn’t really that much engineering going into those things hardware-wise: Accelerating massive chains of fma
s is quite a smaller challenge than designing a CPU or GPU. Meanwhile moore’s law is – well maybe not dead but a zombie. In the past advances in manufacturing meant lower price per transistor, that hasn’t been true for a while now and the physics of everything aren’t exactly getting easier, they’re now battling quantum uncertainty in the lithography process itself.
Where there might still be significant headways to be made is by switching to analogue, but, eeeh. Reliability. Neural networks are rather robust against small perturbations but it’s not like digital systems can’t make use of that by reducing precision, and controlling precision is way harder in analoge. Everything is harder there, it’s an arcane art.
tl;dr: Don’t expect large leaps, especially not multiple. This isn’t a naughts “buy a PC twice as fast at half the price two years later” kind of situation, AI accelerators are silicon like any other they already make use of the progress we made back then.
That is sort of the issue when mixing good conscience with capitalism. Either the goods are valued at what we’re willing to pay, or either they’re valued at what we think the profit margin of the business should be, but mixing the two ultimately leads us to fall for PR crap. Business are quick to gather sympathy when the margins are low, and we fall for this PR crap, but then as soon they own a part of the market it turns into raising the price as much as they possibly can.
That being said, Amazon became what it is because Bezos was hell bent on not rug pulling customers, at least in the early years, so it is possible they would decrease prices eventually to gain market advantage, that’s their whole strategy.