Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he “absolutely” believes that Amazon will soon start charging a subscription fee for Alexa

0 points

Rip Bozo, no one will sub for your knock off siri. Aside from a certain cub set of consoomer cattle.

permalink
report
reply
2 points

I love when satire.

permalink
report
parent
reply
88 points

AI is being touted as the solution to everything these days. It’s really not, and we are going to find that out the hard way.

permalink
report
reply
9 points

Hey that’s only because Amazon, Google and Microsoft (et al) just doesn’t have the Money to Make it good!!

So what about 9.99 a month?

4.99 if you pay up front for a year?

Euh, or how much can you cough up, like for a year or at least for Q4, I’m literally on a bad roll here.

permalink
report
parent
reply
16 points

I’m not going to buy into a subscription model for something I’ve already paid for. This subscription model crap is complete bullshit.

We even tried to do it with heated seats recently. Like install heated seats in your car, but disable them in software. It’s crazy that companies think they can get away with this.

permalink
report
parent
reply
7 points
*

I think there’s a massive difference between unlocking a feature that’s already there and requires no maintenance and a cloud-based service that demands 24/7 uptime and constant developer support, as well as ongoing feature development

permalink
report
parent
reply
2 points

While I agree with you, they are 💯 going to get away with it, because your average consumer just doesn’t care.

permalink
report
parent
reply
42 points

I get what you’re saying, but voice assistants are one of the main places LLMs belong.

permalink
report
parent
reply
6 points
*

Yes, but so much more. An actually useful assistant that could draft emails, set reminders appropriately, create automations, etc. would be worth A LOT of money to me.

permalink
report
parent
reply
2 points

I think if there ends up actually being a version of AI that is privacy focused and isn’t screwing over creators it’d be so much less controversial. Also, everyone (including me) is really, really fucking sick of hearing about it all of the time in the same way that everyone is/was sick of hearing about the blockchain. As in: “Bro your taco stand needs AI/the blockchain.”

permalink
report
parent
reply
4 points

If IBM actually manages to convert COBOL into Java like they’re advertising, they’ll end up killing their own cash cow

So much still runs on COBOL

permalink
report
parent
reply
2 points

It’s not even A.I. either

permalink
report
parent
reply
8 points

I don’t understand this. Hasn’t Intel or Nvidia (or someone else) been making claims about their next CPUs having AI functionality built-in?

permalink
report
reply
1 point

You can record and edit videos on your own devices, but that doesn’t mean it’s suddenly free for Netflix or YouTube to stream their videos to you.

Surely a local version of Alexa could be developed, but that development would come with its own costs.
Some things simply can’t be done locally, such as a web search. Often your route calculations for a map application are also done in the cloud.

permalink
report
parent
reply
16 points

Having “AI functionality” doesn’t mean they can just get rid of their big/expensive models they use now.

If they are anything like Open AI’s LLM, it requires very beefy machines with a ton of expensive RAM.

permalink
report
parent
reply
-2 points
*

Well that’s exactly what I was thinking when these companies were making these claims… like HOW could they possibly handle this locally on a CPU or GPU when there must be a massive database that (I assume) is constantly being updated? Didn’t make sense.

EDIT: this entire website can go fuck off. You ask a simple question about some reasonably new tech, and you get downvoted for having the audacity to learn some new stuff. People on here are downright pathetic.

permalink
report
parent
reply
5 points

“AI” doesn’t use databases per se, they are trained models built from large amounts of training data.

Some models run fine on small devices (like the model running on phones to make better pictures) but others are huge like Open AI’s LLM.

permalink
report
parent
reply
1 point

You’re right. Run an llm locally adjacent to your application sandboxes and local user apps and your office will lower its heating bills.

permalink
report
parent
reply
12 points

They won’t be charging me, because I don’t buy shit from Amazon, and don’t use their spy platform.

permalink
report
reply
26 points

Ok. I’ll be the weirdo. If it’s actually useful, I would pay for it.

Not if it’s just the parlor trick that it currently is.

permalink
report
reply
28 points

This is the killer for all this shit right now as far as I’m concerned. All of it lives squarely in “huh…neat” territory. I have yet to see anything I felt was truly necessary. Until that happens, paying is a non starter for me.

permalink
report
parent
reply
10 points

This is why I’m so confused by Amazon’s approach. I know they’ve already sunk millions if not billions of dollars into this, so why has the user experience not improved in the last 8 years?

I’m not going to buy things with my voice when just getting the lights to turn off or music to play can be an infuriating endeavor. Speech recognition has stagnated.

The third party integrations are just so clunky too. They could have made money by selling licenses to businesses in order to access the service, but again, they haven’t improved that experience at all.

The “Alexa, let me talk to dominos.” or “Alexa, ask LG to turn off the TV” is just stupidly cumbersome. Why can’t you set up preferred providers? I don’t have to say “ask Spotify to play music” I just say “play music”, so we know it’s possible. It would be trivial to implement other preferred service providers compared to the overall scale of Alexa.

permalink
report
parent
reply
10 points

I don’t know if you’re in IT at all, but the really crazy thing is that as half baked as Alexa stuff feels…a ton of AWS’s offerings feel the exact same way. Their marketing material is great, and I do believe their engineers are passionate and have the right intentions. But none of it feels “finished”. It all feels like an elaborate beta test. Things don’t work, documentation is out of date or just plain wrong, it’s impossible to get actual expert support from Amazon directly.

AWS is their biggest money maker and even that is a cobbled together, confusing pile half the time. Sometimes feels like everything is a house of cards.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 17K

    Monthly active users

  • 12K

    Posts

  • 556K

    Comments