User's banner
Avatar

itty53

itty53@vlemmy.net
Joined
1 posts • 50 comments

Fark => Digg => Reddit => Lemmy

Direct message

That’s great, you’re not everyone though, and you’re not fielding everyone’s calls either.

I’m in Healthcare. A massive chunk of our calls are simply “you have an order expected on (date), and shipping to (your address), is this information correct? Yes? Awesome, kthxbye”.

That’s it. By utilizing automatic dialers for that kind of thing we’re freeing up a ton of time for the real people to do more difficult hands on customer service.

I’m gonna say it, you’re the same person my great grandfather was, complaining about ATMs because they were over complicated.

permalink
report
parent
reply

Absolutely, 100%. We aren’t just plugging in an LLM and letting it handle calls willy nilly. We’re telling it like a robot exactly what to do, and the LLM only comes into play when it’s trying to interpret the intent of the person on the phone within the given conversation they’re having.

So for instance as we develop this for our end users, we’re building out functionality in pieces. For each piece where we know we can’t do that (yet), we “escalate” the call to the real person at the call center for them to handle. As we develop more these escalations get fewer, however there are many instances that will always escalate. For instance if the user says “let me speak to a person” or something to that effect, we’ll escalate right away.

For things the LLM can actually do against that users data, those are hard coded actions we control, it didn’t come up with them. It didn’t decide to do them, we do. It isn’t skynet and it isn’t close either.

The LLM’s actual functional use is quite limited to just understanding the intent of the user’s speech, that’s all. That’s how it’s being used all over (to great results).

permalink
report
parent
reply

I’m actually working on an LLM “AI” augmented call center application right now, so I’ve got a bit of experience in this.

Before anyone starts doomsaying, keep in mind that when you narrow the goal and focus of the machine learning model, it gets exponentially better at the job. Way better at the job than people.

ChatGPT on its own is a massive scope, and that flexibility means it’s going to do the bad things we know it too do. That’s why chatgpt sucks.

But build a LLM focused to managing a call center that handles just one topic. That’s what’s going on, virtually everywhere right now. This article gets that “based on chat gpt” in for clicks and fear mongering.

permalink
report
reply

Last hint, this is the one Spade film with him as the lead that virtually everyone loves.

permalink
report
parent
reply

Removing his ability to play stocks at all is removing his ability to earn money. His investors will leave him and the interest on his loans will liquidate him. We’ve seen more than a few of his type flash up and fade away. Milken. Pickens. Belfort. And of course, Madoff. Just to name a few we know by a single name.

He would still be wealthier than 99.99% of people but then so are a lot of folks on the planet. That’s 80 million people left over in that .01%. That’s not all that powerful at all. It’s removing him from the .00001% thats the goal. And killing his stock market abilities would do that over-night. It’s why he bought Twitter, he had to because this was the alternative.

permalink
report
parent
reply

Kinda… depends on how things got him there. The thing Elon can’t lose without really going off the rails is his ability to play the markets. Once the SEC kicks him out you’ll see him legitimately calling for open revolt.

permalink
report
parent
reply

I think the deeper generational thing is in the idea that anything “just works”. Like I’m a programmer, right, so I know shortcuts. Ctrl+S saves the file, simple right?

Me when I want to save a file: Ctrl+SSSS. Why? Because I don’t trust it “just works”. Same reason I don’t trust auto save. Same reason I am stunned every time I tell windows to diagnose and fix the network problem and then it actually does.

I grew up in a time where you couldn’t trust any of that shit.

permalink
report
parent
reply

Ego = the self from its own perspective. Makes complete sense actually. But do they call third person games “superego games”?

permalink
report
parent
reply