34 points

Uh there’s zero chance these big techs are selling voices like this. Also, this sounds very targeted and planned, so there must be more context to this. Also, why the hell are they on bluesky?

permalink
report
reply
4 points

I wish I were this naive

permalink
report
parent
reply

They’re on BlueSky because there are more people there that would actually believe this total load of bullshit. Plenty of scams that will claim to be vague family member (no names; just like “son” or “daughter,” “aunt” or “uncle”) but highly unlikely they’re getting an AI to mimic your voice. Just like vagueness of the family member they claim to be, the voices used may coincidentally sound similar.

permalink
report
parent
reply
2 points

Oh they totally are. Maybe not this directly, but there is an awful depth to the data market. One company may be selling to the next and the next and you end up with companies just buying all the data they can to sell police surveillance software running off of fucking Candy Crush telemetry.

permalink
report
parent
reply
5 points

Because its probably the best twitter alternative

permalink
report
parent
reply
5 points

Not even close to Mastodon

permalink
report
parent
reply
1 point

Bluesky is more usable than Mastodon purely because it doesn’t use a decentralized model. Sure it defeats the purpose, but on the plus side it removes some serious inconveniences that the true fediverse comes with, for example i’ve never been on Bluesky and found myself unable to upvote someone because i’ve accidentally left my instance.

permalink
report
parent
reply
55 points

Not sure I buy this tbh.

permalink
report
reply
1 point

Yeah, unless this person runs a YouTube or podcast it seems implausible. What would you train a random AI on for the normal person?

I could see a situation where you hack a phone, get the contacts and call history, pick the 1st or 2nd most dialed number, have a bot call that person to get samples, then go back to the original phone and try this… I mean, eventually you’d get a hit?

permalink
report
parent
reply
0 points
Deleted by creator
permalink
report
parent
reply
13 points

Yeah, it has some sus vibes. I’m usually far too trusting, but here even my bullshit detectors rang

permalink
report
parent
reply
14 points

You know that old adage “Never attribute to malice that which can be easily explained by stupidity”?

We need a new one along the lines of “Never attribute to truth that which can be easily explained by attention-starved teenagers”

permalink
report
parent
reply
18 points

I don’t know, the “Spanish prisoner” is a scam that seems to be reinvented every few years every time we see a little bit of a change in technology. It wouldn’t take much to fake a person’s voice with a trained model, especially if that person has an online profile open to the public where they post content in their own voice.

permalink
report
parent
reply
3 points

The Spanish Prisoner is also an excellent David Mamet film.

permalink
report
parent
reply
3 points

The plot of the movie is very similar to the basic setup of a 419 scam, which was given the nickname “the Spanish prisoner”

I’m going to go ahead and check that out tonight. Looks like it has a great cast.

permalink
report
parent
reply
5 points

A: The Spanish Prisoner is also an excellent…

B: …excellent

A: David… David Mamet.

B: What I’m saying is, when it comes to excellent films

A: That’s what I’m saying

B: (simultaneously) David Mamet

A: (simultaneously) Mamet

B: it’s excellent

A: hm. The best.

permalink
report
parent
reply

I could easily conceive some tricks to get clips of a person’s voice without them realizing. I’d write them out but… that would be stupid of me. Humans have more vulnerabilities than computers.

permalink
report
parent
reply
33 points
*

The only way to train an AI voice model is to have lots of samples. As scummy as they are, neither Microsoft nor Apple is selling your voice recordings with enough info to link them to you specifically. This person probably just forgot about an old social post where they talk for enough time for a model to be trained. Still super scary stuff.

permalink
report
reply
13 points
*

This person probably just forgot about an old social post…

Or recordings made during customer service calls, maybe a disgruntled employee decides to repurpose the data.

permalink
report
parent
reply
34 points
*

Not true anymore. You can create a reasonable voice clone with like 30 seconds of audio now (11labs for example doesn’t do any kind of authentication). The results are good enough for this kind of thing, especially in a lower bandwidth situation like a phone call.

permalink
report
parent
reply
6 points
*

True for creating voices at all, but that work has already been done.

Now we’re just taking these large AI’s trained to mimic voices and giving them a 30 second audio clip to tell them what to mimic. It can be done quickly and give convincing results especially when hidden by the phonecall quality.

permalink
report
parent
reply
15 points

I advise everyone to contact their loved ones and inform them of this possibility. I also advise having some codeword that would be used if there was an emergency and money needs to be sent.

For example is more than $100 is being asked for we have to share the code word or we should not transfer money.

permalink
report
reply
6 points

The code word is Donkeyballs

permalink
report
parent
reply
4 points

Hey is me, your cousin. I’m kinda in a jam and need your help. The contractor cut me a huge check after the fire that took everything but the bank won’t cash it without proof. I just need $101 for proof and I can pay back immediately.

This is for real.

Thanks.

Donkeyballs

permalink
report
parent
reply
2 points
*

Hey its me your 2nd uncle

I need about $3.50 to pay for shipping on a package from a nigerian price who needed my help

I know we said over 100 for this but

Donkeyballs

permalink
report
parent
reply
4 points

Pashang.

permalink
report
parent
reply
84 points

Are we sure it’s AI?

I’ve heard of this scam happening maybe a decade ago with my extended family. The voice was a real person overseas with a lot of exp tricking grandparents. Scammers only had basic information.

They act as a freaked out kid and the victim gets roped in. They scam for thousands of dollars each time, even succeeding a few times a day would net a big profit. Also cell connections are low fidelity, I bet that aids their ability to trick the victim.

permalink
report
reply
7 points

Yeah, my dad called me one day asking if my brother was out if the country because our grandma got a call saying he was kidnapped in Mexico and she needed to put up money for his release.

It’s wild.

permalink
report
parent
reply
14 points

Same. Years ago my grandfather received a call from a guy claiming to be my younger, male cousin saying he was in jail for something and needed bail. Luckily (?), my grandfather was an asshole and told him to call his mother.

permalink
report
parent
reply
20 points

Yeah this happened to my grandparents, they just say “I sound like shit because I’ve been crying”

permalink
report
parent
reply
2 points

A former colleague’s parents were scammed; their “kid” (the scammer) had a “broken nose”.

permalink
report
parent
reply

People Twitter

!whitepeopletwitter@sh.itjust.works

Create post

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a tweet or similar
  4. No bullying.
  5. Be excellent to each other.

Community stats

  • 9.3K

    Monthly active users

  • 820

    Posts

  • 38K

    Comments