Those experts said some of the invented text — known in the industry as hallucinations — can include racial commentary, violent rhetoric and even imagined medical treatments.

as much as the speech-to-text gets wrong on my phone, I can only imagine what it does with doctors’ notes.

one of my million previous jobs was in medical transcription, and it is so easy to misunderstand things even when you have a good grasp of specialty-specific terminology and basic anatomy.

they enunciate the shit they’re recording about your case about as well as they legibly write. you really have to get a feel for a doctor’s speaking style and common phrases to not turn in a bunch of errors.

But Whisper has a major flaw: It is prone to making up chunks of text or even entire sentences, according to interviews with more than a dozen software engineers, developers and academic researchers. Those experts said some of the invented text — known in the industry as hallucinations — can include racial commentary, violent rhetoric and even imagined medical treatments.

Edit: oh yeah, ✨ innovation ✨

While most developers assume that transcription tools misspell words or make other errors, engineers and researchers said they had never seen another AI-powered transcription tool hallucinate as much as Whisper.

Edit 2: it gets better and better

In an example they uncovered, a speaker said, “He, the boy, was going to, I’m not sure exactly, take the umbrella.”

But the transcription software added: “He took a big piece of a cross, a teeny, small piece … I’m sure he didn’t have a terror knife so he killed a number of people.”

A speaker in another recording described “two other girls and one lady.” Whisper invented extra commentary on race, adding “two other girls and one lady, um, which were Black.”

In a third transcription, Whisper invented a non-existent medication called “hyperactivated antibiotics.”

Edit 3: wonder if the Organ Procurement Organizations are going to try to use this to blame for the extremely fucked up shit that’s been happening

permalink
report
reply
20 points
*

I’ve been using Whisper with TankieTube and I’m curious whether these errors were made with the Large-v2 or the Large-v3 model. I suspect it was the latter, because its dataset includes output from the other.

The Whisper large-v3 model was trained on 1 million hours of weakly labeled audio and 4 million hours of pseudo-labeled audio collected using Whisper large-v2.

Snake eating its own tail, etc.

permalink
report
parent
reply

In your experience, has whisper large c3 been much worse than vo2?

permalink
report
parent
reply
6 points
*

I haven’t done any comparing; I just went with the apparent consensus, which is that v2 was more accurate and hallucinated less.

permalink
report
parent
reply

In your experience, has whisper large c3 been much worse than vo2?

permalink
report
parent
reply
11 points

How can a transcription tool be so bad? YouTube doesn’t get things this wrong.

permalink
report
parent
reply

Probably audio quality. I can’t imagine the acoustics in a hospital room or the hallway outside are anything close to most YouTube videos being recorded with a professional mic

permalink
report
parent
reply

sometimes they go into a tiny little office so they can concentrate better, and it’s so much easier to hear those docs

permalink
report
parent
reply
10 points

“He took a big piece of a cross, a teeny, small piece … I’m sure he didn’t have a terror knife so he killed a number of people.”

“two other girls and one lady, um, which were Black.”

Who did they train it on, Trump, Biden, or any other of the geriatric ghouls in DC?

permalink
report
parent
reply

Seems a bit stupid to use a transcription aid that can literally invent things, but when has something completely failing to do what it is supposed to do stopped capitalists from saving a buck

permalink
report
reply
19 points

seems suboptimal, but i’m excited about the future of ai in the medical industry, specifically rich people care

permalink
report
reply

permalink
report
reply

I managed to discover that with unassisted casual use really quickly. People are asleep at the wheel if they tried to give important duties to an AI. You don’t let a dog drive your car and hope for the best

permalink
report
reply

technology

!technology@hexbear.net

Create post

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

  • 1. Obviously abide by the sitewide code of conduct. Bigotry will be met with an immediate ban
  • 2. This community is about technology. Offtopic is permitted as long as it is kept in the comment sections
  • 3. Although this is not /c/libre, FOSS related posting is tolerated, and even welcome in the case of effort posts
  • 4. We believe technology should be liberating. As such, avoid promoting proprietary and/or bourgeois technology
  • 5. Explanatory posts to correct the potential mistakes a comrade made in a post of their own are allowed, as long as they remain respectful
  • 6. No crypto (Bitcoin, NFT, etc.) speculation, unless it is purely informative and not too cringe
  • 7. Absolutely no tech bro shit. If you have a good opinion of Silicon Valley billionaires please manifest yourself so we can ban you.

Community stats

  • 1.4K

    Monthly active users

  • 1.2K

    Posts

  • 13K

    Comments