A judge in Washington state has blocked video evidence that’s been “AI-enhanced” from being submitted in a triple murder trial. And that’s a good thing, given the fact that too many people seem to think applying an AI filter can give them access to secret visual data.

You are viewing a single thread.
View all comments
192 points
*

If you ever encountered an AI hallucinating stuff that just does not exist at all, you know how bad the idea of AI enhanced evidence actually is.

permalink
report
reply
-109 points

Everyone uses the word “hallucinate” when describing visual AI because it’s normie-friendly and cool sounding, but the results are a product of math. Very complex math, yes, but computers aren’t taking drugs and randomly pooping out images because computers can’t do anything truly random.

You know what else uses math? Basically every image modification algorithm, including resizing. I wonder how this judge would feel about viewing a 720p video on a 4k courtroom TV because “hallucination” takes place in that case too.

permalink
report
parent
reply
81 points

There is a huge difference between interpolating pixels and inserting whole objects into pictures.

permalink
report
parent
reply
-41 points

Both insert pixels that didn’t exist before, so where do we draw the line of how much of that is acceptable?

permalink
report
parent
reply
38 points

Has this argument ever worked on anyone who has ever touched a digital camera? “Resizing video is just like running it through AI to invent details that didn’t exist in the original image”?

“It uses math” isn’t the complaint and I’m pretty sure you know that.

permalink
report
parent
reply
36 points

normie-friendly

Whenever people say things like this, I wonder why that person thinks they’re so much better than everyone else.

permalink
report
parent
reply
5 points

Tangentially related: the more people seem to support AI all the things the less it turns out they understand it.

I work in the field. I had to explain to a CIO that his beloved “ChatPPT” was just autocomplete. He become enraged. We implemented a 2015 chatbot instead, he got his bonus.

We have reached the winter of my discontent. Modern life is rubbish.

permalink
report
parent
reply
-8 points

Normie, layman… as you’ve pointed out, it’s difficult to use these words without sounding condescending (which I didn’t mean to be). The media using words like “hallucinate” to describe linear algebra is necessary because most people just don’t know enough math to understand the fundamentals of deep learning - which is completely fine, people can’t know everything and everyone has their own specialties. But any time you simplify science so that it can be digestible by the masses, you lose critical information in the process, which can sometimes be harmfully misleading.

permalink
report
parent
reply
12 points
*

computers aren’t taking drugs and randomly pooping out images

Sure, no drugs involved, but they are running a statistically proven random number generator and using that (along with non-random data) to generate the image.

The result is this - ask for the same image, get two different images — similar, but clearly not the same person - sisters or cousins perhaps… but nowhere near usable as evidence in court:

permalink
report
parent
reply
-17 points

Tell me you don’t know shit about AI without telling me you don’t know shit. You can easily reproduce the exact same image by defining the starting seed and constraining the network to a specific sequence of operations.

permalink
report
parent
reply
9 points

It’s not AI, it’s PISS. Plagiarized information synthesis software.

permalink
report
parent
reply
0 points

Just like us!

permalink
report
parent
reply
9 points

computers can’t do anything truly random.

Technically incorrect - computers can be supplied with sources of entropy, so while it’s true that they will produce the same output given identical inputs, it is in practice quite possible to ensure that they do not receive identical inputs if you don’t want them to.

permalink
report
parent
reply
2 points

IIRC there was a random number generator website where the machine was hookup up to a potato or some shit.

permalink
report
parent
reply
8 points

Bud, hallucinate is a perfect term for the shit AI creates because it doesnt understand reality, regardless if math is creating that hallucination or not

permalink
report
parent
reply
1 point

You know what else uses math? Tripping on acid. From the chemistry used to creat it, to the fractals you see while on it, LSD is math.

permalink
report
parent
reply
-2 points

Except for the important part of how LSD affects people. Can you point me to the math that precisely describes how human consciousness (not just the brain) reacts to LSD? Because I can point you to the math that precisely describes how interpolation and deep learning works.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 12K

    Posts

  • 531K

    Comments