Tesla braces for its first trial involving Autopilot fatality::Tesla Inc is set to defend itself for the first time at trial against allegations that failure of its Autopilot driver assistant feature led to death, in what will likely be a major test of Chief Executive Elon Musk’s assertions about the technology.

91 points

The headline makes it sound like Tesla is trialing a new ‘fatality’ feature for it’s autopilot.

permalink
report
reply
30 points

Well, someone has to invent the suicide booths featured in Futurama. Might as well be him.

permalink
report
parent
reply
-5 points

I really want to trust you’re throwing a dark joke up but the sheer concept of suicide booths is a very harsh critique at a failed society. A very failed society. For it to become a joke…Call me square but that is a joke haimed to who laughs on it.

permalink
report
parent
reply
13 points

https://youtu.be/EbmQxZkSswI?si=0lcguQyWQxUggaB5

It’s a joke but a suicide booth isn’t that bad, assisted pain free death is a right everyone should have.

But having it on a street corner for ease of access is pretty fucked

permalink
report
parent
reply
9 points
*

Mortal Kombat: Vehicle Edition

permalink
report
parent
reply
6 points

Carmaggedon

permalink
report
parent
reply
4 points

The packet says to fight a Honda. (I know, Street Fighter, but still.)

https://piped.video/watch?v=3vPtn1StzA4&t=1

permalink
report
parent
reply
6 points
*

With how Elon has been acting this is a distinct possibility.

It would probably scream “Xterminate!” before running you over.

permalink
report
parent
reply
6 points

The reality is that they didn’t trial it at all, they just sent straight to production. In this case, it successfully achieved a fatality.

permalink
report
parent
reply
2 points

I’m literally waiting for the moment when a disproportionate ammount of Musk-critics die in car crashes.

permalink
report
parent
reply
39 points
19 points

Isn’t it a glorified cruise control/lane guidance system, rather than an actual automated driving system? So it would be about as safe as those are, rather than being something that you can just leave along to handle its own business, like a robotic vacuum cleaner.

permalink
report
parent
reply
10 points

The main issue is that they market it like a fully autonomous system, and made it just good enough that it lulls people into a false sense of security that they don’t need to pay attention, while also having no way to verify they are, unlike other systems from BMW, GM, or Ford.

Other systems have their capabilities intentionally hampered to insure that you’re not going to feel it’s okay to hop in the passenger seat and let your dog drive.

They are hands-on driver assists, and so they are generally calibrated in a way that they’ll guide you in the lane, but will drift/sway just a bit if you completely take your hands off the wheel, which is intended to keep you, y’know, actually driving.

Tesla didn’t want to do that. They wanted to be the “best” system, with zero safety considerations at any step other than what was basically forced upon them by the supplier so they wouldn’t completely back out. The company is so insanely reckless that I feel shame for ever wanting to work for them at one point, until I saw and heard many stories about just how bad they were.

I got to experience it firsthand too working at a supplier, where production numbers were prioritized over key safety equipment, and while everyone else was willing to suck it up for a couple of bad quarters, they pushed it and I’m sure it’s indirectly resulted in further injuries and potentially deaths because of it.

permalink
report
parent
reply
3 points

hey wanted to be the “best” system, with zero safety considerations at any step other than what was basically forced upon them by the supplier so they wouldn’t completely back out. The company is so insanely reckless that I feel shame for ever wanting to work for them at one point

What does this remind me of… Oh yeah right, OceanGate

permalink
report
parent
reply
-12 points

This is an absolutely bald-faced lie. Tesla absolutely does NOT market Autopilot as fully autonomous system. Autopilot is nothing other than lane-centering and adaptive cruise control with emergency braking, and that’s it. There is zero ambiguity about it on the vehicle and in documentation. Plus, it specifically requires the driver to maintain control of the wheel.

You need to stop, drop, and roll or jump in the nearest pool before your pants burn you to a crisp.

permalink
report
parent
reply
3 points

It is just a shit load of if then else statements. If the inputs don’t have a corresponding if then it just defaults to doing nothing.

permalink
report
parent
reply
-6 points

Driving a car is not safe. 40000 people die on car crashes every year in the US alone. Nothing in that article indicates that autopilot/FSD is more dangerous than a human driver. Just that they’re flawed systems as is expected. It’s good to keep in mind that 99.99% safety rating means 33000 accidents a year in the US alone.

permalink
report
parent
reply
18 points
*

Former NHTSA senior safety adviser Missy Cummings, a professor at George Mason University’s College of Engineering and Computing, said the surge in Tesla crashes is troubling.

“Tesla is having more severe — and fatal — crashes than people in a normal data set,” she said in response to the figures analyzed by The Post.

This would indicate that FSD is more dangerous than a human driver, would it not?

permalink
report
parent
reply
-2 points
*

That still doesn’t tell are those accidents happening more compared to normal cars. If you have good driver assist systems which are able to prevent majority of minor crashes but not the severe ones then the total number of crashes goes down but the kinds that remain are the bad ones.

permalink
report
parent
reply
-2 points

Depends…did you read that study on Twitter or another source?

permalink
report
parent
reply
11 points

You can’t just put something on the streets without first verifying it’s safe and working as intended. This is missing for Autopilot. And the data that’s piling up is showing that Autopilot is deadly.

permalink
report
parent
reply
-7 points

You can say the exact same thing for people.

permalink
report
parent
reply
-8 points
*

First of all what is it that you consider safe? I’m sure you realize that 100% safety rating is just fantasy so what is the acceptable rate of accidents for you?

Secondly would you mind sharing the data “that’s piling up is showing that Autopilot is deadly” ? Reports of individual incidents is not what I’m asking for because as I stated above; you’re not going to get 100% safety so there’s always going to be individual incidents to talk about.

You also seem to be talking about FSD beta and autopilot interchangeably thought they’re a different thing. Hope you realize this.

permalink
report
parent
reply
0 points
*

Humans my friend. We can hold humans accountable. We can’t hold hunks of semi-sentient sand and nebulous transient configurations of electrons liable of anything. So, it has to be better than humans, which is not. If it isn’t better than humans, then we’ll rather just have a human in control. Because we can argue with and hold the human accountable for their actions and decisions.

permalink
report
parent
reply
-17 points

Driving is not safe. These systems could be improved upon, but they’ve also saved numerous lives by preventing accidents from occurring in the first place. The example in the OP happened while this driver was sitting behind the wheel watching a movie. The first example in your article occurred with a driver behind the wheel. If either of them had been driving a 1995 Honda Civic, these accidents would have occurred just the same, but would anyone be demanding that Honda is to blame?

permalink
report
parent
reply
10 points

No, we would (rightfully so) blame the driver for merging into a semi truck that from my understanding was clearly visible.

permalink
report
parent
reply
5 points

but they’ve also saved numerous lives by preventing accidents from occurring in the first place.

There is no data to make this claim. You’re just making this up.

permalink
report
parent
reply
-7 points

Give me a break. You think all these companies are dumping billions of dollars into technology that doesn’t work? You’re making stuff up. Go watch some dashcam videos on YouTube if you want some proof.

permalink
report
parent
reply
25 points

The second trial, set for early October in a Florida state court, arose out of a 2019 crash north of Miami where owner Stephen Banner’s Model 3 drove under the trailer of an 18-wheeler big rig truck that had pulled into the road, shearing off the Tesla’s roof and killing Banner. Autopilot failed to brake, steer or do anything to avoid the collision, according to the lawsuit filed by Banner’s wife.

Is this the guy who was literally paying no attention to the road at all and was watching a movie whilst the car was in motion?

I legit can’t find information on it now as every result I can find online is word for word identical to that small snippet. Such is modern journalism.

I know people like to get a hard on with the word “autopilot”, but even real pilots with real autopilot still need to “keep an eye on things” when the system is engaged. This is why we have two humans in the cockpit on those big commercial jets.

permalink
report
reply
44 points

The way musk marketed it was as a “self driving” feature, not a driving assist. Yes with all current smart assists you need to be carefully watching what it’s doing, but that’s not what it was made out to be. Because of that I’d still say tesla is responsible.

permalink
report
parent
reply
-3 points

Self driving is not a defined standard, it is a buzz word like increase your vitality. The SAE standards for autonomous vehicles do not have a self driving category

permalink
report
parent
reply
-4 points

I think you’re referring to FSD beta and not Autopilot. One is supposed to be the self driving feature at some point while the other is simply lane keeping/cruise control. FSD wasn’t even available when this crash happened.

permalink
report
parent
reply
4 points

No I was referring to autopilot, just look at the name of it. It’s I know it’s not capable of self driving (and neither is the even more absurd name of “full self driving”) but to your average person it intentionally sounds as if the car is driving itself instead of it being a driving assist.

permalink
report
parent
reply
-12 points
*

Tesla’s Autopilot is driving assistance. I don’t know where you saw Musk marketing it as a self driving feature. Hell, even for the misnomer “full self driving” they note:

The currently enabled features require a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.

permalink
report
parent
reply
19 points

The feature is called “Autopilot”, meaning that the car automatically pilots itself, rather than using a human pilot. The definition of autopilot is literally “a device for keeping an aircraft or other vehicle on a set course without the intervention of the pilot.” I’m not sure how he could have more explicitly misrepresented the product.

permalink
report
parent
reply
7 points
*

There are also two pilots. Because they know people are people. And don’t brand it a self driving and full self driving then.

permalink
report
parent
reply
1 point

It sends shivers down my spine to think that airlines want to eliminate the co-piloting requirement in order to reduce costs. It would be increasingly stressful for the pilots, increasing turn-over, burnout and the risk of errors during flights. I would never fly with an airline that makes a single pilot take the brunt of a flight longer than 1 hour. Hell, even quality long-distance bus travel and truck hauling companies have drivers work in tandem, switching every so many hours.

permalink
report
parent
reply
14 points

It seems like an obvious flaw that’s pretty simple to explain. Car is learnt to operate the infromation about collisions on a set height. The opening between the wheels of a truck’s trailer thus could be treated by it as a free space. It’s a rare situation, but if it’s confirmed and reproduceable, that, at least, raises concerns, how many other glitches would drivers learn by surprise.

permalink
report
reply
2 points

In most countries trucks have bars between the trailer wheels, precisely because too many car drivers got an unwelcome haircut by not paying attention.

permalink
report
parent
reply
14 points
*
Deleted by creator
permalink
report
reply
7 points

I mean, yeah, but I doubt that it’s Tesla’s official stance on the matter

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 12K

    Posts

  • 553K

    Comments