6 points

This is the best summary I could come up with:


In March 2023, a North Carolina student was stepping off a school bus when he was struck by a Tesla Model Y traveling at “highway speeds,” according to a federal investigation that published today.

The Tesla driver was using Autopilot, the automaker’s advanced driver-assist feature that Elon Musk insists will eventually lead to fully autonomous cars.

NHTSA was prompted to launch its investigation after several incidents of Tesla drivers crashing into stationary emergency vehicles parked on the side of the road.

Most of these incidents took place after dark, with the software ignoring scene control measures, including warning lights, flares, cones, and an illuminated arrow board.

Tesla issued a voluntary recall late last year in response to the investigation, pushing out an over-the-air software update to add more warnings to Autopilot.

The findings cut against Musk’s insistence that Tesla is an artificial intelligence company that is on the cusp of releasing a fully autonomous vehicle for personal use.


The original article contains 788 words, the summary contains 158 words. Saved 80%. I’m a bot and I’m open source!

permalink
report
reply
4 points

Cameras and AI aren’t a match for radar/lidar. This is the big issue with the approach to autonomy Tesla’s take. You’ve only a guess if there are hazards in the way.

Most algorithms are designed to work and then be statistically tested. To validate that they work. When you develop an algorithm with AI/machine learning, there is only the statistical step. You have to infer whole systems performance purely from that. There isn’t a separate process for verification and validation. It just validation alone.

When something is developed with only statistical evidence of it working you can’t be reliably sure it works in most scenarios. Except the exact ones you tested for. When you design an algorithm to work you can assume it works in most scenarios if the result are as expected when you validate it. With machine learning, the algorithm is obscured and uncertain (unless it’s only used for parameter optimisation).

Machine learning is never used because it’s a better approach. It’s only used when the engineers don’t know how to develop the algorithm. Once you understand this, you understand the hazard it presents. If you don’t understand or refuse to understand this. You build machines that drive into children, deliberately. Through ignorance, greed and arrogance Tesla built a machine that deliberately runs over children.

permalink
report
parent
reply
-5 points

This is speculation, but were most of them from people who disabled the safety features?

permalink
report
reply
6 points

Probably not

permalink
report
parent
reply
9 points

No, I doubt most people care enough to disable them.

permalink
report
parent
reply
-7 points
*

I heard it’s fairly common for people to disarm the feature that requires you to hold the wheel.

Edit: it would be nice if someone explained why I’m being downvoted lol

permalink
report
parent
reply
2 points
*

Anything remotely supportive of Tesla on lemmy usually results in massive downvotes.

You’ve angered the hive mind by suggesting people are actively trying to bypass teslas saftey system so they can be idiots thus making it not wholly Teslas fault.

And yes, many people are actively using bypass devices, but not all.

permalink
report
parent
reply
2 points

You don’t have to disable it to beat the safety system.

They were all pretty much due to inattentiveness, though. Many were drunk drivers.

Many do use defeat devices as well, but not all.

This was all brand new when it first came out and we didn’t really have proper regulations for it. Things have gotten more restrictive, but people do still find ways around it and there’s no fool proof solution to this as humans are smart and will find ways around things.

permalink
report
parent
reply
51 points

They just recalled all the Cybertrucks, because their ‘smort’ technology is too stupid to realize when an accelerator sensor is stuck…

permalink
report
reply
24 points
*

The accelerator sensor doesn’t get stuck, pedal does. The face of the accelerator falls off and wedges the pedal into the down position.

permalink
report
parent
reply
24 points

Pedal, not petal.

Not trying to be an asshole, just a nudge to avoid misunderstandings (although the context is clear in this case)

permalink
report
parent
reply
12 points

Given the number of other issues in the post I’m going to guess it was hurried and autocorrected wrong. Happens to me all the time.

permalink
report
parent
reply
4 points

I realize it’s the pedal that gets stuck, but the computer registers the state of the pedal via a sensor.

The computer should be smart enough to realize something ain’t right when it registers that both the accelerator and brake pedals are being pressed at the same time. And in that case, the brake should always take priority.

permalink
report
parent
reply
2 points

The stories I’ve heard around the recall have been saying that the brakes override the accelerator in the cyber truck.

permalink
report
parent
reply
36 points

Any time now it will be released. Like 7 years ago the taxis.

permalink
report
reply
-27 points

There are some real Elon haters out there. I think they’re ugly as sin but I’m happy to see more people driving vehicles with all the crazy safety features, even if they aren’t perfect.

You’re in control of a massive vehicle capable of killing people and destroying property, you’re responsible for it.

permalink
report
reply
30 points
*

You’re in control of a massive vehicle capable of killing people and destroying property, you’re responsible for it.

If only Elon would say something similar when he re-tweets a video of people having sex while the car is on autopilot. Can you guess what he actually said?

permalink
report
parent
reply
8 points

Moran

permalink
report
parent
reply
-14 points

I’m quite certain that there will be some humble pie served to the haters in not too distant future. The performance of FSD 12.3.5 is all the proof you need that an actual robotaxi is just around the corner. Disagree with me all you want. All we need to do is wait and see.

However I’m also sure that the cognitive dissonance is going to be so strong for many of these people that even a mountain of evidence is not going to change their mind about it because it’s not based in reason in the first place but emotions.

permalink
report
parent
reply
15 points
*

What makes this time any different from the dozens of other times musk had said we’re six months away from FSD? When do you think Tesla will take responsibility for accidents that happen while using their software?

If they do that in the next year, I’ll gladly eat humble pie. If they can’t, will you?

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 11K

    Posts

  • 518K

    Comments