This is the best summary I could come up with:
In March 2023, a North Carolina student was stepping off a school bus when he was struck by a Tesla Model Y traveling at “highway speeds,” according to a federal investigation that published today.
The Tesla driver was using Autopilot, the automaker’s advanced driver-assist feature that Elon Musk insists will eventually lead to fully autonomous cars.
NHTSA was prompted to launch its investigation after several incidents of Tesla drivers crashing into stationary emergency vehicles parked on the side of the road.
Most of these incidents took place after dark, with the software ignoring scene control measures, including warning lights, flares, cones, and an illuminated arrow board.
Tesla issued a voluntary recall late last year in response to the investigation, pushing out an over-the-air software update to add more warnings to Autopilot.
The findings cut against Musk’s insistence that Tesla is an artificial intelligence company that is on the cusp of releasing a fully autonomous vehicle for personal use.
The original article contains 788 words, the summary contains 158 words. Saved 80%. I’m a bot and I’m open source!
Cameras and AI aren’t a match for radar/lidar. This is the big issue with the approach to autonomy Tesla’s take. You’ve only a guess if there are hazards in the way.
Most algorithms are designed to work and then be statistically tested. To validate that they work. When you develop an algorithm with AI/machine learning, there is only the statistical step. You have to infer whole systems performance purely from that. There isn’t a separate process for verification and validation. It just validation alone.
When something is developed with only statistical evidence of it working you can’t be reliably sure it works in most scenarios. Except the exact ones you tested for. When you design an algorithm to work you can assume it works in most scenarios if the result are as expected when you validate it. With machine learning, the algorithm is obscured and uncertain (unless it’s only used for parameter optimisation).
Machine learning is never used because it’s a better approach. It’s only used when the engineers don’t know how to develop the algorithm. Once you understand this, you understand the hazard it presents. If you don’t understand or refuse to understand this. You build machines that drive into children, deliberately. Through ignorance, greed and arrogance Tesla built a machine that deliberately runs over children.
This is speculation, but were most of them from people who disabled the safety features?
I heard it’s fairly common for people to disarm the feature that requires you to hold the wheel.
Edit: it would be nice if someone explained why I’m being downvoted lol
Anything remotely supportive of Tesla on lemmy usually results in massive downvotes.
You’ve angered the hive mind by suggesting people are actively trying to bypass teslas saftey system so they can be idiots thus making it not wholly Teslas fault.
And yes, many people are actively using bypass devices, but not all.
You don’t have to disable it to beat the safety system.
They were all pretty much due to inattentiveness, though. Many were drunk drivers.
Many do use defeat devices as well, but not all.
This was all brand new when it first came out and we didn’t really have proper regulations for it. Things have gotten more restrictive, but people do still find ways around it and there’s no fool proof solution to this as humans are smart and will find ways around things.
They just recalled all the Cybertrucks, because their ‘smort’ technology is too stupid to realize when an accelerator sensor is stuck…
The accelerator sensor doesn’t get stuck, pedal does. The face of the accelerator falls off and wedges the pedal into the down position.
Pedal, not petal.
Not trying to be an asshole, just a nudge to avoid misunderstandings (although the context is clear in this case)
Given the number of other issues in the post I’m going to guess it was hurried and autocorrected wrong. Happens to me all the time.
I realize it’s the pedal that gets stuck, but the computer registers the state of the pedal via a sensor.
The computer should be smart enough to realize something ain’t right when it registers that both the accelerator and brake pedals are being pressed at the same time. And in that case, the brake should always take priority.
The stories I’ve heard around the recall have been saying that the brakes override the accelerator in the cyber truck.
Any time now it will be released. Like 7 years ago the taxis.
There are some real Elon haters out there. I think they’re ugly as sin but I’m happy to see more people driving vehicles with all the crazy safety features, even if they aren’t perfect.
You’re in control of a massive vehicle capable of killing people and destroying property, you’re responsible for it.
You’re in control of a massive vehicle capable of killing people and destroying property, you’re responsible for it.
If only Elon would say something similar when he re-tweets a video of people having sex while the car is on autopilot. Can you guess what he actually said?
I’m quite certain that there will be some humble pie served to the haters in not too distant future. The performance of FSD 12.3.5 is all the proof you need that an actual robotaxi is just around the corner. Disagree with me all you want. All we need to do is wait and see.
However I’m also sure that the cognitive dissonance is going to be so strong for many of these people that even a mountain of evidence is not going to change their mind about it because it’s not based in reason in the first place but emotions.
What makes this time any different from the dozens of other times musk had said we’re six months away from FSD? When do you think Tesla will take responsibility for accidents that happen while using their software?
If they do that in the next year, I’ll gladly eat humble pie. If they can’t, will you?