Mercedes-Benz debuts turquoise exterior lights to indicate the car is self-driving | A visual indicator for other drivers::undefined
Good idea I think, but these could be mistaken for reversing lights
On that note, can we talk about how shit a lot of reverse lights are? In addition to indicating that you’re backing up, they’re also supposed to function as a sort of rear-facing headlight so you can see what your backing up towards, but their size, placement, and brightness on a lot of cars makes them pretty much useless for that in a lot of cases.
I’m not saying they need to be as bright as your regular headlights, that would be serious overkill, but they should probably be noticeably brighter than a turquoise self-driving indicator light would ever need to be.
They’re not supposed to be rear-facing headlights. You don’t even have to have 2 of them (1 is acceptable as long as it’s visible enough). And unlike every other light, there’s no restriction on where it goes. It’s almost like it was an afterthought when they were writing the regulations.
They’re not supposed to be rear-facing headlights.
Blindly backing up in the dark is how you hit/run over things. Back up lights are supposed to provide enough light so you can see where you’re going.
Turquoise is also a shade of blue so I think that may make them illegal in the US since blue lights are only legal on emergency vehicles.
why would i need to know another car is self driving though?
That’s what I thought. I can only imagine idiots will see it and try fuck with it. Anyone else be like, “Okay… So just keep doing what I’m currently doing.”
It’s marketing, if anything.
My theory on Audi bringing out animated indicators was that they were quickly getting a damaging reputation of Audi drivers not using indicators; a reputation their competitor BMW is negatively married to. To prevent this, they appealed to making them unique and special, no one else had them, so the drivers would want to use them. Thus actively mitigating brand damage on BMW levels.
I would love to have an indicator for adaptive cruise control because the way it only reacts to the car right in front of you rather aggressively means it causes shockwave traffic jams unless the human driver behind you keeps enough distance.
Here is an alternative Piped link(s):
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Humans are already experts at causing shockwave traffic jams, so I wouldn’t count on them to reduce them.
I can only imagine assholes messing with the car MORE because the lights are on.
So you don’t call the cops when you overtake them and see then eating a bowl of cereal, jerking it, while watching Flinstones.
If we need warning lights for self driving cars, the technology is not ready.
🤦
if we need warning lights for ANYTHING, the humanity is just not ready.
Same purpose as warning labels: to keep ding dongs alive so they can spend more money
Eh, it’s probably good to have regardless?
It’s less about being careful around the car and more about how you might interact with it. For example, honking the horn or flashing your beams wouldn’t have the same effect. On that note, it might be nice to have some way of telling a self driving car to temporarily use elevated sensors or something, the same way a horn tells a driver that something is wrong. As long as there’s a way to prevent abuse of the system
I don’t know much about these lights, but we COULD use some new standards in general with how many things have changed with cars in recent years. Brake lights on electric vehicles being another thing to consider.
That “gentle horn” everyone wants being another
You’re still the driver in the self-driving car. If someone honks, you have pedals and a wheel in front of you. It always comes down to driver neglect. It’s like blaming the cruise control for speeding, but giving cruise control more responsibilities.
To play the devil’s advocate: early cars needed a guy with a flag im front of them because people were used to horses and carriages and not automobiles. After a while that stopped being a thing.
But yeah, self driving cars are not really ready.
As a Level 3 system, the driver is permitted to take their hands off the wheel, their feet off the pedals, and divert their attention away from the road. […]
The turquoise markers will alert other drivers to the fact that your vehicle is driving itself, so hopefully they won’t be alarmed if they see you doing other things while behind the wheel.
There are warning signs to indicate people learning to drive in ex-Soviet countries (such yellow triangles to put behind the glass), even though they are driving with an instructor.
Now when I think about it, it’s been some time since I’ve seen that sign.
Somewhat similarly in the Netherlands, in case you fail your practical driving exam three times you still get a license but you can only drive cars marked with special yellow number plates.
The technology will never be ready if you don’t test it.
And I would argue we DON’T need warning lights since, while imperfect, most self-driving tech is already vastly better than your average driver. We should have warning lights for cars that DON’T have self-driving.
This is ultimately why we will NEVER have self-driving cars en masse, because society isn’t willing to take the necessary risks to improve the safety of everyone on the road.
How about we:
- Don’t let random customers test it and instead use heavily trained, specialized test drivers
- Require permitting and, e.g., an obstacle course before letting a company’s software be randomly updated and thrown on the road?
Why is there this constant false dichotomy implying that the only way to test self driving cars is a wild west of no regulation?
And also who said that self driving cars are safer than humans? Tesla’s numbers are all statistical lies (in fact Teslas were recently shown to have the most accidents), Cruise just shutdown in SF because they were a liability, and Waymo is heavily limited in its time/weather/areas for driving.
Don’t let random customers test it and instead use heavily trained, specialized test drivers
At some point you need to test it on a large scale. Cruise was even running small-scale and was shut down in short order.
Require permitting and, e.g., an obstacle course before letting a company’s software be randomly updated and thrown on the road?
We do.
Why is there this constant false dichotomy implying that the only way to test self driving cars is a wild west of no regulation?
There isn’t.
And also who said that self driving cars are safer than humans?
…everyone?
https://arstechnica.com/cars/2023/12/human-drivers-crash-a-lot-more-than-waymos-software-data-shows/
Tesla’s numbers are all statistical lies (in fact Teslas were recently shown to have the most accidents)
[Citation needed]
Cruise just shutdown in SF because they were a liability
This is actually a great example of exactly what I’m talking about: GM will shut down Cruise permanently because they’ve discovered what I just said: society has zero tolerance for literally anyone getting hurt by autonomous vehicles, whereas the tens of thousands of people who are killed on our roads every year by individuals is considered acceptable.
Sure. But we’re jumping into the deep end by legally allowing the driver to be exempt from distracted driving laws. There’s a big difference between testing the technology and relying on the technology.
legally allowing the driver to be exempt from distracted driving laws.
Can you cite the legislation that exempts drivers using driver assistance systems from paying attention while driving?
There’s a big difference between testing the technology and relying on the technology.
No one should be relying on the technology.
Brb, wiring up a set of these so I can blame the car for missing a speed sign
Even if this would be a good idea, you can’t just put some non regulated lights on a car. This would need a law change in Germany to be approved and would probably take years of burocrazy until she get beards figured out the exact hue these lights need to emit. But I guess Mercedes already wrote that law for our government to copy. How convenient.
Since it’s Mercedes-Benz doing it, they’ll just write the new law themselves and tell the German minister of traffic to push it through.
Yes, the German economy still heavily relies on the car industry.
And it’s not just leverage, they literally employ “consultants” (lobbyists) who draft bills which are then introduced into the legislative process and voted on by members of parlament who have neither the time nor the technical know-how to understand them. German car makers effectively write their own regulations.
I’ve heard a lot about how Germans are strict with their driving laws, but I never expected them to be straight boring for no good reason.
So should companies not try to innovate or invent things until the German government tells them it’s ok?
The point is that innovation should always come with regulations. This is not the wild west over here. We like to be alive and companies usually don’t care about that but only care about profits. So it’s a good idea that they can’t just do whatever they want. If they invent something actually new I’m quite happy that a third party will have a look at it before it’s mounted to a vehicle that kills me. I know that in the us this is handled the other way around but I guess the statistics for car accidents agrees with me.
This would not be illegal in the US, except some states forbid blue lights because they’re reserved for law enforcement. I haven’t seen any state regulation that rigorously defines “blue” like the NHTSA references to CIE 1931.
They would also have to be distinct enough to not cause confusion with the existing lights.
But I guess Mercedes already wrote that law for our government to copy. How convenient.
How dare a company try to work with governments to create a new safety feature!
How is this a safety feature though? Are they saying we have to be extra careful around self-driving cars? If so then the car shouldn’t be considered to be self-driving. If not, then what’s the use?
I see a lot of people in this thread saying a car that needs any kind of indication of self-driving isn’t safe enough to be on the road, but that implies a single answer to questions like “is it safe enough?” In reality, different people will answer that question differently and their answer will change over time. I see it as a good thing to try to accommodate people who view self-driving cars as unsafe even when they are street-legal. So it’s not really a safety feature from all perspectives, but it is from the perspective of people who want to be extra cautious around those cars.
Personally I see an argument for self-driving cars that aren’t as safe as a average human driver. It’s basically the same reason you sometimes see cars with warning signs about student drivers: we wouldn’t consider student drivers safe enough to drive except that it’s a necessary part of producing safe drivers. Self-driving cars are the same, except that instead of individual drivers, its self-driving technology that we expect to improve and eventually become safer than human drivers.
Another way to to look at it is that there are a lot of drivers who are below-average in their driving safety for a variety of reasons, but we still consider them safe enough to drive. Think of people who are tired, emotional, distracted, ill, etc. It would be nice to have the same warning lights for those drivers, but since that’s not practical, having them only for self-driving cars is better than nothing.
Different regulations apply for the driver when the car is autonomous vs controlled by a driver.
These lights do not indicate driving assists like Tesla’s autopilot but full level 3 and above autonomy. In level 3 for example, Mercedes is responsible for any damages due to accidents - not the driver.
Also in level 3 the driver may legally use their phone, which is illegal for a car driver normally and give them a ticket.
So there IS a legal requirement to find out about the autonomy level of a car from outside.