Expert: “Drivers rely too much on semi-autonomous driving assistants”
Almost all cars today have semi-autonomous driving assistants, e.g. B. an active cruise control with lane departure warning. According to an American security expert, these systems are advertised incorrectly so that people rely on them too much.
Tesla needs to update more than 360,000 cars in the United States, according to the National Highway Traffic Safety Administration (NHTSA). The American RDW is not satisfied with Tesla’s Full Self-Driving Beta system.
Tesla Full Self-Driving is only semi-autonomous
FSD Beta is a collection of semi-autonomous driving assistants and does not turn a Tesla into a self-driving car. It’s long been clear that the system isn’t working optimally (see all the Youtube near-miss videos), but NHTSA is only now getting started.
Recall for over 360,000 Teslas with FSD
According to the organization’s website, “FSD Beta may behave unsafely at intersections, such as driving straight from a pre-sorting section, not stopping at a stop sign, and ignoring an orange light.”
NHTSA continues, “Also, the system does not respond appropriately to a speed limit change and does not respond appropriately to driver intervention to slow.”
Elon Musk does not recognize his own hypocrisy
Tesla boss Elon Musk freaked out on Twitter. He calls it unacceptable that NHTSA is talking about a recall. Finally, the cars can be updated over-the-air. Well, full self-driving isn’t self-driving either, so the man shouldn’t complain…
“People rely too much on systems”
Anyhow, the recall/over-the-air update of hundreds of thousands of Teslas highlights an issue in how semi-autonomous systems are being developed and promoted, said David Harkey, director of the Insurance Institute for Highway Safety (IIHS).
Research from the IIHS shows that drivers who regularly use Level 2 ADAS systems (semi-autonomous assistants) forget that they are not driving a fully autonomous car.
“The systems need a driver who is alert and ready to intervene at all times,” says Harkey. “Our research shows that drivers overestimate the systems despite being warned and aware of Tesla Autopilot accidents.”
Harkey emphasizes that there is currently no system that allows the driver to take their attention off the road and do other things. Modern driving assistants cannot replace humans.
“Autopilot and fully autonomous driving are misleading”
In addition, the security expert calls the terms “autopilot” and “full self-driving” incredibly misleading. He also thinks Tesla isn’t adequately protecting the driver from himself. It’s too easy not to pay attention to the road, he says.