Over the course of 10 months, nearly 400 car accidents in the United States involved advanced driver assistance technologies.
In 392 incidents cataloged by the National Highway Traffic Safety Administration from July 1 last year to May 15, six people died and five were seriously injured. Teslas working with Autopilot, the more ambitious Full Self Driving mode or one of its component features were in 273 crashes.
The revelations are part of a sweeping effort by the federal agency to determine the safety of advanced driving systems as they become more common. In addition to the futuristic allure of self-driving cars, dozens of automakers have rolled out automated components in recent years, including features that allow you to take your hands off the wheel under certain circumstances and assist you with parallel parking.
In Wednesday’s release, the NHTSA announced that Honda vehicles were involved in 90 incidents and Subarus in 10. Ford Motor, General Motors, BMW, Volkswagen, Toyota, Hyundai and Porsche each reported five or fewer.
“These technologies hold promise for improving safety, but we need to understand how these vehicles perform in real-world situations,” said Steven Cliff, the agency’s administrator. “This will help our researchers quickly identify potential defect trends that are emerging.”
Speaking to reporters ahead of Wednesday’s publication, Dr. Cliff also declined to draw any conclusions from the data collected so far, noting that it doesn’t take into account factors such as the number of cars from each manufacturer that are on the road and equipped with this type of technology.
“The data may raise more questions than it answers,” he said.
About 830,000 Tesla cars in the United States are equipped with Autopilot or other company driver assistance technologies, explaining why Tesla vehicles were responsible for nearly 70 percent of reported accidents.
Ford, GM, BMW and others have similarly advanced systems that allow hands-free driving on highways under certain conditions, but far fewer of those models have been sold. However, these companies have sold millions of cars equipped with individual components of driver assistance systems over the past two decades. Components include lane keeping, which helps drivers stay in their lane, and adaptive cruise control, which maintains a car’s speed and brakes automatically when traffic slows down in front of them.
dr. Cliff said the NHTSA would continue to collect data on crashes involving these types of features and technologies, noting that the agency would use it as a guide in creating rules or requirements for how they should be designed and used.
The data was collected under an order issued by the NHTSA a year ago requiring automakers to report accidents involving cars equipped with advanced driver assistance systems, also known as ADAS or Level-2 automated driving systems.
The order was prompted in part by crashes and fatalities in the past six years in which Teslas operated on Autopilot. Last week, the NHTSA expanded an investigation into whether Autopilot has technological and design flaws that pose safety risks. The agency has investigated 35 crashes that occurred while Autopilot was activated, including nine that resulted in the deaths of 14 people since 2014. It had also launched a preliminary investigation into 16 incidents in which Teslas under Autopilot crashed into emergency vehicles that stopped and had their lights flashing.
In the injunction issued last year, the NHTSA also collected data on accidents or incidents involving fully automated vehicles that are largely still under development but being tested on public roads. The manufacturers of these vehicles include GM, Ford and other traditional automakers, as well as technology companies such as Waymo, which is owned by Google’s parent company.
These types of vehicles were involved in 130 incidents, the NHTSA found. One resulted in a serious injury, 15 in minor or moderate injuries, and 108 resulted in no injuries. Many of the accidents involving automated vehicles resulted in fender benders or bumpers because they are mainly used at low speeds and in urban areas.
Waymo, which operates a fleet of driverless cabs in Arizona, was part of 62 incidents. GM’s Cruise division, which has just started offering driverless taxi rides in San Francisco, was involved in 23. vehicles to correct software.
The NHTSA’s order was an unusually bold move for the regulator, who in recent years has come under fire for not being more assertive with automakers.
“The agency is collecting information to determine whether these systems in the field pose an unreasonable risk to safety,” said J. Christian Gerdes, a mechanical engineering professor and director of Stanford University’s Center for Automotive Research.
Tesla’s Autopilot System Problems
Map 1 of 5
Safer Driving Claims. Tesla cars can use computers to handle certain aspects of driving, such as changing lanes. But there are concerns that this driver assistance system, called Autopilot, is not safe. Here the issue is taken a closer look.
Driver assistance and crashes. A crash in 2019 that killed a student shows how gaps in Autopilot and driver distraction can have tragic consequences. In another crash, a Tesla hit a truck, leading to the death of a 15-year-old California boy. His family sued the company, claiming the Autopilot system was partially responsible.
Shortcuts with security? Former Tesla employees said the automaker may have undermined safety when designing its Autopilot driver assistance system to fit the vision of chief executive Elon Musk. Mr. Musk is said to have insisted that the system relies solely on cameras to track a car’s surroundings, rather than using additional sensor equipment as well. Other self-driving vehicle systems usually take that approach.
Information gap. A lack of reliable data also hinders assessments of the security of the system. Reports published every three months by Tesla suggest that accidents with Autopilot are less common than without, but the numbers can be misleading and do not take into account the fact that Autopilot is mainly used for highway driving, which is about the generally twice as safe as driving in the city. streets.
An advanced driver assistance system can independently steer, brake and accelerate vehicles, but drivers must remain alert and ready to take control of the vehicle at all times.
Safety experts are concerned because these systems allow drivers to relinquish active control of the car and give them the impression that their car is driving itself. When the technology malfunctions or cannot handle a particular situation, drivers may not be prepared to quickly take control.
The NHTSA’s order required companies to provide data on crashes within 30 seconds of the collision when advanced driver assistance systems and automated technologies were in use. While this data provides a broader picture of the behavior of these systems than ever before, it is still difficult to determine whether they reduce accidents or otherwise improve safety.
The agency has not collected any data that would allow researchers to easily determine whether using these systems is safer than disabling them in the same situations.
“The question: what is the baseline against which we compare this data?” said Dr. Gerdes, the Stanford professor, who served as the first Chief Innovation Officer for the Department of Transportation, of which NHTSA is a part, from 2016 to 2017.
But some experts say the goal shouldn’t be to compare these systems to human driving.
“When a Boeing 737 falls from the sky, we don’t ask, ‘Does it fall more or less from the sky than other planes?'” said Bryant Walker Smith, an associate professor at the University of South Carolina’s Law and Technical Schools who specializes in in emerging transportation technologies.
“Accidents on our roads are equivalent to several plane crashes per week,” he added. “Comparing is not necessarily what we want. If there are crashes that these drive systems contribute to — crashes that otherwise wouldn’t have happened — that’s a potentially solvable problem that we need to be aware of.”