Every three months, Tesla publishes a safety report showing the number of miles between crashes when drivers use the company’s driver assistance system, Autopilot, and the number of miles between crashes when they don’t.
These numbers always show that accidents are less frequent with Autopilot, a collection of technologies that can steer, brake and accelerate Tesla vehicles on their own.
But the numbers are misleading. Autopilot is mainly used for highway driving, which is generally twice as safe as driving on city streets, according to the Department of Transportation. There may be fewer crashes with Autopilot just because it is typically used in safer situations.
Tesla has not provided any data to compare Autopilot’s safety on the same types of roads. Nor do other automakers offer similar systems.
Autopilot has been operating on public roads since 2015. General Motors introduced Super Cruise in 2017 and Ford Motor released BlueCruise last year. But publicly available data that reliably measures the security of these technologies is scarce. American drivers — whether they use these systems or share the road with them — are effectively guinea pigs in an experiment whose results have not yet been revealed.
Automakers and tech companies are adding more vehicle features that they claim improve safety, but these claims are difficult to verify. All the while, the number of fatalities on the country’s highways and streets has soared in recent years, reaching a record 16 years in 2021. It appears that any additional safety provided by technological advances will not compensate for poor decisions made by drivers behind the wheel. .
“There is a lack of data that would give the public confidence that these systems, when deployed, will meet the expected safety benefits,” said J. Christian Gerdes, a professor of mechanical engineering and co-director of the Center for Stanford University. Automotive Research, the Department of Transportation’s first Chief Innovation Officer.
GM collaborated with the University of Michigan on a study that examined the potential safety benefits of Super Cruise, but concluded they didn’t have enough data to understand whether the system reduced accidents.
A year ago, the National Highway Traffic Safety Administration, the government’s regulatory agency for vehicle safety, ordered companies to report potentially serious accidents involving advanced driver assistance systems along the lines of Autopilot within a day of becoming aware of them. The warrant said the agency would make the reports public, but it has not yet done so.
The security agency declined to comment on the information it had gathered so far, but said in a statement the data would be released “in the near future.”
Tesla and its CEO, Elon Musk, did not respond to requests for comment. GM said it had reported two Super Cruise incidents to the NHTSA: one in 2018 and one in 2020. Ford declined to comment.
The agency’s data is unlikely to provide a complete picture of the situation, but it could encourage lawmakers and administrators to take a closer look at these technologies and ultimately change the way they are marketed and regulated.
“To solve a problem, you first have to understand it,” said Bryant Walker Smith, an associate professor in the University of South Carolina’s Law and Engineering Schools who specializes in emerging transportation technologies. “This is a way to get more ground truth as a basis for investigations, regulation and other actions.”
Despite its capabilities, Autopilot does not take away the responsibility of the driver. Tesla tells drivers to stay alert and be ready at all times to take control of the car. The same goes for BlueCruise and Super Cruise.
But many experts worry that because they allow drivers to relinquish active control of the car, these systems may trick them into thinking their car is driving itself. Subsequently, if the technology malfunctions or cannot handle a situation on their own, drivers may not be prepared to take control as quickly as necessary.
Older technologies, such as automatic emergency braking and lane departure warning, have long provided drivers with safety nets by slowing or stopping the car or alerting drivers when they deviate from their lane. But newer driver assistance systems reverse that arrangement by turning the driver into the technology safety net.
Safety experts are particularly concerned about Autopilot because of the way it is being marketed. For years, Mr. Musk has said the company’s cars were on the cusp of being truly autonomous — driving themselves in just about any situation. The name of the system also implies automation that the technology has not yet reached.
This can lead to driver complacency. Autopilot has played a role in many fatal accidents, in some cases because drivers were unwilling to take over the car.
Mr. Musk has long promoted Autopilot as a way to improve safety, and Tesla’s quarterly safety reports seem to back him up. But a recent study by the Virginia Transportation Research Council, a branch of the Virginia Department of Transportation, shows that these reports are not what they seem.
“We know that cars that use Autopilot are less likely to crash than those that do not use Autopilot,” said Noah Goodall, a researcher with the municipality who investigates safety and operational issues surrounding autonomous vehicles. “But do they drive the same way, on the same roads, at the same time of day, by the same drivers?”
How Elon Musk’s Twitter Deal Unfolded
Map 1 of 6
A blockbuster deal. Elon Musk, the world’s richest man, put an end to what seemed an unlikely attempt by the famed mercurial billionaire to buy Twitter for about $44 billion. Here’s how the deal unfolded:
The first offer. Mr. Musk made an unsolicited offer of more than $40 billion for the influential social network, saying he wanted to turn Twitter into a private company and that he wanted people to speak more freely about the service.
The Insurance Institute for Highway Safety, an insurance industry-funded nonprofit research organization, analyzes police and insurance data and found that older technologies such as automatic emergency braking and lane departure warning have improved safety. But the organization says studies have not yet shown that driver assistance systems provide comparable benefits.
Part of the problem is that police and insurance records don’t always indicate whether these systems were in use at the time of an accident.
The federal agency for vehicle safety has instructed companies to provide data on accidents where driver-assistance technologies were in use within 30 seconds of the collision. This could provide a broader picture of how these systems perform.
But even with that data, security experts said, it will be difficult to determine whether using these systems is safer than disabling them in the same situations.
The Alliance for Automotive Innovation, a trade group for auto companies, has warned that Federal Security Agency data could be misinterpreted or misrepresented. Some independent experts express similar concerns.
“My major concern is that we will have detailed data on accidents involving these technologies, without comparable data on accidents involving conventional cars,” said Matthew Wansley, a professor at New York’s Cardozo School of Law who specializes in emerging automotive technologies and previously general advisor at an autonomous vehicle start-up called nuTonomy. “It could look like these systems are a lot less secure than they actually are.”
For these and other reasons, automakers may be reluctant to share certain details with the agency. Under its command, companies can ask it to withhold certain data by claiming it would reveal trade secrets.
The agency also collects crash data on automated driving systems – more advanced technologies aimed at removing drivers from cars entirely. These systems are often referred to as ‘self-driving cars’.
For the most part, this technology is still being tested in a relatively small number of cars with drivers behind the wheel as backup. Waymo, a company owned by Google’s parent company Alphabet, operates a driverless service in the Phoenix suburbs, with similar services planned in cities such as San Francisco and Miami.
Companies are already required to report accidents involving automated driving systems in some states. The data from the federal security service, which cover the entire country, should also provide more insight in this area.
But the most immediate concern is the safety of Autopilot and other driver assistance systems, which are installed on hundreds of thousands of vehicles.
“There is an open question: Does Autopilot increase or decrease the crash rate?” said Mr. Wansley. “We may not get a complete answer, but we will get some useful information.”