Last fall, Missy Cummings sent a paper to her colleagues at the National Highway Traffic Safety Administration that revealed a startling trend: When people using advanced driver assistance systems are killed or injured in a car crash, it’s more likely they have been speeding than people driving cars on their own.
The two-page analysis of nearly 400 accidents involving systems like Tesla’s Autopilot and General Motors’ Super Cruise is far from conclusive. But it raises new questions about the technologies that have been installed in hundreds of thousands of cars on American roads. Dr. Cummings said the data indicated that drivers were placing too much trust in the systems’ capabilities and that automakers and regulators should restrict when and how the technology was used.
People “rely too much on technology,” he said. “They are letting the cars speed up. And they are having accidents that seriously injure or kill them.”
Dr. Cummings, a professor of engineering and computer science at George Mason University who specializes in autonomous systems, recently returned to academia after more than a year at the security agency. On Wednesday, he will present some of his findings at the University of Michigan, a short drive from Detroit, the main center of the American auto industry.
Systems like Autopilot and Super Cruise, which can steer, brake and accelerate vehicles on their own, are becoming increasingly common as automakers compete to win over car buyers with promises of superior technology. Companies sometimes market these systems as making self-driving cars. But its legal fine print requires drivers to remain vigilant and ready to take control of the vehicle at any time.
In interviews last week, Dr. Cummings said automakers and regulators should prevent such systems from operating above the speed limit and require drivers using them to keep their hands on the wheel and their eyes on the wheel. road.
“Car companies, meaning Tesla and others, are marketing this as hands-free technology,” he said. “That’s a nightmare.”
But these are not measures that NHTSA can easily implement. Any effort to control how driver assistance systems are used is likely to face criticism and lawsuits from the auto industry, especially from Tesla and its CEO, Elon Musk, who has long chafed at the rules that considered outdated.
Security experts also said the agency was chronically underfunded and lacked enough qualified staff to do its job properly. The agency has also operated without a Senate-confirmed permanent leader for much of the past six years.
Dr. Cummings acknowledged that the rules she requested would be difficult to implement. She said she also knew her comments could again inflame supporters of Musk and Tesla, who attacked her on social media and sent her death threats after she was named a senior adviser to the security agency.
But Dr. Cummings, 56, one of the first female combat pilots in the Navy, said she felt compelled to speak out because “humans abuse technology.”
“We need to put regulations in place that deal with this,” he said.
The security agency and Tesla did not respond to requests for comment. GM pointed to studies it had conducted with the University of Michigan that examined the safety of its technology.
Because autopilot and other similar systems allow drivers to relinquish active control of the car, many safety experts worry that the technology would lead people to believe that cars drive themselves. When technology malfunctions or can’t handle situations like swerving around stuck vehicles, drivers may not be ready to take control fast enough.
The systems use cameras and other sensors to check whether the driver’s hands are on the wheel and whether their eyes are looking at the road. And they will disengage if the driver is not attentive for a significant amount of time. But they work in sections when the driver is not concentrating on driving.
Dr. Cummings has long warned that this can be a problem, in academic articles, in interviews, and on social media. She was named NHTSA’s chief safety adviser in October 2021, shortly after the agency began collecting crash data involving cars using driver assistance systems.
Mr. Musk responded to his quote in a post on twitter, accusing her of being “extremely biased against Tesla”, without citing any evidence. This triggered a flurry of similar statements from her followers on social media and in emails to Dr. Cummings.
He said that he eventually had to close his Twitter account and temporarily leave his home due to the harassment and death threats he was receiving at the time. One threat was serious enough that it was investigated by police in Durham, NC, where she lived.
Many of the claims were absurd and false. Some of Musk’s supporters noted that she served as a member of the board of directors of Veoneer, a Swedish company that sells sensors to Tesla and other automakers, but mistook the company for Velodyne, an American company whose laser sensor technology , called lidar, is seen. as a competitor to the sensors Tesla uses for Autopilot.
“We know you own lidar companies and if you accept the NHTSA advisory position, we will kill you and your family,” an email to him said.
Jennifer Homendy, who heads the National Transportation Safety Board, the agency that investigates serious car accidents, and who has also been attacked by Musk fans, told CNN Business in 2021 that the false claims about Dr. Cummings were a “calculated attempt to divert attention from real safety issues.”
Before joining NHTSA, Dr. Cummings left the Veoneer board, sold her shares in the company, and recused herself from the agency’s investigations involving only Tesla, one of which was announced prior to her arrival.
The analysis he sent to agency officials in the fall looked at advanced driver assistance systems from several companies, including Tesla, GM and Ford Motor. When cars using these systems were involved in fatal crashes, they were traveling over the speed limit 50 percent of the time. In serious injury crashes, they were speeding 42 percent of the time.
In crashes that did not involve driver assistance systems, those numbers were 29 and 13 percent.
The amount of data the government has collected on accidents involving these systems is still relatively small. Other factors could be biasing the results.
Advanced driver assistance systems are used much more frequently on highways than on city streets, for example. And the accident data that Dr. Cummings analyzed is dominated by Tesla, because his systems are used more than others. This could mean that the results are unfairly reflected in the performance of systems offered by other companies.
During his time at the federal safety agency, he also examined so-called phantom braking, which is when driver assistance systems cause cars to slow or stop for no apparent reason. Last month, for example, the news site The Intercept released footage of a Tesla vehicle that inexplicably braked in the middle of the Bay Bridge connecting San Francisco and Oakland, causing an eight-car pileup that injured nine people, including a 2-year-old child.
Dr Cummings said data from automakers and customer complaints showed this was a problem with multiple driver assistance systems and robotaxis developed by companies including Waymo, owned by Google’s parent company, and Cruise, a division of GM Now undergoing testing in several cities, these autonomous taxis are designed to operate without a driver and carry passengers in San Francisco and the Phoenix area.
Apparently, many crashes occur because the people behind these cars aren’t prepared for those erratic stops. “Cars are braking in a way that people don’t anticipate and can’t respond to,” she said.
Waymo and Cruise declined to comment.
Dr. Cummings said the federal enforcement agency should work with automakers to restrict advanced driver assistance systems using its standard withdrawal process, in which companies agree to make changes voluntarily.
But experts questioned whether automakers would make such changes without a significant fight.
The agency could also set new rules that explicitly control the use of these systems, but this would take years and could lead to lawsuits.
“NHTSA could do this, but would the courts uphold it?” said Matthew Wansley, a professor at New York Yeshiva University Cardozo School of Law who specializes in emerging automotive technologies.
Dr. Cummings said robotic taxis were coming at the right pace: After limited testing, federal, state and local regulators are reining in their growth until the technology is better understood.
But, he said, the government must do more to ensure the safety of advanced driver assistance systems like Autopilot and Super Cruise.
NHTSA “needs to flex its muscles more,” he said. “You shouldn’t be afraid of Elon or moving the markets if there is obvious unreasonable risk.”