Viewpoints

Why the “Hands-off” Approach to Automated Driving Systems?

Michael J. Quinn

June 9, 2021

I was shocked and dismayed when a Washington State legislator told me that cars controlled by automated driving systems needed no regulation because they would undoubtedly have fewer accidents than human-driven vehicles. Regulation and innovation can co-exist, and it is the responsibility of government to protect public safety. Granted, the introduction of automated driving systems will likely reduce the accident rate, but if proper safety regulations can reduce the accident rate even further, that would be a good outcome, and it has now become clear that proper regulation of the safety of automated driving systems is needed.

Let’s start with the good news: A well-designed automated driving system can have far fewer accidents than human drivers. The success of Waymo is a case in point. Waymo is now offering Waymo One, a fully autonomous taxi service in a geographically restricted area: the East Valley of Phoenix, Arizona. Waymo’s automated driving system, called Waymo Driver, logged more than 6 million miles of public road driving in 2019 without a single high severity collision (a collision where there is at least a 10% chance of someone being seriously hurt) [1].

Moreover, Waymo engineers recently published a study exploring how Waymo Driver could have changed the outcome of 72 fatal accidents involving two vehicles occurring in that same area near Phoenix in the ten-year period 2008-2017. For each accident, they simulated what would have happened if Waymo Driver had been controlling one of the two vehicles. According to the simulations, Waymo Driver would have avoided the collision in 100% of the scenarios where it replaced the crash initiator (the driver making the first unexpected maneuver leading to the collision). When Waymo Driver replaced the crash responder (the driver responding to the other driver’s unexpected maneuver), an accident would have been avoided in 82% of the test cases. In another 10% of the cases where Waymo Driver was the responder, the collision would have been less severe. For 8% of the crashes where it took the role of the responder, Waymo Driver would have been unable to change the outcome of the accident because its vehicle was being rear-ended [1]. Waymo’s simulation results are credible. According to the National Highway Traffic Safety Administration (NHTSA), human error is the major factor in 94% of all fatal crashes [2].

It makes sense, then, that the US Department of Transportation favors the safe development and deployment of automated vehicle technology [3]. What I question is the means the NHTSA is using to achieve that end. The NHTSA has chosen to take “a nonregulatory approach to automated vehicle technology safety” by issuing voluntary guidance for the development of automated driving systems (my emphases) [3]. Remarkably, a manufacturer requires no approval from the federal government before it begins testing or deploying automated driving systems in the United States.

In the years since the voluntary guidance was published, not every developer of an automated driving system has been as successful as Waymo. In particular, Tesla has not deployed its technology in a safe manner. In 2015 Tesla released an SAE Level 2 automation system called Autopilot, which consisted of traffic-aware cruise control, lane-keeping functionality, and the ability to change lanes. Tesla wrote on its Web page: “While truly driverless cars are still a few years away, Tesla Autopilot functions like the systems that airplane pilots use when conditions are clear. The driver is still responsible for, and ultimately in control of, the car” [4]. With that disclaimer, Tesla gave its vehicle owners a great deal of latitude to engage the system. Tesla did not restrict the use of Autopilot to freeways. Tesla allowed the driver to use Autopilot with a cruising speed well above the speed limit [5]. Most importantly, Tesla did not implement systems inside the vehicle to track whether the driver was paying attention to the road ahead. These decisions contributed to the May 2016 death of Joshua Brown, whose Tesla Model S, with Autopilot engaged, failed to brake when a semitrailer truck turned in front of the car in clear daylight on a divided highway in Florida [6].

More recently, on May 5, 2021, a Tesla crashed into an overturned semi-trailer truck on the 210 freeway in Fontana, California, killing Steven Hendrickson, who was in the Tesla, and seriously injuring a man helping the truck driver get out of his vehicle. Although it is unclear whether Autopilot was engaged at the time of the accident, Hendrickson was a member of a Tesla club who had posted videos of himself in the driver’s seat without his hands on the wheel or his foot on the pedal as the Tesla cruised down a freeway [7].

On May 15, 2021, a sheriff’s deputy in Washington State parked his patrol car on a road shoulder with its lights flashing, then got out of his car to investigate another vehicle that had run into a power pole. Less than a minute later, a Tesla Model S with Autopilot enabled crashed into his patrol car [8].

As these examples show, vehicles equipped with autonomous driving systems have killed or injured their drivers and innocent members of the public. People are more likely to adopt systems they trust [9]. Safety and reliability are widely recognized as important attributes of trustworthy AI systems [10, 11, 12, 13]. A recently published monograph, How Humans Judge Machines by César Hidalgo et al., provides evidence that people are less forgiving of accidents caused by machines than those caused by humans [14]. The experiments documented in this book reveal that people judge the actions of other people based on their intentions, but they judge the actions of machines based on their outcomes. Hence it is particularly important that automated driving systems demonstrate a high level of reliability and safety to win the trust of the public. The NHTSA should not leave it up to individual automobile manufacturers to decide how safe they want their products to be. Instead, all manufacturers should be required to meet minimum safety standards.

The National Transportation Safety Board (NTSB) has investigated several Tesla accidents. In a remarkable letter written to the NHTSA in February 2021, Robert L. Sumwalt, III, the Chairman of the NTSB, calls the NHTSA’s nonregulatory approach to ensuring safety “misguided” [15]. The letter questions whether Autopilot-equipped vehicles should even be on the road. It makes the following recommendation to the NHTSA: “Evaluate Tesla Autopilot-equipped vehicles to determine if the system’s operating limitations, the foreseeability of driver misuse, and the ability to operate the vehicles outside the intended operational design domain pose an unreasonable risk to safety” [15]. The letter recommends that the NHTSA develop standards for systems that monitor driver engagement as well as a method to verify that vehicles have safeguards preventing automated vehicle control systems being used outside the conditions for which they were designed. The NTSB further recommends that once these standards are developed, the NHTSA should require that driver monitoring systems be included in new passenger vehicles with SAE Level 2 automation.

The NTSB’s recommendations are reasonable; the required technology is already in use. For example, Cadillac’s Super Cruise system appears to meet the standards proposed by the NTSB. Super Cruise can only be used on divided, limited access highways, and it incorporates a Driver Attention Camera system designed to ensure the driver is looking at the road ahead [16]. Independent testing organization Consumer Reports concluded that Cadillac’s Super Cruise system “does the best job of balancing high-tech capabilities with ensuring that the car is operated safely and that the driver is paying attention” [17].

The NHTSA’s hands-off approach to automated driving systems is wrong. We do not need to choose between innovation and regulation; we can have both. It is naïve to expect that companies will always do the right thing, and the NHTSA’s nonregulatory approach has led to unnecessary injuries and deaths. The NTSB recommendations are reasonable and will reduce accidents, increase public trust in automated driving systems, and help ensure there will not be an accident-fueled backlash when more cars equipped with these systems hit the road.

 

[1] John M. Scanlon, Kristofer D. Kusano, Tom Daniel, Christopher Alderson, Alexander Ogle, and Trent Victor. “Waymo Simulated Driving Behavior in Reconstructed Fatal Crashes within in Autonomous Vehicle Operating Domain.” Waymo. 2021. https://storage.googleapis.com/waymo-uploads/files/documents/Waymo-Simulated-Driving-Behavior-in-Reconstructed-Collisions.pdf

[2] National Highway Traffic Safety Administration. “Critical Reasons for Crashes Investigated in the National Motor Vehicle Crash Causation Survey.” DOT HS 812 115. US Department of Transportation. February 2015. https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812115

[3] National Highway Transportation Safety Administration. Automated Driving Systems 2.0: A Vision for Safety. US Department of Transportation. https://www.nhtsa.gov/sites/nhtsa.gov/files/documents/13069a-ads2.0_090617_v9a_tag.pdf

[4] “Your Autopilot Has Arrived.” Tesla Motors. October 14, 2015. https://www.tesla.com/blog/your-autopilot-has-arrived

[5] Alice Truong. “Elon Musk Is Going to Pull Back on Autopilot Mode to Keep Tesla Drivers from ‘Doing Crazy Things.’” Quartz. November 5, 2015. https://qz.com/542618/elon-musk-is-going-to-pull-back-on-autopilot-mode-to-keep-tesla-drivers-from-doing-crazy-things/

[6] “Collision Between a Car Operating with Automated Vehicle Control Systems and a Tractor-Semitrailer Truck Near Willison, Florida May 7, 2016.” National Transportation Safety Board. NTSB/HAR-17/02, PB2017-102600. September 12, 2017. https://www.ntsb.gov/investigations/AccidentReports/Reports/HAR1702.pdf

[7] Daisy Nguyen. “Tesla Driver in Fatal California Crash Had Posted Videos of Himself in Vehicle.” Los Angeles Times. May 16, 2021. https://www.latimes.com/california/story/2021-05-16/tesla-driver-in-fatal-california-crash-had-post-videos-of-himself-in-vehicle

[8] Kate Duffy. “A Tesla Running on Autopilot Smashed into a Deputy’s Patrol Vehicle in Washington State, Police Say, Causing ‘Significant Damage.’” Business Insider. May 18, 2021. https://www.businessinsider.com/tesla-autopilot-crash-model-s-deputy-vehicle-washington-state-damage-2021-5

[9] Microsoft. The Future Computed: Artificial Intelligence and Its Role in Society. Redmond, Washington. February 8, 2018. https://blogs.microsoft.com/wp-content/uploads/2018/02/The-Future-Computed_2.8.18.pdf

[10] “Responsible AI” (webpage). Microsoft Corporation. https://www.microsoft.com/en-us/ai/responsible-ai. Accessed June 1, 2021.

[11] “Artificial Intelligence at Google: Our Principles” (webpage). Google. https://ai.google/principles/. Accessed June 1, 2021.

[12] The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems. Ethically Aligned Design: A Vision for Prioritizing Human Well-being with Autonomous and Intelligent Systems, First Edition. IEEE, 2019. https://standards.ieee.org/content/ieee-standards/en/industry-connections/ec/autonomous-systems.html

[13] Pontifical Academy for Life, Microsoft, IBM, FAO, and Italian Ministry of Innovation. “Rome Call for AI Ethics.” Vatican City. February 28, 2020. https://www.romecall.org/wp-content/uploads/2021/02/AI-Rome-Call-x-firma_DEF_DEF_con-firme_.pdf

[14] César A. Hidalgo, Diana Orghiain, Jordi Albo Canals, Filipa de Almeida, and Natalia Martin. How Humans Judge Machines. The MIT Press. 2021. https://www.judgingmachines.com/

[15] Robert L. Sumwalt, III, Chair, National Transportation Safety Board. Letter to US Department of Transportation. February 1, 2021. https://downloads.regulations.gov/NHTSA-2020-0106-0617/attachment_1.pdf

[16] “Designed to Take Your Hands and Breath Away (webpage). https://www.cadillac.com/ownership/vehicle-technology/super-cruise. Accessed June 6, 2021.

[17] Patrick Olsen. “Cadillac Tops Tesla in Consumer Reports’ First Ranking of Automated Driving Systems” (webpage). Consumer Reports. October 4, 2018. https://www.consumerreports.org/autonomous-driving/cadillac-tops-tesla-in-automated-systems-ranking/