This week’s question is from Marianne K. from Pacific Heights who asks: “I read your column last week about the first Uber robot-car pedestrian death in Arizona. Then, the next day there was a fatal crash involving a Tesla on “Autopilot.” What is going on? Who is responsible when a car on Autopilot crashes?”
Dear Marianne, this series of fatal tragedies is a huge wake-up call. The technology is not yet ready for mainstream introduction into our roads and highways. Driving is a complex, dynamic activity that utilizes all of our faculties. While carefully developed autonomous vehicles of the future may increase safety, current automated driving features may not be as safe as touted.
Sadly, this is not the first Autopilot death. In May of 2016, Joshua Brown, a former Navy SEAL, died near Williston, Florida when his Tesla Model S collided with a tractor-trailer while it was engaged in the “Autopilot” mode. The crash happened as the truck made a left turn across his path as he was going 74 mph. The Model S, which was relying heavily on cameras in its operation, did not recognize the trailer as it was white and cast across an overcast white sky. A major National Highway Traffic Safety Board (NTSB) investigation ensued resulting in the June 2017 issuance of a 500-page report on the crash. The NTSB found no system failures and reported that during a 37-minute period of the trip when Brown was required to have his hands on the wheel, he apparently did so for just 25 seconds. While dispelling the urban myth that Brown was watching a Harry Potter video on the Model S control panel, the report found that Brown was given a visual warning seven separate times that said “Hands Required Not Detected” for one to three seconds.
In September 2016, before the NTSB report came out, Tesla, having reviewed the data from the Brown crash, announced “improvements” in its Autopilot system adding new “restrictions” on the hands-off driving features and improvements in its use of radar that its chief executive officer said likely would have prevented the crash. The updated system was said to temporarily block drivers from using Autopilot if they did not respond to audible warnings to take back manual control. Now, it seems that the improvements were not enough to prevent another death.
Tesla has been criticized for its branding of the driver assistance package as an “Autopilot”. Both Consumer Reports and the German Government asked Tesla to stop using the Autopilot moniker as it portrayed a false sense of security leading to driver inattention and abdication of operational control. Indeed, German transport minister Alexander Dobrindt, asked Tesla to drop the term Autopilot, arguing it can lead consumers to think the car had greater abilities than it did. Tesla refused stating “Just as in an airplane, when used properly, Autopilot reduces driver workload and provides an added layer of safety when compared to purely manual driving.” While the mechanics of Tesla’s response may be true, they dodge the reality: drivers are treating the feature as if it was an autopilot that can assume control while the driver “multitasks.”
Tesla is already on the defense (employing a strong offense) releasing information from the data it records saying the driver Wei Huang should have had about five seconds, and 500 feet of unobstructed view of the concrete barrier, which he struck, before the crash. According to Tesla, Huang did not have his hands on the wheel for six seconds prior to the impact. Earlier in the drive, Tesla reports that Huang had been given multiple visual warnings and one audible warning to put his hands back on the wheel.
Tesla also released photos of a missing crash absorbing barrier which could have attenuated the impact. It seems that the barrier may have been removed following another crash and had yet to be replaced. In short Tesla is saying it was either Huang’s fault or Cal Trans’ fault.
Legally, the analysis will examine the relative fault associated with the collision. What is Tesla’s responsibility for its Autopilot failing to merge to the right or left at the Y in the highway and, instead, heading into a concrete wall faced with a reflective panel? Was it the reflective material used in the warning sign that may have interfered with Tesla’s emphasis on the use of radar following the Brown fatality in Florida? Was there a problem with the mapping software that may have been thrown off by a change in conditions associated with the barrier’s removal? Likewise, Mr. Huang’s conduct will be evaluated: was he unreasonable in his reliance of the system touted as an “autopilot?” Finally, what is CalTrans’ responsibility, if any, if it failed to replace a crash absorption system? Did their failure to act create a “dangerous condition of public property” as defined by Government Code Section 830?
As the technology is still in its infancy it is unclear who will bear responsibility. One thing is clear though, the courts will take a pivotal role, as they always have, in apportioning responsibility. Let’s just hope that not too many more people have to die before the courts or legislature work out the regulatory bugs.