A look at the semi-autonomous road ahead
As the big debate on driverless cars revs up, Florida-based Eric D. Ruben, Senior Counsel, Clyde & Co. shares his insights on the current scenario and assess the risks moving forward…
When the US Transportation Secretary unveiled President Obama’s plan to provide a 10-year, US$4 billion investment in autonomous vehicle development earlier this year, America signaled its faith in technology to save us from ourselves.
Even with all of the incredible advancements in vehicle safety, the National Highway Traffic Safety Administration (NHTSA) reported 32,675 people died in automobile accidents in the US in 2014 and a staggering 2.34 million more suffered injuries. Further, an NHTSA survey conducted from 2005 to 2007 and presented to Congress in 2008 concluded that driver error was the critical reason for 94 per cent of all crashes involving light vehicles. That means that on average a person dies in a car crash in America about every 16 minutes and more than four people are injured in accidents every minute of every day; and, to be frank, the finger of blame can almost always be pointed in one direction – us.
Let’s set aside the overwhelming human carnage that has no doubt somehow affected every reader of this article. A 2011 AAA study found accidents in the US cost roughly US$300 billion dollars per year in deaths, health care and property loss (yes BILLION). That same study estimated the costs of traffic congestion including, for example, lost productivity and wasted gasoline, at US$100 billion. In total, the costs of car accidents in terms of injuries, death and human delays is roughly 2.6 per cent of the American GDP. In other words, if you are fortunate enough to be spared the anguish of injury or death to you or a loved one, each of us is still taking a major hit to our wallets.
But, all is not gloom, doom, and despair. The self-driving cars of futuristic Hollywood are already roaming the streets, learning our roads, signs, signals and habits to take the life and death responsibility of driving out of human hands.
It is no longer fantasy. One day in the not too distant future, drunk driving will be a thing of the past. Texting and driving will be a misnomer. The elderly and disabled will gain or regain the independence of unassisted transportation. Our auto insurance premiums will take a nosedive or even disappear. The cost of transporting goods will plummet. Car accident attorney referral services will fizzle, and personal injury lawyers will be standing on street corners with a cup. Okay, that last one is a bit of an exaggeration.
However, the car buyer’s options will not be as simple as full autonomy or no autonomy. The May 7, 2016 fatal crash involving a Tesla Model S with “Autopilot” introduced the first resounding thump on the rocky road that is semi-autonomy.
NHTSA has identified five levels of automation: Level 0 is no automation, Level 1 includes some function-specific automation, but the driver is still in overall control (like adaptive cruise control), Level 2 (like the Tesla Model S) includes combined function automation where drivers can cede limited control in certain situations, Level 3 includes limited complete self-driving with drivers still ready to regain control, and Level 4 is full automation (no steering wheel or pedals necessary).
It is the Level 2 and Level 3 vehicles that will likely trigger significant litigation and lead to other growing pains that could stagnate, disrupt, or even temporarily raise premiums and other costs.
The legal intrigue of Level 2 and Level 3 vehicles is not necessarily with the technology, but with the very concept of rotating responsibility between man and machine. The NHTSA definition of a Level 2 vehicle, for example, is chockfull of vague, translatable, and litigation-prone wording. It states that the driver can cede “active primary control in certain limited driving situations,” and the driver is still “responsible for monitoring the roadway and safe operation and is expected to be available for control at all times and on short notice.” Where can a trial lawyer begin? What are “certain limited driving situations?” Something tells me plaintiffs and defendants will disagree. Perhaps more ambiguous, what is “short notice?” Is it 30 seconds? Five? An instant?
One look at the 2016 Tesla Model S owner’s manual furthers this point. The Driver Assistance section includes 52 individual warnings and six cautions. Then, there is the “catch-all” informing the driver of ultimate responsibility: “Never depend on these components to keep you safe. It is the driver’s responsibility to stay alert, drive safely, and be in control of the vehicle at all times.” None of the foregoing should be read to suggest the Model S Autopilot technology isn’t an amazing feat and generally extremely safe when used properly. In fact, Tesla stated that the fatal event in May was the first in 130 million miles of autonomous test driving compared to one death every 94 million miles among all vehicles in the US.
But, plaintiffs’ lawyers are already salivating over potentially vague statements and arguable consumer expectations, not to mention the typical design and manufacturing defect claims that will be tailored to autonomous vehicles (e.g., autonomous parts out of specification, alleged software or algorithm defects, and the inevitable supposedly better alternate design). For example, Tab Turner is one of the country’s most successful vehicle defect plaintiff’s attorneys. Turner was recently quoted comparing the Autopilot system to an “attractive nuisance” where he argues it is dangerous to tell drivers they can cede control while simultaneously saying they better still be monitoring at all times.
Frankly, the very name “Autopilot” could suggest a total cessation of control more suitable for at least a Level 3 vehicle. Consumer Reports has encouraged Tesla to disable Autopilot until additional security is developed, saying the word “Autopilot” can give consumers a false sense of security. Google’s CEO of Self Driving Cars John Krafcik seems to generally agree as he recently stated Google abandoned development of Level 2 vehicles after finding test drivers were not paying attention even when told to do so.
As things stand today, Americans are not anxious to hand total control of the wheel to a machine. A University of Michigan study published in May of 2016 (the same month as the fatal Tesla crash) found almost half of respondents had no interest in self-driving cars, and only 15.5 per cent want a fully autonomous vehicle. Perhaps movies like Terminator and The Matrix have conditioned us to mistrust machines with our safety. Then, there is the general human desire to maintain even a false sense of control.
Regardless of the reason, we are likely to see Level 2 and Level 3 vehicles in higher numbers before Americans accept fully autonomous vehicles. That leaves insurers still unclear as to whether and to what extent drivers could be held responsible for crashes. We can then prepare for intricate, conflicting arguments of comparative negligence. Essentially, who was in control at the time of the accident, who should have been in control, did the vehicle perform properly, and was it properly maintained?
Autonomous vehicles, from Level 1 to Level 4, will save lives and save money. But, there should be caution in the semi-autonomous road ahead that could fill Americans with questions and doubt as we follow sensationalised stories for each crash leading ultimately to a future of litigation with juries listening to “he said, it said.”