Have you ever imagined being on a busy highway surrounded by student drivers? It’s a scary thought, isn’t it?  In certain US cities, it is a dangerous reality now that public roads are a testing ground for self-driving vehicles.  Leaders in the transportation industry have no choice but to be vocal on the matter.  The CEO of Uber Technologies, Dara Khosrowshahi, proclaimed “Ultimately, self-driving cars will be safer than humans.  But right now, self-driving cars are learning.  They’re student drivers.”

There are several companies developing self-driving technologies and they’re all operating differently.  Some companies, such as Tesla, GM, and Volvo, are testing limited self-driving technologies that assume that the human driver will continue to pay full attention to the road and be ready to take control of the vehicle. Others, such as Google’s Waymo and Cambridge-based nuTonomy, are testing vehicles that are designed to navigate without any assistance from a human driver.

These fully and semi-self-driving cars have been on American roads for a few years now, but recently have been making headlines at an increasing rate.  In the last month, two people have died in collisions involving semi-autonomous self-driving cars:  a California man was killed when his self-driving Tesla Model X collided with a roadside barrier, and an Arizona woman was struck by a self-driving Uber as she crossed the street.  While the causes of each crash remain under investigation, there is evidence that each driver was not paying attention to the road.

According to Tesla, their car had warned the California driver several times to place his hands on the wheel, and the driver had at least five seconds to correct himself before the collision.  Video has been released showing the driver in the Arizona crash had her hands off the wheel and was not looking at the road in the moments before she struck the pedestrian.

These crashes highlight the trouble with semi-self-driving cars:  they need an attentive driver.  But, since these vehicles are able to independently make many decisions on a driver’s behalf, it’s easy for a driver to get distracted.  The less a driver has to do, the less aware they are.  Too easily can a driver’s attention wander from the task at hand – safely driving. 

Fully-autonomous vehicles are even more experimental.  They are not yet available to the general public, but companies such as Uber, Google, and Volvo are vigorously developing the technology.  Since 2016, the Commonwealth of Massachusetts has worked with car manufacturers to push for more testing.  The City of Boston has a program that permits these cars to be tested in the Seaport and Fort Point neighborhoods.  That program was temporarily suspended after the Arizona crash.

Little regulatory oversight

As Uber’s CEO suggests, self-driving cars are similar to student drivers.  Unlike (human) student drivers, however, they are not heavily regulated.  With so many companies developing unique self-driving technologies, regulations concerning car safety will need revision.

The primary source of law relating to automobile design safety is the Federal Motor Vehicle Safety Standards (FMVSS).  These regulations apply to three categories:  crash avoidance, crashworthiness, and post-crash survivability.  The FMVSS regulates car safety features ranging from seat belts and child seat anchoring systems to electronic stability control and crash data recorders.  Though the FMVSS applies to many areas of automobile manufacturing, it does not (yet) include any regulations for the systems that allow self-driving cars to avoid collisions.

While we expect manufacturers to work with federal and state regulators to improve self-driving vehicle safety, it is likely many people will be injured by these experimental vehicles until then. In those cases, legal teams that specialize in motor vehicle negligence and products liability law will test the question of who is responsible for self-driving car crashes.

There is a demand for laws to govern self-driving vehicles (of any kind).  This isn’t an entirely new issue for the legislative and legal branches of government.  They’ve already adapted and applied the law to technologies such as airplane autopilot, surgical robots, and cruise control.

In the meantime, buckle up, and brace yourself for more “student drivers.” Cases involving self-driving vehicles are likely to involve complex technology, multiple parties, and blurry questions of fact.

If you or someone you know is injured in a crash involving a self-driving vehicle, SUGARMAN’s personal injury attorneys have experience in products liability and motor vehicle negligence cases.  Call us at 617-542-1000, email info@sugarman.com, or fill out a Contact Form.