The Impact of Self-Driving Cars

Technological changes in the auto industry have taken several giant leaps forward in the last decade, including lane change warning systems, emergency braking, and self-driving cars which may only be a year of two away from hitting the roads in huge numbers.  So what does that mean in terms of consumer safety and liability?

The National Highway Traffic Safety Administration (NHTSA) and Society of Automotive Engineers say there are six levels of driving automation. Zero denotes full human control and five is a fully autonomous vehicle.

Bryant Walker Smith, an assistant law professor at the University of South Carolina, says that today’s level of automation is at a two or below, which puts the driver at fault in the event of a car crash. “Anything that’s below level three, it’s clearly a human that’s supposed to be doing part of the driving,” he adds.

As more manufacturers increase the production of autonomous vehicles, the auto industry will likely bear a higher burden of liability. In the 2016, attorneys filed for a class action suit in a U.S. District Court in California against Tesla Motors claiming the manufacturer’s Autopilot system was “flawed” and resulted in the first known fatality involving a self-driving vehicle.

According to news reports, Joshua Brown, a 40 year old man from Ohio, lost control of his Tesla Model S car when the autopilot failed to stop him from plowing into a tractor-trailer on a Florida highway.
After a yearlong investigation, the NHTSA concluded that the Tesla Autopilot system capable of automatically steering and controlling a car had “played a major role” in the crash. The agency also said the Autopilot system had performed as intended, but lacked safeguards to prevent drivers from using it improperly.

Auto industry experts predict that driver-assisted cars will likely reduce human error, making driving safer in the future.  Only time will tell.

While it is not uncommon for consumers to test the limits of new technology devices, drivers may want to think twice before blindly trusting autonomous vehicles.

In January, the NHTSA’s report to its board said that Tesla Motor’s Autopilot-enabled vehicles did not need to be recalled, which could be interpreted as a victory for the automaker. However, the inquiry focused only on the question of whether any flaws in the system had led to the crash; it found no such flaws, despite the fact that the system failed to distinguish between the rear of the tractor-trailer and the sky.  Apparently, the technology may not quite be there for situations similar to this.

Additionally, the NHTSA report indicated that Mr. Brown, the driver was able to use the system on a road for which it was not designed.

Even though the government has opened a clearer path for automakers to develop driverless cars, manufacturers will still need to provide proper consumer warnings.

Chairman of the NHTSA, Robert L. Sunwalt, referring to the Tesla case said, “The combined effects of human error and the lack of sufficient system controls resulted in a fatal collision that should not have happened.”  [New York Times, September 2016]

When taking legal action against any manufacturer for a product defect, many state courts consider the consumer’s reasonable expectation of the product.

Vanderbilt Law School professor, W. Kip Viscusi, put it this way.”A reasonable consumer might expect [autopilot] to work better, that you wouldn’t be crashing into a semi that crossed the highway.” [CNN Money, July 2016]

Self-driving technology is not like the spell check feature on your word processing program, where you can go back, find and then correct your mistake. Do we really want to rely solely on a computer to make life and death decisions on the roads and highways for us?  Remember, humans are still involved in designing and manufacturing these vehicles and the computers and sensors that help to operate them.  Judgment is a human trait that is hard to install in a computer.

Don’t get me wrong, I am all in favor of technological advances. On the other hand, as an attorney specializing in auto accidents and product liability cases, I am concerned that self-driving vehicles may create more careless drivers who put too much faith into a computer.

The Transportation Department recently unveiled voluntary guidelines for testing autonomous vehicles as part of a broader government effort to encourage automakers’ development of self-driving technology. Under these guidelines, it will be left to automakers and other companies to decide whether to submit safety reviews to federal regulators.

Currently, it’s unclear how many crashes have happened while autopilot was activated. A Tesla driver who crashed on a Pennsylvania highway said autopilot was active, but Tesla has not conceded this. Safety regulators are looking into that incident as well.