Understanding Liability Rules in Car Accidents Caused by Autopilot Systems

February 24th, 2017

Car manufacturers are currently hard at work on technology for self-driving vehicles. The hope is that soon cars sold to the public will be able to drive themselves around. These cars are likely to be marketed to the public as a safer alternative to standard cars, since many manufacturers suggest car accidents will become less frequent when machines are in control instead of people and human error is eliminated.

As the technology for driverless cars advances, pressing questions will arise regarding who exactly is liable if an accident happens. If an autopilot system promises a motorist that the car can operate the vehicle for the driver, it could become the responsibility of the autopilot system to avoid accidents.

If that system fails, the manufacturer could potentially be held liable for losses either instead of the driver or in addition to the driver.

Scientific American suggests that it is very clear car manufacturers will one day become legally liable for a crash caused by a self-driving vehicle. Scientific America explains that: ” When a computerized driver replaces a human one, experts say the companies behind the software and hardware sit in the legal liability chain-not the car owner or the person’s insurance company. Eventually, and inevitably, the carmakers will have to take the blame.”

When a car accident happens, usually a driver is at fault. However, if car makers say that technology will take over and machines will be in charge, then it stands to reason that a failure of the technology must happen for a car accident to take place. Product liability laws make product makers responsible for harm caused when a product is used as intended but injuries happen. These laws could dictate that crashes are caused by defects in the autopilot system and thus a manufacturer is responsible.

One of the key issues will come down to whether the car-maker breached its duty to other motorists when a crash happened, or whether it was only the driver who failed to live up to an obligation. The answer to this question could change as the technology advances and court rulings and statutory laws begin to develop a protocol for treating crashes caused by driverless cars.

Currently, many of these driverless cars do not yet claim they can fully take over the driving experience. LA Times  reported on a National Highway Traffic Safety Administration review of a fatal accident involving a Tesla vehicle that had Autopilot engaged, and the article made clear that Tesla’s on-screen instructions and autopilot system both placed responsibility for avoiding accidents squarely on the driver.

National Highway Traffic Safety Administration (NHTSA) investigated the fatal Tesla accident, as well as other less serious accidents, and determined no recall was needed. NHTSA warned that drivers continue to remain the responsible person in a self-driving car, and thus must be vigilant to avoid causing accidents.[/et_pb_text][/et_pb_column][/et_pb_row][/et_pb_section]