Ideally, vehicle automation will someday make roads safer, perhaps even accident free. Volvo, for example, has a goal of entirely eliminating serious injuries and deaths in its new automated vehicles by the year 2020. It seems reasonable that if human error is removed from the driving equation, accidents will occur less frequently.
Unfortunately, self-driving cars have not yet reached that point. Since autonomous capabilities are still in development, test cars occasionally crash. Human judgment, faulty as it may sometimes be, is still needed more often than not in order to prevent collisions.
When a self-driving car accident does happen, who is responsible? Is it the human who is behind the wheel, if there is one? Or does the responsibility lie with the developers who put the autonomous car on the road?
Autonomous Car Crashes in the News
As self-driving cars become have become more common, so too has the number of incidents involving these vehicles. In late 2017 and early 2018, there were several reported accidents involving self-driving vehicles. These news reports serve as a sobering reminder that autonomous technologies still have a long way to go before cars will be able to take over all driving responsibilities.
For example, a self-driving Chevy Bolt grazed the side of a motorcycle while traveling on a California road in December 2017. As the Bolt was changing lanes in autonomous mode, the motorcycle was hit in the process. The cycle’s rider sustained an injury.
In January 2018, a stopped firetruck was hit by a Tesla Model S that was in Autopilot mode. Neither the driver nor the automated system braked for the firetruck. Even though the car was traveling at 65 mph, no one was hurt in that crash.
A March 2018 crash, however, left a pedestrian dead. The fatality occurred when a self-driving Uber car in Arizona struck a woman who was walking nearby. Although the vehicle was using its self-driving mode, it was occupied by an Uber employee who was riding in the driver’s seat.
Responsibility at Various Levels of Autonomy
Not all autonomous cars have the same capabilities. An automobile is rated at a level between 0 and 5 to describe its degree of autonomy.
The Level 2 designation applies to vehicles that have more than one system that lends a hand to drivers. However, the drivers are still ultimately responsible for the decisions that are made. They can’t check out while behind the wheel because they may need to take full control at any minute.
Human intervention lessens at Levels 3 and above. Nevertheless, Level 3 cars still require the driver to be ready to step in at a moment’s notice. A Level 4 car can handle all of the steering, braking, and decision-making in ideal driving environments. At Level 5, the vehicle operates with complete autonomy.
In general, currently available autonomous cars are rated Level 2. Audi is starting to introduce Level 3 cars, and Waymo has been testing Level 4 vehicles. It’s worth noting, however, that most autonomous car testing is still being done with a human on board. When dangerous circumstances emerge, the driver is generally responsible for intervening to prevent an accident.
Navigating Responsibility
With that in mind, both now and in the future, who should bear responsibility when a self-driving car is in a crash?
It seems that in recent crashes, the driver behind the wheel of the cars should carry at least some of the responsibility. In fact, the motorcyclist in the Bolt crash presented the car manufacturer with a lawsuit. This is the first American case of a lawsuit being brought against the maker of an autonomous car, but it surely won’t be the last.
Even automakers seem to understand that they bear some responsibility when their self-driving models are involved in accidents. Back in 2015, Volvo stepped up and said that it would cover the costs incurred by vehicles using its IntelliSafe system. This sort of arrangement is a gamble that automakers are willing to take because, if all goes well, the overall number of accidents should significantly decrease once autonomous cars are the norm.
For now, though, there is an air of ambiguity surrounding who bears ultimate responsibility when self-driving car crashes occur. Legislation has yet to catch up with current automotive technologies. The federal government has passed the responsibility on to individual states, which leaves state legislators to evaluate current policies, consider insurance issues, and pass specific regulations.
In the meantime, this is far from a black-and-white matter. Therefore, the responsibility for accidents involving self-driving cars must be determined on a case-by-case basis.