Who’s accountable if a self-driving automotive has an accident?

0

As self-driving cars gain momentum in today’s automotive landscape, the issue of legal liability in the event of an accident has become more relevant.

Studies of the interaction between humans and vehicles have shown time and again that even systems for automating driving – such as adaptive cruise control, which keeps the vehicle at a certain speed and distance from the car in front – are anything but error-free.

Recent evidence suggests that drivers have limited understanding of what these systems can and cannot do (also known as mental models), which contributes to system abuse.

In the video above, a webinar on the dangers of advanced driver-assisted systems.

There are many issues worrying the self-driving car world, including imperfect technology and the lukewarm public acceptance of autonomous systems. There is also the question of legal liability. In particular, what are the legal responsibilities of the human driver and the automaker who built the self-driving car?

Trust and Accountability

In a study recently published in Humanities and Social Science Communications, the authors deal with the problem of excessive driver trust and the resulting system abuse from a legal perspective. They are examining what self-driving car manufacturers should legally be doing to ensure drivers understand how to use the vehicles appropriately.

One solution suggested in the study is to require buyers to sign end-user license agreements (EULAs), similar to the terms that an agreement would require when using new computer or software products. To get consent, manufacturers may use the ubiquitous touchscreen installed on most new vehicles.

The problem is that this is far from ideal or even safe. And the interface may not provide enough information to the driver, creating confusion about the nature of the consent forms and their implications.

The problem is, most end users don’t read EULAs: A 2017 Deloitte study shows that 91% of people agree to them without reading. In young people, the percentage is even higher. 97% agree without checking the terms.

In contrast to using a smartphone app, operating a car involves considerable safety risks, regardless of whether the driver is human or software. Human drivers must agree to take responsibility for the software and hardware results.

“Warning drowsiness” and distracted driving are also a cause for concern. For example, a driver who is upset after receiving continuous warnings might choose to just ignore the message. If the message is displayed while driving, it can be a distraction.

Given these limitations and concerns, the manner in which this consent is to be obtained is unlikely to fully protect automakers from their legal liability in the event the system fails or an accident occurs.

Driver training for self-driving vehicles can help drivers fully understand system capabilities and limitations. This needs to go beyond just buying the vehicle – recent evidence shows that even relying on the information provided by the dealer won’t answer questions.

Against this background, the way forward for self-driving cars will not go smoothly.

This article by Francesco Biondi, Assistant Professor of Human Kinetics at the University of Windsor, has been republished by The Conversation under Creative Commons license. Read the original article.


SHIFT is brought to you by Polestar. It’s time to accelerate the transition to sustainable mobility. That’s why Polestar combines electric driving with state-of-the-art design and exciting performance. Find out how.

Published on December 5, 2020 – 16:00 UTC

Leave A Reply

Your email address will not be published.