Accredited Member

Let’s Talk 02 9058 5838

Suite 2.11, 11-13 Solent Circuit Norwest NSW 2153


Self-driving cars: who’s to blame when in an accident?

Autonomous vehicles are slowly becoming more commonplace in today’s motor vehicle landscape. With the rise of self-driving cars comes the issue of legal liability in the event of an accident.

Continuous research of human-vehicle interaction has shown over and over that systems made to automate driving, like adaptive cruise control, are far from being free from errors.

Recent data suggests that drivers’ limited comprehension of what these systems can and can’t do contribute to system misuse.

While there are issues plaguing the world of autonomous driving such as the imperfect technology and lukewarm reception of autonomous systems, there exists questions regarding legal liabilities. More specifically, what are the legal responsibilities of the human driver and the car manufacturer that built the autonomous driving vehicle?


Trust and accountability

In a study published recently in Humanities and Social Science Communications, the authors discuss the issue of over-reliant drivers and the resulting system misuse from a legal perspective. They delve deeper into what the manufacturers of self-driving cars should legally do to make sure that drivers understand how to properly utilise the vehicle.

One proposed solution required buyers to sign end-user licence agreements (EULAs), similar to those terms and conditions you must agree to when using a new software or computer. To get consent, manufacturers might need to employ the omnipresent touchscreen, which ow comes installed in most new vehicles.

This solution is far from ideal and safe. Another problem lies with the interface, as it may not provide enough information to the driver, which may lead to confusion about the nature of the request for agreement and their implications.

The main problem is that most end users don’t read the EULAs. A 2017 Deloitte study found that 91% of people agree to them without reading. In younger people, the percentage is lower with 97% agreeing without reading the terms and conditions.

Unlike using a new app or computer, operating a car has significant safety risk, regardless if the driver is human or software. Human drivers must give permission to take responsibility for the outcomes of the software and hardware.


Warning fatigue and distracted driving are other reasons for concern. For example, if a driver becomes irritated after receiving continuous warnings could decide to ignore the message. Alternatively, if the message appears while the vehicle is moving it could be seen as a distraction.

With the aforementioned restrictions and concerns, if this mode of attaining consent were to go ahead, it would not fully protect automotive manufacturers from their legal responsibility should the system breakdown or in the event of an accident.

Driver training is necessary for self-driving cars to ensure that drivers completely comprehend the system’s capabilities and limitations. This needs to happen beyond the vehicle purchase stage. Recent data shows that relying on information given by the dealership may not necessarily answer many pertinent questions.

The innovation of self-driving cars many not necessarily be as straightforward as one would assume.

Read more here: 


request-quote-icon-blueRequest a Quote

Simply fill out your details below in order to request a quote from us. We will be in contact with you shortly after.

Our Main Financiers Include