(read time: 12 mins)

Lucie Clinch and Julian Chamberlayne examine potential pitfalls in the rush towards autonomous vehicles in their article, Racing Ahead, published in the October edition of PI Focus.

The Automated and Electric Vehicles (AEV) Act passed into law in July. Implementation is anticipated via a number of statutory instruments within the next few years, given the government’s aim to have fully automated vehicles on our roads by 2021.

In this article, we look at some of the Act’s potential omissions in the context of liability and injury involving autonomous vehicles (AVs), and the void in the law relating to the current and near future breed of semi-autonomous vehicles.

 

The Automated and Electric Vehicles Act 2018

The Act will require the Secretary of State to maintain a list of motor vehicles that are:

  1. ‘designed or adapted’ to ‘be capable of safely driving themselves’; and
  2. may lawfully be used when driving themselves on roads or other public places in Great Britain.

It is not yet clear who will bear the expense of collating and maintaining this list. Once the list is published, the Secretary of State will have two years to report to parliament as to the effectiveness of that listing.

The Act is split into two parts: the first relates to liability of insurers where an accident is caused by an automated vehicle (Part 1), and the second relates to issues around public charging or refuelling points for electric vehicles. This article will solely deal with points arising in relation to liability aspects of Part 1 the Act. This applies to self-driving vehicles that ‘may lawfully be put in self-drive mode on roads or other public places in Great Britain’.

The key provision within Part 1 of the Act is section 2, which states that an insurer will be directly liable for an accident caused by an AV where it is:

  1. ‘driving itself on a road or other public place in Great Britain’;
  2. ‘insured’; and
  3. an ‘insured person or any other person suffers damage as a result of the accident’.

There has been much publicity about the fact that section 2 gives innocent victims a direct right of action against the vehicle’s insurer. The intention is for victims of accidents involving AVs to obtain compensation quickly and easily, without prolonging the process with complicated product liability claims against the AV technology manufacturers, or dealing with liability disputes between insurer and manufacturer.

The insurer remains free to pursue the manufacturer for any reimbursement or contribution if it can establish they are liable for the accident in question.

 

Section 2(1)a and ‘Roads and other public places in Great Britain’

A revision to amend the Act to include the words ‘roads or other public places in Great Britain’ mirrors the Road Traffic Act (RTA) 1988 and was inserted during the parliamentary debates.

Unfortunately, taken at face value this wording means the Act does not cover those injured in AV motoring accidents abroad. The duty for compulsory insurance within the UK under Part VI of the RTA1988, extends only to motor vehicles on a road or other public place.

However, the CJEU decision in Vnuk v Zavarovalnica Triglav d.d CJEU C-162/13 confirmed that the duty to insure extended to private property, and should not be restricted to roads.

It was anticipated at the time of Vnuk that the RTA 1988 would be amended to remove any restriction on place and use. The European Commission’s recent REFIT review of the Motor Insurance Directive in May this year highlighted that, post-Vnuk, Europe-wide compulsory motor insurance should not be limited to providing cover on roads and other public places, and yet the AEV Act 2018 does limit its applicability.

Stewarts’ Julian Chamberlayne, co-author of this article, wrote about such inconsistencies between UK and EU law in anticipation of driverless technology in early 2017, in, The revolutionary road to driverless technology . That article pointed out that a new Modern Transport Act would be required to deal with the inconsistencies between the Road Traffic Act and the EU Motor Insurance Directives, in order to ensure adequate cover is provided for English citizens who may be injured by European motorists, or when travelling in Europe (Nicholas Bevan has also written extensive commentary on this issue and the Vnuk decision in his excellent series of articles published in the New Law Journal).

The wider issues around the Vnuk ruling and implementation of the Motor Insurance Directives, together with whether UK citizens will continue to enjoy the benefits of cross-border remedies in a post-Brexit world, are outside of this scope of this article.

 

The levels of automation and ‘driving itself’

As mentioned above, the direct right of action against the insurer will apply where ‘an accident is caused by an automated vehicle driving itself [on a road or other public place in Great Britain]’. The Act goes on to clarify ‘driving itself’ simply as ‘operating in a mode which is not being controlled and does not need to be monitored by an individual’.

The Act purposely makes no mention of the varying levels of automation at which an AV may be driven.

Outside of the Act, the levels are currently defined in the motor industry by SAE International (Society of Automotive Engineers) and categorised in levels 1-5 (with level 1 behaving like a conventional car with a fully attentive driver involved, and level 5 requiring no driver input or monitoring at all).

However, when the Bill was discussed in parliament, Baroness Sugg, under Secretary of State for Transport, confirmed that the government believes the SAE levels lack precision and it would be seeking to set safety standards by way of a technical committee operating under UNECE (United Nations Economic Commission for Europe).

Once these standards are set, the vehicles will be expected to pass an approval process to demonstrate they are capable of safely driving themselves before they can be sold in the UK (much like current vehicles have to pass industry set technical standards to get to market).

Baroness Sugg confirmed that draft legislation regarding levels of automation will be put before Parliament once there is a better view of the landscape of the new technology and vehicle standards. This would form part of a wider government regulatory programme to ensure motorists and businesses benefit from AVs.

This begs the question as to which level would be deemed lawful or safe for the purposes of AV mode and, indeed, for the Act to apply.

It is a disappointing omission that the Act excludes the current and near future semi-autonomous vehicles in which the driver is expected to be monitoring the vehicle whilst in AV mode.

Their exclusion from the scope of the Act suggests that accidents involving these vehicles may have to be resolved by the courts, in cases that may well require complex technical evidence and involve the manufacturer. Access to black box information from such vehicles will be crucial (for further reading, see Jack Stilgoe’s article ‘Self-driving car companies should not be allowed to investigate their own crashes’, The Guardian, April 13 2018).

Baroness Sugg stated that the first true (self-driving) AVs are only likely to be used in specific areas, such as motorways with dedicated AV lanes, and there will be a procedure to safely hand back control to the driver. Finer detail on this point is awaited further to public and parliamentary consultations.

Such guidance will probably take some time to be passed via secondary legislation attached to the new Act, or through differing Statutory Instruments. In the meantime, these questions remain grey areas of the law.

 

Exclusions on liability

The Act excludes liability to an insured person in defined circumstances, such as when the accident occurs:

  1. as a direct result of software alterations made by the insured person, or with the insured person’s knowledge, that are prohibited under the policy; and
  2. a failure to install safety-critical software updates that the insured person knows, or reasonably ought to have known, are safety critical.

Questions arise such as how much is the owner/driver meant to know about the technology in the vehicle? Will they be alerted to the required updates and the limitation of insurance cover if they do not comply? What is ‘safety critical’? Who is responsible for imparting this information: the manufacturer, servicing garage or the insurer? What if the vehicle is involved in an accident on the way to the garage for the software or hardware update?

In some cases, insurers may want to make investigations into the potential exclusions before agreeing to compensate, which may delay the injured party gaining access to compensation at the earliest stage.

 

Contributory negligence and the AV

The Act confirms that incidents involving AVs will be subject to the usual rules of contributory negligence as defined in the Law Reform (Contributory Negligence) Act 1945, where the accident or damage was to any extent caused by the injured party.

Section 3(2) of the Act states that the insurer is, however, not liable to the person ‘in charge of the vehicle’ where the ‘accident that it caused was wholly due to the person’s negligence in allowing the vehicle to begin driving itself when it was not appropriate to do so’.

This was a feature of concern during the Bill’s passage through parliament: when is it ‘appropriate’ to allow the vehicle to ‘drive itself’? There will need to be further guidance on this to ensure injured drivers are afforded sufficient protection.

Without it, insurers will likely seek to use a section 3 (2) defence, where possible, and particularly in relation to a single vehicle accident where only the AV and the driver are involved.

The Bill envisaged victims would not have to pursue complicated and protracted product liability claims involving the vehicle technology manufacturers.

While this concern has been removed with the proposed direct right of action, drivers of AVs are still left at risk of delayed compensation in relation to single AV accidents where they are unable to show the vehicle itself at fault.

Further guidance will be needed as to the potential loopholes for owners and drivers regarding the required software updates, and when, where and how it is – and is not – appropriate to allow a self-driving vehicle to drive itself.

 

AVs in the news

Tesla has experienced two recent incidents in the USA involving cars in autopilot mode. And in the UK, a Tesla driver, Mr Patel, has been disqualified from driving for putting his car into autopilot mode on the M1 while he sat in the passenger seat.

A Tesla engineer confirmed in that case that the autopilot was intended simply to provide assistance to a ‘fully-attentive driver’.

The infamous fatal accident in March 2018 involving an Uber self-driving vehicle, and the 18 May 2018 crash involving a Tesla car on autopilot and a stationary police car in California, both raised further questions about drivers assuming autopilot can be relied on, and so not paying the same attention as if they were driving themselves.

It could be considered reasonable human nature to assume the car is automated if it has an autopilot setting; perhaps a change in the manufacturer’s terminology is required.

There are examples of AV technology having difficulty detecting stationary objects ahead, as well as making decisions or taking evasive action in the way a human driver would.

The technology can only be programmed to a point, so can it really be a better substitute for a human driver with the ability to think about avoiding danger, even if it takes an unpredictable form?

For example, a human may be better placed to decide whether swerving to avoid a collision would cause more damage than simply facing the collision, avoiding faster moving traffic / other vehicles. Can an AV do the same? Or as this technology develops, can the computer in the AV make better decisions than the average human driver? Even if it can, will society accept occasional ‘computer errors’?

Following the M1 incident, Tesla reminded customers that they must ‘maintain control of the vehicle at all times’.

The difficulty with this, as Mr Patel suggested in court in the UK, is that if your car has the capability to do something ‘amazing’ (his words) why wouldn’t he use it? You only have to look at the way people interact with their mobile phone while walking (or driving) to get a flavour for how technology changes the way humans behave.

Interesting legal issues would arise in semi-automated mode, ie. with driver monitoring.

Questions will be asked as to who made what decision: was it the driver/monitor or the vehicle itself? Who or what would be to blame? How do the programmers programme the car to react in the ‘correct’ way? Was the car being properly monitored when an accident occurred? Was the reaction of the vehicle reasonable when compared to a human driver? Was the human driver’s reaction appropriate in taking control at a given time? How long is an appropriate reaction time when in a semi-autonomous mode? The answers to these questions will likely reveal themselves as AVs become a reality on UK roads.

 

What next?

While technological developments, testing of them and legislation pushes on, public support for AVs is limited.

Without more dialogue between the public and government in making these advances on the UK’s roads, it is likely that the take up of the vehicles themselves will be slow.

The Law Commission has begun a three-year project reviewing rules and regulations around AVs, with the aim of publishing a consultation paper by the end of this year. The review will examine legal obstacles to the widespread introduction of AVs and highlight the need for regulatory reform.

Unfortunately, the legal difficulty around the new legislation and attempts to adapt the current law is likely to lead to years of uncertainty for victims as the use of AVs increases. It is essential that consumers and drivers at the heart of this new world have access to the best information available as to when, where, how and if they can ‘safely’ allow a car to ‘drive itself’, together with clear guidance on relevant software updates that they may be required to have installed or install themselves.

Baroness Randerson (Liberal Democrat Lords Spokesperson for Transport) commented during the debate of the Bill in the Lords, that ‘we are trying to imagine ourselves into the future’, and that the ‘government are going to have to be watertight in their approach’.

While the Law Commission review will go some way to dealing with the points raised in this article, the laws surrounding automated and semi-automated vehicles appear leaky in their current form.

This article first appeared in PI Focus. Please click here to view the original (subscription required).

 


 

You can find further information regarding our expertise, experience and team on our Personal Injury pages.

If you require assistance from our team, please contact us or request a call back from one of our lawyers by submitting this form.

 


 

Subscribe – In order to receive our news straight to your inbox, subscribe here. Our newsletters are sent no more than once a month.

Key Contacts

See all people