Self-driving cars (or automated vehicles (AVs)) are moving ever closer to widespread legal use on our roads. The UK government has been keen to press ahead with the development of AV technology following the lifestyle changes brought about by the Covid-19 pandemic.
In this article, initially written in August 2022 and published in the November 2022 edition of PI Focus, Senior Associate Lucie Clinch reviews personal injury liability and the introduction of self-driving vehicles to public roads. She also considers the latest developments in automated vehicles, including the Law Commission of England and Wales and Scottish Law Commissions’ co-authored ‘Automated Vehicles’ report and ongoing consultations on automated vehicles.
‘Automated Vehicles’ report
The ‘Automated Vehicles’ report, published in January 2022 by the Law Commission of England and Wales and Scottish Law Commission (“the Commissions”), makes proposals on how self-driving vehicles could be safely introduced to UK roads. The report proposes that a ‘self-driving’ vehicle is responsible for how the car drives and operates, which it describes as the ‘dynamic driving task’. This is consistent with the Automated and Electric Vehicles Act 2018 (AEVA), which defines a vehicle as ‘driving itself’ if it is ‘operating in a mode in which it is not being controlled, and does not need to be monitored by an individual’.
Autonomous vehicle users may not be expected to ‘drive’ the car or even proactively ‘monitor’ it. However, they are expected to respond promptly and sensibly to transition demands in an emergency or another scenario where manual driving would be required. When this occurs, a vehicle system would alert the person inside with multisensory signals. How long they would have to take control remains under review, though it has been proposed that it should be within 10 seconds.
The person in the driver’s seat will not be considered a ‘driver’ from a liability standpoint. They will instead be a ‘user in charge’ (UIC), a new legal designation created to refer to autonomous vehicle users. In an incident requiring the UIC to take control of the vehicle, the automated lane keeping systems (ALKS) would issue a transition demand that the UIC would have to accept. Once this ‘handover’ is complete, the UIC assumes responsibility as the driver, including from a liability perspective. If the UIC fails to take over when a request is issued, mitigation against damage or risk of injury would need to be assessed by the regulatory body. Under current plans, an autonomous vehicle should be able to come to a ‘controlled stop’ in an emergency.
Perhaps the most common question posed by members of the public around autonomous vehicles is how safe will they be compared to conventional vehicles? One of the key takeaways from the report was that how ‘safe’ autonomous vehicle use ‘needs to be’ is a ‘political decision’. It is unclear whether the new technologies will improve road safety, particularly in the short term.
The report and liability – a ‘no blame’ culture?
Since the report, the Road Collision Investigation Branch (RCIB) has been formed. The focus of AV collision investigation will be on learning lessons rather than allocating fault, so I would hope that AVs are within the RCIB’s remit. It is important investigators are properly trained and aware of the technological features of an AV in order to be effective.
The report recommends that UICs be immune from any criminal offence or civil penalty. However, this would cease to apply in a situation where the UIC assumed responsibility as the driver following a ‘handover’. The UIC would still be liable if they deliberately caused the Automated Driving System (ADS) to malfunction or if they had wilfully altered the system to operate in a manner unintended by the manufacturer.
These changes would represent a significant shift in liability in the event of a road traffic collision or other incident. Where before ‘blame’ would rest with a driver or other person involved, the use and potential fault of an AV would place blame on a machine. It is also recommended that aggravated offences such as causing death by dangerous driving will not be committed when a vehicle is driving itself. Liability could therefore rest with the manufacturer or body that obtained safety authorisation for the vehicle, also known as the authorised self-driving entity (ASDE). UICs would remain liable for ‘non-dynamic’ tasks, including parking, passenger seatbelt use, vehicle maintenance and insurance, and reporting accidents.
The Commissions’ aim is to promote a ‘no-blame safety culture’ that learns from mistakes, with an emphasis on a duty of candour, allowing ASDEs to take measures to improve safety as required. Regulatory sanctions against the companies are recommended by the Commissions, as opposed to criminal sanctions against the human drivers. Criminal sanctions could apply, including against senior managers, if the duty of candour is breached or misrepresentations are made to the regulator.
The AEVA allows a direct right of action against an insurer (s 2(1)) in an accident caused by an automated vehicle driving itself on a road or other public place, but questions remain over how the courts will apply contributory negligence under the Act. This may complicate proceedings for injury claimants.
The report is clear that victims in accidents involving AVs will not have to prove that anyone was at fault and that the insurer would have to compensate the victim directly. However, evidence in such cases would typically be held by the ASDE, requiring open communication between various parties to progress a claim. In its consultation responses on this subject, Stewarts suggested that the government should give statutory clarification on how contributory negligence would apply as the alternative could leave many injured victims fighting difficult legal battles that allege contributory negligence.
The report deals with the control, use and retention of data in detail, taking into account the considerable privacy concerns connected with it.
In traditional vehicle road collisions, insurers rely heavily on evidence from insured drivers and witnesses to defend and deal with claims. This will become more difficult in an automated environment as insurers will need to rely heavily on any vehicle-generated data to assess claims. Such data will be critical when establishing liability as insurers will be able to verify, for example, whether a certain vehicle was in an alleged location, whether automated driving had been engaged and whether a ‘transition demand’ was issued and not actioned by the human driver. Insurer access to such data will be crucial for liability assessments.
The report also considers the introduction of a statutory duty requiring those ‘controlling’ automated vehicle data to disclose it to insurers. Legislation imposing such an obligation on those controlling automated vehicle data would be necessary to invoke Article 6(1)(c) of the UK General Data Protection Regulation, where ‘processing is necessary for compliance with a legal obligation to which the controller is subject’.
The key concern in imposing a statutory liability is the potential tension between insurers and manufacturers. On the one hand, there are some fears that ASDEs would be hesitant to release relevant data, particularly if they anticipate an insurer bringing a claim against it. For similar reasons, insurers want ASDEs to be under clear duties, potentially with sanctions, to avoid the likelihood of manufacturers favouring their own insurance partners. On the other hand, there also exists the possibility that insurers will seek to obtain data that goes beyond what is required to appropriately resolve the claim in question (such as seeking data that may assist them in a related product liability claim).
Notwithstanding these potential issues, the report makes two recommendations, given the role of data in establishing liability:
- Where the data is necessary to decide claims fairly and accurately, the new Act should impose a duty to disclose data, and
- In the absence of agreement between insurers and manufacturers, the in-use regulator should have a statutory power to issue a code of practice in relation to automated vehicle data and all those obliged to disclose that data must have regard to that code.
It remains to be seen how effective these recommendations will be until the protocols are in place and the true conflicts between insurers and manufacturers emerge.
Length of retention
The ALKS regulation in relation to data storage provides a guideline of six months’ storage before deletion. This is inadequate in serious or other injury claims where the claimant may not bring a claim immediately. The Commissions support an initial proposal of three years’ data retention, in line with the standard limitation period for bringing a claim. There remain arguments as to whether this is appropriate, for example, in claims involving minors or protected parties.
Despite the cost and practical difficulties associated with data storage, the Commissions’ latest report has revised the initial proposal from 36 to 39 months of data retention, allowing for an extra three months for data storage. While potentially costly and difficult in terms of data storage, this was considered necessary given how regularly claims arrive on or shortly before the limitation expiry.
Storing data for three years would be a redundant distribution of resources if it were to be deleted before an insurer has had a genuine opportunity to make a data request of the ASDE or has not been notified of a potential injury claim. An additional three months represents a reasonable compromise, with the onus on the insurer to be proactive and take steps to obtain the relevant data as quickly as possible. Where a request is made within the relevant period, the required data should be retained until it has been supplied.
The UN Regulation on Automated Lane Keeping Systems requires vehicles to record when it is involved in a detected collision, and this remains a necessary data point in the report’s proposal. However, the Commissions set out that the authorisation authority should also require data to be collected and stored in line with the statutory duty and retention period. Further, the required data should include the date, time and location of a self-driving automated driving system feature being engaged or deactivated, when a transition demand is issued and when a collision is detected.
The Commissions say this proposal represents the minimum data requirement to process claims fairly and accurately and properly establish liability. It also notes that the required data need not necessarily be limited to the above.
Highway Code update
On 1 July 2022, the Highway Code was updated to explain that while travelling in ‘self-driving mode’, the driver must be ‘ready to resume control in a timely way when they are prompted to’. The amendments also implement rules confirming that a driver can watch media on an inbuilt screen while the vehicle is in control. Meanwhile, it is still not legal to use a mobile phone while driving. As Nicolas Lyes, head of policy at the RAC, said: ‘It’s vital the Highway Code changes covering automated vehicle technology are crystal clear, setting out exactly what drivers can and can’t do when certain features are engaged otherwise there’s a very real risk that drivers will be confused. This itself could lead to avoidable road traffic collisions – especially if a driver hasn’t taken back control of the vehicle after they’ve been told to.’
Changes to the Highway Code alone will not ensure the safe introduction of AVs. The Transport Committee recently sought written evidence on infrastructure, making a regulatory framework, and safety and public perception of safety. It also considered how AVs would interact with pedestrians, cyclists and conventionally driven vehicles. In addition, the Law Commission of England and Wales recently closed a consultation on ‘remote driving’, covering the use of vehicles with no ‘driver’ present and related legislation – including potential issues applying the Road Vehicles (Construction and Use) Regulations 1986 to remotely operated vehicles.
Work continues with the Centre for Connected and Autonomous Vehicles (CCAV), which is consulting on a proposed safety ambition. The CCAV seeks views on the approach that self-driving vehicles should be expected to be as safe as a competent and careful human driver. I suggest that anything less would not be ‘safe enough’, and there is an argument that AVs should be even more safe we expect human drivers to be.
You can find further information regarding our expertise, experience and team on our Personal Injury pages.
Subscribe – In order to receive our news straight to your inbox, subscribe here. Our newsletters are sent no more than once a month.