The last few weeks have been a mixed bag of action and inaction in relation to liability for harmful content on online platforms. That said, the resolute direction of travel in the UK and the EU remains that platforms are facing a future of regulation and greater responsibility for said content. The age of the defence of being a “mere conduit” of information appears to be drawing to a close.

 

EU Digital Services Act

On 2 June 2020, the European Commission kicked off two public consultations seeking views on the rules which its Digital Services Act package needs to introduce to modernise the legal framework for digital services. The consultations cover the responsibilities of online platforms for content and, separately, the competition challenges presented by large platforms acting as gate-keepers.

The EU legal framework for digital services has remained broadly unchanged since the adoption of the e-Commerce Directive in 2000, albeit supplemented by instruments relating to specific illegal or harmful activities conducted online. Under the e-Commerce Directive, platforms are not liable for illegal or harmful content uploaded by users where they act as “mere conduits” of the information. The only circumstance in which they will be liable is where they have knowledge of illegal terrorist content.

In the 20 years since the e-Commerce Directive was introduced, the ‘fourth industrial revolution’ has bolstered beyond recognition the role that online platforms play in how users communicate and consume information, and how businesses trade. With these changes have come new challenges associated with dangerous or illegal goods, as well as illegal content (child sexual abuse or hate speech) and misinformation. The Combined Evaluation Roadmap and Inception Impact Assessment associated with the content consultation acknowledges that, in the absence of modernised and harmonised EU rules, Member States are increasingly passing laws with notable differences in the obligations imposed.

The European Commission is keen to tackle this fragmentation, place greater responsibility on platforms and give companies and governments legal clarity as to parameters. The first pillar of the Digital Services Act package will, therefore, likely include:

  • a proposal to increase and harmonise the responsibilities of online platforms in relation to content and services, including by way of notice and take down orders
  • upgrade liability rules, including a definition as to what is illegal online, and
  • reinforce the transparency and oversight over platforms’ content policies in the EU.

In January of this year, lobbyists for Google, Facebook and Twitter wrote to the European Commission warning of the risks of holding them legally liable for all content on their platforms. They argued that any new rules in the Digital Services Act should include a ‘Good Samaritan’ principle, to ensure that legal liability is not triggered because a company is doing its best to tackle harmful content.

This does not mean the European Commission’s proposals are at odds with the platforms across the board. Back in February, at the Munich Security Conference, Mark Zuckerberg called for greater regulation of harmful online content, opining that it is not for companies to decide what counts as legitimate free speech. Since then, Facebook has established an independent oversight board to hear appeals against content removal decisions made by Facebook’s in-house moderation teams. Some may see this as an attempt to demonstrate that a model mixing self-regulation with limited liability could work.

Such a view, among others, will no doubt be proposed in response to the consultations, which are open for feedback until 8 September 2020 (the Impact Assessment is open until 30 June 2020). These are supplemented by a variety of other exploratory European Commission projects, including a review of the Code of Practice on Disinformation. The EU plans to propose legislation by the end of the year.

 

UK Online Harms Bill

In contrast, progress with the UK’s Online Harms Bill has been delayed, ostensibly due to Covid-19 issues occupying bandwidth.

The Online Harms White Paper was published in April 2019. This set out the government’s intention to impose a statutory duty of care on online services to protect users from “online harms” including child sexual abuse material, terrorist content, hate crimes and harassment. This duty would only apply to services that facilitate the sharing of user-generated content or where users interact with each other online. Business-to-business services would not be within scope.

Following the release of the White Paper, a consultation ran from 8 April to 1 July 2019. This received over 2,400 responses from companies in the technology industry including tech giants and small and medium-sized enterprises, academics, think tanks, children’s charities, rights groups, publishers, governmental organisations and individuals.

Frustrated by the ensuing delay, Lord McNally introduced a paving Bill in January 2020, requiring the Secretary of State to publish a draft bill with Ofcom as online harms regulator within one year.

On 12 February 2020, the government published its initial response to the consultation. This clarified that it sought to appoint Ofcom as the relevant regulator and equip it with the powers to ensure companies have appropriate systems and processes in place to fulfil their duty of care. In other words, the duty of care will be systemic, rather than individual. While the duty of care will require companies to remove expeditiously illegal content from their services, they will not have a similar obligation to remove harmful but legal content. Instead, companies will have to state publicly what content and behaviours are and are not appropriate on the service and to have systems in place to enforce these statements consistently and transparently. The initial response did not define the sanctioning powers that would be available to Ofcom, but suggested that these might include the power to issue fines, impose liability on senior managers, mount raids and seize materials, and order ISPs to block access.

A full response to the consultation on the regime was promised in the “spring”. In evidence to the Home Affairs Select Committee on 13 May 2020, Caroline Dinenage, Minister of State for Digital, Culture, Media and Sport, said that the government remained committed to a “world leading” online harms regime, but that the full response to the consultation had been delayed by Covid-19 and would “probably” be released during the autumn, with the legislation following shortly afterwards.

There are a number of open questions about the scope of the regime, not least as to transparency reporting and whether there will be appeals and super-complaint mechanisms. However, the minister’s evidence did provide welcome clarification on several aspects, including that:

  • the regime will apply to some degree to private channels of communication
  • although not mentioned in the initial response, search engines will be required to comply with the regime, and
  • only the illegal harms of terrorist content and child sexual exploitation will have dedicated codes, while the focus for other harms will be on the systems and processes platforms should put in place to comply with the duty of care.

The select committee alluded to the fact that Action Fraud and the National Crime Agency believe that online fraud should be included as an online harm, just as the Betting and Gaming Council wants it to include unregulated gambling operators. Fraud was expressly excluded from scope in the white paper. Ms Dinenage acknowledged that there was a “trade off” in terms of how comprehensive the legislation will be on harms covered versus speed of delivery.

Delays in the UK notwithstanding, the next three months will be critical for stakeholders to influence the UK government and European Commission in shaping the new online harms regime(s). One thing seems certain: the responsibilities and liabilities of platforms will be changing.

 

 


 

You can find further information regarding our expertise, experience and team on our Media Disputes page.

If you require assistance from our team, please contact us or alternatively request a call back from one of our lawyers by submitting this form.

 


 

Subscribe – In order to receive our news straight to your inbox, subscribe here. Our newsletters are sent no more than once a month.

Key Contacts

See all people