We are pleased to introduce our August 2019 media disputes newsletter. In our coverage of this month’s events, we discuss the striking out of a confidence case brought by a husband against his wife during a divorce, scrutiny of the legality of facial recognition systems, and Ofcom being appointed as the “super-regulator” of online harms.
If you would like to sign-up to receive the newsletter straight to your inbox, please subscribe here.
Husband’s confidence claim fails against wife in the English High Court
The approved written transcript of the ex tempore judgment of Mr Bhanu Choudrie v Mrs Simrin Choudrie was published in August 2019. The parties are involved in ongoing divorce-related litigation in the Family Court, but this judgment related to a separate confidence dispute.
In this matter the husband sought from the English High Court a declaration that his estranged wife owed him a duty of confidence in relation to information she received during their marriage, and for a few months after their separation.
Mr Choudrie also originally claimed that his wife should disclose on oath the names and addresses of all the people she had shared confidential information with, together with full particulars of such disclosure, and also particulars of the uses that his wife had made of the information. That part of his claim was abandoned on the eve of the strike-out hearing.
Mr Choudrie did not claim damages, nor did he seek an injunction to stop his wife disclosing confidential information in future.
Mrs Choudrie applied to the court to strike out her husband’s claim on the grounds that:
- The claim failed to identify with sufficient precision or clarity the information which Mr Choudrie claimed was confidential;
- Mr Choudrie was solely seeking a declaration from the court that Mrs Choudrie owed him an equitable duty of confidence. Mrs Choudrie’s lawyers argued that fell far short of what would be needed to amount to a proper cause of action, and that his claim was pointless and unnecessary;
- Mrs Choudrie had made clear in correspondence that she understood her duty of confidence, had not breached it in the past, and had no intention of breaching it in the future; and
- The claim was an abuse of process because even if it disclosed a technical cause of action, no legitimate purpose could be served by it.
An interim application hearing was held and the judge struck out the claim on the grounds that:
- It disclosed no reasonable grounds for bringing the claim for a declaration; and
- Consequently, that it was an abuse of the court’s process under principles set-out in Jameel v Dow Jones & Co Inc.  QB 946.
The judgment said:
“I cannot see that any declaration would resolve matters definitively between the parties or would give any real comfort to the claimant… The claim essentially fails on the grounds that there is no clear, crystallised and articulated dispute in relation to which declaratory relief could properly be granted, and any declaration would therefore have no utility whatsoever. There is no actual crystallised dispute but only a contingent, and speculative, dispute.”
The court also ordered Mr Choudhrie to pay Mrs Choudhrie’s legal costs of the application, which it was told were more than £235,000. Mr Choudhrie will, of course, also have to pay his own legal costs.
What this case illustrates is that while communications between a husband and wife can be protected by confidence, and a duty of confidence can be owed both to the person who confides the information and third parties to whom it may also relate, a claim for confidence should be formulated in a way that is specific, clear, and not speculative in nature.
The EU, the UK’s Information Commissioner and the Mayor of London address concerns about the legality of facial recognition systems
Question marks remain about the legality of the rapid roll-out of facial recognition systems in the UK and across the EU, by both the public and private sectors. There are fears that, at least on some occasions, they could breach data and privacy laws.
The use of this technology is still in its infancy but it is probably far more widespread than the public realise and its use is increasing exponentially, particularly in London.
People are starting to take note and this month the Mayor of London, Sadiq Khan, and the UK’s Information Commissioner, Elizabeth Denham, focussed on a live face-scanning system being used across and around King’s Cross train station in London.
Mr Khan said there were “serious and widespread concerns” about the legal framework for facial-recognition technology. On 13 August 2019 Mr Khan asked King’s Cross Central Limited Partnership for reassurance that its use of this technology was compliant with the law as it stands. He also said that he had called on the UK government to legislate to provide certainty about exactly how facial-recognition technology can be legally used in the UK.
On 15 August 2019 Elizabeth Denham, the Information Commissioner, released a statement on the use of live facial recognition technology in King’s Cross. It said:
“Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all. That is especially the case if it is done without people’s knowledge or understanding.
“I remain deeply concerned about the growing use of facial recognition technology in public spaces, not only by law enforcement agencies but also increasingly by the private sector. My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used…
“Put simply, any organisations wanting to use facial recognition technology must comply with the law – and they must do so in a fair, transparent and accountable way. They must have documented how and why they believe their use of the technology is legal, proportionate and justified.
“We support keeping people safe but new technologies and new uses of sensitive personal data must always be balanced against people’s legal rights.”
The King’s Cross site is just one location in the UK that uses facial recognition technology, often unnoticed by the general public. It is impossible for us to comment on its legality without further information about it. The ICO is currently investigating the use of facial recognition there, and more information is needed. King’s Cross was singled out following media scrutiny of its use of facial recognition systems.
To date, the ICO, the UK courts, and the British government appear to have done little to address the rapid expansion of facial recognition technology across the country, or to consider its legality. Test cases are likely to appear in the future.
Of particular interest is whether the technology complies in every instance with UK data protection laws, the European General Data Protection Regulation (GDPR), and British common law privacy rules.
It is notable that the European Commission this month is reportedly considering introducing a regulation that would impose strict limits on the use of facial recognition technology.
This month also the Swedish Data Protection Authority, Sweden’s national data protection watchdog, imposed its first fine under the GDPR, which related to a facial recognition system being used in a school to monitor the daily attendance of students.
It will be interesting to see how this develops, how quickly, and whether regulators, the courts, governments, and the EU can fence in a genie that seems to be already out of its bottle.
Ofcom looks set to be UK’s online harms ‘super-regulator’ from September 2020
The UK media this month reported that Ofcom will be appointed as the interim “super-regulator” to enforce soon to be introduced “online harms” measures. The appointment is likely to commence in September 2020, although an official government statement on the matter is awaited.
As we wrote in April 2019, the government is proposing a new statutory duty of care aimed at reducing “online harms”. The UK government Online Harms White Paper proposes a new statutory duty of care to apply to any “companies that allow users to share or discover user-generated content, or interact with each other online”, regardless of whether they are based in the UK or not.
The white paper is likely to affect social media platforms, file hosting sites, public discussion forums, messaging services, and search engines, among other companies.
When it becomes the online harms regulator, Ofcom will have a suite of powers to take effective enforcement action against companies that have breached their statutory duty of care. This is likely to include the ability to issue substantial fines and to impose liability on individual members of senior management.
Tech and media companies have criticised the proposals as being in their opinion too draconian. For instance, the Internet Association (IA), which is the official lobbying arm for companies such as Google, Facebook, Twitter, Amazon and Microsoft, released a statement shortly after the Online Harms White Paper was launched that the new regime would “hold back the British tech sector, worsen the quality of internet services for ordinary consumers, undermine privacy, and have a negative effect on freedom of speech”.
This new regime is set to shake up how online material is regulated, and it remains to be seen whether it will be enthusiastically enforced by Ofcom in a year’s time.
A new regime for English media and communications cases
Partner and Head of Media Disputes, Ryan Dunleavy, analyses the impact of new legislation that from 1 October 2019 will create a designated specialist Media and Communications List in the Queen’s Bench Division of the High Court. This amends the Civil Procedure Rules 1998, and changes how media and communications cases are defined and dealt with in English courts.
Read the full article here
You can find further information regarding our expertise, experience and team on our Media Disputes page.
Subscribe – In order to receive our news straight to your inbox, subscribe here. Our newsletters are sent no more than once a month.