*/
Google fail? Heather Rogers QC puts the legal record straight on the first two ‘right to be forgotten’ cases – tried under the twilight data protection regime but with issues far from resolved
‘Google loses landmark “right to be forgotten” case’, ‘Businessman’s court victory over Google has important implications’ and ‘GOOGLE FAIL’ were headlines after the judgment of the Honourable Mr Justice Warby in NT1 & NT2 v Google LLC on 13 April 2018, [2018] EWHC 799 (QB). Well, up to a point.
This was the determination in the High Court of the first two trials in England and Wales over the ‘right to be forgotten’ – claims for the delisting of personal data from Google searches.
Anonymised claims brought by two unrelated claimants were tried together. Both objected to the fact that returns on Google searches revealed ‘spent’ convictions. Their circumstances were very different. NT1 had been sentenced to four years’ imprisonment after being convicted (at a trial at which he did not give evidence) of conspiracy (false accounting), an offence that involved dishonesty and from which he had made very substantial gains. NT2 had been sentenced to six months’ imprisonment on conviction (on a guilty plea) of an offence relating to phone tapping: he had instructed investigators after his business was targeted by ‘malign actors’ and he had made no financial gain. NT2 was remorseful: the offence was a ‘cataclysmic mistake’ for which he took full responsibility. By contrast, NT1 had ‘very obvious difficulties in acknowledging his guilt’ and the conviction and sentence ‘still rankle’ with him.
The cases were tried under the Data Protection Act 1998 (DPA 1998), which gave effect to an EU Directive 95/46 from October 1995 on the protection of individuals with regard to the processing of personal data and the free movement of such data. As everyone knows, that legal regime has been superseded. The GDPR – the General Data Protection Regulation, EU 2016/679 – took effect from 25 May 2018, with the new Data Protection Act 2018 having emerged from Parliament on 23 May 2018 after last-minute ‘ping pong’ on the question of whether there should be a ‘Leveson 2’ public inquiry.
As the judge acknowledged, the decision in these cases was made in the ‘twilight’ of the old regime, with the ‘first light of GDPR visible on the horizon’, leading him to observe that it was ‘unlikely’ that his decision would have an impact on other cases [105]. But his approach to a number of difficult issues, in particular, how to balance rights over personal data and the right to freedom of expression, is worth reading. The careful 76-page judgment includes a number of points of interest.
Firstly, it was not an abuse of process to bring a claim for data protection where reputational concerns were involved. The inter-relationship between libel and data protection has been raised in earlier cases (including the Elaph Publishing Ltd case [2017] EWCA Civ 29, [2017] 4 WLR 28 CA). Claimants are entitled to choose how to put their claim and to rely on new data rights (rather than libel), providing that is not a deliberate attempt to avoid the rules of defamation in a case where protection of reputation was the only objective (the ‘nub’ of the claim) [62-64]. Google’s argument that these claims were an abuse failed.
Secondly, Google admitted that its presentation of search results as a consequence of a search in the name of a data subject constituted the ‘processing’ of personal data for the purposes of the DPA 1998. Google accepted that – subject to the journalism exemption – on receipt of a delisting request, it was obliged to carry out the balancing test prescribed by the CJEU in Google Spain (Google Spain SL v Agencia Española de Protección de Datos [2014] QB 1022).
The judge found that s 32 of the DPA 1998 – the exemption for processing for ‘special purposes’, including journalism – did not apply, since Google’s activity could not be ‘equated with journalism’, even though ‘journalism’ should be broadly defined (see the CJEU in Satamedia C-73/07; and see also the ECtHR at (2018) 66 EHRR 8) [98].
This is interesting: on the one hand, results produced by Google’s internet search engine (ISE) are wholly automated, governed by computer-based algorithms [100] (indeed, Google reserved its position on whether its activities were ‘caching’ for the purposes of the E-Commerce Directive SI 2002/2013 [50]); but, on the other hand, internet intermediaries act as ‘bridge-builders between content providers and internet users’, playing a ‘crucial’ role in the information society [115]. So Google plays a vital role in a democratic society in conveying information to the public in the exercise of the right to freedom of expression. Like journalists and the media, but not ‘journalism’?
Apart from falling at the threshold, the judge found that Google could not satisfy the essential ‘reasonable belief’ requirements for the s 32 exemption, because no-one formed any ‘belief’ in the automated process. After Google received notice of a complaint or request for delisting, it looked at the substance of the matter and assessed the public interest, but there was no evidence of it forming any belief as to whether compliance with requirements of the DPA 1998 in continued processing would be ‘incompatible’ with the purposes of journalism [102].
Section 32 has passed into history, but exemption for processing for the purposes of journalism exists under the new regime. (Readers wishing to don their DPA anorak will note that, going forward under the new DPA, the exemption no longer requires that processing be ‘only’ for the purposes of journalism.) This issue is far from over.
Thirdly, the claimant has the burden of proving that data are ‘inaccurate’ and must specify the inaccuracies [79]. Words must be read in their context and the meaning(s) conveyed identified, using defamation principles [82-84]. The court has a range of options as to remedy, if inaccuracy is shown [86]. NT1 failed to show that his personal data were inaccurate (and the judge was critical of his evidence) [91-92]; but NT2 had no difficulty in showing that his data were significantly misleading [190].
"In short, every case is going to turn on its facts – there are no ‘bright lines’. There are, as the judge recognised, competing interests at stake in a changing legal landscape"
Fourthly, the ‘right to be forgotten’, as acknowledged by the CJEU in Google Spain, recognises that there might be free speech justifications for disclosing sensitive personal data (even without consent) [104]. Information about criminal convictions constituted sensitive personal data under the DPA 1998. Processing might be justified under Schedule 3, condition 5 (data made public as the result of a step taken by the data subject, that is, by committing a criminal offence) [106-113]; and under Schedule 2, condition 6 (processing ‘necessary’ for the legitimate interests of the data controller and not ‘unwarranted’) [115].
The detail of the judge’s reasoning was based on the statutory requirements that are superseded. But the wider considerations of principle will continue to be relevant. In essence, the decision on the claim for delisting turned on the outcome of the balancing test between competing interests: the data protection (privacy) rights of the data subject and the freedom of expression rights of the data controller and the public [115] and [132-135]. Articles 8 and 10 of the European Convention of Human Rights were in play and an ‘intense focus’ on the comparative rights, based on the specific facts of the case, was required. Importantly, the starting point was an equal balance between delisting and continued processing [132], though the fact that a conviction had become ‘spent’ would be a weighty factor [166(2)].
In short, every case is going to turn on its facts – there are no ‘bright lines’. There are, as the judge recognised, competing interests at stake in a changing legal landscape. There are legitimate interests in rehabilitation, which led to the passing of the Rehabilitation of Offenders Act 1974 (long before the internet); the courts have developed individual privacy rights under the Human Rights Act 1998 (the origins of the tort of misuse of private information lie in the 2004 House of Lords decision in the Naomi Campbell case [2004] 2 AC 457); with Google Spain in the CJEU in 2014 being the landmark decision on the ‘right to be forgotten’ (expressly included in Art 17 of the GDPR). The questions of whether or when a criminal conviction (determined at a public hearing and a matter of public record) becomes a private matter (in relation to which the individual has a reasonable expectation of privacy) – and when information about that conviction (even when spent) may be published – are difficult and subtle questions. (The judge summarised a series of public law cases considering spent convictions [48].)
In NT1’s case, the balance was clearly in Google’s favour. The relevant factors included the length of his sentence (if one day longer, it would never have become spent); the offence involved serious dishonesty on a substantial scale; his case on harm was not compelling; he had published misleading information, making ‘crude attempts to re-write history’; and he could not be trusted to provide an accurate account of his business background or credentials to those with whom he came into contact in business [167-169]. By contrast, NT2 had shown a credible case on harm; the information was out of date and irrelevant; and there was no sufficient interest in the data on the part of users of Google’s ISE to justify the continued processing [223].
Fifthly, the question of actual consent arose in NT2’s claim, since he had given interviews in which he had referred to the criminal conviction (before it became spent), as part of an attempt to put adverse publicity into context. Now he had withdrawn consent and wished those interviews to be delisted by Google. It was not suggested that there was any legal obstacle to his withdrawing consent (no contractual obligation or estoppel) [220]. Consent plays a crucial role in data protection claims and it is clear that a defendant will not be able to rely on prior publications, made voluntarily by the data subject, as constituting consent to continuing publication.
Sixthly and beyond, there are more points of interest outside the scope of this short case note. For example: the convergence of causes of action in data protection, misuse of private information, and defamation – in which competing Article 8 and Article 10 rights arise. And, of course, it will be borne in mind that these claims were only in relation to the delisting of information by Google – there was no attempt to seek the removal or erasure of the same data from the primary sources identified in the searches ([160] and [221]). Note, however, that the judge’s finding that certain data about NT2 in a newspaper article were inaccurate suggested that the newspaper would not be entitled to continue to publish that article. Claims against Google – particularly those which might determine whether data are accurate – will be closely followed by media defendants. But that is for another day.
Having played two in NT1 & NT2, Google lost one and won one. For the successful NT2, the judge made an order for ‘delisting’, but declined to order any damages since Google had taken ‘reasonable care’. As for NT1, his data protection and misuse of private information claims failed. He was given permission to appeal but, whatever the interest in the legal issues, the facts appear to be against him.
There is everything to play for and one thing is for sure: data protection will run and run.
Contributor Heather Rogers QC is a barrister at One Brick Court.
‘Google loses landmark “right to be forgotten” case’, ‘Businessman’s court victory over Google has important implications’ and ‘GOOGLE FAIL’ were headlines after the judgment of the Honourable Mr Justice Warby in NT1 & NT2 v Google LLC on 13 April 2018, [2018] EWHC 799 (QB). Well, up to a point.
This was the determination in the High Court of the first two trials in England and Wales over the ‘right to be forgotten’ – claims for the delisting of personal data from Google searches.
Anonymised claims brought by two unrelated claimants were tried together. Both objected to the fact that returns on Google searches revealed ‘spent’ convictions. Their circumstances were very different. NT1 had been sentenced to four years’ imprisonment after being convicted (at a trial at which he did not give evidence) of conspiracy (false accounting), an offence that involved dishonesty and from which he had made very substantial gains. NT2 had been sentenced to six months’ imprisonment on conviction (on a guilty plea) of an offence relating to phone tapping: he had instructed investigators after his business was targeted by ‘malign actors’ and he had made no financial gain. NT2 was remorseful: the offence was a ‘cataclysmic mistake’ for which he took full responsibility. By contrast, NT1 had ‘very obvious difficulties in acknowledging his guilt’ and the conviction and sentence ‘still rankle’ with him.
The cases were tried under the Data Protection Act 1998 (DPA 1998), which gave effect to an EU Directive 95/46 from October 1995 on the protection of individuals with regard to the processing of personal data and the free movement of such data. As everyone knows, that legal regime has been superseded. The GDPR – the General Data Protection Regulation, EU 2016/679 – took effect from 25 May 2018, with the new Data Protection Act 2018 having emerged from Parliament on 23 May 2018 after last-minute ‘ping pong’ on the question of whether there should be a ‘Leveson 2’ public inquiry.
As the judge acknowledged, the decision in these cases was made in the ‘twilight’ of the old regime, with the ‘first light of GDPR visible on the horizon’, leading him to observe that it was ‘unlikely’ that his decision would have an impact on other cases [105]. But his approach to a number of difficult issues, in particular, how to balance rights over personal data and the right to freedom of expression, is worth reading. The careful 76-page judgment includes a number of points of interest.
Firstly, it was not an abuse of process to bring a claim for data protection where reputational concerns were involved. The inter-relationship between libel and data protection has been raised in earlier cases (including the Elaph Publishing Ltd case [2017] EWCA Civ 29, [2017] 4 WLR 28 CA). Claimants are entitled to choose how to put their claim and to rely on new data rights (rather than libel), providing that is not a deliberate attempt to avoid the rules of defamation in a case where protection of reputation was the only objective (the ‘nub’ of the claim) [62-64]. Google’s argument that these claims were an abuse failed.
Secondly, Google admitted that its presentation of search results as a consequence of a search in the name of a data subject constituted the ‘processing’ of personal data for the purposes of the DPA 1998. Google accepted that – subject to the journalism exemption – on receipt of a delisting request, it was obliged to carry out the balancing test prescribed by the CJEU in Google Spain (Google Spain SL v Agencia Española de Protección de Datos [2014] QB 1022).
The judge found that s 32 of the DPA 1998 – the exemption for processing for ‘special purposes’, including journalism – did not apply, since Google’s activity could not be ‘equated with journalism’, even though ‘journalism’ should be broadly defined (see the CJEU in Satamedia C-73/07; and see also the ECtHR at (2018) 66 EHRR 8) [98].
This is interesting: on the one hand, results produced by Google’s internet search engine (ISE) are wholly automated, governed by computer-based algorithms [100] (indeed, Google reserved its position on whether its activities were ‘caching’ for the purposes of the E-Commerce Directive SI 2002/2013 [50]); but, on the other hand, internet intermediaries act as ‘bridge-builders between content providers and internet users’, playing a ‘crucial’ role in the information society [115]. So Google plays a vital role in a democratic society in conveying information to the public in the exercise of the right to freedom of expression. Like journalists and the media, but not ‘journalism’?
Apart from falling at the threshold, the judge found that Google could not satisfy the essential ‘reasonable belief’ requirements for the s 32 exemption, because no-one formed any ‘belief’ in the automated process. After Google received notice of a complaint or request for delisting, it looked at the substance of the matter and assessed the public interest, but there was no evidence of it forming any belief as to whether compliance with requirements of the DPA 1998 in continued processing would be ‘incompatible’ with the purposes of journalism [102].
Section 32 has passed into history, but exemption for processing for the purposes of journalism exists under the new regime. (Readers wishing to don their DPA anorak will note that, going forward under the new DPA, the exemption no longer requires that processing be ‘only’ for the purposes of journalism.) This issue is far from over.
Thirdly, the claimant has the burden of proving that data are ‘inaccurate’ and must specify the inaccuracies [79]. Words must be read in their context and the meaning(s) conveyed identified, using defamation principles [82-84]. The court has a range of options as to remedy, if inaccuracy is shown [86]. NT1 failed to show that his personal data were inaccurate (and the judge was critical of his evidence) [91-92]; but NT2 had no difficulty in showing that his data were significantly misleading [190].
"In short, every case is going to turn on its facts – there are no ‘bright lines’. There are, as the judge recognised, competing interests at stake in a changing legal landscape"
Fourthly, the ‘right to be forgotten’, as acknowledged by the CJEU in Google Spain, recognises that there might be free speech justifications for disclosing sensitive personal data (even without consent) [104]. Information about criminal convictions constituted sensitive personal data under the DPA 1998. Processing might be justified under Schedule 3, condition 5 (data made public as the result of a step taken by the data subject, that is, by committing a criminal offence) [106-113]; and under Schedule 2, condition 6 (processing ‘necessary’ for the legitimate interests of the data controller and not ‘unwarranted’) [115].
The detail of the judge’s reasoning was based on the statutory requirements that are superseded. But the wider considerations of principle will continue to be relevant. In essence, the decision on the claim for delisting turned on the outcome of the balancing test between competing interests: the data protection (privacy) rights of the data subject and the freedom of expression rights of the data controller and the public [115] and [132-135]. Articles 8 and 10 of the European Convention of Human Rights were in play and an ‘intense focus’ on the comparative rights, based on the specific facts of the case, was required. Importantly, the starting point was an equal balance between delisting and continued processing [132], though the fact that a conviction had become ‘spent’ would be a weighty factor [166(2)].
In short, every case is going to turn on its facts – there are no ‘bright lines’. There are, as the judge recognised, competing interests at stake in a changing legal landscape. There are legitimate interests in rehabilitation, which led to the passing of the Rehabilitation of Offenders Act 1974 (long before the internet); the courts have developed individual privacy rights under the Human Rights Act 1998 (the origins of the tort of misuse of private information lie in the 2004 House of Lords decision in the Naomi Campbell case [2004] 2 AC 457); with Google Spain in the CJEU in 2014 being the landmark decision on the ‘right to be forgotten’ (expressly included in Art 17 of the GDPR). The questions of whether or when a criminal conviction (determined at a public hearing and a matter of public record) becomes a private matter (in relation to which the individual has a reasonable expectation of privacy) – and when information about that conviction (even when spent) may be published – are difficult and subtle questions. (The judge summarised a series of public law cases considering spent convictions [48].)
In NT1’s case, the balance was clearly in Google’s favour. The relevant factors included the length of his sentence (if one day longer, it would never have become spent); the offence involved serious dishonesty on a substantial scale; his case on harm was not compelling; he had published misleading information, making ‘crude attempts to re-write history’; and he could not be trusted to provide an accurate account of his business background or credentials to those with whom he came into contact in business [167-169]. By contrast, NT2 had shown a credible case on harm; the information was out of date and irrelevant; and there was no sufficient interest in the data on the part of users of Google’s ISE to justify the continued processing [223].
Fifthly, the question of actual consent arose in NT2’s claim, since he had given interviews in which he had referred to the criminal conviction (before it became spent), as part of an attempt to put adverse publicity into context. Now he had withdrawn consent and wished those interviews to be delisted by Google. It was not suggested that there was any legal obstacle to his withdrawing consent (no contractual obligation or estoppel) [220]. Consent plays a crucial role in data protection claims and it is clear that a defendant will not be able to rely on prior publications, made voluntarily by the data subject, as constituting consent to continuing publication.
Sixthly and beyond, there are more points of interest outside the scope of this short case note. For example: the convergence of causes of action in data protection, misuse of private information, and defamation – in which competing Article 8 and Article 10 rights arise. And, of course, it will be borne in mind that these claims were only in relation to the delisting of information by Google – there was no attempt to seek the removal or erasure of the same data from the primary sources identified in the searches ([160] and [221]). Note, however, that the judge’s finding that certain data about NT2 in a newspaper article were inaccurate suggested that the newspaper would not be entitled to continue to publish that article. Claims against Google – particularly those which might determine whether data are accurate – will be closely followed by media defendants. But that is for another day.
Having played two in NT1 & NT2, Google lost one and won one. For the successful NT2, the judge made an order for ‘delisting’, but declined to order any damages since Google had taken ‘reasonable care’. As for NT1, his data protection and misuse of private information claims failed. He was given permission to appeal but, whatever the interest in the legal issues, the facts appear to be against him.
There is everything to play for and one thing is for sure: data protection will run and run.
Contributor Heather Rogers QC is a barrister at One Brick Court.
Google fail? Heather Rogers QC puts the legal record straight on the first two ‘right to be forgotten’ cases – tried under the twilight data protection regime but with issues far from resolved
The beginning of the legal year offers the opportunity for a renewed commitment to justice and the rule of law both at home and abroad
By Louise Crush of Westgate Wealth Management sets out the key steps to your dream property
A centre of excellence for youth justice, the Youth Justice Legal Centre provides specialist training, an advice line and a membership programme
By Kem Kemal of Henry Dannell
By Ashley Friday of AlphaBiolabs
Providing bespoke mortgage and protection solutions for barristers
Joanna Hardy-Susskind speaks to those walking away from the criminal Bar
Imposing a professional obligation to act in a way that advances equality, diversity and inclusion is the wrong way to achieve this ambition, says Nick Vineall KC
From a traumatic formative education to exceptional criminal silk – Laurie-Anne Power KC talks about her path to the Bar, pursuit of equality and speaking out against discrimination (not just during Black History Month)
Yasmin Ilhan explains the Law Commission’s proposals for a quicker, easier and more effective contempt of court regime
James Onalaja concludes his two-part opinion series