You may remember that around this time last year I wrote a rather critical analysis of the newly established Right to be Forgotten which resulted from the Google Spain decision. You may also remember that Julia Powles and Rebekah Larsen collected a great deal of commentary (available here) from all sides of the debate on this topic, including, I am flattered to say, mine. Apart from anything else, this collection of commentary from all perspectives helped me re-analyse my own position on the Right to be Forgotten (RTBF), and perhaps move away from being staunchly against it, to being critical of how it was implemented. A year down the line, Julia Powles and Ellen Goodman managed to round up signatures from the lot of us, and composed an excellent Open Letter to Google, asking them for more transparency in how exactly they handle RTBF requests.
This letter was also published by The Guardian, and sets out a number of key requests for Google:
Here is what we think, at a minimum, should be disclosed:
- Categories of RTBF requests/requesters that are excluded or presumptively excluded (e.g., alleged defamation, public figures) and how those categories are defined and assessed.
- Categories of RTBF requests/requesters that are accepted or presumptively accepted (e.g., health information, address or telephone number, intimate information, information older than a certain time) and how those categories are defined and assessed.
- Proportion of requests and successful delistings (in each case by % of requests and URLs) that concern categories including (taken from Google anecdotes): (a) victims of crime or tragedy; (b) health information; (c) address or telephone number; (d) intimate information or photos; (e) people incidentally mentioned in a news story; (f) information about subjects who are minors; (g) accusations for which the claimant was subsequently exonerated, acquitted, or not charged; and (h) political opinions no longer held.
- Breakdown of overall requests (by % of requests and URLs, each according to nation of origin) according to the WP29 Guidelinescategories. To the extent that Google uses different categories, such as past crimes or sex life, a breakdown by those categories. Where requests fall into multiple categories, that complexity too can be reflected in the data.
- Reasons for denial of delisting (by % of requests and URLs, each according to nation of origin). Where a decision rests on multiple grounds, that complexity too can be reflected in the data.
- Reasons for grant of delisting (by % of requests and URLs, each according to nation of origin). As above, multi-factored decisions can be reflected in the data.
- Categories of public figures denied delisting (e.g., public official, entertainer), including whether a Wikipedia presence is being used as a general proxy for status as a public figure.
- Source (e.g., professional media, social media, official public records) of material for delisted URLs by % and nation of origin (with top 5–10 sources of URLs in each category).
- Proportion of overall requests and successful delistings (each by % of requests and URLs, and with respect to both, according to nation of origin) concerning information first made available by the requestor (and, if so, (a) whether the information was posted directly by the requestor or by a third party, and (b) whether it is still within the requestor’s control, such as on his/her own Facebook page).
- Proportion of requests (by % of requests and URLs) where the information is targeted to the requester’s own geographic location (e.g., a Spanish newspaper reporting on a Spanish person about a Spanish auction).
- Proportion of searches for delisted pages that actually involve the requester’s name (perhaps in the form of % of delisted URLs that garnered certain threshold percentages of traffic from name searches).
- Proportion of delistings (by % of requests and URLs, each according to nation of origin) for which the original publisher or the relevant data protection authority participated in the decision.
- Specification of (a) types of webmasters that are not notified by default (e.g., malicious porn sites); (b) proportion of delistings (by % of requests and URLs) where the webmaster additionally removes information or applies robots.txt at source; and (c) proportion of delistings (by % of requests and URLs) where the webmaster lodges an objection.
TechCrunch has also picked up on the Open Letter and point out that the letter focused on Google specifically because it was, realistically, receiving the lion’s share of RTBF delisting requests, as well as pointing out the frustrations that although Google has made some attempt to be transparent about this procedure “that the information that Google has released thus far has been “anecdotal” — and does not allow outsiders to judge how representative it is.”
Incidentally, Google have already, sort of, responded regarding our open letter – though nothing concrete was promised, they assured Wired.co.uk that they would take the proposals under advisement. Though Google have been (understandably) critical of the RTBF decision, they have made efforts to implement it fairly and somewhat transparently. As the Google spokesperson, talking to Wired.co.uk, stated, they “… launched a section of [their] Transparency Report on these removals within six months of the ruling because it was important to help the public understand the impact of the ruling … [their] Transparency Report is always evolving and it’s helpful to have feedback like this so we know what information the public would find useful.” As Katie Collins puts it:
Google’s Global Privacy Counsel has openly said the company is “building a rich programme of jurisprudence”, but as the letter points out, “it is jurisprudence built in the dark”. The meaning of the ruling, it says, “deserves much greater elaboration, substantiation and discussion”.
Sophie Curtis, in an article for The Telegraph, backs up this balance between disapproval of the decision by Google and others on the one hand, the fact that they have nonetheless made attempts to be transparent on the other hand, and on the third hand (don’t ask) the fact that more could be done to be open and transparent about how exactly the Right to be Forgotten works, or should work, in Europe.
The letter also points to recommendations by Google’s own Advisory Council, as well as the Article 29 Working Party, that “data controllers” such as Google should be as transparent as possible about the way they deal with right to be forgotten requests.
It is of course important to keep in mind that this could all change if the fabled and long-awaited General Data Protection Regulation ever manages to be finalised and comes into effect. This would (by its strictly applicable, EU-Regulation-y nature) fully harmonise large aspects of data protection and privacy within Europe, including perhaps a new or revised version of the Right to be Forgotten, perhaps more along the lines of the previously proposed right to erasure (or perhaps, rather unlikely as it is, a backtrack on the RTBF).
There hasn’t been an official response from Google at the time of writing, but not for want of pestering, so I shall endeavour to update this post if and when there is one. In the mean time, spread the word, and the original letter can be found here.
Image attribution: Bee on Forget-me-Nots by Henry Hemming | Flickr
“European Parliament legislative resolution of 12 March 2014 on the proposal for a regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation)”
Neil Brady, “Does the ‘right of erasure’ pose a bigger threat than the ‘right to be forgotten’?”, The Guardian, http://www.theguardian.com/media-network/media-network-blog/2014/jul/10/right-forgotten-google-data-protection
Katie Collins, “Google ‘considers’ more ‘right to be forgotten’ transparency”, Wired.co.uk, http://www.wired.co.uk/news/archive/2015-05/14/google-right-to-be-forgotten-transparency-letter
Sophie Curtis, “Google under fire from academics over ‘right to be forgotten’ transparency”, The Telegraph, http://www.telegraph.co.uk/technology/google/11605425/Google-under-fire-from-academics-over-right-to-be-forgotten-transparency.html
Jemima Kiss, “Dear Google: open letter from 80 academics on ‘right to be forgotten'” The Guardian, http://www.theguardian.com/technology/2015/may/14/dear-google-open-letter-from-80-academics-on-right-to-be-forgotten
Shane McNamee, “Europe and the Right to be Forgotten a Memorable Victory for Privacy or Defeat for Free Speech”, The Undisciplined, https://theundisciplined.com/2014/05/17/europe-and-the-right-to-be-forgotten-a-memorable-victory-for-privacy-or-defeat-for-free-speech/
Ellen P Goodman, “Open Letter to Google From 80 Internet Scholars: Release RTBF Compliance Data”, Medium, https://medium.com/@ellgood/open-letter-to-google-from-80-internet-scholars-release-rtbf-compliance-data-cbfc6d59f1bd
Cambridge Code Google Spain Commentary available at http://www.cambridge-code.org/googlespain.html
LSE Media Policy Project Blog, “In Open Letter to Google, 80 Technology Scholars Press for More Transparency on Right to Be Forgotten Compliance” http://blogs.lse.ac.uk/mediapolicyproject/2015/05/14/in-open-letter-to-google-80-technology-scholars-press-for-more-transparency-on-right-to-be-forgotten-compliance/