To mark Privacy Week 2016, NZLS journalist and law graduate James Greenland considers the European Union's new "right to be forgotten" – more accurately, a right to have unwanted personal information deleted from a search engine's result listings – and considers whether New Zealand's existing privacy law would benefit from recognition of a similar right.
In 2014 the Court of Justice of the European Union (CJEU) in its landmark Google Spain decision recognised a so-called "right to be forgotten", allowing individuals in the EU to make prejudicial but true information about them less accessible to anyone searching their name online. Less a right to be forgotten, it's more a right to hide undesirable personal information.
Shortly after the CJEU's decision, New Zealand Privacy Commissioner John Edwards wrote a blog asking New Zealanders where they thought the balance should be, between rights to privacy and the right to freedom of expression and information. He said Google Spain had been met with "criticism, astonishment, suspicion, relief and applause" – despite the implications of the ruling not being "as clear cut as you might think".
The Google Spain facts
In 1998 a Spanish newspaper published two articles about one Mr González, a mere 35-words each, recording that Mr González's home was to be auctioned to pay off his debts. The articles were lawfully published online. Nearly two decades later, leading to the CJEU decision, Mr González realised that any internet user who typed his name into Google was likely to receive links to the articles. The past was over, and it was prejudicial to Mr González's present and future that information about his debt history continue to be recalled and remembered, he argued. So he demanded that Google break the links, and that the original articles removed by their publisher. Only partly successful, Mr González won the right to have the links removed from online circulation. The original articles remain online – they're just much more difficult to find. The "right to be forgotten", "takes us back to a time when people had to go to the library to research past debts rather than instantly downloading them," commentators have observed.
So, while not a resounding victory for Mr González, the implications of Google Spain are massive, and managing the change of law has proved a mammoth task, which has so far been conveniently if questionably outsourced to Google itself.
"Do No Evil" – Google's power to decide
The day following Google Spain more than 10,000 people in the EU requested that links to information relating to them be taken down. Amongst the first to seek to enforce the newly recognised right were a convicted child pornographer and a doctor who'd received a poor review of his medical practice.
Here's how Google – who's marketing mantra is "do no evil" – currently handles take-down requests.
An online web form allows individuals to request the removal of unwanted search result links. As the process is controlled by Google, a private company, there is (ironically?) little information publicly available about how decisions that must balance fundamental rights to privacy and information are made.
What is known is that such decisions are made by Google's own internal "senior panel", which discusses and votes on whether links should be removed, taking into account "the characteristics of the individual, the publisher of the information, and the nature of the information available via the link".
If Google deems removal is justified, it will "delist" links as far as they are displayed against a search of the data subject's name, and then only from the specific domain (geographical locality) that the individual lives in (although Google is in the process of extending the right to be forgotten, so that links are removed from other countries' domains too).
As with other privacy law, an individual's public profile will also be relevant to the assessment of privacy rights – the more famous you are, the less you can expect a right to privacy.
An aside; a recent Italian court decided that an individual is also entitled to "edit" the snippet / blurb of information that appears below links listed in Google's search results if it is deemed misleading. Automated algorithms currently produce that content without human intervention, but soon Google might be required to take on an editing capacity too.
Google's responsibility has been described as "quasi-judicial" – it has the power to make the sorts of decisions that are usually made by Governments and Courts.
Even Google's own chairman Eric Schmidt doesn't like the position his company has been put it, publicly decrying: "we didn't ask for it". Perhaps unsurprisingly – considering the principles of free information behind his revolutionary online encyclopaedia – Wikipedia founder Jimmy Wales is also amongst the most vocal opponents of the "right to be forgotten" and has lobbied European Parliament to immediately overwrite the CJEU's Google Spain decision, deriding it as a "right to censor some information that you don't like".
Privacy concerns or cosmetic censorship?
Since launching its "take-down" web form service in July 2014 Google has evaluated more than 1 million links and removed 41% of those from its search results. Time has shown that Google will almost certainly remove links to articles naming the victims of crime, and it's likely to remove links revealing a person's involvement with minor crime or quashed convictions.
Perhaps of more concern is Google's willingness to remove links to articles containing minor yet insensitive personal details, such as residential addresses and opinions. An article detailing a competition entered by an individual when they were younger has been effectively hidden. Same for an article about minor crimes committed ten years previously by a school teacher.
More widely acceptable is the fact that Google seems especially reluctant to remove links when they lead to information about an individual's professional capacity and activities. Google has chosen not to remove links to information about a couple arrested for business fraud, a professional's arrest for financial crimes, a doctor's botched procedure and an individual's dismissal for sexual crimes committed at work.
As a Dutch court put it when faced with interpreting the new law; the "right to be forgotten" is not a right to remove articles which "may be unpleasant, but [are] not unlawful" from the eyes of the public. Although, that's arguably exactly the right the appellant won in Google Spain.
It is a natural consequence of high profile crime, for example, that criminals receive public notoriety. Offenders must be prepared to live with the consequences of public knowledge of their decisions and actions, which can be far longer lasting, possibly permanent, in the internet age – but are arguably no less deserved. Do the crime, do the time, essentially.
But what about when people want the world to "forget" their less serious indiscretions, to make it more difficult for undesirable but true facts from their past to be known in the present?
But I've got nothing to hide …
Why should lawyers in their professional capacity care about the right to demand Google breaks links to unwanted information?
Don't most professionals wish to be remembered for their legacy of service, not forgotten to the pages of "practical obscurity"?
And do we really want our doctors to have the ability to hide evidence of malpractice, lawyers erasing records of misconduct, sex-offenders deleting references to past crimes, rehabilitated drug addicts erasing evidence of their addiction, former prostitutes the record of their resume?
Where do we draw the line? What other kinds of information might people want to conceal?
Lawyer rating and review websites are increasing in popularity. A right to remove links to old, undesirable legal service reviews might become incredibly important to lawyers who want to maintain an untarnished professional reputation.
Is a "right to be forgotten" even that new?
The Privacy Commissioner blogs that the so-called "new" right has its origins in pre-internet jurisprudence and principles, and adds his distaste for the phrasing "right to be forgotten".
"It is inaccurate, imprecise, and impossible".
Really it's a right to "practical obscurity" – a concept recognised more than 30 years ago by the US courts – which appreciates that while information may have been "publicly available" (such as on record at a courthouse or local council office) and open for inspection, "the passage of time and geographical obstacles to overcome for most people to gain access to the information guaranteed a degree of privacy in relation to that material".
Even New Zealand's High Court in Tucker – a leading case in our privacy law jurisprudence –recognised that privacy could potentially "grow back" over previously public information. Tucker was about a man with a bad ticker, who – ingeniously for the 80s – attempted to "crowdfund" the heart transplant that would save his life. He might have been successful, had media not published details of Tucker's past convictions for sexual offending against children, derailing support for his operation. The case was never fully heard, and it remains academic whether New Zealand's courts would have recognised that Tucker had an enforceable right to have those details of his offensive past "forgotten" or suppressed.
A right to be forgotten in New Zealand?
The EU "right" was "developed to remedy the dichotomy between ephemera and permanence created by the exponential power of the digital age", writes Victoria University student Anna Fraser in her research paper "Should there be a right to be forgotten (the right to make search engines forget about you) in New Zealand? An analysis of Google v Spain".
Google's search engine has the power to create a digital profile of an individual using the aggregation of disparate data about them. Taken individually, this data is mostly meaningless – a record of our thousands of discrete actions and appearances/references online. Taken together these disparate data points can paint an intimate picture of a person's life, public and private.
The ubiquitous nature of internet record keeping makes it difficult for others to "'forget'" information that would previously, pre-Google, have faded from memory.
The "core issue arises when information generated by Google is seen as prejudicial" by an individual who wants It removed, Ms Fraser writes. "How should their right to privacy be balanced with the rights of freedom of expression and access to information...?"
Ms Fraser argues that a similar, but more limited, right to be forgotten should be created or read-in to New Zealand's existing privacy law. By suggesting a framework for the enforcement of such a right, she is directly responding to the Privacy Commissioner's question: where do you think the balance should be?
The HDCA and existing law can handle it
Ms Fraser says "a right to be forgotten should aim to reflect social mores about what a person is entitled to put behind them and what remains society's business ad infinitum". But decisions balancing the rights to expression and privacy shouldn't be made by a private company, she adds.
She says the soon-to-enter-force (next year) Harmful Digital Communications Act (HDCA), designed to "deter, prevent, and mitigate harm caused to individuals by digital communications; and provide victims … with a quick and efficient means of redress", should be sufficient to protect New Zealanders' legitimate rights to online privacy.
The HDCA will establish an independent agency, whose experts could conduct the types of ethical analyses Google currently undertakes in the EU, she says. Where "sensitive personal facts" published online can be proved to be harmful to individuals, links to those facts will be taken down. A right to appeal through the judicial hierarchy would exist as usual.
Wellington barrister and privacy law lecturer Steven Price agrees the HDCA has potential to address these issues, and says that a "right to be forgotten" is "silly".
"There is no such right and there can't be one.
"There can be a power to remove material from certain places if it is found to be interfering with legal rights. Pretty much all countries have that in some form."
New Zealand already has relatively well-developed privacy laws that give individuals some power over how accessible their private information is to the public.
But under the developing European law, the rules for removal are vague", and their "application is, in practical terms, up to Google".
"In NZ we already have the ability to have material removed for various privacy reasons via a variety of mechanisms - the courts can grant injunctions preventing the publication or continued publication of private facts where publication is highly offensive and there's no countervailing public interest.
"The Press Council, Online Media Standards Authority and Broadcasting Standards Authority can rule on complaints and the first two can order removal of material in some cases.
"The Harmful Digital Communications Act will add to this when its civil regime comes into force, in particular because it contains the power to make takedown orders.
However, it is not designed to expand the reach of the law so much as to make the remedies offered by courts more accessible, Mr Price says, crucially. It's about a better way to enforcing existing privacy law and principles.
"It's not entirely clear how it will be interpreted, and some are worried that it may get out of hand.
"But it contains a range of measures aiming to limit the takedown powers, and expressly says that any takedown order must be consistent with the NZ Bill of Rights Act, which protects freedom of expression except where a particular ruling is reasonable and demonstrably justified in a free and democratic society."
The Big Question
What information should an individual be entitled to remove from the digital record of history, in a free and democratic society that values the freedom of expression – especially when that information is accurate?
"A free internet has effectively "harnessed the world's interests, creativity, and intelligence to produce a colossal archive of everything," those against the EU law say.
"People should have the right to know true information."
If construed too widely, a "right to be forgotten" becomes a right to rewrite the past. And doesn't that undermine many of the benefits and promises of the internet age? Wouldn't it destroy the archive of everything?
Isn't it a bit 1984?
If knowledge is power, isn't it dangerous to give anyone, especially a private company like Google, the ability to decide what information will and won't be archived online?
How would you balance the rights to privacy and the rights to expression and free access to information?
Where would you draw the line?
New Zealand will soon have to decide this for itself.