On June 25, 2013, the Opinion of the Advocate General Niilo Jääskinen (AG) in case C-131/12, Google Spain v. Agencia Española de Protección de Datos, was published. This case, which is pending at the Court of Justice of the European Union (CJEU), is being closely watched because one of the questions presented to the court is about the right to be forgotten by search engines. This question implicates the proper balance of freedom of expression and protection of personal data and privacy under EU law.
The case is also interesting because it is the first time that the CJEU is asked to interpret the 1995 Data Protection Directive vis-à-vis search engines. When the CJEU finally reaches a decision in this case, it will be binding not only in the Spanish Courts, but in all the national courts of the 28 Member States of the European Union.
Facts of the Case
In 1998, a Spanish newspaper published, both off-line and online, information about a court-ordered foreclosure auction to pay social security debt. In 2009, the debtor, who had since paid his debt, discovered that ‘googling' his name led to a link to the online notice.
He asked the newspaper to take the information down, but the editor refused as the publication had originally been made by order of the Ministry of Labor and Social Affairs. He then asked Google Spain to stop referencing the link in its search results and also complained to Spain's Data Protection Authority, the Agencia Española de Protección de Datos (AEPD).
The AEPD asked Google to stop indexing the link, but refused to ask the newspaper editor to take the original information down, as the publication was legally justified. Google appealed, and on March 9, 2012, the Audiencia Nacional of Spain issued a reference for a preliminary ruling to the CJEU. This process allows judges from the Member States to ask the CJEU for an advisory opinion as to how they should apply particular EU laws.
Three Questions Asked to the CJEU
There were three main categories of questions referred to the CJEU. One was about the territorial application of the EU Data Protection Directive, the second was whether a search engine should be considered a data processor by the 1995 Data Protection Directive, and the third was about whether there is a right to be forgotten by a search engine.
1. The Territorial Scope of the Data Protection Directive
Under article 4.1 of the Data Protection Directive, the Directive is applicable to a "data controller" that processes data in the territory of a Member State or uses equipment situated in the territory of a Member State, even if the controller is not established there.
Google's business model is based on keyword advertising, and as such, processes data linked to selling targeted advertisements to people living in the EU. It has subsidiaries in several Member States. If these subsidiaries are acting as a bridge for the referencing service to the advertising market, they are establishments within the meaning of article 4(1) of the Directive. Not much surprise here.
2. The Liability of Search Engines
A more interesting question was the applicability of the Data Protection Directive to a search engine. There is no doubt, under the 2003 CJEU Lindquist case, that the publisher of web pages containing personal data is a data controller. But should a search engine also be considered a data controller?
According to the Advocate General, search engine activities are indeed personal data processing (at 75). However, Google should not be considered a "controller" under the definition of article 2(d) of the Directive, which defines the controller as the person who "determines the purposes and means of the processing of personal data." That is, search engine providers only supply the tools used to locate information, but do not exercise control over personal data. They cannot distinguish personal data from non-personal data, and cannot change information on host servers. Rather, they crawl the web to retrieve and copy web pages in order to index them; this is a "passive relationship to electronically stored or transmitted content"(at 87). Accordingly, the AG concluded that a search engine is generally not a data controller (at 89).
However, according to the AG, a search engine provider can be considered a controller when it exercises discretionary control over the search engine's index (as opposed to automatically caching content from other web pages) (at 91-92). For instance, a search engine sometimes blocks certain search results or does not display some URL addresses. A search engine might also be considered a controller when it deliberately caches content despite an exclusion code on a web page, or refuses to update the cached version of a web page at the website's request (at 93). In these circumstances, it has to comply with article 6(c) and 6(d) of the Directive stating the principles of adequacy, relevancy, proportionality, accuracy and completeness when processing such data (at 94-98).
The Spanish data subject had asked Google to suppress his name from its index. That was not a complaint about Google's exercise of control, but about automatically cached content. The AG concludes that even if a search engine, such as Google, actually processes some personal data, it generally is not a controller under the Data Protection Directive. Therefore a national data protection authority cannot ask a search engine to take down personal data, unless the search engine posted that data notwithstanding exclusion codes or ignored a request by a website to update its cache (at 99-100). The AG deferred on the issue of whether a "notice and takedown" procedure for illegal or inappropriate content might apply to search engines, finding that to be a question of national law based on grounds unrelated to the protection of private information (at 99).
3. Right to be Forgotten
The last of the three questions referred to the CJEU was whether there exists a right to be forgotten, although the answer to that question under the Data Protection Directive would be relevant only in the limited circumstances where a search engine could be considered a data controller. Nevertheless, the AG considered whether either the Directive or the Charter of Fundamental Rights of the EU gave rise to such a right.
- Does the Data Protection Directive Provide for a General Right to Be Forgotten?
Article 12(b) of the Directive provides the data subject with a right to rectify, erase or block processing of incomplete or inaccurate data. Its article 14(a) provides the subject with the right to object to the processing of his data. The Spanish court asked if these rights include the right to ask a search engine provider to prevent indexing personal information, even lawfully published data, giving data subjects a ‘right to be forgotten.'
According to the AG, article 12(b) only applies to incomplete or inaccurate information. In this case, the information had been lawfully published, which also makes article 14(a) inapplicable in this case. The AG is of the opinion that "the Directive does not provide for a general right to be forgotten in the sense that a data subject is entitled to restrict or terminate dissemination of personal data that he considers to be harmful or contrary to his interests" (at 108).
- Is the Right to Be Forgotten a Fundamental Right?
The AG then considered whether denying a right to be forgotten is compatible with the 2000 Charter of Fundamental Rights of the EU, which sets forth all the civil, political, economic and social rights of European citizens and all persons residing in the EU.
Among these rights is the protection of personal data which must be, according to the Charter's article 8.2, "processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law." This does not bring any new elements according to the AG.
But article 7 of the Charter also claims that "[e]veryone has the right to respect for his or her private and family life, home and communications," a right which is also protected by article 8 of the European Convention of Human Rights. Under CJEU case law, the right to private life with regard to processing of personal data covers all information relating to an individual, in his private and in his professional sphere (the AG quoted the 2010 Volker und Markus Schecker case).
As Google indeed processes personal data, there is an interference with the right to privacy stated by the article 7, and thus, to be legal, this interference must be based on law and necessary in a democratic society under both the Charter and the European Convention of Human Rights.
But the Charter, in its article 11, and the European Convention of Human Rights, in its article 10, also protect freedom of expression and information. Internet users have the right to seek and receive information on the Web; this is even stated to be "one of the most important ways to exercise [the] fundamental right to receive information" (at 131). That right would be compromised if the results would be somewhat sanitized (the AG used the term ‘bowdlerized,' referring to Thomas Bowdler who prepared an ‘appropriate' version of Shakespeare in the 19th Century.)
The AG quoted the 2010 case of Aleksey Ovchinikov v. Russia where the European Court of Human Rights found that "in certain circumstances a restriction on reproducing information that has already entered the public domain may be justified, for example to prevent further airing of the details of an individual's private life which do not come within the scope of any political or public debate on a matter of general importance."
However, the right to private life must be balanced with freedom of expression and freedom of information. Giving data subjects a right to be forgotten "would entail sacrificing pivotal rights such as freedom of expression and information" and should not even be considered on a case-by-case basis, such as a notice and take down procedure. That would lead, according to the AG, to either automatic withdrawal of links or an unmanageable number of take-down requests (at 133). It would also lead to a censuring of content published on the Web, with the search engine playing the uncomfortable role of censor.
Notably, the Advocate General started his opinion quoting the 1890 Warren and Brandeis article "The Right to Privacy." His opinion that there should be no right to be forgotten that allows individuals to require search results to be stripped from one's personal data is indeed in line with the general U.S. opinion on the subject.
The opinion only addressed the issue of the right to be forgotten by a search engine, which is considered an intermediary by the Directive, and not the original publisher. By using ‘exclusion codes' directing search engines not to index or display in the results a particular a web page, an online publisher can effectively minimize the effect of publishing information online, or chose to take it down entirely if requested to do so by the data subject.
This ‘right to be forgotten' is hotly debated right now on both sides of the Atlantic, after the EU Commission published in January 2012 a Proposal for a data protection Regulation, which would be directly applicable in all the Member States. Under article 16 of the Proposal, the data subject would have the right to have the data controller delete personal data relating to her and to stop further dissemination of such data. It remains to be seen when and if this article will become law in the EU. However, if there is no a right to be forgotten, all we ever do, both online and off-line, may remain recorded on the Web, in perpetuity.
Marie-Andrée Weiss is a solo attorney admitted in New York, and her admission is pending in France. Her practice focuses on intellectual property, privacy, and social media law. She frequently writes on these topics and on European Union law.
(Image courtesy of Flickr user fake is the new real pursuant to a Creative Commons CC BY-NC-SA 2.0 license.)