Get AI summaries of any video or article — Sign up free

Google Spain v. Gonzáles: Did the Court forget about freedom of expression?

S. Kulk, Frederik Zuiderveen Borgesius
ArXiv.org·2025·Social Sciences·28 citations
8 min read

Read the full paper at DOI or on arxiv

TL;DR

The paper’s central claim is that Google Spain insufficiently engages freedom of expression and the public’s right to receive information when delisting name-based search results.

Briefing

This paper is a legal-analytic commentary on the CJEU’s landmark decision in Case C-131/12, Google Spain SL and Google Inc. v. Agencia Española de Protección de Datos and Mario Costeja González (“Google Spain”). The authors, S. Kulk and Frederik Zuiderveen Borgesius, ask a focused normative question: did the Court “forget about freedom of expression?” They argue that the judgment’s reasoning and doctrinal framing tilt too strongly toward privacy and data protection, while under-engaging the equally fundamental right to freedom of expression (including the public’s right to receive information) and the rights of information providers and search engine operators.

The question matters because Google Spain operationalized what is popularly called the “right to be forgotten” by recognizing, under certain conditions, a data subject’s ability to require delisting of search results tied to their name—even when the underlying third-party web content is lawful and accurate. This has had wide practical effects on online speech, discoverability, and the architecture of networked information. The authors’ concern is not simply that privacy can outweigh expression; rather, it is that the CJEU’s approach establishes a default hierarchy that treats privacy/data protection as overriding “as a rule” the public’s interest in access, thereby shifting the balancing task away from courts and toward private intermediaries.

Methodologically, the paper is not an empirical study. It is a doctrinal and interpretive analysis grounded in the text of the CJEU judgment, the relevant EU Data Protection Directive 95/46/EC, and comparative human-rights doctrine from the European Court of Human Rights (ECtHR). The authors also draw on related institutional materials and implementation evidence: they discuss the Advocate General’s opinion, the Article 29 Working Party’s prior views on search engines, and Google’s reported compliance statistics and transparency practices (e.g., communications to the Working Party and Google’s transparency reporting). The “data” in this paper are therefore legal authorities and documented implementation figures rather than survey or experimental data.

The paper first summarizes the case facts. In 1998, La Vanguardia Ediciones SL published online announcements about a real-estate auction connected to social security debts, naming Mario Costeja González as the real-estate owner. Although the publisher did not remove the information from its website (because the publication was ordered by the Spanish Ministry of Labour and Social Affairs), González sought delisting of search results that surfaced the newspaper pages when users searched for his name. After a complaint to the Spanish data protection authority (DPA), the DPA rejected the complaint as to the publisher (given the legal basis for publication) but upheld it regarding Google Spain and Google Inc., treating the search engine operator as capable of being required to remove links from its index and prevent future access through search.

The CJEU’s reasoning, as presented here, proceeds in several steps. First, it treats the search engine’s activities of indexing, storing, and displaying links as “processing” of personal data, even when the data are already publicly available and even if the operator does not alter the underlying data. Second, it classifies the search engine operator as a “controller” because it determines the purposes and means of the processing. Third, it establishes territorial reach: processing is carried out “in the context of the activities” of an establishment in a Member State when the operator has a branch or subsidiary oriented toward that Member State (here, Google Spain’s advertising activities). Fourth, it interprets the Directive’s rights and balancing provisions to allow delisting where the data subject’s rights under Articles 7 and 8 of the Charter (privacy and data protection) require it. The judgment emphasizes that search results provide a “structured overview” of information about an individual and can enable detailed profiling that would be difficult without search engines. It also states that even initially lawful processing can become incompatible over time when data become inadequate, irrelevant, or excessive.

The paper’s core critique is that the CJEU’s balancing framework insufficiently accounts for freedom of expression. The authors argue that delisting interferes with multiple expressive interests: (1) publishers’ freedom of expression (because delisting makes lawful content harder to find via name searches); (2) search engines’ expressive interests (because search engines curate and provide access to information); and (3) the public’s right to receive information. They stress that ECtHR doctrine treats privacy and expression as deserving “equal respect” and requires nuanced, case-by-case balancing. In contrast, the CJEU is said to declare that privacy/data protection rights “override, as a rule” not only the operator’s economic interest but also the general public’s interest in access. While the CJEU includes caveats (e.g., where the data subject plays a role in public life), the authors view the “as a rule” language as a departure from ECtHR’s equal-weight approach.

The paper also highlights institutional and governance concerns. By treating search engine operators as controllers, the CJEU effectively creates a private ordering system: delisting decisions are made by the operator under legal pressure, with courts and DPAs acting as review bodies. The authors argue that this is problematic in hard cases because search engines may not be the most appropriate party to balance competing fundamental rights. They cite the Advocate General’s warning that such balancing by the operator could lead to an unmanageable number of delisting requests and could incentivize automatic withdrawal of results. They further compare the dynamic to intermediary liability regimes (e.g., notice-and-takedown incentives), where intermediaries remove content expeditiously to reduce legal risk, potentially chilling expression.

On implementation, the paper provides specific figures from Google’s compliance communications. As of 18 July 2014, Google reported receiving 91,000 requests involving more than 300,000 URLs. Google delisted about half of the URLs upon request; for 32% it decided not to delist; and for 15% it asked for more information. These numbers are used to illustrate the scale of private decision-making and the administrative burden on intermediaries and oversight bodies.

The authors also argue that the CJEU’s controller framing creates doctrinal difficulties regarding “special categories of personal data.” Under the Directive, processing of such data (e.g., racial/ethnic origin, political opinions, religious beliefs, health or sex life) generally cannot rely on the balancing provision (Article 7(f)); instead, explicit consent or specific exceptions are required. Because search engines index many pages that may contain special category data, the authors contend that indexing could be partly illegal under the Directive unless exceptions apply. They note that the Advocate General sought to avoid this consequence by not treating the operator as a controller for processing on third-party pages, but the CJEU allegedly ignores the legal-basis gap.

Limitations are inherent in the paper’s design: it is a commentary rather than an empirical evaluation of outcomes. It does not systematically measure changes in speech, access, or chilling effects after the judgment. Instead, it relies on legal reasoning, doctrinal comparisons, and documented implementation statistics. The authors also acknowledge that their focus is primarily freedom of expression; jurisdictional issues are only briefly addressed.

Practically, the paper suggests that policymakers and courts should not solve all conflicts by imposing obligations on search engines. It proposes alternative solutions such as limiting indexing duration for certain outdated information (e.g., real-estate auction notices) or using technical measures like do-not-index directives (e.g., robots.txt) by original publishers. In the most difficult cases, it argues courts should balance privacy and freedom of expression with equal weight.

Who should care? The authors’ critique is aimed at legal practitioners, courts, data protection authorities, and policymakers shaping EU data protection and intermediary governance. It is also relevant to publishers and platform operators because it affects discoverability of lawful content and the legal risk calculus that drives delisting decisions. Ultimately, the paper frames Google Spain as not merely a privacy case but a structural decision about who performs fundamental-rights balancing in the online information environment.

Cornell Notes

The paper argues that the CJEU’s Google Spain judgment, while recognizing delisting rights, under-emphasized freedom of expression and the public’s right to receive information. By treating search engines as data controllers, it also shifts complex fundamental-rights balancing to private intermediaries, raising governance and legal-basis concerns (including for special-category data).

What research question does the paper address?

Whether the CJEU in Google Spain gave sufficient attention to freedom of expression when it recognized delisting rights for name-based search results.

Why does the authors’ question matter in the broader context?

Because delisting affects not only privacy but also publishers’ speech, search engines’ role in information access, and the public’s ability to find information online.

What is the paper’s methodology?

Doctrinal legal analysis comparing the CJEU’s reasoning with EU data protection law, the Advocate General’s opinion, and ECtHR freedom-of-expression/privacy balancing doctrine, supplemented by documented implementation figures and transparency practices.

What key legal mechanism did the CJEU establish for delisting?

Under Articles 12(b) and 14(1)(a) of the Directive, delisting can be required when continued linking becomes incompatible over time (data become inadequate, irrelevant, or excessive), subject to a fair balance and exceptions (e.g., public-life role).

How does the paper characterize the CJEU’s balancing approach?

It claims the CJEU treats privacy/data protection as overriding “as a rule” the public’s interest in access, departing from ECtHR doctrine that gives equal respect to privacy and expression.

What specific implementation statistics does the paper cite from Google?

As of 18 July 2014, Google reported 91,000 requests involving more than 300,000 URLs; it delisted about half, did not delist 32%, and asked for more information in 15%.

What governance concern does the paper raise about treating search engines as controllers?

It may create private ordering where operators must balance rights in hard cases, potentially leading to automatic or overly cautious delisting and limited transparency for the public.

What legal-basis problem does the paper identify regarding special categories of data?

Because special-category data processing generally cannot rely on the balancing provision, indexing third-party pages containing such data could lack a lawful basis, making parts of search indexing potentially unlawful under the Directive.

What practical alternatives does the paper suggest?

Instead of relying mainly on search-engine obligations, policymakers could use time-limited indexing for outdated notices and technical measures (e.g., do-not-index/robots.txt), and courts should balance rights with equal weight in difficult cases.

Review Questions

  1. How does the paper argue that the CJEU’s “as a rule” language changes the hierarchy between privacy/data protection and freedom of expression?

  2. What does the paper claim are the consequences of shifting fundamental-rights balancing to search engines as “controllers”?

  3. Explain the paper’s special-category-data argument: why does the controller framing create a legal-basis gap for indexing?

  4. What do the cited Google compliance numbers (requests/URLs and delisting rates) illustrate about the scale and nature of private decision-making after Google Spain?

  5. What alternative policy solutions does the paper propose to address outdated or harmful information on the web without overburdening search engines?

Key Points

  1. 1

    The paper’s central claim is that Google Spain insufficiently engages freedom of expression and the public’s right to receive information when delisting name-based search results.

  2. 2

    It argues the CJEU’s balancing framework gives privacy/data protection overriding weight “as a rule,” departing from ECtHR doctrine of equal respect between privacy and expression.

  3. 3

    By treating search engines as data controllers, the judgment shifts complex fundamental-rights balancing to private intermediaries, raising concerns about chilling effects, incentives for over-removal, and limited transparency.

  4. 4

    The authors highlight a governance problem: delisting decisions are made through private notice-and-takedown-like processes, with the public lacking clear oversight over what is removed.

  5. 5

    They argue the controller framing creates doctrinal difficulties for indexing “special categories of personal data,” because the Directive generally does not allow reliance on the balancing provision for such data.

  6. 6

    The paper uses Google’s reported implementation figures (91,000 requests; 300,000+ URLs; roughly half delisted; 32% not delisted; 15% asked for more info) to show the scale of operator decision-making.

  7. 7

    The paper proposes that policymakers and courts should consider alternatives (e.g., time-limited indexing and do-not-index mechanisms) and reserve equal-weight judicial balancing for hard cases.

Highlights

The authors argue that the CJEU privacy/data protection rights “override, as a rule… the interest of the general public in having access to that information upon a search relating to the data subject’s name.”
They report Google’s compliance statistics: “91.000 requests involving more than 300.000 URLs… [with] about half of the URLs” delisted, “32%” not delisted, and “15%” requiring more information.
They contend that delisting “limits the publisher’s freedom of expression, because it makes the original publication harder to find – at least on the basis of a name search.”
They warn that treating search engines as controllers may create “an unmanageable number of delisting requests” and incentivize automatic withdrawal of results (drawing on the Advocate General’s concerns).
They argue that indexing may lack a legal basis for “special categories of data” because such processing generally cannot rely on the balancing provision under the Directive.

Topics

  • EU data protection law
  • Fundamental rights (privacy and freedom of expression)
  • Right to be forgotten / delisting
  • Human rights balancing doctrine
  • Intermediary liability and platform governance
  • Freedom of information and search engines
  • Special categories of personal data
  • Judicial review vs private ordering

Mentioned

  • Google
  • Google Search
  • Google Spain
  • La Vanguardia Ediciones SL
  • Chilling Effects Clearinghouse
  • Article 29 Working Party
  • S. Kulk
  • Frederik Zuiderveen Borgesius
  • Mario Costeja González
  • Advocate General Jääskinen
  • Joris van Hoboken
  • Christopher Kuner
  • Grainne De Burca
  • James Grimmelmann
  • Rosa Julià-Barceló
  • Kamiel J Koelman
  • Jennifer Urban
  • Laura Quilter
  • Wendy Seltzer
  • Axel Springer AG (as a case party)
  • David Smith
  • Kelly Fiveash
  • CJEU - Court of Justice of the European Union
  • DPA - Data Protection Authority
  • ECtHR - European Court of Human Rights
  • ECHR - European Convention on Human Rights
  • E-Commerce Directive - Directive 2000/31/EC on electronic commerce
  • DMCA - Digital Millennium Copyright Act
  • URL - Uniform Resource Locator