EU Pushes for Real-Time Access to Google Search Data

The European Commission plans to access Google’s most sensitive user data. Officially, the aim is to promote competition. In practice, it lays the groundwork for a new form of real-time monitoring.

The European Commission’s push to access Google search data, framed as a competition measure, could enable near real-time monitoring of user behavior at scale. Photo: Carl Court/Getty Images/Gemini

The European Commission’s push to access Google search data, framed as a competition measure, could enable near real-time monitoring of user behavior at scale. Photo: Carl Court/Getty Images/Gemini

The European Commission has put forward a proposal that would require Google to provide detailed data on how its search engine is used across Europe. Under the Digital Markets Act, the information would be made available via an application programming interface and updated on a regular basis.

Officially, the measure is intended to foster competition by giving rival providers the ability to improve their own search engines. The proposal itself, however, outlines an extensive collection of search behavior data. It describes a data structure that effectively captures the full scope of user activity, at a level of detail that has so far been accessible only to the platform operator.

The plan would record every individual search query, including the original wording and any subsequent modifications. It would also include timestamps, location data, language and device type. The context of each query would be tracked as well, including whether it was made via a browser, an app, voice control or image recognition.

EU Age-Verification App Ready: Brussels Takes Covid Model Online

You might be interested EU Age-Verification App Ready: Brussels Takes Covid Model Online

Mapping the User

The data collection would not end with the query itself. All displayed results would also be recorded, from organic listings and advertisements to so-called knowledge panels and short answers. For each result, position, presentation and context would be stored, creating a complete record of what a user sees.

Particularly far-reaching is the collection of interaction data. The proposed interface would not only document clicks, but also their sequence, duration and timing. It would further track scrolling behavior, dwell time, returns to search results and patterns of navigation across content.

The planned update frequency is the decisive factor. Rather than a static dataset, the data would be continuously updated. That creates the technical possibility of analyzing search behavior in near real time.

This fundamentally changes the nature of the project. Access to search data on this scale would make it possible to track, at any given moment, what people are interested in, which questions they are asking, which topics are gaining traction and how users respond to them. That goes beyond traditional market transparency. It amounts to infrastructure for the continuous observation of social dynamics.

Such an architecture also creates the capacity to respond to those developments immediately. Real-time insight into emerging trends allows them to be interpreted, influenced or actively countered. The line between a competition tool and a politically usable data resource becomes increasingly blurred.

Search queries rank among the most sensitive forms of data. Users often formulate questions that they would not raise with doctors, lawyers or even those close to them, including health concerns, financial difficulties, private conflicts or professional uncertainties. Combined with location, timing and behavioral patterns, such data can form detailed profiles that may allow conclusions about individual users.

Anonymization as a Weak Point

The Commission relies on anonymization. Personal identifiers are to be removed, timestamps reduced and rare queries filtered out. In practice, such methods are widely considered vulnerable.

A well-known example is the release of search data by AOL in 2006, when journalists were able to identify a supposedly anonymized user based on her queries. Search data in particular is regarded as highly sensitive, as even individual queries can reveal personal details.

Google Staff Push Back Against Military Use of AI

You might be interested Google Staff Push Back Against Military Use of AI

There is also a structural issue. While Google as a platform operator is subject to regulatory oversight, including the General Data Protection Regulation, the sharing of data would increase the number of potential access points. Each additional recipient represents a possible security risk, whether through data leaks, government access or commercial reuse.

It also remains unclear who would ultimately gain access. The Commission refers to “vetted recipients” without defining the term in detail. As a result, it is not transparent which actors may have access to the datasets in future or for what purposes they might use them.

The public consultation runs until early May, with a final decision expected by the end of July 2026. That leaves only a limited window in which the scope and design of the measure can still be influenced.