The European Commission (EC) is considering a “Digital Omnibus” package that would substantially rewrite EU privacy law, particularly the landmark General Data Protection Regulation (GDPR). It’s not a done deal, and it shouldn’t be.
The GDPR is the most comprehensive model for privacy legislation around the world. While it is far from perfect and suffers from uneven enforcement, complexities and certain administrative burdens, the omnibus package is full of bad and confusing ideas that, on balance, will significantly weaken privacy protections for users in the name of cutting red tape.
It contains at least one good idea: improving consent rules so users can automatically set consent preferences that will apply across all sites. But much as we love limiting cookie fatigue, it’s not worth the price users will pay if the rest of the proposal is adopted. The EC needs to go back to the drawing board if it wants to achieve the goal of simplifying EU regulations without gutting user privacy.
Let’s break it down.
Changing What Constitutes Personal Data
The digital package is part of a larger Simplification Agenda to reduce compliance costs and administrative burdens for businesses, echoing the Draghi Report’s call to boost productivity and support innovation. Businesses have been complaining about GDPR red tape since its inception, and new rules are supposed to make compliance easier and turbocharge the development of AI in the EU. Simplification is framed as a precondition for firms to scale up in the EU, ironically targeting laws that were also argued to promote innovation in Europe. It might also stave off tariffs the U.S. has threatened to levy, thanks in part to heavy lobbying from Meta and tech lobbying groups.
The most striking proposal seeks to narrow the definition of personal data, the very basis of the GDPR. Today, information counts as personal data if someone can reasonably identify a person from it, whether directly or by combining it with other information.
The proposal jettisons this relatively simple test in favor of a variable one: whether data is “personal” depends on what a specific entity says it can reasonably do or is likely to do with it. This selectively restates part of a recent ruling by the EU Court of Justice but ignores the multiple other cases that have considered the issue.
This structural move toward entity specific standards will create massive legal and practical confusion, as the same data could be treated as personal for some actors but not for others. It also creates a path for companies to avoid established GDPR obligations via operational restructuring to separate identifiers from other information—a change in paperwork rather than in actual identifiability. What’s more, it will be up to the Commission, a political executive body, to define what counts as unidentifiable pseudonymized data for certain entities.
Privileging AI
In the name of facilitating AI innovation, which often relies on large datasets in which sensitive data may residually appear, the digital package treats AI development as a “legitimate interest,” which gives AI companies a broad legal basis to process personal data, unless individuals actively object. The proposals gesture towards organisational and technical safeguards but leave companies broad discretion.
Another amendment would create a new exemption that allows even sensitive personal data to be used for AI systems under some circumstances. This is not a blanket permission: “organisational and technical measures” must be taken to avoid collecting or processing such data, and proportionate efforts must be taken to remove them from AI models or training sets where they appear. However, it is unclear what will count as an appropriate or proportionate measures.
Taken together with the new personal data test, these AI privileges mean that core data protection rights, which are meant to apply uniformly, are likely to vary in practice depending on a company’s technological and commercial goals.
And it means that AI systems may be allowed to process sensitive data even though non-AI systems that could pose equal or lower risks are not allowed to handle it.
A Broad Reform Beyond the GDPR
There are additional adjustments, many of them troubling, such as changes to rules on automated-decision making (making it easier for companies to claim it’s needed for a service or contract), reduced transparency requirements (less explanation about how users’ data are used), and revised data access rights (supposed to tackle abusive requests). An extensive analysis by NGO noyb can be found here.
Moreover, the digital package reaches well beyond the GDPR, aiming to streamline Europe’s digital regulatory rulebook, including the e-Privacy Directive, cybersecurity rules, the AI Act and the Data Act. The Commission also launched “reality checks” of other core legislation, which suggests it is eyeing other mandates.
Browser Signals and Cookie Fatigue
There is one proposal in the Digital Omnibus that actually could simplify something important to users: requiring online interfaces to respect automated consent signals, allowing users to automatically reject consent across all websites instead of clicking through cookie popups on each. Cookie popups are often designed with “dark patterns” that make rejecting data sharing harder than accepting it. Automated signals can address cookie banner fatigue and make it easier for people to exercise their privacy rights.
While this proposal is a step forward, the devil is in the details: First, the exact format of the automated consent signal will be determined by technical standards organizations where Big Tech companies have historically lobbied for standards that work in their favor. The amendments should therefore define minimum protections that cannot be weakened later.
Second, the provision takes the important step of requiring web browsers to make it easy for users sending this automated consent signal, so they can opt-out without installing a browser add-on.
However, mobile operating systems are excluded from this latter requirement, which is a significant oversight. People deserve the same privacy rights on websites and mobile apps.
Finally, exempting media service providers altogether creates a loophole that lets them keep using tedious or deceptive banners to get consent for data sharing. A media service’s harvesting of user information on its website to track its customers is distinct from news gathering, which should be protected.
A Muddled Legal Landscape
The Commission’s use of the “Omnibus” process is meant to streamline lawmaking by bundling multiple changes. An earlier proposal kept the GDPR intact, focusing on easing the record-keeping obligation for smaller businesses—a far less contentious measure. The new digital package instead moves forward with thinner evidence than a substantive structural reform would require, violating basic Better Regulation principles, such as coherence and proportionality.
The result is the opposite of “simple.” The proposed delay of the high-risk requirements under the AI Act to late 2027—part of the omnibus package—illustrates this: Businesses will face a muddled legal landscape as they must comply with rules that may soon be paused and later revived again. This sounds like “complification” rather than simplification.
The Digital Package Is Not a Done Deal
Evaluating existing legislation is part of a sensible legislative cycle and clarifying and simplifying complex process and practices is not a bad idea. Unfortunately, the digital package misses the mark by making processes even more complex, at the expense of personal data protection.
Simplification doesn’t require tossing out digital rights. The EC should keep that in mind as it launches its reality check of core legislation such as the Digital Services Act and Digital Markets Act, where tidying up can too easily drift into a verschlimmbessern, the kind of well-meant fix that ends up resembling the infamous ecce homo restoration.









