Overview
On 19 November 2025, the European Commission published two proposed regulations as part of its Digital Package on Simplification: the Digital Omnibus and the Digital Omnibus on AI. These Proposals introduce wide-ranging amendments to existing EU digital and data protection and privacy laws to simplify and streamline compliance obligations for companies operating in the EU or targeting the EU market. The Proposals focus on five main areas:
The Digital Omnibus
- Cybersecurity incident reporting (the Network and Information Systems 2 Directive (NIS2) and other key cybersecurity laws)
- Data protection (GDPR)
- E-privacy (ePrivacy Directive and updated GDPR)
- Data use and governance (Data Act and a suite of related data governance laws)
The Digital Omnibus on AI
- AI (AI Act)
The Omnibus Proposals form part of the Commission’s simplification agenda under the competitiveness compass, which aims to reduce administrative burdens by at least 25% for all companies and 35% for small and medium-sized enterprises (SME).
Below, we first look at these sweeping changes, focusing on the Digital Omnibus, and identify key proposed changes to data protection, e-privacy, cybersecurity, and data use and what these might mean for businesses.
We have also outlined the next steps the Digital Omnibus will need to undergo before becoming law, and what companies can do now to assess the potential impact on their EU operations.
Stay tuned for the second part of this series, in which we will analyse the proposed Digital Omnibus on AI.
In depth
The Digital Omnibus seeks to modernize and simplify legislation in four focus areas:
- Cybersecurity, introducing a centralised channel, the so-called Single Entry Point (SEP), for notifying competent authorities of incidents, and raising the threshold for GDPR data breach notifications to supervisory authorities to align with the threshold for notifying individuals.
- Data protection, codifying a ‘relative’ approach to personal data, introducing new legal bases for processing sensitive data (e.g., in the context of AI), introducing practical exceptions from notice obligations, unifying the approach to Data Protection Impact Assessments (DPIAs), adding protections against potential abuse of access rights by individuals, introducing a number of exceptions to ease R&D compliance requirements, and other more minor clarifications.
- E-privacy, inserting into GDPR’s specific rules on cookie consent and expanding the use cases where consent is not required. These provisions apply only when personal data is stored on or accessed from the terminal of natural persons. To try to reduce ‘cookie consent fatigue’ and simplify or eliminate the need for cookie banners, it also introduces a new obligation for websites and apps to allow data subjects to consent through automated, machine-readable mechanisms. Browser manufacturers must likewise enable users to grant or refuse consent.
- Data use and governance, consolidating and streamlining data-sharing and use-related obligations across a number of acts, introducing additional trade secret disclosure protections, and allowing temporary relief from a number of cloud-switching obligations to selected types of companies and services.
Cybersecurity
A SEP for incident reporting
The Proposal introduces an SEP: a secure, EU-wide reporting platform to be developed and operated by the European Union Agency for Cybersecurity (ENISA). The SEP will serve as a centralized channel for notifying competent authorities of cyber and data incidents under GDPR, the NIS2 Directive, the Digital Operational Resilience Act (DORA), the Electronic Identification and Trust Services (eIDAS) Regulation, and the Directive on the Resilience of Critical Entities (CER). This is done through an amendment to the NIS2 Directive and targeted amendments to DORA, eIDAS, and CER. Future EU reporting legislation may also rely on this platform.
Key changes
The proposed amendments are mostly procedural rather than substantive. They aim to streamline reporting processes across legal frameworks without altering core obligations:
- Single submission channel. Companies will submit notification reports via ENISA’s SEP platform. ENISA will securely route each notification to all relevant national authorities.
- Scope. The SEP applies only to notifications to competent authorities (e.g., supervisory authorities and Computer Security Incident Response Teams – CSIRTs) and does not replace requirements to inform affected individuals, customers, or the public.
- ENISA’s role. ENISA will design, maintain, and operate the SEP, ensuring that the necessary technical arrangements, protocols, and tools are in place to enable entities and competent authorities to access, submit, retrieve, transmit, and process information through the SEP and integrate it within their own systems.
- Notification templates. The Proposal clarifies that when developing common reporting templates for notifications under NIS2, CER, and GDPR, the Commission should take due account of the experience gained and the common templates developed under DORA. For the development of a GDPR template, the European Data Protection Board must, within nine months of the regulation taking effect, submit to the Commission a proposal for:
- A common template for notifying personal data breaches to supervisory authorities
- A list of situations where a breach is likely to pose a high risk to individuals’ rights and freedoms
- Cross-framework recognition. An incident report submitted under Article 14(3) of CRA that includes the required details will also be accepted as a notification of a NIS2 incident notification, reducing duplication and simplifying reporting process.
- GDPR alignment. The Proposal introduces a higher threshold for data breach notification to the supervisory authority, aligning it to the notification to individuals only where a personal data breach is likely to result in a high risk to the rights and freedoms of individuals and extends the deadline from 72 to 96 hours. Notifications will be submitted through the SEP once it is operational.
Impact on businesses
- In the long term, the SEP will provide a unified, secure, and efficient mechanism for reporting to competent authorities, reducing administrative burden.
- In the short term, the benefits will materialize gradually. Within 18 months of the regulation entering into force, ENISA must pilot the SEP for each EU legal act covered. Incident notifications under each legal act may only begin after successful piloting and once the Commission has confirmed the SEP’s readiness. To do so, the Commission and ENISA will assess the SEP’s functioning, reliability, integrity, and confidentiality. When these conditions are met, the Commission will publish a notice in the Official Journal of the European Union, confirming that the SEP is fully operational. Until the system becomes fully operational, entities must continue to use existing national channels.
In summary, the newly published Digital Omnibus Regulation Proposal marks a decisive step towards simplifying and harmonising EU incident reporting. While implementation will take time, the SEP promises to significantly streamline compliance, reduce duplication, and enhance cross-border consistency in cybersecurity and data incident management.
Data protection
The proposed amendments to GDPR bring not only the changes companies might have anticipated based on European courts case law (e.g., further confirmation of the relative approach to personal data) but also a number of measures specifically aimed at facilitating the effective use of data for research and innovation. Notably, the Proposal introduces changes that support the responsible development and deployment of AI tools, such as allowing reliance on legitimate interests for the development and operation of AI systems and creating an exemption from the obligation to provide notice when personal data is processed for scientific research purposes. These adjustments are designed to create a more enabling framework for organisations seeking to use personal data in innovative and socially beneficial ways.
We have outlined key changes below.
New definition of personal data
Under GDPR, personal data means any information relating to an identified or identifiable natural person (Article 4(1)). The Proposal amends this definition by further clarifying the notion of identifiability. Specifically, it adds that information is not considered personal data for a given entity merely because a potential subsequent recipient may have means reasonably likely to be used to identify the individual to whom the information relates.
In practice, this means that companies must assess for themselves whether they are able to identify individuals using the means reasonably available to them. If they lack such means, the information will not be regarded as personal data from their perspective, even if another party could identify the person.
Recital 27 recalls the established case law of the Court of Justice of the European Union, which held that a means of identifying a data subject is not ‘reasonably likely to be used’ where the risk of identification is, in reality, insignificant, such as when identification is prohibited by law or impossible in practice, as when it would require a disproportionate effort in terms of time, cost, or labour.
In essence, the Proposal codifies and sharpens this relative approach reflected in cases such as Single Resolution Board and Scania, and by doing so it reinforces the legal weight and practical applicability of that case law.
The European Commission also proposed a fail-safe mechanism, allowing it to adopt an implementing act to further specify means and criteria to determine whether data resulting from pseudonymisation no longer constitutes personal data for certain entities.
Impact on businesses
- The impact of this amendment is potentially quite high. Businesses would no longer need to treat certain datasets as personal data if they can no longer reasonably identify individuals, removing GDPR obligations such as legal basis (e.g., consent for certain trackers consent), access rights, and data transfer restrictions.
- Companies would be able to more efficiently share, analyse, or commercialise certain data, provided proper risk assessments are in place and documented.
Amendments to facilitate development and operation of AI systems
(a) Reliance on legitimate interests
The Proposal explicitly allows controllers to rely on legitimate interest when processing personal data to develop and operate AI systems and models.
This addition would make it easier for companies to use personal data for AI-related activities, provided they meet the following conditions:
- Pass the ‘legitimate interest test.’ Companies must document that their interest in developing or operating the AI system or model is not overridden by the interests or fundamental rights and freedoms of the individuals (children in particular) whose personal data they process.
- Implement appropriate technical and organizational measures (TOMs). The Proposal highlights such examples as:
- Minimize data collected when selecting sources, training, and testing of an AI system or model
- Protect against nondisclosure of residually retained data in the AI system or model
- Increase transparency
- Grant individuals an unconditional right to object to the processing of personal data
- Lastly, where EU/local Member State laws mandate consent – companies will not be able to rely on legitimate interest.
(b) Use of sensitive data for AI development and operation
The Proposal introduces a new legal basis permitting the processing of sensitive personal data (i.e., special category personal data revealing racial or ethnic origin, political opinions, religious, or philosophical beliefs, or trade union membership; genetic/biometric data which are processed for the purpose of uniquely identifying an individual; data concerning health; and data concerning a natural person’s sex life or sexual orientation) for the development, testing, and operation of AI system and models. The above is subject to two conditions:
- Apply appropriate TOMs to avoid the collection and otherwise processing of sensitive personal data.
- Remove any sensitive personal data, which has been identified in the datasets used for training, testing, or validation or in the AI system or model despite the TOMs.
However, where removal of sensitive personal data requires disproportionate effort, the controller would need to prevent the sensitive data from being used to produce outputs, being disclosed, or otherwise being made available to third parties.
Impact on businesses
- The assured ability to rely on legitimate interest and the new legal basis for sensitive data may be crucial for a number of AI businesses. However, the Proposal provides for measures and related documentation that companies need to consider to benefit from this legal basis in line with the Proposal. In particular, an unconditional right to object is a particularly strong tool that individuals can benefit from.
Legal grounds to use biometric data for identification verification
Biometric data for identification verification is used to verify users’ identity, such as with fingerprints or facial recognition. Verification may be needed for multiple purposes, typically including unlocking devices, accessing secure systems, or confirming someone’s identity for financial or administrative purposes. GDPR requires a company to have legal grounds under Article 9, which would typically be consent, which is challenging in the employment sector.
The Proposal would allow biometric data processing for identification verification without consent when the data is controlled solely by the user. The Proposal explains that this includes when biometric data is securely stored solely on the side of the data subject (e.g., on the data subject’s device) or is securely stored by the controller in a state-of-the-art encrypted form and the encryption key or equivalent means is held solely by the data subject.
Impact on businesses
- Companies would be able to implement user-controlled biometric authentication (e.g., on-device facial or fingerprint recognition) without the need of further legal grounds, such as consent. This will ease the use of biometrics for identity verification, providing clarity and enabling innovation while still requiring robust safeguards and respect for individual rights.
Further processing for archiving, scientific, research, or statistical purposes
The Digital Omnibus Proposal provides greater clarity than the current framework by explicitly stating that further processing of personal data for archiving purposes in the public interest, scientific or historical research purposes, or statistical purposes shall be considered to be compatible with the initial purposes for which the data was collected.
It also makes clearer that when personal data is processed for these specific purposes, there is no need to assess whether the purpose of the further processing is compatible with the original one.
Impact on businesses
- It provides greater legal certainty and facilitates the secondary use of personal data for these limited purposes by clarifying that the conditions for compatible processing (e.g., the need to find a link between initial and secondary purposes) would not be relevant in these cases.
Simplified privacy notice rules and new exception for research
The Digital Omnibus Proposal streamlines privacy notice obligations under Article 13 of GDPR, further reducing the burden of controllers to provide certain information on data processing and introducing a new exception for scientific research.
If a controller collects data directly from an individual, no privacy notice is needed when:
- The personal data has been collected in the context of a clear, circumscribed relationship between the individual and a controller
- The controller exercises an activity that is not data intensive (e.g., when it collects a low amount of personal data and its processing operations are not complex; this would not be the case, for example, in the field of employment); and
- There are reasonable grounds to assume that the individual already knows who the controller is and what the intended purpose is, as well as the legal basis for processing personal data
However, this exemption does not apply if:
- The controller transmits the data to other recipients or categories of recipients
- The controller transfers the data to a third country
- The controller carries out automated decision-making, including profiling or
- The processing is likely to result in a high risk to the rights and freedoms of data subjects within the meaning of Article 35 (e.g., in cases where DPIA is necessary)
The Proposal also introduces an exception for scientific research. Controllers conducting scientific research may also be exempt from providing individual notices when:
- Giving notice is impossible or would involve disproportionate effort or
- Providing the notice would render impossible or seriously impair the achievement of the objectives of the scientific processing
In such cases, the controller must still protect individuals’ rights and freedoms and legitimate interests by taking appropriate measures, including making the information publicly available.
Impact on businesses
- The general exemption will likely apply mainly to small or local businesses engaged in straightforward, low-risk processing, such as managing small customer databases. For example, it could apply to a craftsman and a client, where the processing is confined to the minimum personal data necessary to perform the service.
- It is unlikely to affect larger organisations, which typically operate complex, data-intensive systems and global databases, where the full information requirements will continue to apply.
- The research exception will particularly benefit life sciences and research organisations, where providing individual notices may be impractical or could compromise research objectives.
‘Neutralized’ potential abuse of data subject access requests
Under the Digital Omnibus Proposal, controllers would be allowed to refuse to act on a request or charge a reasonable fee for handling data subject access requests (DSARs) under Article 15 of GDPR if the data subjects abuse the requests by using them for other than the protection of their data. Examples of abusive or excessive requests include:
- When an individual intends to cause the controller to refuse an access request in order to subsequently demand the payment of compensation, potentially under the threat of bringing a claim for damages
- When an individual makes excessive use of the right of access with the only intent of causing damage or harm to the controller
- When an individual makes a request but simultaneously offers to withdraw it in return for some form of benefit from the controller
In such cases, the controller must be able to demonstrate that there are reasonable grounds to believe that the request is excessive. The Proposal also suggests that controllers should bear a lower burden of proof regarding the excessive character of a request than regarding the manifestly unfounded character of a request (i.e., the other basis for the refusal of such request under GDPR).
Impact on businesses
- This clarification would reduce the administrative burden associated with DSARs, particularly those filed for purposes other than data protection, such as in employment litigation or termination proceedings.
- It would increase the legal certainty for organisations frequently targeted by abusive access requests.
Consistent DPIAs
The Digital Omnibus Proposal would require the European Data Protection Board to develop a unified EU-wide template and related methodology for conducting DPIAs. It would also be tasked with consolidating currently country-specific lists of processing operations that require or do not require a DPIA.
Impact on businesses
- This change will allow businesses, especially multinational ones, to more easily determine the do’s and don’ts of DPIAs and more easily unify their internal procedures across the EU.
Automated decision-making
The decisions based solely on automated processing, including profiling, are permitted when specific conditions set out in GDPR are met.
Such decisions that have legal or similarly significant effects on individuals may rely solely on automated processing, including profiling, only if one of the following applies:
- The decision is authorised by the EU or Member State law to which the controller is subject and which lays down suitable measures to safeguard the individual’s rights and freedoms and legitimate interests.
- The decision is based on the individual’s explicit consent.
- The decision is necessary for entering into or performing a contract between the individual and a controller. The Proposal clarifies that this exception applies regardless of whether the decision could be taken otherwise than by solely automated means.
Impact on businesses
- This clarification slightly improves legal certainty and flexibility for organisations using automated decision-making. Businesses may rely on automation even when a human could have made the same decision on their own, as long as one of the provided legal bases applies.
ePrivacy
The proposed Digital Omnibus introduces significant changes to GDPR by adding specific provisions governing online tracker (e.g., cookie) consent. The new rules apply in situations where personal data is stored on or accessed from the terminal equipment of natural persons (e.g., computers, mobile phones, or Internet of Things (IoT) devices).
A key element of the Proposal is the introduction of new obligations for data controllers that operate websites and mobile applications. These controllers would need to ensure that their website and app interfaces allow users to give or refuse consent in a way that computers can read automatically (a concept similar to the Global Privacy Control required under the California Consumer Privacy Act (CCPA)). Browser manufacturers will have to include technical tools within the browser that allow users to give or refuse consent. This aims to solve so-called consent fatigue and reduce the need for cookie banners.
Key changes
- Enforcement. Under GDPR, enforcement would be subject to the one-stop-shop mechanism. Companies engaging in cross-border personal data processing through online trackers will be able to interact with a single lead data protection authority rather than with the 27 national regulators.The administrative sanctions regime is not explicitly addressed in the Proposal, raising doubts as to whether violation of the online tracker rules would be subject to the maximum fines under GDPR Article 83.4 (2% of global turnover or €10 million) or Article 83.5 (4% of the global turnover or €20 million), particularly for the new automated, machine-readable consent recognition mechanisms, including for browser providers.
- Broader consent exceptions. Consent remains required for storing or accessing personal data on a user’s device unless explicitly allowed by GDPR. However, the proposed rules broaden the consent exceptions compared to the current ePrivacy Directive. Specifically, consent is not required in the following situations:
- When providing any service explicitly requested by the user. This exception aims to go beyond the current ePrivacy Directive, which applies the exception only to the narrower information society.
- When storage or access is necessary to maintain or restore the security of a service provided by the controller and requested by the individual. For example, automatic security updates previously requested by the user will apply. This is an entirely new exception.
- When a controller collects aggregated information to measure the audience of an online service, provided the analysis is carried out solely for the controller’s own use. We understand that in this use case, creation of ‘aggregated information’ requires the use of personal data; otherwise, this scenario would fall under the ePrivacy Directive rather than GDPR. Presumably controllers would still be able to rely on the same exception to the extent that they use a processor (processing personal data solely on their behalf) in line with the GDPR requirement. It is unclear whether ‘measuring the audience’ is limited to simple counting (e.g., number of users) or could extend to more detailed analytics, such as segmenting users by age range or other attributes (e.g., 30–40 years old).
- Online interfaces allowing data subjects to give or refuse consent through an automated, machine-readable mechanism. Data controllers will have to incorporate such interfaces in their websites and apps and honour the data subject’s choices. Providers of web browsers must offer technical tools to allow users to give or refuse consent. The Commission is required to issue a mandate to European standardisation organisations to create harmonised standards for such online interfaces but not for browsers. The rules would only be applicable 24 and 48 months after the adoption, respectively.
- Current ePrivacy Directive rules. The cookie consent rules in the existing ePrivacy Directive (and national cookie laws) are not repealed or significantly amended. They remain applicable but only to the storage of or access to information on users’ terminal equipment that does not constitute or lead to processing of personal data and the user/subscriber is a legal entity (not a natural person).This would cover such situations as when equipment is owned by a legal entity and used in an automated way (e.g., an autonomous vehicle). Ironically the ePrivacy rules would become more stringent than those applicable to personal data, as they do not include the new exceptions that are applicable under GDPR.
Impact on businesses
- Websites honouring browser and other automated signals. Once in effect, companies would need ensure that their website and app interfaces allow users to give or refuse consent in a way that computers can read automatically (including through a browser and other automated signals) across all websites and apps (a concept similar to Global Privacy Control required under CCPA). This Proposal would eliminate the need to request consent separately for each app or website and potentially remove the need for cookie banners entirely.Given that these obligations will take effect two years after the adoption of the Omnibus, the impact will be long term. However, the Proposal may already be relevant for companies considering significant investments in consent banners solutions.
- Expanded consent exceptions. Upon adoption of the Proposal, expanded exceptions for analytics, security, and requested services will reduce the need for consent in these cases. However, since most sites and apps use cookies for advertising, consent mechanisms will still be required for advertising cookies until they can be replaced by fully automated, unified signals.Even before the Omnibus enters into force, companies taking a risk-based approach may consider these exceptions as indicative of lower regulatory risk for such uses.
- Separate rules for personal and nonpersonal data. The Proposal introduces rules for storing and accessing personal data as distinct from nonpersonal data, which remains subject to the ePrivacy Directive. Companies will need to determine which regime applies to their trackers and act accordingly, adding complexity to the current framework, which does not make this distinction. Additional complexity arises from potentially different supervisory authorities and sanctions for the categories. Trackers entailing cross-border processing of personal data will be subject to one-stop-shop enforcement, whereas the others will be subject to each Member State’s laws.Companies may find it useful to assess whether the trackers used on their websites and apps process personal data and would become subject to GDPR rather than to the ePrivacy Directive. Given the new definition of personal data, some companies may fall under the ePrivacy Directive.
Data use and governance
The proposed amendments relate to not only the Data Act itself but also four additional pieces of legislation (the Data Governance Act, Open Data Directive, Platform-to-Business Regulation, and Free Flow of Non-Personal Data Regulation), repealing some parts and amending and consolidating the remaining provisions under the Data Act (e.g., provisions on data intermediaries and mandatory government access to data).
This section primarily focuses only on the proposed amendments to the Data Act itself, including the following key changes:
- Cloud-switching obligations. The Data Act requires cloud and other data processing service providers to allow switching to a new provider or to an on-premise solution on no more than two months’ notice, to support the migration of the data, and to phase out switching fees. The Proposal relaxes these rules for SMEs and small midcap companies (SMCs), and custom-made data processing services (e.g., services which are not off the shelf and would not function without prior adaptation to the needs and ecosystem of the user), where these are provided based on contracts concluded before 12 September 2025.
- Trade secrets. The data-sharing obligations under the Data Act include requirements for protecting trade secrets (e.g., data holders may protect trade secrets by using TOMs). Under the Digital Omnibus Proposal, data holders would not be required to share trade secrets on a case-by-case basis if, despite adopting appropriate TOMs, they are highly likely to suffer serious economic damage from the disclosure of trade secrets or there is a high risk that this information could be unlawfully acquired, used, or disclosed to countries outside the EU, or entities under their control, with less robust protections than those provided in the EU.The data holder would need to substantiate their decision to refuse disclosure of such data to the user in writing and notify the competent authority. The Proposal provides examples of how such risks would need to be demonstrated through objective considerations, such as the enforceability of trade secrets protection in third-party countries, the nature and level of confidentiality of the data requested, and the uniqueness and novelty of the connected product.
- Public sector access to private company data. The Data Act established rules for when and how public sector bodies can access data from private companies. The Proposal changes the scope of these obligations, allowing such access only in public emergencies. This is a marked limitation compared to the initial broad scope that applied to any ‘exceptional needs.’
- Smart contracts. Vendors of applications using smart contracts (or persons whose trade, business, or profession involves the deployment of smart contracts for others) will no longer need to comply with statutory essential requirements (e.g., on robustness and access control, safe termination and interruption, and data archiving and continuity).
Impact on businesses
- Smaller businesses and those that provide highly customized data processing services, such as custom software-as-a-service (SaaS), network-as-a-service (NaaS), and infrastructure-as-a-service (IaaS) providers, will benefit from lighter cloud-switching requirements under current contracts.
- All businesses should prepare to assess whether they can avail themselves of an additional layer of trade secret protections under the Data Act, which will potentially benefit businesses subject to data-sharing requirements, where such protections are crucial for business viability.
Next steps
The Digital Omnibus Proposal is expected to move through the legislative process in the European Parliament and Council throughout 2026. In the European Parliament, several committees, including the Committee on Internal Market and Consumer Protection (IMCO), the Committee on Industry, Research and Energy (ITRE), and the Committee on Civil Liberties, Justice and Home Affairs (LIBE) will be involved in shaping the Proposal. While it has been mentioned in the press that the European Parliament may adopt a fast track procedure, it is currently uncertain whether an expedited procedure will be applied, as the level of urgency (typical of such procedures) is not clear in this case.
The McDermott team will follow these developments closely. To learn more, reach out to the authors or your regular McDermott lawyer to discuss the potential legal implications for your specific business.