Overview
What new guidance on anonymisation from the UK Information Commissioner’s Office (ICO) means for healthcare and life sciences companies.
In Depth
Harnessing health data is a key priority for life sciences companies as it enables more effective scientific research, better tailored clinical trials and investigations, and targeted drug development and commercial access. One of the key tools for health research is anonymised and pseudonymised data. In recent years, there has been significant debate in the United Kingdom and Europe about how, and even if, health data can be effectively anonymised.
The new ICO guidance on anonymisation, issued on 28 March, sets out helpful guidance about how to approach this challenge in the United Kingdom. The guidance diverges from the approach taken by some regulators in Europe, which appear to be taking a more cautious approach to both anonymisation and pseudonymisation, and the sector is eagerly awaiting the appeal decision of the European Court of Justice (ECJ) in Single Resolution Board (SRB) v European Data Protection Supervisor (EDPS).
ICO guidance diverges from the approach taken by some regulators in Europe.
The ICO’s guidance is not statutory code but contains advice on how to interpret UK law and a series of good practice recommendations.
Whilst there is no penalty for not following these recommendations, the ICO says it will take its guidance into account and is less likely to carry out enforcement action, including financial penalties, if an organisation can demonstrate that it made a serious effort to comply with data protection law, and had a genuine reason to believe that the information was not personal data.
Over recent years, there has been significant academic and legal debate about the status of data when it is transferred to third parties. At the heart of this wrangling is the question of whether data can be anonymous in the hands of one person, but personal (or pseudonymous) data in the hands of another.
Anonymisation Absolutists Versus Relativists
The legal debate focuses on terminology in Recital 26 of the EU General Data Protection Regulation (GDPR), which states: “To determine whether a natural person is identifiable, account should be taken of all the means reasonably likely to be used, such as singling out, either by the controller or by another person to identify the natural person directly or indirectly.” (Emphasis added.)
Proponents of absolute anonymisation argue that the words “or by another person” mean that if any person is able to identify the data subject, including the original controller who applied the anonymisation techniques and who may retain the means to re-identify the data subject, the data can never be anonymous, even in the hands of a recipient who does not have access to reasonable means of re-identifying the data.
This absolute approach may have significant adverse implications for health and life sciences companies that, as part of health research or post market activities, often use pseudonymised health datasets without access to other directly identifiable information, such as names, cohort data, medical images, biological samples, etc.
An absolute approach treats this type of data as never being anonymised if directly identifiable information, such as a code or table that corresponds the identity of the patient or research participant with the random ID assigned to the corresponding dataset, is held by one of the actors in the chain. The relativists argue that personal data can be anonymous in the hands of the other actors in that chain, who don’t have access to identifiable information, provided certain conditions are met.
The Position in Europe
This was the legal issue at stake in SRB v EDPS, where the European General Court (GC) decided on 26 April 2023 in favour of a relative approach. In this case, SRB transmitted non-directly identifying data to a consulting firm and faced complaints from individuals regarding a breach of data protection laws. The GC considered that the EDPS should have verified whether or not individuals were re-identifiable from the consulting firm’s perspective, through reasonable means, and not solely from the SRB’s perspective.
The case is currently under appeal by the EDPS, joined by the European Data Protection Board (EDPB), to the ECJ. SRB, joined in the appeal by the EU Commission, argues that a relative approach applies and that pseudonymised data may remain personal data for the data controller who pseudonymised it, but it is necessary to examine the actual identifiability of the data subjects in the hands of the recipient. Adopting an absolute approach, the EDPS argues that data is anonymous only where the risk of re-identification is non-existent or insignificant.
On 6 February 2025, the Advocate General published his opinion, which, whilst adopting a slightly different interpretation, reaches a relativist conclusion that where pseudonymisation is sufficiently robust, it may be possible for data to be not reasonably identifiable. Pseudonymised data can, therefore, be anonymous. This not only confirms a relative approach to anonymisation, but goes one step further, allowing for pseudonymisation techniques to render data anonymous, a position that the European data protection authorities have, so far, been very reluctant to adopt.
The ECJ’s decision is eagerly awaited not least because, in January 2025, the EDPB issued draft guidelines on pseudonymisation that take a more absolutist view. These guidelines say that pseudonymised data remains personal data even in the hands of a third party.
In addition, the EDPB guidelines seem to create a two-tier vision of pseudonymisation. There could be a limited level of pseudonymisation, for example, removing direct identifiers to mask the identity, but also a higher level of pseudonymisation, which would meet the legal definition of pseudonymisation under GDPR. Only pseudonymisation procedures that effectively reduce the risk of data being attributed to a specific person could thus benefit from the favourable regime identified by the guidelines regarding pseudonymised data. The EDPB therefore appears to not only reject a relative approach to anonymisation, but seems to be moving towards a stricter approach to pseudonymisation.
The EDPB guidelines seem to create a two-tier vision of pseudonymisation.
“Whose Hands?” the ICO’s Contextual Test for Anonymisation
The ICO applies a “whose hands?” test, which means data could be personal in the hands of one person, but not another. The ICO makes it clear that there is a spectrum of identifiability that depends on a number of factors; on one end, it is impossible for data to be re-identified, and at the other data is clearly personal.
The ICO’s approach is that it does not need to be impossible to identify an individual for the data to be considered anonymised. Data can instead be considered “effectively anonymised” if the possibility of identification is “sufficiently remote”.
Along this spectrum, it is important to assess the identifiability risk, taking into account the motivated intruder test as well as a series of other practical steps in assessing whether or not data can be re-identified. Motivation in this context can vary, but for healthcare and life sciences companies handling valuable information in the form of health data, the likelihood of sophisticated and capable intruders is high.
The ICO makes it clear that there is a spectrum of identifiability.
The ICO makes it clear that the context for the data disclosure is important. The ICO draws the distinction between the “release model”, for example, release to the public at large or release to a limited group of defined people for a particular purpose. The standards for anonymisation for public release are higher because they carry more risk: the anonymising party loses control of the data, and it may be impossible to retract the information. By contrast, limited release models may be easier to assess and control, despite some risk remaining. The ICO recommends that any risk assessment take into account the persons to whom the data is being released, and any legal or other prohibitions on re-identification, including confidentiality or professional limitations, such as those that apply to clinicians, as well as contractual controls.
Finally, the ICO sets out some helpful and practical guidance about anonymisation techniques, illustrated with worked examples.
ICO Guidance Action Points for UK Health and Life Sciences Companies
Companies that anonymise personal data should consider how they follow the ICO’s recommendations and be able to point to internal governance and accountability measures.
At a minimum, these should include:
- Reviewing transparency information: The ICO makes it clear that privacy notices should explain what data is being anonymised and why, set out the risk mitigation measures in place, and explain the type of release model.
- Ensuring a legal basis is in place for anonymisation.
- Risk assessments: Appropriate Data Protection Impact Assessments (DPIA) specifically tailored to anonymisation as a processing activity should be conducted. Companies that anonymise data on an ongoing basis should consider developing and implementing a re-identifiability risk assessment process tailored to the company’s activities and objectives. These should be reviewed regularly, particularly for sensitive data such as health data, and be updated if technology changes or there are:
- Changes to what data is publicly available
- New releases, e.g., to new recipients
- Attacks or vulnerabilities that may affect the anonymisation or other measures in place.
- Recipient diligence: The ICO emphasises that the anonymising party must work with the recipient, who should also assess the identifiability risk in their hands.
- Governance programmes: Data governance policies, staff training, board reporting structures and procedures to identify and respond to re-identification incidents should be reviewed and updated if necessary. These standards must be flowed down to third parties as contractual safeguards and as part of regular audits.
- Wider legal obligations: Be mindful of legal frameworks and rules that apply alongside data privacy laws, such as the common law duty of confidentiality.