Member News, News

Stephenson Harwood | Data and Cyber Update – February 2026

Welcome to the latest edition of the Stephenson Harwood Data and Cyber Update, covering the key developments in data protection and cyber security law in February 2026.

In data regulation news, the European Data Protection Board (“EDPB”) and the European Data Protection Supervisor (“EDPS”) issue a joint Opinion on the Digital Omnibus Regulation proposal; and the UK Information Commissioner’s Office (“ICO”) publishes its data protection complaints guidance.

In cybersecurity news, Ofcom fast-tracks its decision on online safety proposals, as UK regulators and legislators take urgent steps to improve online safety, as a response to the Grok indecent images scandal.

In our enforcement and civil litigation update, the Court of Justice of the European Union (“CJEU”) confirms that organisations can challenge binding decisions of the EDPB; and the UK Court of Appeal confirms a controller’s data security obligations for pseudonymised data.

DATA REGULATION

CYBER SECURITY

ENFORCEMENT AND CIVIL LITIGATION

DATA REGULATION

EDPB AND EDPS ISSUE JOINT OPINION ON THE DIGITAL OMNIBUS REGULATION PROPOSAL

On 11 February 2026, the European Data Protection Board (“EDPB”) and the European Data Protection Supervisor (“EDPS”) published a joint Opinion (the “Opinion”) on the European Commission’s Digital Omnibus Regulation proposal (the “Digital Omnibus”). The Digital Omnibus (alongside a separate Digital Omnibus proposal on AI) aims to simplify the EU digital legislative framework, including the EU General Data Protection Regulation (“GDPR”), ePrivacy Directive and Data Act (see overview here).

In their Opinion, the EDPB and EDPS welcome some proposed changes under the GDPR, including the proposed definition of “scientific research” and the introduction of a new derogation to process special categories of data for biometric authentication, where the verification means are under the individual’s sole control. They also support the extension of time for data breach notifications from 72 to 96 hours.

However, the Opinion challenges certain key proposed amendments to the GDPR, including:

  1. Definition of personal data: the most serious concern raised related to the proposed narrowing of the definition of personal data. The Digital Omnibus seeks to adopt a subjective definition of personal data, focusing on whether a given entity can identify the individual using reasonable means. The EDPB and EDPS strongly urge legislators not to adopt this amendment, arguing that this change would undermine the fundamental right to data protection, create loopholes for controllers, and increase legal uncertainty.
  2. Solely automated decision-making (“ADM”): the Opinion states that the proposed wording that ADM be permitted “regardless of whether the decision could be taken otherwise than by solely automated means”, could create confusion. It recommends clarifying that ADM is only allowed when “necessary” for entering into or performing a contract, meaning that “no other equally effective and less intrusive means (automated or not) are available to the controller”.
  3. AI and legitimate interest: the Opinion states that there is no need to include an explicit provision to allow processing for AI development under “legitimate interests”. This is seen as unnecessary and unclear, as reliance on legitimate interest as a legal basis in the context of the development and deployment of AI models or systems is already possible under the current GDPR, subject to applying an appropriate balancing test.
  4. Limitation of Data Subject Rights: the proposed link between “abuse of rights” and the exercise of the right to access for purposes other than data protection is viewed as problematic. The Opinion stresses that access rights protect all fundamental rights, not just data protection rights. The EDPB and the EDPS recommend that the notion of “abusive requests” should be linked with the existence of an abusive intention (i.e. evident intention to cause harm to the controller).

On the ePrivacy Directive, the Opinion supports efforts to reduce “consent fatigue” and cookie banners. However, applying these rules under the GDPR for personal data but under the e-Privacy Directive for non-personal data may create confusion about which rules apply in practice, undermining the goal of simplification.

The EDPB and EDPS welcome proposals to integrate the Data Governance Act and Open Data Directive rules on the re-use of data and documents held by public sector bodies into the Data Act. However, they oppose the proposal to remove a current rule that personal data can be shared in pseudonymised form with public sector bodies only in cases where anonymous data is insufficient to respond to a public emergency.

Whilst the Opinion is non-binding, it is expected that amendments will be made to the Digital Omnibus to take into account at least some of the suggestions made by the EDPB and the EDPS. We are also monitoring reports of a leaked draft of the proposal from the Council of the European Union, which eliminates the proposed new definition of “personal data” under the GDPR. We will continue to provide updates as the Digital Omnibus makes its way through the EU legislative process before becoming law.

ICO PUBLISHES NEW GUIDANCE ON HANDLING DATA PROTECTION COMPLAINTS

February saw the entry into effect of the main changes to data protection legislation under the Data (Use and Access) Act 2025 pursuant to the Data (Use and Access) Act 2025 (Commencement No.6 and Transitional and Saving Provisions) Regulations 2026 (“Regulations”).

The requirement for all organisations to implement a formal data protection complaints process will now come into effect on 19 June 2026.

This means that, if an individual considers that an organisation has infringed data protection legislation in its handling of their personal data (or the personal information of someone they’re acting on behalf of), they must be able to make a complaint directly to that organisation, before it is escalated to the ICO.

To respond to this requirement, the ICO has published new guidance on how to deal with data protection complaints (the “Guidance”).

The Guidance sets out both expectations and obligations placed upon organisations in relation to the handling of data protection complaints. While these obligations will necessitate a review and, in many cases, the implementation of enhanced internal processes, they also present a chance for organisations to address and resolve complaints directly with data subjects, potentially reducing the likelihood of them being escalated to the ICO.

We have outlined some practical steps an organisation should take to implement and operate a compliant internal complaints-handling process here.

Some of the key points from the Guidance are as follows.

WHAT IS A DATA PROTECTION COMPLAINT?

A data protection complaint is essentially an allegation of non-compliance with data protection legislation by the affected data subject. The Guidance gives some examples of what would not count as a data protection complaint, such as where an individual makes a complaint about a service they have received whilst also exercising their data protection rights.

HOW CAN COMPLAINTS BE MADE?

Whilst organisations have an obligation to implement a clear internal process for handling any data protection complaint, the legislation does not specify this process. Therefore, organisations have discretion over the method, whether that is updating existing complaint procedures to capture data protection complaints, or using online forms, portals, or even live chats to handle complaints. Despite the flexibility, the Guidance does clarify that organisations must accept complaints made through any channel, even social media.

HOW SHOULD COMPLAINTS BE HANDLED?

Organisations must:

  • Acknowledge a complaint within 30 days of receipt.
  • Make enquiries without undue delay and of an appropriate level. An organisation must be able to justify how the complaint was handled.
  • On completion of an investigation, let the complainant know the outcome without undue delay.

Organisations are also advised to:

  • Implement a proportionate process to verify the identity of a complainant and the identity of a third party who is acting on behalf of a complainant.
  • Carry out training and, where necessary, implement procedures so employees can identify and effectively manage the complaint in line with legal obligations.
  • Implement efficient and fit-for-purpose complaint tracking processes; this will help ensure that acknowledgements are sent within 30 days and enquiries are made without undue delay.

CYBER SECURITY

OFCOM FAST-TRACKS DECISION ON ONLINE SAFETY PROPOSALS, AS UK REGULATORS AND LEGISLATORS TAKE URGENT STEPS TO IMPROVE ONLINE SAFETY

This month has seen a number of developments focussed on strengthening online protections for women and girls, who are disproportionately affected by non-consensual intimate image abuse, following the Grok indecent images scandal.

1. OFCOM FAST-TRACKS BLOCKING MEASURES

Ofcom has been consulting on a range of additional online protections designed to push tech platforms to go further in tackling illegal content online. One proposal is a requirement for platforms to block illegal intimate images at source, by using advanced proactive “hash matching” technology to detect and prevent the sharing of non-consensual intimate images. It has now announced, on 18 February 2026, that it will be fast-tracking its decision on its proposals, in light of the “urgent need” to better protect women and girls online. Ofcom’s decision will now be announced in May, rather than Autumn, as planned, and follows Ofcom’s formal investigation into Grok, which was launched in January.

2. ICO LAUNCHES FORMAL INVESTIGATION

The ICO launched its own formal investigation into X on 3 February 2026. The probe will assess whether personal data was processed lawfully under UK data protection law and if safeguards were in place to prevent Grok from generating non-consensual sexualised images.

3. CRIME AND POLICING BILL TO BE STRENGTHENED

The UK government has announced that it will amend the Crime and Policing Bill, which is currently making its way through parliament, to require tech platforms to detect and remove intimate images shared without consent within 48 hours of being flagged. Further amendments will make creating or sharing non-consensual intimate images a ”priority offence” under the Online Safety Act, treating such abuse as seriously as child sexual exploitation or terrorism.

ENFORCEMENT AND CIVIL LITIGATION

CJEU PERMITS CHALLENGES TO BINDING DECISIONS OF THE EDPB

In a landmark judgment on 10 February 2026, the Court of Justice of the European Union (“CJEU”), set aside a 2022 General Court ruling in WhatsApp Ireland Ltd (“WhatsApp”) v European Data Protection Board (“EDPB”) (T-709/21, EU:T:2022:783), referring the case back to the General Court.

The CJEU’s ruling confirms that organisations can seek annulment of binding decisions of the EDPB, before their lead supervisory authority adopts a final decision.

By way of background, in 2018 the Irish Data Protection Commission (“DPC”) launched a cross-border investigation into WhatsApp’s GDPR compliance. After a lack of consensus between national supervisory authorities regarding the DPC’s draft decision, the DPC referred the matter to the EDPB to resolve the dispute. The EDPB issued a binding decision finding WhatsApp in breach of transparency obligations and ordering the DPC to increase its proposed fine to €225 million. WhatsApp challenged the EDPB decision before the General Court, which dismissed WhatsApp’s action for annulment of the decision on the grounds that the EDPB decision was not open to challenge and was not of direct concern to WhatsApp. WhatsApp appealed to the CJEU.

The CJEU has now clarified that direct legal changes can be brought against EDPB binding decisions, based on an objective assessment of the decision by reference to its legal effects. Where a decision produces definitive, and not merely preparatory, legal effects, that decision can directly affect an organisation – even if not formally addressed to them – and it is accordingly open to challenge.

This sets a precedent for contestability that could lead to dual-track litigation, with organisations able to challenge EDPB decisions before the General Court, as well as challenging national implementing decisions before national courts in parallel.

The General Court will now assess the merits of the EDPB decision for the first time, the outcome of which could potentially influence further EU GDPR enforcement. We will be keeping a particularly close eye on Meta Platform’s appeal against a record-breaking EU fine, which was put on hold pending consideration of the above case. By a binding decision in 2023, the EDPB told the DPC to amend another of its draft decisions and impose a €1.2 billion fine on Meta for non-compliance with the EU GDPR in connection with its transfers of personal data from the EU to the US.

UK COURT OF APPEAL CONFIRMS CONTROLLERS’ DATA SECURITY OBLIGATIONS FOR PSEUDONYMISED DATA

In a significant judgment handed down on 19 February 2026, the Court of Appeal has confirmed that controllers remain subject to data security obligations in respect of personal data even where that data, if exfiltrated by a third party such as a cyber attacker, could not be used to identify the related individuals.

The case arose from a pre-GDPR cyberattack on DSG Retail Limited (“DSG”) (the operator of Currys PC World and Dixons Travel), in which attackers scraped millions of individuals’ payment card numbers (but not names) from tills. The ICO fined DSG £500,000 in 2020, which was the maximum fine available under the Data Protection Act 1998 (“DPA”), for failing to take appropriate technical and organisational measures to protect data. Following appeals by DSG to the First-tier Tribunal and Upper Tribunal, the ICO appealed to the Court of Appeal in 2024 to seek clarification on the ruling given by the Upper Tribunal in which it was held that the controller was not obliged to take measures against risks of data incidents caused by a third party who was unable to identify the relevant data subjects.

The Court of Appeal overturned the ruling, finding the result inconsistent with the purposes of data protection legislation. It held that the security duty applies if data qualifies as “personal data” from the perspective of the controller and the controller holds information enabling it to identify the individuals concerned, regardless of whether the data would constitute identifiable personal data to the third party that accessed it. This approach is consistent with the determination made in EDPS v Single Resolution Board (SRB) (C-413/23) (see our September 2025 update here).

Although the case was decided under the DPA, its reasoning is readily transferable to the current UK GDPR, which imposes a structurally similar security obligation and relies on an similar definition of personal data. The case has now been remitted to the First-tier Tribunal to be determined in accordance with the Court of Appeal’s judgment.

ICO FINES REDDIT AND MEDIALAB OVER CHILDREN’S DATA PROTECTION FAILURES

The ICO has taken enforcement action against Reddit and MediaLab.AI, Inc (“MediaLab”), owner of the online platform, Imgur – handing down a fine of £14.47 million and £247,590, respectively for failing to protect children’s data.

The ICO began an investigation on 3 March 2025 into how Reddit and Media Lab were processing children’s data and in particular, its use of age assurance mechanisms.

REDDIT

In July 2025, Reddit implemented an age verification measure that required users to self-declare their age to create an account. However, this measure was deemed insufficient by the ICO as it could be easily evaded, putting children at risk of accessing inappropriate content.

IMGUR

Despite Imgur’s terms and conditions requiring children under 13 years old to use the platform with adult supervision, the ICO found that it failed to adopt any age assurance measures to verify the age of users and obtain parental consent. Although Imgur blocked UK access to its platform on 30 September 2025, the fine relates to the period when the platform was available to UK users and failed to comply with UK GDPR requirements.

PENALTY DECISION AND ITS WIDER IMPLICATIONS

In determining these fines, the ICO considered the number of children affected, the degree of potential harm, the duration of the breaches, and the company’s global turnover. For MediaLab, the ICO also considered its commitment to address infringements should Imgur become available again in the UK.

These enforcement actions send a strong message to companies providing online services in the UK that are likely to be accessed by children: adequate age assurance measures must be implemented. As the ICO warns, “[c]ompanies that choose to ignore this can expect to face similar enforcement action”.

 

 

Compliments of Stephenson Harwood – a member of the EACCNY