Two years on from the Digital Services Act’s full implementation, our Data & Technology team looks at recent developments and what’s to come in 2026.
Following a lengthy learning and implementation period for providers, the European Commission and national regulators are now gearing up for enforcement of the Digital Services Act (DSA). Later this year, we expect ‘vetted’ researcher access and the protection of minors to be the focus.
Notable developments from year two
The first DSA fine
Online platform X was fined €120 million for breaches of the DSA related to the deceptive design of its ‘blue checkmark’, the lack of transparency of its advertising repository and the failure to provide access to public data for researchers. This was the first fine issued by the European Commission under the DSA and followed a two-year investigation.
The European Commission’s decision was subpoenaed and published by the U.S. House of Representatives. The decision outlines the European Commission’s positions on a range of DSA issues, including the geographical scope of data access under Article 40 DSA and a broad conceptualisation of the data which may be relevant for research into “systemic risks”.
Guidance from the European Commission and the EDPB
The European Commission’s Article 28 Guidelines were published on 14 July 2025. They contain prescriptive requirements for ‘online platforms’ underpinned by a Risk Review to determine which requirements are appropriate for a given service. Of note:
- Age assurance: a sliding scale of age assurance measures are proposed, up to and including age verification for certain providers. Assessments of age should be contestable by users via a mechanism which fulfills the criteria of Article 20 DSA.
- Default settings: minor users’ account settings must be set to the highest level of privacy, safety and security by default. Such settings need to be designed in such a way that interaction is limited to approved contacts, and features like geolocation, tracking, autoplay, live streams, and push notifications are turned off.
- Feedback and reporting: in addition to the requirements of Articles 16 and 20 DSA, the European Commission expects that platform terms are amended to cover certain “harmful” content. For example, the Guidelines require reporting and feedback mechanisms for violations of the platform’s terms which “includes any content, user or activity that is considered by the platform to be harmful to minors’ privacy, safety, and/or security”.
The European Data Protection Board (EDPB) released its Guidelines in September 2025 on the interplay between the DSA and the GDPR. Of note:
- DPIA for proactive moderation: voluntary proactive moderation technology is likely to meet several Article 35 GDPR Data Protection Impact Assessment (DPIA) triggers. These are likely to include evaluation or scoring, automated-decision making (ADM) with legal or similarly significant effects and systematic monitoring of data subjects. Legitimate interests under Article 6(1)(f) will be the most suitable legal basis to rely on.
- Protection of minors as a legal obligation: Articles 28(1) and (2) DSA can qualify as a legal basis for processing under Article 6(1)(c) GDPR to process personal data subject to the condition that the controller is able to demonstrate (on a case-by-case basis) that such processing is necessary and proportionate to achieve the goals established by Articles 28(1) and (2) DSA.
- ADM within recommendations: recommender system suggestions could be considered an Article 22(1) GDPR ‘decision’ i.e., the ‘decision’ to present specific content to an individual may have an impact that is not necessarily legal “but rather economic and social”.
Irish enforcement commences
Throughout the past year, the Irish Digital Services Coordinator (DSC) under the DSA, Coimisiún na Meán (CnaM), engaged with service providers following the receipt of user complaints alleging non-compliance with the DSA.
Three formal investigations were commenced against X, LinkedIn and TikTok. In X’s case, CnaM acknowledged that its concerns had been supplemented by information provided in DSA complaints. CnaM has also forwarded user complaints to the European Commission to support its investigations into VLOPs.
Experience has shown to date that CnaM is focused on user-facing processes and due diligence under the DSA, including the Article 16 DSA notice and action mechanism, Article 17 statements of reasons and the Article 20 DSA internal complaints process.
What’s to expect in the coming year
Data Access under Article 40(4) DSA
For providers of very large online platforms and search engines (VLOPSEs), data access for “vetted researchers” will commence in the coming weeks. CnaM, in its role as the DSC for 15 out of the 25 designated VLOPSEs, published the results of a survey it carried out with prospective researcher applicants in September 2025. Of particular note here is the potential volume of research requests, particularly for large social and video platforms. From the 116 responses to its survey:
- 103 researchers plan to make applications under Article 40(4) DSA within the first 12 months and are expected to submit a total of 602 individual applications
- 448 of those applications are expected to be made within the first six months, with 317 relating to VLOPSEs established in Ireland
Irish-established VLOPSEs can expect significant engagement from CnaM and researchers over the coming months.
Regulatory engagement on Article 28 DSA
Regulators and platforms have now had over six months to digest and consider the Article 28 Guidelines. We expect that CnaM will engage with platforms on:
- The Risk Reviews expected to be carried out under the Guidelines and
- The appropriate measures put in place to protect minor users
Platforms should be cognisant of the need to evidence why certain measures are relevant, or a priority, given the specific aspects of their service and user base.
In the Netherlands, the Dutch DSC initiated an investigation into Roblox to assess whether the platform “takes sufficient measures to protect minors on its service” on 20 January 2026, marking the first such national investigation.
The first standardised DSA transparency reports will shortly be published
Over the next few weeks, all services within the scope of the DSA are for the first time required to publish transparency reports using the prescriptive templates published under the European Commission’s Implementing Regulation. These reports will allow both regulators and the wider research and academic community to directly compare the content moderation activity undertaken by these services. You can learn more about these reports in our recent article here.
This content is provided for information purposes only and does not constitute legal or other advice.
Authors:
• Michael Madden, Partner, MASON HAYES & CURRAN
• Luke Murray, Associate, MASON HAYES & CURRAN
Compliments of Mason Hayes & Curran – a Member of the EACCNY