News from Across the Sea #4:
Welcome to our first installment of 2025! One year after the DSA came into full force, we will review what has been achieved so far and some of the challenges that lie ahead. What is behind the decline in user numbers on the main adult content platforms? We will also discuss the debates surrounding the German parliamentary elections and analyze the adoption of new Codes of Conduct.
DSA, Year 1: Achievements and Challenges:
It is time to evaluate the DSA one year after its full implementation. Among its major successes, it is worth noting that the complex system of oversight designed by the law has been almost fully implemented. Nearly all EU countries have appointed their Digital Services Coordinator, the authority that ensures compliance with the DSA, except for the specific obligations of Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). Additionally, these authorities have established the European Board for Digital Services, which has already held 12 monthly meetings.
In terms of progress, major platforms and search engines are largely complying with their obligations to publish reports and provide information. As a result, we now know more about their operations and content moderation systems than we did a year ago. In addition, 25 VLOPs/VLOSEs have been designated as subject to stricter due diligence requirements, along with 20 trusted flaggers, that is, civil society organizations that receive priority treatment from platforms when reporting illegal content. The European Commission has also initiated multiple enforcement enforcement against potential platform violations. So far, one of these cases led to an agreement to remove the TikTok Lite Rewards Program. Another case, involving X, reached a preliminary conclusion that the company may have violated several provisions of the law, though the process is still ongoing.Delegated legislation has been enacted regarding transparency reporting, risk mitigation in electoral contexts and y independent audits. Six out-of-court dispute resolution bodies have also been authorized, so that users can resolve their disputes with the platforms free of charge.
There are deficits in the implementation of platform data access policies for researchers (Article 40) and a lack of clear definitions regarding systemic risks (Article 34), a central component in the new co-regulatory paradigm advocated by the EU, which we hadpreviously cautioned aboutandcontinue to address.It is no surprise, then, that the risk detection and mitigation reports released in November were generic and nearly devoid of data.
VLOPs on the Run:
According to media reports, WhatsApp's “Channels” feature has reportedly surpassed 45 million monthly users in the EU, the threshold to be considered a large online platform (VLOP) under Article 33 of the DSA. It should be noted that, by the law's own definitions, the messaging function is not covered by the DSA. For its part, the navigation platform Waze is also said to have reached the "magic number" that would make it a VLOP.Official designations are expected in the coming months.
On the other hand, the adult sites Pornhub, Stripchat and Xvideos, which were designated as VLOPs in December 2023, have reported a sharp decline in their monthly user counts, now hovering around 26 million. It is unlikely that half of Europe's pornography consumers have suddenly abandoned these platforms, suggesting this could be a maneuver to evade the heightened obligations that come with their designation as VLOPs. Under Article 33(5) of the DSA, a platform can lose its VLOP status if it maintains a user count below the legally set threshold for an uninterrupted period of one year. The drastic drop in the reported number of users, according to the platforms, is due to the difficulty in determining the geographic origin of those who access their sites using their browser’s incognito mode.
DSA and Electoral Processes
There have been no significant developments in the European Commission’s proceedings against TikTok for potential violations of the DSA, in connection with the Romanian presidential elections, which were suspended last December. We still do not know what happened in Romania, but what seems clear is that, the DSA alone is not enough to safeguard democracy.
On February 23, Germany held parliamentary elections amid great uncertainty following the events in Romania, pressure from the European Commission on national authorities and Elon Musk's explicit support for Alice Weidel, the candidate from the far-right Alternative for Germany (AfD) party. Some viewed his endorsement as an undue interference that could trigger the DSA application. In fact, a Berlin court upheld a lawsuit filed by Democracy Reporting International against X, which had refused to grant access to data that would allow the detection of potential foreign interference in the electoral process. The platform has appealed the decision.
Incorporation of Codes of Conduct into the DSA:
TheCode of Practice on Disinformationandthe Code of Practice on Illegal Hate Speech Onlinehave been officially incorporated as Codes of Conduct under Article 45 of the DSA. This means that the commitments voluntarily undertaken by companies that have signed these codes will now become binding obligations under the DSA. This is relevant because it allows for a comparison between the vague risk identification and mitigation obligations under Articles 34 and 35 of the law and the more concrete commitments and requirements.
There are, however, problematic aspects regarding these Codes. As Griffin and Vander Maelen explainhereandhere, their voluntary nature is disputed: from the outset, authorities pressured platforms to sign them under threat of stricter regulations, andguided the directiontheir content should take. Additionally, the authors argue that there are positive and negative incentives that make it too costly for companies to opt out. The Codes can act as “carrots”—signing up is the easiest way to demonstrate compliance with the DSA’s risk mitigation obligations. At the same time, they also function as “sticks”—the European Commission could use its authority to investigate non-compliance (and apply hefty fines) to pressure hesitant platforms into joining. Thus, for example, Article 75 of the DSA suggests that signing the Codes could serve as a remedy for non-compliance; while Recital 104 implies that rejecting an “invitation” from the Commission to participate in a Code could be seen as an indicator of non-compliance.
The voluntary nature or otherwise of the Codes of Conduct is essential, insofar as they contain some obligations that the State could not legitimately ask of the platforms. For example, the Code of Practice on Disinformation requires the “demonetization of disinformation” and includes commitments to adopt various policies against the spread of disinformation, regardless of whether such content is legal or illegal. As for the Code of Practice on Hate Speech, Alkiviadou explains that the definitions used are much more comprehensive than the current human rights standards on the matter and that the Code contains metrics that encourage over-removal of content, such as the 24-hour deadline for takedowns and the commitment to review at least 50% of the complaints received.
What Are We Reading?
- El excellent report by AlgorithmWatch and DFRLab quantifying the influence of the “Musk Effect” on electoral discourse surrounding Germany’s February parliamentary elections.
- La European Ombudsman’s Opinion criticising the general presumption of confidentiality adopted by the European Commission in relation to risk assessment and mitigation reports under the DSA.
Thank you for reading this far—we will meet again soon.
Until next time!