The Digital Services Act (DSA), which establishes the most comprehensive framework for regulating digital platforms in the European Union, has announced its first major sanction decision. The European Commission announced an administrative fine of 120 million Euros for social media platform X (formerly known as Twitter) for violating its transparency obligations.

Alp Mete Şirin
Dec 8, 2025
The Digital Services Act (DSA), which creates the most comprehensive framework for regulating digital platforms in the European Union, announced its first major enforcement decision. The European Commission has imposed an administrative fine of 120 million Euros on the social media platform X (formerly known as Twitter) for violating its transparency obligations.
This decision demonstrates how seriously the DSA is applied in regulating the platform economy while also signaling a new era in discussions about digital rights, algorithmic transparency, and content safety.
The official statement can be accessed here:
https://ec.europa.eu/commission/presscorner/detail/en/ip_25_2934
1. Why Was a Penalty Imposed? Three Critical Violation Areas
The Commission evaluated three fundamental violation headings in its decision. These areas are central to the systemic risk mitigation and transparency obligations that the DSA imposes on platforms.
1.1. Deceptive Design Usage in the Blue Tick Application
The Commission states that X’s paid blue tick application deceives users.
The platform presents this mark without providing a real verification of what it means for an account to be marked as “verified.”
This situation particularly increases the risks of
identity theft
impersonation accounts
fraud
disinformation
among others.
The DSA explicitly prohibits “deceptive designs that disrupt users' decision-making processes.” The Commission found that X violated the DSA by presenting this symbol as an indication of verification.
1.2. Lack of Transparency in the Ad Library
The DSA imposes an obligation on platforms to create accessible and queryable ad repositories to ensure the traceability of political or commercial advertisements.
According to the Commission’s findings, X’s ad library:
operates with delays
contains access barriers
does not display the content, subject, or the legal entity that gave the advertisement
This deficiency makes it difficult for researchers and civil society to detect manipulative campaigns, information operations, and fraudulent advertisements.
1.3. Barrier to Public Data Access for Researchers
One of the innovative aspects of the DSA is granting researchers the right to access publicly available data from platforms.
The Commission found that X:
has restricted the usage of many access methods including scraping through its terms of use
has unnecessarily complicated the application processes
has delayed the approval mechanism
which was determined.
This situation directly impacts the research activities of academics working on disinformation, election security, platform behavior, and algorithmic risks within the EU.
2. Compliance Timeline and X’s Obligations
The Commission’s decision also includes a compliance timeline for X.
Within 60 working days, X must notify the measures it plans to take to end deceptive design violations regarding the blue tick application.
Within 90 working days, it is required to submit a detailed action plan regarding advertising transparency and researcher data access issues.
The Digital Services Board will provide feedback on the action plan, and the Commission will make its final decision thereafter.
Inappropriate or delayed compliance steps could lead to additional fines.
3. The Importance of the Decision for the Digital Ecosystem
This decision is not just a sanction concerning X. It is also a concrete application of the EU’s new regulatory approach towards large online platforms.
3.1. Platform Responsibility Reaches a New Standard
The DSA positions platforms no longer as passive actors but as risk managers concerning content moderation and advertising transparency.
3.2. Researcher Access as a Fundamental Element of the Democratic Ecosystem
Platforms closing access to publicly available data can pose serious risks to democracy and public order. This decision aims to create a transformation that strengthens scientific research against information manipulation.
3.3. A Clear Stance Against Deceptive Design Practices
For the first time, the DSA has comprehensively regulated the banning of deceptive designs (dark patterns) in EU law. This penalty underscores that manipulation through design is no longer just an ethical issue but a legal violation.
4. LANT Perspective: The Beginning of an Era of Transparency and Accountability in Digital Governance
As LANT, we see this decision as a critical step towards raising governance standards in the global digital economy.
The societal impact of platforms is no longer just a matter of technology or product design; new balances are being established at the intersection of law, ethics, algorithmic transparency, and public order.
This sanction particularly brings the following questions back to the forefront:
How do platform designs shape user behaviors?
How does the lack of transparency in the advertising ecosystem increase information security risks?
Why is access granted to researchers for auditing algorithmic decision-making processes indispensable?
Due to LANT’s mission, we will continue to raise awareness in these areas and contribute to policy development processes with our community of experts.
Result
The 120 million Euro fine imposed by the European Commission on X is the first and a symbolic case that demonstrates the enforcement power of the DSA.
This approach, centered on transparency, accountability, and user safety, outlines a strong framework for how digital platforms will be governed in the future.
Access to the official statement:
https://ec.europa.eu/commission/presscorner/detail/en/ip_25_2934



