EU Accuses Meta and TikTok of Breaching Digital Services Act Transparency Rules

European Commission Finds Meta and TikTok in Preliminary Breach of Digital Services Act (DSA)

INFORMATIONAL
October 25, 2025
4m read
RegulatoryPolicy and Compliance

Related Entities

Organizations

European Commission

Products & Tech

FacebookInstagram

Other

MetaTikTokDigital Services Act (DSA)

Full Report

Executive Summary

The European Commission has issued preliminary findings indicating that Meta (for Facebook and Instagram) and TikTok are in breach of the EU's landmark Digital Services Act (DSA). The primary allegations focus on a failure to comply with transparency obligations, specifically by not providing researchers with adequate access to publicly available platform data. This access is a key requirement of the DSA, intended to allow independent analysis of the platforms' societal risks. Additionally, Meta is accused of implementing a cumbersome and deceptive reporting mechanism for illegal content, making it difficult for users to flag harmful material. These findings are the first step in a formal process, and the companies now have the right to respond before the Commission makes a final determination.


Regulatory Details

The investigation centers on specific articles of the Digital Services Act, a comprehensive law governing online platforms operating in the EU.

Key Alleged Breaches:

  1. Lack of Researcher Data Access (Article 40 of the DSA): The Commission found that both Meta and TikTok may have established "burdensome procedures and tools" that effectively hinder researchers from accessing public data. This prevents independent study of platform risks, such as the spread of disinformation or illegal content.
  2. Burdensome Illegal Content Reporting (Article 16 of the DSA): This finding is specific to Meta's Facebook and Instagram. The Commission alleges that the mechanism for users to report illegal content is not user-friendly or easily accessible. It is accused of using "deceptive design" (dark patterns) and imposing unnecessary steps, which discourages users from flagging harmful content like terrorist material or child sexual abuse material (CSAM).

These preliminary findings were made in cooperation with Coimisiún na Meán, Ireland's Digital Services Coordinator.

Affected Organizations

  • Meta: Specifically its Facebook and Instagram platforms.
  • TikTok: The popular short-form video platform.

These are designated as "Very Large Online Platforms" (VLOPs) under the DSA, subjecting them to the strictest level of regulation.

Compliance Requirements

The DSA mandates that VLOPs must:

  • Provide a clear, simple, and effective mechanism for users to report illegal content.
  • Grant vetted researchers access to platform data to conduct studies on systemic risks.
  • Be transparent about their content moderation and advertising systems.

Failure to comply can result in significant penalties.

Implementation Timeline

  • October 25, 2025: The European Commission announces its preliminary findings.
  • Response Period: Meta and TikTok now have a period to review the investigation files and submit a written response.
  • October 29, 2025: A new delegated act on data access is set to come into force, which will grant researchers even broader access rights to non-public data from VLOPs, increasing the pressure on platforms to comply.

Enforcement & Penalties

If the preliminary findings are confirmed and the companies fail to take satisfactory remedial action, the European Commission can impose significant penalties. Under the DSA, fines can reach up to 6% of a company's global annual turnover. The Commission can also issue legally binding orders demanding specific changes to the platforms' operations. This case serves as a major test of the EU's new digital rulebook and its ability to hold Big Tech accountable.

Impact Assessment

This action signals the European Commission's serious intent to enforce the DSA vigorously. For the platforms, it creates significant legal and financial risk. They may be forced to re-engineer their data access APIs for researchers and overhaul their content reporting user interfaces. For the public and research community, a successful enforcement action could unlock a wealth of data for studying the impact of social media on society. However, a failure to secure meaningful changes could be seen as a sign that the DSA lacks the teeth to truly regulate these powerful platforms.

Timeline of Events

1
October 25, 2025
The European Commission announces its preliminary findings that Meta and TikTok are in breach of the DSA.
2
October 25, 2025
This article was published

Article Author

Jason Gomes

Jason Gomes

• Cybersecurity Practitioner

Cybersecurity professional with over 10 years of specialized experience in security operations, threat intelligence, incident response, and security automation. Expertise spans SOAR/XSOAR orchestration, threat intelligence platforms, SIEM/UEBA analytics, and building cyber fusion centers. Background includes technical enablement, solution architecture for enterprise and government clients, and implementing security automation workflows across IR, TIP, and SOC use cases.

Threat Intelligence & AnalysisSecurity Orchestration (SOAR/XSOAR)Incident Response & Digital ForensicsSecurity Operations Center (SOC)SIEM & Security AnalyticsCyber Fusion & Threat SharingSecurity Automation & IntegrationManaged Detection & Response (MDR)

Tags

EUDigital Services ActDSAMetaTikTokRegulationTransparencyDark Patterns

📢 Share This Article

Help others stay informed about cybersecurity threats

Continue Reading