The European Commission has issued preliminary findings indicating that Meta (for Facebook and Instagram) and TikTok are in breach of the EU's landmark Digital Services Act (DSA). The primary allegations focus on a failure to comply with transparency obligations, specifically by not providing researchers with adequate access to publicly available platform data. This access is a key requirement of the DSA, intended to allow independent analysis of the platforms' societal risks. Additionally, Meta is accused of implementing a cumbersome and deceptive reporting mechanism for illegal content, making it difficult for users to flag harmful material. These findings are the first step in a formal process, and the companies now have the right to respond before the Commission makes a final determination.
The investigation centers on specific articles of the Digital Services Act, a comprehensive law governing online platforms operating in the EU.
Key Alleged Breaches:
These preliminary findings were made in cooperation with Coimisiún na Meán, Ireland's Digital Services Coordinator.
These are designated as "Very Large Online Platforms" (VLOPs) under the DSA, subjecting them to the strictest level of regulation.
The DSA mandates that VLOPs must:
Failure to comply can result in significant penalties.
If the preliminary findings are confirmed and the companies fail to take satisfactory remedial action, the European Commission can impose significant penalties. Under the DSA, fines can reach up to 6% of a company's global annual turnover. The Commission can also issue legally binding orders demanding specific changes to the platforms' operations. This case serves as a major test of the EU's new digital rulebook and its ability to hold Big Tech accountable.
This action signals the European Commission's serious intent to enforce the DSA vigorously. For the platforms, it creates significant legal and financial risk. They may be forced to re-engineer their data access APIs for researchers and overhaul their content reporting user interfaces. For the public and research community, a successful enforcement action could unlock a wealth of data for studying the impact of social media on society. However, a failure to secure meaningful changes could be seen as a sign that the DSA lacks the teeth to truly regulate these powerful platforms.

Cybersecurity professional with over 10 years of specialized experience in security operations, threat intelligence, incident response, and security automation. Expertise spans SOAR/XSOAR orchestration, threat intelligence platforms, SIEM/UEBA analytics, and building cyber fusion centers. Background includes technical enablement, solution architecture for enterprise and government clients, and implementing security automation workflows across IR, TIP, and SOC use cases.
Help others stay informed about cybersecurity threats