According to 9to5Mac, the European Commission has issued preliminary findings that both Meta and TikTok violated child protection rules under the Digital Services Act. The companies were found to have created burdensome procedures for researchers seeking data on children’s exposure to harmful content and used deceptive interface designs that made reporting illegal content difficult. This regulatory action comes as Meta faces separate legal challenges in the US regarding its handling of teen mental health research.
Table of Contents
Understanding the Digital Services Act Framework
The Digital Services Act represents Europe’s most ambitious attempt to create a comprehensive regulatory framework for digital platforms. Unlike previous regulations that focused primarily on data privacy, the DSA specifically addresses platform accountability, transparency, and user protection mechanisms. The act establishes tiered obligations based on platform size, with very large online platforms like Meta and TikTok facing the strictest requirements. These include mandatory risk assessments, independent auditing, and crucially, researcher access to public data – the very provision at the center of these preliminary findings.
Systemic Failures in Platform Accountability
The core issue here extends beyond simple regulatory non-compliance to fundamental questions about platform governance. When companies deliberately complicate researcher access through what the EU describes as “burdensome procedures,” they’re effectively creating information asymmetries that prevent independent verification of their safety claims. This pattern isn’t new – internal documents from Meta’s US litigation reveal a consistent strategy of limiting research visibility to manage legal exposure. The use of dark patterns in reporting mechanisms represents an even more concerning development, as it actively discourages users from flagging harmful content while maintaining the appearance of compliance.
Broader Implications for Social Media Regulation
These findings will likely accelerate global regulatory efforts beyond Europe. The DSA’s approach of mandating researcher access creates a template that other jurisdictions may adopt, potentially leading to fragmented compliance requirements across different markets. For platforms like Instagram, which derive significant engagement from younger users, the operational impact could be substantial. We’re likely to see increased pressure for standardized data access protocols and independent oversight mechanisms that transcend individual platform controls. The timing is particularly significant given increasing scrutiny of how algorithmic content distribution affects developing adolescent brains.
Navigating the Compliance Landscape
The coming months will test whether platform operators can genuinely reform their approaches or will continue with compliance theater. With potential fines reaching 6% of global turnover, the financial incentives for meaningful change are substantial. However, the deeper challenge involves cultural transformation within these organizations – moving from legal risk management to genuine safety prioritization. The parallel US litigation, where a judge rejected Meta’s attorney-client privilege claims under the crime-fraud exception, suggests courts are becoming increasingly skeptical of corporate attempts to shield internal safety discussions. This converging pressure from multiple regulatory fronts may finally force the transparency that voluntary initiatives have failed to deliver.