Listen to the article
EU Regulators Target TikTok Over “Addictive” Design Features
European Commission regulators have launched proceedings against TikTok, claiming the social media platform’s design features may foster addiction, especially among younger users. The investigation represents a significant shift in European regulatory focus, targeting not just harmful content but the fundamental architecture of digital platforms and their psychological impact on users.
The investigation falls under the European Union’s Digital Services Act (DSA), which requires large online platforms to identify and mitigate “systemic risks,” including potential threats to mental health and child safety. European officials have specifically highlighted TikTok’s infinite scroll function, autoplay capabilities, and highly personalized recommendation algorithms as features that may encourage compulsive usage patterns.
“These are not just content moderation issues anymore,” said a Commission spokesperson familiar with the investigation. “We’re examining how the very design of these platforms may shape user behavior in potentially harmful ways.”
Currently in its preliminary phase, TikTok has the opportunity to respond to the allegations before the Commission reaches a final determination. However, the stakes are considerable – companies found in violation of DSA requirements face potential fines of up to 6% of their global annual revenue, which could translate to hundreds of millions of dollars. Beyond financial penalties, the EU could mandate significant design modifications to reduce features deemed potentially addictive.
TikTok has pushed back against some of the preliminary findings, highlighting safety features it has already implemented. “We’ve introduced screen-time reminders, parental controls, and other tools specifically designed to help users maintain a healthy relationship with our platform,” a TikTok representative stated. The company emphasized its commitment to user wellbeing while defending its core product features.
The scientific foundation for “social media addiction” remains complex. Unlike gaming disorder, which the World Health Organization formally recognized in its ICD-11 classification system, social media use has not been classified as a standalone clinical disorder. However, research has increasingly examined how certain design elements – including endless content feeds, algorithmically curated recommendations, and variable reward mechanisms – may reinforce habitual or compulsive usage patterns in susceptible individuals.
Dr. Sarah Meyers, a digital psychology researcher at Oxford University not involved in the regulatory proceedings, noted: “The evidence suggests these platforms aren’t universally ‘addictive,’ but their design certainly leverages psychological principles that can make it difficult for some users to regulate their usage. Younger users may be particularly vulnerable.”
The European regulatory approach stands in contrast to the more fragmented framework in the United States. While the EU has moved toward directly regulating platform design through the DSA, the U.S. lacks comprehensive federal legislation targeting potentially addictive design features. American regulation has instead focused on adjacent issues like data privacy.
The Children’s Online Privacy Protection Act (COPPA), for instance, primarily addresses how companies collect data from children under 13, not how platforms are designed to encourage engagement. Meanwhile, the proposed Kids Online Safety Act (KOSA) would require platforms to mitigate potential harms to minors, including references to design practices that increase time spent online, but the legislation has yet to become law.
The TikTok investigation underscores a broader international debate about the regulation of digital platforms. European officials have positioned themselves at the forefront of efforts to address not just what appears on these platforms but how they are engineered and the behavioral patterns they may encourage.
Industry observers note that the outcome could have far-reaching implications beyond TikTok, potentially establishing precedents for how digital platforms approach engagement-driven design features across the sector. The case represents a significant test of the EU’s regulatory ambitions and could influence approaches to platform governance worldwide.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


11 Comments
Interesting to see the EU taking a closer look at the addictive design features of social media platforms. Addressing the psychological impact of these features is an important step towards protecting user wellbeing.
I agree, the EU’s focus on mitigating systemic risks like mental health threats is a welcome shift in the regulatory approach.
The infinite scroll, autoplay, and personalized algorithms are definitely concerning design choices that can foster compulsive usage. Glad to see regulators scrutinizing these features for their potential to harm, especially younger users.
Absolutely, these platforms need to be held accountable for the negative impacts their design can have. Prioritizing user wellbeing should be paramount.
While digital platforms provide value, their design should prioritize user wellbeing. This regulatory action is a step in the right direction towards more responsible innovation.
It will be interesting to see how TikTok responds to these regulatory charges. Transparency around their design choices and willingness to make changes will be key.
Agreed, their response will be telling. Hopefully they’ll work constructively with regulators to address the identified issues.
While social media platforms provide entertainment and connection, their design should not come at the expense of user mental health. This regulatory action is an important step forward.
The shift towards examining platform architecture, not just content, is a crucial evolution in digital regulation. Tackling systemic risks head-on is necessary to protect vulnerable users.
I’m curious to see if this investigation leads to broader reforms around addictive design features across the social media landscape. Consistency in standards would be ideal.
Good point. Consistent regulations that address these issues holistically will be important for creating a healthier digital ecosystem.