TikTok is back at the center of the European regulatory debate, as the European Commission has preliminarily concluded that the platform’s design violates the Digital Services Regulation. At the heart of the case is the way the app structures the user experience, creating conditions that, according to the Commission, foster addiction and pose risks to the physical and mental well-being, especially of minors.
The Commission's preliminary assessment focuses on specific design features that are key elements of the experience. TikTok. Unlimited scrolling of content, auto-playing videos, constant push notifications and highly personalized recommendation systems create an environment of constant stimulation. According to the Commission, these elements act as “reward” mechanisms, reinforcing the urge for prolonged use without a conscious decision.
The scientific research cited by the Commission shows that this type of design can shift users' brains into a state of "autopilot". In this state, attention is reduced, self-control is weakened and behavior becomes more compulsive. This phenomenon is not limited to adult users, but also largely concerns minors, who are more vulnerable to such practices due to developmental factors.
According to preliminary findings, TikTok failed to adequately assess the risks arising from these features. The Commission considers that the company did not take into account the potential impact on the physical and mental health of users, especially minors and vulnerable adults. The issue is not only about the duration of use, but also the quality of the experience and its impact on daily life.
Particular mention is made of the neglect of critical indicators of compulsive use. Among these, the Commission highlights the time minors spend on the app during night hours, the frequency with which users open the app during the day, as well as other behavioral patterns that could act as warning signs. The lack of systematic analysis of this data is considered a serious weakness in the platform's risk management approach.
Read Also: TikTok: Signs new deal and stays alive in the US
In terms of mitigation measures, the picture presented by the Commission is equally problematic. Existing screen time management tools are considered insufficient, as they can be easily bypassed by users and do not create substantial “friction” that would encourage the restriction of use. Similarly, parental control tools are characterized as having limited effectiveness, since they require additional time, technical knowledge and ongoing involvement from parents.
The European Commission believes that, at this stage, small improvements or individual adjustments are not enough. Instead, it believes that TikTok should review the very basic design of its service. Among the suggestions mentioned are the gradual disabling or limiting of unlimited scrolling, the implementation of truly effective breaks in use, especially during night hours, and the readjustment of the recommendation system so that it does not reinforce compulsive behaviors.
Read Also: Lizzo: She settled out of court the dispute over the TikTok song she "played" with Sydney Sweeney
Despite the clarity of the findings, the Commission stresses that these are preliminary views and that the investigation is not yet complete. The conclusions are based on an extensive analysis of internal company documents and data, TikTok's responses to repeated requests for information, and a review of the international scientific literature on behavioral addiction. In addition, interviews were conducted with experts from different scientific fields.
In the next stage, TikTok has the right to examine the investigation file and respond in writing to the preliminary findings. At the same time, the European Digital Services Board, which plays an advisory role in the application of the Regulation, will be consulted. This process is part of the legal guarantees provided for large digital platforms.
If the Commission's views are confirmed at the final stage, the case may lead to a decision of non-compliance. In such a case, the sanctions provided for are particularly severe, as the fine can reach up to 6% of the company's total worldwide annual turnover. The amount of the penalty will depend on the seriousness, duration and repetition of the infringement.
Read Also: Skye Newman: From TikTok to BBC Radio 1 – The story behind the Sound of 2026
The investigation into addictive design is only part of a broader process of reviewing TikTok's compliance with the Digital Services Act, which began in February 2024. Other issues are also being examined in this context, such as the so-called "rabbit hole effect" in recommendation systems, the risk of minors being exposed to inappropriate content due to false age claims, as well as the platform's obligations for a high level of privacy and safety protection for minors.
Meanwhile, there have already been some developments in other areas. Researchers’ access to public data on the platform was the subject of preliminary findings in 2025, while advertising transparency issues were addressed through binding commitments made by the company at the end of the same year. All of this adds up to a picture of increased pressure on the big platforms for greater accountability.
The TikTok case raises a broader question about the future of social networks in Europe. The issue is not limited to compliance with a regulation, but concerns how the design of digital services affects the daily lives of millions of users. As the investigations progress, the balance between innovation, entrepreneurial freedom and the protection of mental health is expected to be at the heart of the public debate.