European regulators on Feb. 6, 2026 unveiled charges that TikTok has engineered its wildly popular video app to be “addictive,” forcing thousands of new videos on users in endless succession. The European Commission’s preliminary report, part of a year-long probe under the new Digital Services Act (DSA), cites features like infinite scroll, autoplay videos and constant push notifications as deliberately geared to keep users – especially minors – glued to their screens. TikTok’s owner ByteDance was given a short deadline to “change the basic design” of the app in Europe or face steep penalties: up to 6% of global turnover in fines.
For Europe’s roughly 170 million TikTok users – a majority of whom regulators say are children – this could mean a very different app experience is on the way. More broadly, the move signals a new era of tech policy: regulators worldwide are forcing social media companies to reckon with the mental-health impact of their products.
What happened
On Feb. 6 in Brussels, the European Commission formally accused TikTok of violating the DSA by failing to curb “addictive” design elements in its app. The charges came at a press conference led by EU tech chief Henna Virkkunen and spokesman Thomas Regnier. The regulators’ 31-page preliminary report details how TikTok’s core features – from its never-ending video feed to its AI-powered recommendations – “fuel the urge to keep scrolling,” shifting users’ brains into “autopilot mode”. In particular, the report singled out TikTok’s infinite scroll and autoplay functions, and said these “reward” users with constant new content at the expense of their well-being.
The Commission said TikTok “did not adequately assess” how its design could harm users, especially minors, and noted the app ignores signs of compulsive use (such as late-night screen time). Regulators have now told TikTok to disable or limit endless scrolling, implement mandatory screen-time breaks (including nighttime shutoff), and overhaul its recommendation algorithm to reduce addictive potential.
TikTok (owned by China’s ByteDance) immediately rejected the claims.
A company spokesman called the EU’s preliminary findings “categorically false and entirely meritless,” saying TikTok will fight the charges “through every means available”. In Brussels, Ms. Virkkunen said that TikTok must act “to protect our minors” and change its app’s design in Europe. The EU has set no hard deadline for a final ruling, but officials noted TikTok can review the evidence and submit a written response before any decision is finalized. If the Commission confirms a DSA breach, TikTok could face fines up to 6% of ByteDance’s annual global revenue – a sum likely to run into the billions of dollars.
The EU’s charges focus squarely on TikTok’s feed mechanics – the looping short-video stream shown above. Regulators argue that features like endless video scrolling and instant autoplay “reward” users with new clips and push their brains into compulsive viewing. The Commission noted that TikTok’s current screen-time tools and parental controls are “easy to dismiss” and provide only “limited friction,” making them ineffective for curbing binge use. By forcing an “autopilot mode,” the app’s AI-driven recommender system is said to undermine self-control and promote excessive screen time. In effect, Brussels is demanding a near-total redesign of TikTok’s user interface: kill the infinite scroll, insert hard breaks in viewing, and dial back the algorithm that feeds personalized content.
Why it matters
This is a watershed moment for both users and the tech industry. For TikTok’s hundreds of millions of devotees, especially teens, the expected changes could substantially alter daily usage. Imagine a TikTok that forces you to stop watching after a set time or resets the feed instead of endlessly loading – user habits and engagement metrics could shift sharply. That, in turn, threatens TikTok’s core business. The app’s immense screen time is what drives ad views and growth, so regulators are effectively challenging TikTok’s entire model. Parental-advocacy groups have long warned that unlimited scrolling contributes to youth anxiety and sleep problems; regulators now echo those concerns, saying the platform must curb its own design tricks to protect children’s mental health.
The broader implications are significant. EU enforcers are setting a precedent that software itself – not just content – can violate the law if it harms users. Other social apps are watching closely. Last year the EU charged Facebook and Instagram with “dark patterns” and shadowy UI tricks; it has also asked Google, Apple and Snapchat to detail how they shield kids from harm. Analysts say TikTok’s case shows regulators mean business: EU Vice President Virkkunen has made clear that “platforms [are] responsible for the effects they can have on their users,” warning that addictive designs will no longer escape scrutiny.
In practice, this could spur competitors and new startups to build social media with built-in limits or rewind buttons – aligning with a nascent industry trend toward “healthy tech” features. Tech stock markets showed only mild reaction (none of the companies is solely exposed to TikTok’s fate), but media and privacy advocates hailed the move as a long-overdue stand against always-on social feeds.
Context & background
EEurope’s Digital Services Act took effect in 2024. Its goal is to force large tech platforms to police abuse and put user safety first.
The law applies to any online service with more than 45 million users in the EU. TikTok easily qualifies. The platform now has around 170 million European users. EU officials have noted that a large share of them are children.
Under the DSA, regulators can investigate both content and product design. That includes how apps are built to influence user behavior. The TikTok probe followed a two-year investigation led by the European Commission. It examined whether the app meets the DSA’s risk-mitigation requirements.
TikTok is not alone. Regulators have launched similar cases against other tech giants. In October 2025, Brussels charged Meta over interface practices on Facebook and Instagram. Authorities have also questioned Snap and Google about age verification and teen safety.
Pressure is growing beyond Brussels. Lawmakers in several EU countries are debating hard age limits for social media. Other regions are moving even faster. Australia recently banned users under 16 from most social platforms. In the United States, dozens of states now require social networks to verify users’ ages.
The trend is clear. Regulators worldwide are targeting youth screen time. TikTok now sits at the center of that push. The timing is sensitive. Just this month, the company settled a major U.S. lawsuit tied to alleged social media addiction claims.
TikTok’s own history adds context.
The app’s highly personalized algorithm made TikTok the fastest-growing social platform in history. It also drew intense scrutiny. Critics have raised concerns about security, data privacy, and censorship.
In 2023, ByteDance spun off TikTok’s U.S. operations. The move came partly in response to political pressure. The EU action, however, clearly covers TikTok’s entire European footprint.
Regulators have already forced changes. Last year, TikTok agreed under the DSA to publish an advertising library. The goal was to help researchers identify scams and misleading promotions.
Now, the focus has shifted to product design. The ruling on addictive features is the first major test of the DSA’s power. If Brussels can force changes to a dominant app’s feed, the impact could extend far beyond TikTok. It may reshape how social apps are built across the industry.
The backlash is already visible online. Tech blogs and social platforms are flooded with “infinite scroll” memes. Users are debating whether compulsive design should be legal at all.
One EU tech adviser summed it up bluntly. The Commission’s findings, they said, put “the internet’s daily dose of dopamine under the microscope.” For the tech industry in 2026, that microscope has never been sharper.
Reactions
Unsurprisingly, TikTok’s camp blasted the move as overreach. A company spokesperson insisted the platform’s design is not to blame, repeating that the charges were “categorically false and entirely meritless”. On Capitol Hill and Wall Street, observers noticed the timing: U.S. regulators and lawmakers themselves have been debating TikTok, but have criticized EU action for potentially censoring content. Meanwhile, EU officials and advocates cheered the announcement. Henna Virkkunen stressed that the law is meant to protect minors: “TikTok has to change the design of their service in Europe to protect our minors,” she told reporters.
EU lawmaker Alexandra Geese – who chairs a youth rights committee – issued a statement calling out the broader danger: “Many social media platforms ruthlessly exploit these (addictive) mechanisms to boost advertising revenue at the expense of the health of children and teenagers. This must come to an end,” she said. Tech analysts note this is a stark contrast to Big Tech’s usual defense of “user freedom” – if even a tech giant like TikTok must bow to user-wellness rules, the norms of app design could be upended. On social networks, some users mockingly pointed out that TikTok’s business model depended on tricks like this, while parents and child-safety groups largely applauded the EU for taking a stand.
What’s next
In the short term, TikTok must prepare its defense and propose remedies. The company has a formal window (typically a few weeks) to review the Commission’s files and submit arguments. If TikTok admits to changes, it might begin testing new features: for example, it could enforce stricter screen-time limits, introduce more friction into the feed, or tweak its recommendation algorithm in Europe. If it resists, Brussels could issue a non-compliance order and eventually impose fines – up to 6% of ByteDance’s turnover. EU officials have suggested they expect rulings on this and related cases “in the next weeks and months”, so a final decision on TikTok could arrive by early spring.
Looking further ahead, the outcome will influence more than just one app. Other platforms (Meta, Snap, Google, etc.) will be watching to see how far the EU will go. If TikTok is forced to overhaul its user experience, it sets a precedent that could motivate regulators in the U.S., UK and beyond to demand similar changes. Product teams worldwide are now on notice that “make apps engaging” can no longer ignore mental health. For users, the changes could mean a less frantic TikTok – if only in Europe first. But if TikTok appeals or courts balk, we could instead see a drawn-out legal battle, raising questions about regulators’ power over software design.
One thing is clear: this case marks a new chapter in tech oversight. Whether this leads to friendlier, safer apps or simply a whack-a-mole of endless regulation remains to be seen. For now, the industry is watching and waiting to see just how un-bendy the scroll will become.
Sources: Official EU charges and expert analysis





0 Comments