Attorneys general from 14 states, including New Jersey, New York, California, and Massachusetts, have launched lawsuits against TikTok, alleging the social media giant’s practices are harming children’s mental health. The lawsuits, which span multiple states, claim that TikTok, owned by the Chinese company ByteDance, has knowingly designed its platform to encourage compulsive use among young users, fostering addictive behaviors that exacerbate mental health issues in children and teens.
New Jersey Attorney General Matthew Platkin is at the forefront of the legal action, accusing TikTok of deliberately ignoring user safety concerns. His office argues that TikTok has failed to implement features to mitigate excessive use, despite being aware of the platform’s negative impact on minors’ mental well-being. “TikTok’s algorithms and design are not accidental; they are engineered to keep users, especially children, hooked on the platform,” Platkin said.
California’s Attorney General Rob Bonta, who is also part of the multi-state lawsuit, added that TikTok is exploiting vulnerable users. “They know exactly what they are doing,” Bonta said. He accused TikTok of creating a feedback loop that prioritizes engagement over safety, resulting in children spending excessive time on the app and being exposed to content that harms their mental and physical well-being. Bonta’s office, alongside those from Florida, Kentucky, Nebraska, Tennessee, and Vermont, claims that TikTok has violated state consumer protection laws, calling its practices manipulative and dangerous.
TikTok Under Nationwide Scrutiny
The lawsuits are part of a growing wave of legal actions and investigations aimed at reining in the social media industry’s impact on young people. Earlier this year, 43 attorneys general, including Bonta and Platkin, opened an investigation into Meta Platforms, the parent company of Facebook and Instagram, on similar grounds. These state officials allege that platforms like TikTok and Meta deliberately use sophisticated algorithms to maximize screen time, targeting young users with addictive content that can lead to negative emotional, psychological, and physical outcomes.
In the case against TikTok, the states argue that the platform uses persuasive techniques, such as infinite scrolling, push notifications, and tailored content, which increase the time users, especially children, spend on the app. By keeping children engaged, the attorneys general say TikTok increases its advertising revenue while neglecting the well-documented mental health risks associated with excessive social media use.
Legal and Legislative Responses
As lawsuits move forward, state officials are making it clear that they intend to continue pursuing legal action against other social media platforms that they believe promote unhealthy habits among youth. Attorney General Platkin has signaled that his office will also target Meta and Discord, two other platforms that have faced scrutiny for their impact on younger users. Platkin reiterated his commitment to holding tech companies accountable, saying, “We will continue to go after platforms that profit at the expense of our children’s well-being.”
For TikTok, this legal pressure adds to the growing concerns over its operations, particularly as it faces scrutiny from the U.S. government over data privacy and national security concerns. The lawsuits raise additional questions about the platform’s responsibility to protect its users, particularly vulnerable young people who may be susceptible to harmful content.
The Broader Debate on Youth and Social Media
The legal action against TikTok is part of a broader national debate about the role social media plays in the mental health crisis among children and teens. Advocates for greater regulation of social media argue that platforms have a duty to prioritize the health and safety of young users, while tech companies often contend that parents and guardians should play a larger role in managing their children’s screen time.
The outcomes of these lawsuits could set significant precedents for how social media platforms are regulated in the future. As the cases proceed, experts believe that any rulings in favor of the states could force tech companies to overhaul their practices, making it more difficult for them to design their platforms in ways that exploit children’s engagement.
As the legal battles unfold, many parents, educators, and mental health professionals will be watching closely, hoping for reforms that can curb the negative impact of social media on the youngest and most vulnerable users.