Courtesy of Pexels

On October 8, California Attorney General (AG) Rob Bonta filed a lawsuit against TikTok, accusing the social media platform of deliberately targeting young users to its app with addictive and dangerous features.

Leading the charge for 13 states, AG Bonta alleges TikTok violated California’s Unfair Competition and False Advertising Laws — that prohibit dishonest marketing and fraudulent business practices — by collecting young users’ data and misleading the public regarding the app’s “addictive features” and “harmful content.” The lawsuit seeks “significant penalties” from TikTok, including an injunction and monetary damages against the platform.

The AG’s lawsuit is another attempt in the U.S. government’s growing list of actions to regulate social media companies like TikTok and protect users from online dangers. In April, President Biden signed bipartisan legislation giving Beijing-based parent company ByteDance nine months to sell TikTok or face nationwide prohibition in the U.S. The company is challenging the law in court and continues to operate in the U.S. unabated.

While the federal government’s actions against TikTok are primarily focused on its foreign intelligence threat, the state’s lawsuit centers on the platform’s addictive features, which the AG believe are fueling a rising mental health crisis among young people. 

An increasing body of studies, including social media companies’ internal research, supports the notion that excessive social media use can have harmful mental health impacts, including expanded rates of anxiety, depression, self-harm and suicide. A 2024 University of California, San Francisco study, for instance, found that extended screen time for preteens increases the likelihood that they will develop symptoms of severe mental illness, which is especially concerning when 67% of 13 to 17-year-olds use TikTok and adolescents spend an average of 3.5 hours a day on social media.

Despite mounting state and federal pressure, including hundreds of lawsuits, social media companies have been slow to alter the most dangerous aspects of their platforms or install meaningful safeguards for users under 18. This inaction underscores the critical importance the teen demographic has for social media’s bottom line: who will do anything for profit — even if that means harming their youngest users.

According to internal documents from an ongoing lawsuit against TikTok, company executives were recorded “speaking candidly about a host of dangers for children,” including safeguards — or rather the lack of protections — for user time management on the app. The TikTok managers commented that the “goal is not to reduce the time spent” but to “contribute to daily active users [DAU] and retention.” Simply put, social media executives want their users to spend more time on their apps because it increases their profits.

Courtesy of Wikimedia Commons

AG Bonta also referenced this point at a San Francisco news conference, saying that TikTok has “chosen profit over the health of our children” and that “youth addiction is a key and central pillar to TikTok’s business model.”

So, what can be done to ensure teens don’t fall victim to these platforms’ negative influences while the powerful social media companies battle with the government and the courts?

Earlier this year, U.S. Surgeon General Dr. Vivek H. Murthy provided an important suggestion to this question: requiring that social media companies include warning labels on their platforms that would explicitly inform users of safety issues, raise awareness about mental health risks and possibly change behavior.

Like other dangerous and addictive products, such as tobacco and alcohol, social media needs to start being treated as a health risk with real dangers to mental health. Adding warning labels to social media platforms would allow parents and younger users to knowingly accept the potential risks of engaging with these sites as currently designed.

 Schools can and should also offer counseling and information on the dangers of social media to teens. Much like sex and drug education, which for years have been taught in schools from a harm reduction perspective, the hazards of social media should be provided in a practical, nonjudgmental way to teens to minimize mental health risks.

Ultimately, because children cannot be expected to understand all the complexities of social media and because technology has increasingly become a more significant part of young lives, it is really up to parents to monitor their children’s use of social media and help them establish healthy behaviors online. This would include monitoring screen time and helping them develop self-discipline with social media at a young age, therefore encouraging them to set boundaries and make the right choices on their own when they are older.

Social media companies like TikTok and others must cease exploiting and harming young users and deceiving the public about the dangers of their platforms. However, until that happens, teens and parents need to fully understand the risks and consequences of engaging with these platforms and take proactive actions to protect themselves.

Author