Courtesy of Pexels

California, along with 40 other states, has sued Meta Platforms Inc. for designing harmful features that deliberately addict children and young adults to its apps. The lawsuit alleges that Meta, owner of Facebook and Instagram, exploits young users for “profit” by deploying manipulative features that are harmful to children’s mental health. These addictive features violate consumer protection and child safety laws, according to State Attorney Generals. As more and more studies report that younger generations struggle with worsening mental health, for-profit social media sites’ designs and features are a significant factor in harming younger users’ safety. 

Meta’s designs include specific features that make self-regulation of and disengagement from social media platforms nearly impossible. Algorithms that log behavior and collect data on what users want to see recommend content to keep young users on social media for hours. Constant alerts about likes and posts induce users to return to the app throughout the day, often disrupting normal routines. Additionally, the ability to “infinitely scroll” through posts online facilitates more engagement with Meta’s products, which could ultimately lead to unhealthy online habits. Meta’s apps also promote negative body image issues with visual filters that target specific physical features, such as eye color or skin tone, leading to low self-esteem and depression. Though concealed, these particular designs are undoubtedly created to make Meta’s apps addictive for younger users at the cost of their mental health and safety. 

Despite these concerns, the State Attorneys’ General lawsuit focuses on consumer protection laws, which prevent companies from deceptive and fraudulent practices in business-consumer sales and transactions. According to the case, Meta fails to mention in its Privacy Policy that their data is used to “train its Recommendation Algorithms to induce them to keep using the Platforms.” For young children who often do not know the significance of sharing private information, Meta’s lack of transparency regarding the use of personal data collection is extremely dangerous. Though Meta mentions data is used for “improving our Products … including personalizing content and recommendations,” Meta should provide a direct explanation of how young users’ information is being used and shared, as this can jeopardize their safety and mental health. 

The lawsuit also alleges that Meta’s design violates protections under the Children’s Online Privacy Protection Act of 1998 (COPPA), a law requiring tech companies to obtain “consent from parents before collecting personal information of children online,” as young users are allowed to enter their personal information when creating an account before verifying parental consent. Last year, the Senate sought to update COPPA by extending the age from 13 to 16 before it was dismissed. Though this bill may seem controversial, the revisions are not drastic. They provide possible solutions to stop social media platforms from utilizing personal information to hook young users to their apps. For example, COPPA 2.0 would establish a “Digital Marketing Bill of Rights for Teens” to limit the collection of teenagers’ personal information and require companies to include an “Eraser Button” where parents and kids can “eliminate personal information from a child or teen when technologically feasible.” If the youth mental health crisis has revealed anything, it’s that youth consumer privacy issues are just the tip of the iceberg. 

For its part, Meta claims to have implemented measures and regulations addressing these issues, such as parental supervision tools and break reminders. However, these actions are still not enough, especially as they continue to include other harmful features knowingly. Documents leaked in 2021 by France Haugen, a former Meta employee, showed the company knew Instagram directly contributed to teenage girls’ worsening body image and did nothing to contain the problem. Meta’s decision to continue to retain design and features in their apps known to be harmful to young users shows they are more interested in generating profits even at the expense of the mental health of their youngest users. 

Despite recent Congressional efforts towards user safety and security on popular social media apps like TikTok, the federal government continues to play a game of catch-up with social media platforms regarding the safety and protection of youth online. Moreover, efforts to regulate tech across various areas, including antitrust legislation and new safeguards, have failed in Congress due to immense tech lobby opposition, resulting in various state social media consumer protection laws. This lawsuit is a much-needed, progressive step in outlining guidelines for social media platforms and reigning in companies’ profit-driven behaviors. Online child safety in all its forms is a significant issue that demands immediate national attention and action. It’s time the health and well-being of youth are prioritized above all else.

Author