Social Media Addiction x Meta Lawsuit

This brief details an examination into the current legal cases against social media corporation Meta Platforms Inc. for their addictive app features that have been harmful to children and teen mental health‍

At YIP, nuanced policy briefs emerge from the collaboration of six diverse, nonpartisan students.

HeadingHeading 3

Card Title

Lorem ipsum dolor sit amet conse adipiscing elit

Card Title

Lorem ipsum dolor sit amet conse adipiscing elit

Card Title

Lorem ipsum dolor sit amet conse adipiscing elit

Card Title

Lorem ipsum dolor sit amet conse adipiscing elit

Support

Abstract

This brief details an examination into the current legal cases against social media corporation Meta Platforms Inc. for their addictive app features that have been harmful to children and teen mental health

Executive summary

Meta Platforms are facing lawsuits in claims of addictive social media features targetted towards children and teenage users. This brief will examine the basis of the case, the history, and the future implications for underaged users.   

Overview

Meta Platforms is being sued by dozens of US states, including California and New York, for intentionally creating features on Facebook and Instagram that drive young users to develop an addiction to the social media sites, endangering the mental health of young people and exacerbating the juvenile mental health problem. The lawsuits claim that Meta intended to keep young users hooked to boost profits and that Meta gathers data on children under 13 without their parents' consent, possibly an infringement of federal law.

According to Colorado Attorney General Phil Weiser, who has concentrated on the deterioration of young people’s mental health and its relationship to social media, states that “the features designed for these apps, the infinite scroll feature, for one example, are done with an awareness that they drive addictive behavior, that there are no guardrails, that young people aren't self-regulating, and that they’re adopting these technologies, to their detriment.” 

The case, which was filed in federal court in California, also alleges that Meta frequently violates federal law by gathering data on children under the age of 13 without getting their parent’s permission. 

There are now 42 states taking action, as nine additional attorneys general have filed in their states on top of the 33 already. As per Meta’s statement, they have vowed support to the attorney general motive of offering children safe and good online experiences. They also remind through their statement of their thirty tools already in existence meant to benefit teenagers and their families on social media. 

This extensive lawsuit comes from a bipartisan coalition of attorney generals from California, Florida, Kentucky, Massachusetts, Nebraska, New Jersey, Tennessee, and Vermont who conducted inquiries. It also comes after scathing newspaper articles initially published in the autumn of 2021 by The Wall Street Journal

The article chronicles Meta's own research, which revealed that the company was aware of the negative effects Instagram may have on teens' mental health and body image, particularly for adolescent females. According to their internal survey, 17% percent of teenage girls said Instagram exacerbates eating disorders, and 13.5% of teenage girls said it exacerbates suicidal thoughts. After the first revelations, a group of news outlets, including The Associated Press, released their analysis based on papers disclosed by Frances Haugen, a whistleblower who has testified about her findings before Congress and a British parliamentary committee. Youth use social media almost exclusively in the United States and many other countries. 

History

The rise of modern social media, as the world knows it, began in the early 2000s, with the launch of Facebook in 2004, Reddit in 2005, and Twitter in 2006. Facebook started as a platform for college students and expanded to high school students in the US and the UK. With investment and time, Mark Zuckerberg developed "The Wall," an area of the user's profile where their friends could post public messages, prompting the user to check their feed constantly. With the implementation of other features, such as the like button and timeline, Facebook became a platform that more and more users gravitated towards. In 2018, Instagram showed that it was following a similar path, reaching one billion users that year. 

With the rise in the use of Instagram and the fact that the age of registration is only 13, the social media platform has become popular among teenagers and young adults, with more than half of the Instagram population under 34. While the platforms help young users stay in contact with friends and form new relationships, social media's easy access and quick rewards can be especially harmful for younger users. The constant pressure can lead to significantly negative consequences for teenagers who have trouble facing the pressure from social media.

With the rise of TikTok, there has been an increase in concerns over internet and social media safety, especially for teenagers. These concerns have been about privacy and safety in the content shown and how the content is displayed, mostly in how social media algorithms are designed to keep users constantly engaged with the platform. In late 2023, this concern was evident when dozens of states filed lawsuits against Meta, the parent company of Instagram, for using "powerful and unprecedented technologies to entice, engage and ultimately ensnare young people." 

This lawsuit goes back to 2021, when whistleblower Frances Haugen revealed thousands of internal Facebook documents and research, stating that Facebook cares about 'profits over safety.'In an interview with CBS News, Haugen revealed that one of Facebook's internal studies showed that Instagram harms teenage girls because of how the algorithm leads them to see the same triggering content over again, without the power to leave the platform. 

Then in the summer of 2021, state attorney generals chose to come together to face the issue of declining youth mental health in tandem with social media. Colorado State Attorney General Phil Weiser stated that Meta implemented features with awareness that they were addictive and also violated the federal Child Online Privacy Protection Act by marketing to children under 13 and collecting data without parents' consent. 

These years of warnings and research have culminated to these current ongoing lawsuits at the focus of this brief. 

Policy Problem

Meta’s perpetuation of addictive content begs whether young, impressionable individuals should face such endless content daily. 

While the addictive content itself is a considerable issue, it asks whether Meta is deliberately perpetuating such content. This issue is ultimately at the forefront of the allegations posed against Meta. If Meta was at fault for purposefully developing addictive algorithms, then a strong case could be made against them for neglect. 

Record levels of poor mental health amongst adolescents coupled with the dynamic, manipulative algorithms of Meta’s platforms (namely Facebook and Instagram) — which seek to develop content that is just as addictive — have served as the impetus for thousands of parents and activists across the country to take charge. While beneficial in connecting individuals with people who share their interests or hobbies, its personalization algorithm has been a double-edged sword. It has simultaneously been responsible for increasingly addictive content. 

Although there are technically age restrictions on Instagram and Facebook, preventing children younger than 13 from having an account, getting around such bans is often accomplished without much hassle and has been pronounced as evidence of neglect on Meta’s part. Along with easy age restriction bypasses, however, a 200+ page filed by multiple states, including Arizona, California, and Colorado, alleged Meta was guilty of a “scheme to exploit young users for profit,” thus proposing that not only was Meta guilty of knowingly using manipulative algorithms but doing so in the aims of profit - portraying Meta as putting profits over user health.

Even though children and adolescents have, by and large, been most affected by social media’s addictive algorithms, it is essential to note that Meta’s platforms have had adverse mental health effects on people of all ages, including even seniors. It also generates feelings of social isolation and depression — both of which are prime causes for age-related neurodegenerative diseases like Alzheimer’s.

Among its addictive features, consistent notifications and unending feeds are two aspects of Meta’s platforms that have been shown to harm younger users’ mental health. Social isolation, eating disorders, and negative self-perceptions have been pronounced by both those affected and those who’ve witnessed such impacts social media has had on their loved ones.

Previous lawsuits against Meta have resulted in the distribution of payments in the thousands of dollars or have resulted in minor, insignificant changes in Meta’s algorithms, which ultimately have not changed the collective opinion on Meta’s addictive algorithms

Meta and other social media companies have argued, and are still currently asserting, the First Amendment with people’s rights to free speech. Whether that argument will hold up, however, remains to be seen. 

Policy Options

After the lawsuit of dozens of states, including big names like New York and California, it is clear that a problem lies within the features of Meta Platforms. This issue, primarily regarding children and teenagers, has been rendered to have drastic negative effects on their mental health. Thus, significant change must be implemented within these commonly used platforms. Acknowledging the gravity of the situation, policymakers and stakeholders are compelled to explore effective solutions to address this issue. These solutions must mitigate the harm caused to young people's mental health and combat the addictive features designed for these platforms. Maintaining a balance between preserving the positive aspects of social media and improving a proven faulty program is crucial going forward. 

Age-Appropriate Content and Feature Restrictions:

One thing that can be made for certain after this mass lawsuit is greater age verification. Numerous studies have proven the addictive features of these platforms lead to significantly more harm than good, especially when talking about the 13 and under demographic. According to the terms of service of these major social media platforms, this demographic shouldn’t even be using Meta’s apps. Yet, they easily bypass these age restrictions, rendering millions of underage users on Meta’s platforms. This is why there is an outstanding need for the implementation of stricter age verification to ensure that individuals below a certain age do not have access to certain features known to contribute to addictive behavior. 

Whether this is achieved through a release form signed by a parent/guardian or the scanning of an official government document to prove your age, there must be added security for platforms that hold these proven harmful features. 

Although this change may hinder the user experience for those “of age,” it will benefit humanity by keeping suicide and depression rates down in our children. 

Under the category of restricting general features within these platforms, it is essential to modify the algorithm to limit exposure to harmful content and harmful practices for younger users. It has been proven that meta is cultivating addiction in children and teens to muster every last dollar they can get. 

Features like the endless scroll seen on their content and the intense data farming significantly incline the user to spend more time on the application due to constant stimulation that aligns with their interests. Thus, collaborating with child development experts to establish age-appropriate guidelines for content and features will also benefit the younger demographics and the user base. 

Increased Transparency and User Control:

Meta intentionally designed its platform to incorporate addictive features that lure in children at the expense of their mental health and self-esteem. 

These very real statistics and outcomes from using these platforms can no longer fly under the radar. It is vital that Meta provides transparent information about the algorithms used to curate content and, most importantly, keep users on their platform for the most amount of time. This transparency should include accessible and easy-to-consume disclaimers that are available to all users, and required to be seen by any user in the teenage demographic. This level of extreme transparency is needed for full awareness of the effects that these addictive features have on our impressionable youth, which in turn will diminish the increased rates of depression that have been found in children. To keep this transparency on track the introduction of regular and detailed reports on the impact of social media use on mental health, would ensure accountability and inform users about potential risks associated with prolonged engagement. This is in tandem with increased motivation from parents and guardians of minors to take enhanced control over content filtering, time limits, and notifications, leading to a healthier social media environment. 

Investment in Digital Well-being Research and Education: 

A set of policies requiring Meta to collaborate with independent research institutions to conduct extensive studies on the mental health implications of social media use, particularly among young users, would help Meta develop a more sustainable and user-friendly social media experience. 

An article published by a professor at the University of Pittsburg said that the explanation to teens about the addictive nature and algorithms embedded into social media enhanced teen awareness leading to increased willingness to acknowledge their overuse on the platform. 

The policy option of Meta having the requirement to collaborate with researchers like the author of this article would overall be a significant investment in digital well-being and help stop this ongoing gain in youth social media addiction. Meta’s collaboration with these vital studies would undoubtedly push this crucial awareness due to their extreme position of power. Meta would have to allocate resources to develop and implement educational programs within the platforms, focusing on responsible online behavior, digital literacy, and strategies for maintaining a healthy balance between online and offline activities. This can be done with dedicated funds for mental health initiatives. This would include support for external organisations working on youth mental health, to address broader societal concerns beyond platform-specific issues.   

These policy options strive to address the root causes of the harm to young people's mental health while promoting a collaborative approach between the government and Meta to create a safer and more responsible digital environment for users, especially the youth.

Conclusions

This lawsuit is still ongoing, initiated in October of 2023. A separate group of over 40 attorneys is also leading potentially helpful investigations into Tiktok that could affect this Meta case. The company has asked for the current lawsuit by state attorneys to be dismissed along with other tangent lawsuits from consumers also suing for their alleged harm to teenage and children users. However this case may play out, it could enforce a new precedent of social media and its relation to mental health. 

Acknowledgment

The Institute for Youth in Policy wishes to acknowledge, Michelle Liou, Joy Park, Nolan Ezzet, and other contributors for developing and maintaining the Policy Department within the Institute.

Christine Li

Policy Analyst

Christine is a social policy writer for YIP. Raised in Brooklyn, New York, she loves going on walks and watching late night television shows.

Spencer Samet

Policy Analyst

Spencer Samet is a student at Windward School in Los Angeles California. He is passionate about current events and plans to pursue political science. Spencer works as a technology policy CO-Lead for YIP and is an active member of his highschool’s debate team.

Natalie Gelman

Policy Analyst

Tanya Mahesh

Fall 2023 Fellow

Tanya Mahesh is a High School Student from Pearland, Texas and with a keen interest in the intersection of business, technology and policy.

Vaishnavi Moturi

Policy Analyst

Vaishnavi Moturi is a student at Centennial High School and a technology policy analyst at the Institute for Youth in Policy. She is the founder and director of Hello CitizenZ, where she seeks to help create a generation of global citizens while developing technologies that improve public health systems and society’s collective health.

Lyla Renwick-Archibold

Similar Policy Briefs

No items found.