Protecting Minors on Social Media
Clare Reynolds
In September 2021, Stephanie Mistre’s worst nightmare came true – her daughter had taken her own life.[1] As the mother mourned her daughter’s passing, she began looking through her daughter’s cell phone and found that her TikTok “For You Page”[2] was full of videos detailing methods for and generally promoting suicide.[3] Joined by several other families, Mistre is now involved in a highly publicized suit against TikTok, alleging that the app’s algorithm pushed these videos on her minor daughter and contributed to her death.[4] This is one of many allegations that social media and other online platforms are not doing enough to protect minors.[5]
TikTok’s response to these claims is that they have sufficient guardrails in place to protect minors.[6] The app argues that their guidelines explicitly forbid mentions of suicide, that they employ over 40,000 trust and safety professionals worldwide, and that those who search for suicide-related videos are referred to mental health services.[7] This response, however, does little to address protections safeguarding minors in particular. As social media rapidly increases in popularity and accessibility, especially among younger generations, serious concerns have emerged regarding how these platforms are protecting minors from damaging, predatory, or otherwise harmful content.
Instagram, TikTok, Snapchat, Facebook, and other social media apps require account holders to be at least 13 years old.[8] This permission allows children under 18 virtually unlimited access to a plethora of content on social media – including content meant for adults. So, the question becomes, what do we do about this? Is the burden on social media companies to protect their minor users from inappropriate content or exploitation of their data? Is it on regulators and lawmakers? Or is it the burden of the minor’s legal guardians to ensure their safety? Different authorities have suggested different solutions.
One digital privacy expert says that the key to protecting minors lies in the enforcement of existing privacy legislation and ensuring that large tech providers are actually promoting and using their platforms within the guidelines of current regulations.[9] Other experts agree, finding that despite international legislation’s stated policy goals to protect minors, enforcement of such legislation has thus far been insufficient.[10] More effective enforcement mechanisms include explicit language about when violations occur and the implementation of sanctions for these violations, counterbalancing jurisdictional rules by providing power to consumer organizations to launch complaints, and harmonizing existing enforcement structures to create a wide horizontal group with aligned goals.[11]
Another group of authors argues that even if current digital privacy laws (namely COPPA[12] in the United States) were perfectly enforced, minors, especially those ages 13-17 would be inadequately protected online.[13] Their article suggests that merely expanding existing legislation to protect those up to the age of 18 would be unsuccessful, due to the technological savvy of this age group.[14] Instead, these authors suggest enacting new legislation that “recognize[s] the tendency of adolescents to act impulsively and without considering risks, and that take[s] steps to mitigate the consequences of such behavior online” has the promise to be more successful.[15] Policies like the “eraser button law” in California – which allows for easy, fast, and clear removal of all posts by minors – could help protect minors from themselves.[16] This state law also prohibits platforms from advertising certain content to minors.[17] However, the authors recognize that no legislation will be perfect, and no matter what the law is, there is still onus on guardians to protect their minor children from potentially harmful content on the Internet.[18]
And some have suggested that guardians should be the primary catalysts to fill the gap between law and reality of minors online.[19] One Northeastern law student says that although it is clear that there need to be changes in both legislation and enforcement, parents should do their part in protecting their minor children to supplement such changes.[20] This is especially true when it comes to minor influencers, and parents who post their children online as part of their own personal brand. Scholars and laypersons alike seem to agree that parents posting their children for content should stop doing so for their children’s own protection or at least set up accounts for their children to benefit financially from such a practice while using additional safeguards to deter their children from harm.[21]
In summary, protecting minors from harmful content on the Internet is incredibly important to prevent detrimental outcomes to their mental health and lives overall, as seen in the recent case involving Mistre and her teenage daughter. This is just one area of many surrounding the Internet and digital age wherein the legal landscape has not caught up to the rapidly evolving changes of the outside world.
[1] Tom Nouvian, Families sue TikTok in France over teen suicides they say are linked to harmful content, N.Y. Times (Jan. 25, 2025) https://www.washingtonpost.com/business/2025/01/23/tiktok-france-trial-suicide-lawsuit/577aad64-d956-11ef-85a9-331436ec61e9_story.html.
[2] A “For You Page” (FYP) is “a personalized feed based on your interests and engagement” and is the first thing you see upon opening the app. TikTok, For You, https://support.tiktok.com/en/getting-started/for-you (last visited Feb. 5, 2025).
[3] Id.
[4] Id.
[5] See, e.g., Isaiah Poritz, Meta Can’t Escape States’ Claims it Hooked Kids on Social Media Platforms, Bloomberg News (Oct. 15, 2024, 6:02 PM) https://news.bloomberglaw.com/litigation/meta-must-face-some-claims-by-state-ags-in-addiction-lawsuit (explaining a lawsuit brought by dozens of state attorneys alleging that Meta and its platforms caused children to become addicted to social media).
[6] Tom Nouvian, Families sue TikTok in France over teen suicides they say are linked to harmful content, N.Y. Times (Jan. 25, 2025) https://www.washingtonpost.com/business/2025/01/23/tiktok-france-trial-suicide-lawsuit/577aad64-d956-11ef-85a9-331436ec61e9_story.html.
[7] Id.
[8] Mindaugas Jancis, Social media age restrictions: a comprehensive list, Cyber News (Oct. 21, 2024, 6:33 AM) https://cybernews.com/editorial/mature-content-online/.
[9] Rachel Reed, Candy Crushed, Harvard Law Today (Aug. 12, 2024) https://hls.harvard.edu/today/dojs-lawsuit-against-tiktok-signals-more-aggressive-policing-of-childrens-privacy-online-says-harvard-law-expert/.
[10] M. Cantero Gamito, Do Too Many Cooks Spoil the Broth? How EU Law Underenforcement Allows TikTok’s Violations of Minors’ Rights, 46 J. Consumer Pol. 281, 283 (July 17, 2023).
[11] Id at 299-301.
[12] 15 U.S.C. §§ 6501–6505 (Children’s Online Privacy Protection Rule)
[13] Caitlin R. Costello, Adolescents and Social Media: Privacy, Brain Development, and the Law, 44 J. Am. Acad. Psychiatry L. 313, 316 (2016).
[14] Id at 319.
[15] Id.
[16] Id.
[17] Id.
[18] Id at 320.
[19] Amber Kolb, Influencing a New Generation: Guardians’ Duties to Protect the Interests and Safety of Children on Social Media, 57 Fam. L.Q. 55, 57 (2023-2024).
[20] Id.
[21] See, e.g. Id at 72; Elizabeth Martinez, You’re on Your Own Kid, Influencers, The Regulatory Review (Aug. 13, 2024) https://www.theregreview.org/2024/08/13/martinez-youre-on-your-own-kid-influencers/.