Online safety advocates are hopeful that recent landmark trial verdicts could bring significant changes to social media platforms long criticized for their impact on children and teens. The trials have resulted in juries holding Meta and YouTube accountable for the harms suffered by young users, a development that advocates say validates their long-standing concerns.
Julianna Arnold, founder of the advocacy organization Parents RISE!, expressed her relief in light of the verdicts, emphasizing the need for action. Arnold’s own advocacy stems from the tragic death of her daughter Coco, which she attributes to Instagram. “Now we have the proof to back up and validate the stories we’ve been telling,” she stated.
The trials marked the first instances where juries of everyday citizens evaluated the safety of social media for minors, leading to troubling findings. A jury in New Mexico found Meta liable for creating an environment conducive to child predation, while another jury in California determined that both Meta and YouTube knowingly designed addictive platforms that failed to inform users and parents about potential risks, ultimately harming a young woman’s mental health.
The awarded damages, although relatively small in comparison to the valuations of Meta and Google, pose a significant issue for these companies, especially considering they face numerous other lawsuits. Repeated unfavorable verdicts could lead to billions in penalties and necessitate major changes to their platforms. In response, both companies have indicated plans to appeal the verdicts. A Meta spokesperson argued that teen mental health is complex and cannot be solely attributed to their app, while a Google representative stressed that YouTube is a responsibly built platform.
Despite the companies’ claims about their investment in user safety features—ranging from parental oversight tools to content restrictions—advocates call for more substantial reforms. Among the changes desired are the elimination of notifications that promote excessive use, particularly those that create anxiety around social interactions, such as Snapchat’s “Snap Streak” feature. Critics argue that these functions are designed to keep users engaged daily and are detrimental to young people’s mental health.
Furthermore, experts have urged social media companies to be transparent about how user data influences content recommendations. Jonathan Haidt, a social psychologist, noted that autoplay features, common on platforms like Instagram and TikTok, contribute to addictive usage patterns and should be reconsidered. Some advocates have proposed redesigning social media environments to limit their addictive qualities altogether.
Arnold is looking ahead, advocating for comprehensive online safety legislation. Lawmakers have discussed these measures for years but have yet to implement them. She expressed a desire for legislation that mandates companies prioritize users’ safety much like other products are required to do under U.S. law. Senators Marsha Blackburn and Richard Blumenthal are currently promoting the Kids Online Safety Act, which mandates platforms to safeguard minors’ data and offer protective tools for parents.
Some advocates have called for a system similar to Australia’s, which imposes age restrictions on social media access for children under 16. However, concerns about privacy and verification processes complicate this issue. As arrests for tech negligence mount, the consensus among advocates and watchdog groups is clear: current measures are insufficient, and the status quo must change to better protect the youth.


