When Stephen Scheeler took the helm as Facebook’s Australia chief in the early 2010s, he was filled with optimism about the potential of social media to foster global connection and democratize knowledge. That enthusiasm, however, has waned significantly over the years, leaving him and many others skeptical about the platforms they once championed. By the time he departed the company in 2017, concern began to overshadow that initial optimism, leading him to conclude, “There’s lots of good things about these platforms, but there’s just too much bad stuff.”
Critics have increasingly focused on social media’s impact on teenagers, highlighting the mental health risks associated with platform usage. In response, governments worldwide have begun addressing these concerns, introducing measures to limit children’s social media exposure. Among these, Australia’s upcoming ban on social media accounts for individuals under 16, set to take effect on December 10, represents the most drastic intervention yet. Major tech companies have vehemently opposed this legislation, arguing that it risks exacerbating issues around child safety and infringes upon rights.
As noted by Paul Taske of NetChoice, a trade group for several tech giants, this Australian ban may establish a precedent that could influence regulations in other nations. Experts are wary, viewing it as a potential “proof of concept” for similar measures globally.
The social media landscape has seen increasing scrutiny due to various whistleblower accounts and lawsuits claiming that companies prioritize profits over user safety. A significant trial is scheduled to begin in January in the U.S., where allegations against several social media giants—including Meta, TikTok, and Snapchat—suggest that their platforms have been designed for addiction while concealing associated harms. The companies deny these claims, but the stakes are clearly high, with top executives being summoned to testify.
Adding pressure, ongoing legal cases allege that executives, including Zuckerberg, intentionally halted measures intended to improve teens’ safety on their platforms. This includes rejecting proposals to eliminate Instagram features that contribute to body image issues. Former employees have also testified to Congress about the company practices that undermined user safety.
The broader industry faced backlash not only for issues surrounding mental health but also for the handling of misinformation and extremist content. The rapid spread of violent content on social media platforms has led to intensified calls for accountability. In a rare show of bipartisan unity, American lawmakers have begun to scrutinize tech firms more closely, demanding greater transparency and responsibility.
As discussions about Australia’s ban evolved, tech companies engaged in dialogue with the government, though critics noted their lack of public communication contributed to an atmosphere of distrust. Instagram’s parent company Meta and Snapchat have argued that age verification should fall to app stores like Apple and Google, asserting that the government has overreached its authority.
Australia’s decision to limit access without parental approval sets a new global standard for regulating social media usage among minors. Minister Anika Wells has expressed solidarity with other nations looking to craft similar legislation, with countries like Denmark, Norway, and several in the EU already beginning to explore their own measures.
In the face of impending regulation, social media firms have rushed to implement child-focused features and marketing strategies. Snapchat and Instagram have introduced accounts tailored for younger users, ensuring heightened privacy and safety measures. Despite this, critics argue that these efforts may not go far enough in addressing the systemic issues relating to youth mental health on social media.
While the consequences of Australia’s radical ban are still uncertain, industry insiders believe it could prompt further scrutiny of social media practices worldwide. With tech companies facing hefty fines for non-compliance, the economic implications loom large for many operators who view such penalties as manageable costs of doing business.
As Scheeler suggests, even imperfect regulations may be better than allowing the current unregulated usage to persist. This pivotal moment in social media governance may ultimately serve as a starting point for a broader conversation on how best to protect young users in an increasingly digital world.


