A small number of social media platforms in Australia must now take reasonable steps to prevent accounts for anyone under 16. The government claims the reform will protect a generation facing bullying, anxiety and a ‘loss of childhood’. It echoes a global panic about smartphones and youth.

A closer look reveals a deeper story: the age ban is a blunt instrument, pushed by old media interests, enabled by social media’s own failings, and built on shaky evidence and a limited definition of harm. It risks pleasing anxious parents while locking young people out of debates about their own lives.

Panic over policy

The move repeats a familiar policy pattern. Just like the News Media Bargaining Code in 2021, the age ban has strong fingerprints from Australia’s legacy media sector. Back then, Australia’s legacy media lobbied hard for a law forcing platforms like Google and Facebook to pay for news, framing the fight as ‘tech giants vs democracy’. In reality, it was a political bargain: governments helped prop up failing business models, and media outlets helped shape a culture war over platforms.

Now the target is the same villain, but the story has changed. Instead of threatening journalism, social media is blamed for harming children. The simplicity of the narrative helps. It makes for punchy headlines. It plays on parental anxieties. And it lets governments claim decisive action against big tech.

Social media also made itself an easy mark. For years, the giants of the sector traded away public trust. Weak moderation, opaque algorithms, failure to uphold hard-fought safeguards and endless engagement tactics undermined its social licence. The industry is now seen as irresponsible, even predatory.

But a ban is a poor answer to a complex problem.

A ban cuts young people from social spaces that matter. It does not fix the roots of harm.

The harms linked to social media are real. But they are often vague or over-stated. There is no proven link between smartphone use and youth depression. Bullying happens online, but also offline. The idea of ‘lost childhood’ sounds dramatic. But childhood now includes digital life: friends, learning, identity and play. A ban cuts young people from social spaces that matter. It does not fix the roots of harm, nor does it appreciate how childhoods, and children’s media consumption, have changed.

The government sells the ban as common sense. Ministers speak to parents. But the people most affected were never asked. A recent survey of 17 000 Australian teens by the ABC shows that a mere nine per cent support the ban. Most instead see it as paternalistic and out of touch. They worry it will cut them off from friends, culture and support networks. And they know that every teenager can route around an age check or quickly locate an alternative platform.

There is a way to make social media safer: start by listening to young people. In our own research and that of our peers, young Australians outlined a clear set of reforms that would make platforms healthier for everyone, not just minors. Their ideas are precise, practical, and aligned with expert academic and civil society debates.

They want stronger tools to deal with abuse. They want more transparency. Clear explanations when content is removed or accounts are restricted, so they understand the rules instead of feeling punished arbitrarily.

They want privacy by design, not as an afterthought, with data collection minimised and choices respected. They want genuine avenues for input, platforms that actually listen to their concerns, test new features with young people and act on feedback rather than imposing rules from above. Above all, they want social media that supports their wellbeing, creativity and connection, not just engagement metrics.

These ideas are simple. They touch the business model. They give choice. They don’t punish young users — they reform the system. Australia’s model does none of that. It leaves the system as it is. It only pushes teens away from rights and safety tools.

A warning to Europe

Europe should watch this closely. Some leaders already praise Australia as bold. But a wave of age bans will fix little. It will bring harsh rules and no real change to platform design.

This fight is also about power in small democracies. Old media still shapes the agenda in Australia. It frames platforms as a threat. It uses its reach to push policy. Tech companies now sit on the back foot. They face anger from the public and pressure from lawmakers. Their poor record on safety provides critics with an easy mark.

If leaders want to help young people, though, they need more than bans. They need to invest in youth services. They need strong digital education. They need to support families and schools. And they need to treat teenagers as citizens with views worth hearing.

Social media is now a firm part of youth culture, offering care and community. Teens use it to build identity and find support. They learn from peers, share art and run national campaigns for change. They form groups that help them feel less alone. A law that cuts access may reduce some harms. It may also cut down these huge positives. Young people no longer trust legacy media to speak for them. They are finding their own path online, building news habits through creators and friends, not old and stale newspapers and TV.

There is another issue that sits behind all this: observability. We cannot improve platforms if we cannot see how they work. For years, companies have refused access to data to inform research. We can’t see how algorithms spread harmful content, how important moderation decisions are made, or how attention is monetised. Researchers and civil groups have long worked in the dark, subsisting on scraps that clever colleagues can manage to thread together.

Europe stands at a crossroads. It can lead on digital rights. Or it can undo its best tools.

Europe changed that. The GDPR gave people rights over their data. It allowed access, correction and removal. The Digital Services Act (DSA) bolstered this. It forces large platforms to share data with approved researchers. It requires audits on risks. It brings the feed into public view. These laws have opened a window into the black box.

This access matters. It begins to let us understand how platforms shape politics and culture. It helps us see when and how harmful content spreads. It lets us test if ‘safety by design’ works. And it gives civil society the early foundations of a way to hold platforms to account.

But this progress also faces a threat. The European Commission has proposed a Digital Omnibus reform that may weaken parts of GDPR and let companies reject access to personal data if the request is for research. That would end ‘data donation’, where users choose to share data for public study. This is a major backwards step.

If this change passes, we will genuinely lose power. Researchers will lose the tools to track harm. The black box will close. This risk should worry anyone who wants healthy digital spaces. Without data, we cannot locate harms. We cannot reveal patterns of abuse. We cannot see how feeds push specific types of content and bury others. Platforms win by silence and obfuscation.

The positive reform ideas from teens need observability to work. Feed control means little if we cannot see what the feed does. Safety tools matter, but we need proof they work. Controls help choice, but we must test if they make a positive difference. Research and public oversight make this possible.

Europe stands at a crossroads. It can lead on digital rights. Or it can undo its best tools. The age ban in Australia shows how fear can drive uninformed policy. It shows how old media still shape the debate.

The question is simple: do we want teens out of social media? Or do we want social media to work for teens? The first path gives a headline. The second gives us options.

Australia chose the headline. Others can choose the future. For that, we must listen to young people, invest in their lives and open the platforms to public view. Only then can social media serve care, connection and culture — not panic.