Australia is pioneering a global shift in how young people access social media, having recently enacted a law restricting access for those under 16. The move, which has seen over 4.7 million accounts blocked in its first month, is prompting similar considerations from countries across Europe and beyond, as concerns grow over the impact of social media on youth mental health and safety.
The Australian legislation, known as the Social Media Minimum Age Scheme, is an amendment to the country’s online safety laws. It requires social media platforms to actively verify the age of users and face potential fines of up to 49.5 million Australian dollars (approximately 680 million Czech koruna) for non-compliance.
The substantial penalties are driving technology companies to prioritize age verification and combat false age declarations. Meta Platforms, the parent company of Facebook and Instagram, has already removed around 550,000 accounts potentially belonging to users under the age of 16 to comply with the new regulations.
Artificial Intelligence Plays a Key Role
While a standardized technological solution hasn’t emerged, social media platforms are largely relying on artificial intelligence (AI) to verify user ages. Companies are hesitant to publicly disclose their specific methods to maintain a competitive edge.
AI systems automatically scan user profiles, estimating age based on activity and behavior. Advanced systems can even assess age from photographs. When discrepancies are detected, accounts are typically blocked, requiring users to verify their identity with official documentation like a passport or national ID card.
The effectiveness of these measures is already apparent. Australian lawmakers reported that over 4.7 million accounts were blocked in the first month following the law’s implementation.
“Our children are exposed to a space they should never have navigated alone. We will protect them from the digital Wild West.”
The trend is gaining momentum in Europe, with France recently approving a law prohibiting social media use for children under 15, potentially taking effect in September 2026, pending Senate approval. The proposed legislation emphasizes protecting children’s mental health from cyberbullying and harmful social comparison.
Spain is also planning a ban on social media access for anyone under 16, without exceptions, and is demanding robust age verification systems from platforms. The proposed law also includes stricter penalties for algorithmic manipulation and the spread of illegal content.
“We will change the law so that platform managers are legally responsible for the violations that occur on their networks,” stated Spanish Prime Minister Pedro Sánchez. He likened social media networks to a digital playground created without regard for children’s safety. “Our children are exposed to a space they should never have navigated alone. We will protect them from the digital Wild West.”
Denmark and Germany are also discussing restrictions on social media access for children or stricter age verification, inspired by the Australian model. The development highlights a growing international concern over the potential harms of social media to young people.
Discussions regarding limiting social media access for children are also underway in the Czech Republic.
Poll
Do you think it is right for states to restrict access to social networks for children?
I don’t have a clear opinion
A total of 2409 readers have voted.
What Do Social Networks Say?
Technology companies are responding cautiously to the evolving landscape. Representatives generally state their willingness to cooperate with authorities in implementing technical measures, while also emphasizing the technological challenges and costs involved.
Companies are primarily implementing changes in countries where required by law, as few are willing to voluntarily relinquish users, even young ones. Instead, platforms are increasingly focusing on providing parents with tools to control their children’s accounts. YouTube, for example, recently introduced options for parents to limit the time their children under 13 can spend watching short-form videos on Shorts, and even block access entirely.
YouTube Shorts, offering videos just seconds long, can still lead to hours of daily screen time, particularly among children.
Experts worldwide agree that excessive screen time has negative impacts on child development and well-being. Protecting minors is the primary justification for the proposed and enacted bans.
