NextFin news, On October 16, 2025, a committee of the European Parliament—the Internal Market and Consumer Protection Committee—formally proposed a bloc-wide minimum age of 16 for accessing social media platforms, video-sharing services, and AI companions without parental authorization. This proposal, championed by Danish MEP Christel Schaldemose of the Social Democrats, was adopted by a large majority and is scheduled for a plenary vote by the full Parliament in late November 2025. The measure also sets a minimum age of 13 for any social media access, but with mandatory parental consent for those under 16.
The proposal arises from growing concerns about the impact of digital platforms on minors, citing risks such as addiction, mental health deterioration, and exposure to illegal or harmful content. The committee urges the European Commission to fully leverage its powers under the Digital Services Act (DSA) to enforce these protections, including imposing fines or banning non-compliant platforms. The initiative aligns with ongoing efforts by the European Commission, led by President Ursula von der Leyen, to establish a 'digital majority' across the EU, with expert recommendations expected by year-end.
Member states within the EU remain divided on the issue, but momentum is building, notably under Denmark’s rotating EU presidency, which advocates for a digital majority age of 15. The committee’s report also calls for banning manipulative platform features such as infinite scrolling, autoplay, disappearing stories, and loot boxes in games accessible to minors, alongside stricter age verification systems that respect privacy rights.
According to a recent Eurobarometer survey, 74% of EU youth aged 15-24 follow influencers on social media, and 65% rely on these platforms as their primary news source, underscoring the pervasive influence of digital media on young Europeans. The committee’s recommendations aim to curb the negative externalities of this engagement by imposing stricter access controls and platform accountability.
From a regulatory perspective, this proposal represents a significant escalation in the EU’s digital child protection framework. It moves beyond voluntary platform policies and non-binding guidelines towards enforceable, harmonized rules across all 27 member states. The emphasis on safety-by-design and banning addictive engagement mechanisms reflects a shift towards proactive platform responsibility rather than reactive content moderation.
Technologically, implementing an EU-wide minimum age of 16 will necessitate robust, privacy-preserving age verification systems. These systems must balance effective enforcement with data protection, avoiding intrusive identity checks that could deter users or raise privacy concerns. The Commission’s ongoing development of such systems will be critical to the proposal’s success.
Economically, the proposal will impact major social media companies operating in Europe, including Meta (Facebook, Instagram), Google (YouTube), and emerging AI platform providers. Compliance costs will rise due to enhanced age verification and content moderation requirements. However, these costs may be offset by increased user trust and reduced regulatory risks. The proposal also signals potential personal liability for senior management in cases of persistent breaches, increasing corporate governance stakes.
Socially, raising the minimum age to 16 aims to protect adolescent mental health and reduce early exposure to harmful digital content. Research increasingly links early social media use to anxiety, depression, and addictive behaviors. By restricting access, the EU hopes to foster healthier digital habits and safeguard youth development.
Looking forward, if adopted, this policy could set a global precedent for digital age regulation, influencing other jurisdictions such as the United States and Australia, which recently enacted similar age restrictions. The EU’s integrated market and regulatory clout mean that platform operators will likely implement uniform age controls worldwide to maintain compliance.
However, challenges remain. Enforcement across diverse member states with varying digital infrastructures and cultural attitudes will require coordinated efforts. There is also the risk of circumvention through VPNs or unregulated platforms. Moreover, balancing child protection with digital inclusion and freedom of expression will require nuanced policy calibration.
In conclusion, the European Parliament committee’s proposal to set 16 as the minimum age for social media access marks a pivotal step in digital child protection policy. It reflects mounting political will to address the complex harms of online platforms on youth, leveraging regulatory tools like the DSA and emerging age verification technologies. The proposal’s adoption and implementation will significantly reshape the digital landscape for European minors and potentially influence global standards in the years ahead.
According to Biometric Update, this initiative also includes banning addictive design features and loot boxes in games accessible to minors, emphasizing a comprehensive approach to digital safety. The proposal is part of a broader EU strategy to enforce the Digital Services Act more rigorously and ensure platforms prioritize user safety, especially for vulnerable groups.
Explore more exclusive insights at nextfin.ai.

