New Europe push to curb children’s social media use

The European Commission wants to launch an age-verification app next month
Several EU countries are worried about the dangers of social media for children. Photo: Shutterstock.com

From dangerous diet tips to disinformation, cyberbullying to hate speech, the glut of online content harmful to children grows every day. But several European countries have had enough and now want to limit minors’ access to social media.

The EU already has some of the world’s most stringent digital rules to rein in Big Tech, with multiple probes ongoing into how platforms protect children − or not.

There are now demands for the EU to go further as a rising body of evidence shows the negative effects of social media on children’s mental and physical health.

Backed by France and Spain, Greece has spearheaded a proposal for how the EU should limit children’s use of online platforms as fears mount over their addictive nature.

They were to discuss the plan yesterday with EU counterparts in Luxembourg.

The proposal includes setting an age of digital adulthood across the 27-country EU, meaning children will not be able to access social media without parental consent.

France, Greece and Denmark believe there should be a ban on social media for under-15s, while Spain has suggested a ban for under-16s.

Australia has banned social media for under-16s, which will enter into force later this year, while New Zealand and Norway are also considering a similar ban.

‘Better protect children’

France has led the way in cracking down on platforms, passing a 2023 law requiring them to obtain parental consent for users under the age of 15.

But the measure has not received the EU green light it needs to come into force.

France also gradually introduced requirements this year for all adult websites to have users confirm their age to prevent children accessing porn − with three major platforms going dark this week in anger over the move.

Also under pressure from the French government, TikTok on Sunday banned the “#SkinnyTok” hashtag, part of a trend promoting extreme thinness on the platform.

“We have an opportunity that should not be missed, and that’s what I also came to say to the European Commission today, age verification is possible,” French Digital Minister Clara Chappaz told reporters.

She pointed to work “in progress” in France for adult websites. “We want the same thing for social media.”

Chappaz added that although countries pushing the proposal were not aligned on the age for a ban, they all agreed on the need to enforce age verification properly.

The worry is that children as young as seven or eight can easily create an account on social media platforms despite a minimum age of 13, by giving a false date of birth, she said in Luxembourg.

Her Danish counterpart Caroline Stage Olsen emphasised that children should be as protected online as they are in the real world.

“We need to do something to make sure they are better protected than they are today,” she added.

In-built age verification

France, Greece and Spain expressed concern about the algorithmic design of digital platforms increasing children’s exposure to addictive and harmful content − with the risk of worsening anxiety, depression and self-esteem issues.

The proposal also blames excessive screen time at a young age for hindering the development of minors’ critical and relationship skills. They demand “an EU-wide application that supports parental control mechanisms, allows for proper age verification and limits the use of certain applications by minors”.

The goal would be for devices such as smartphones to have in-built age verification.

The European Commission, the EU’s digital watchdog, wants to launch an age-verification app next month, insisting it can be done without disclosing personal details.

The EU last month published draft guidelines for platforms to protect minors, to be finalised once a public consultation ends this month, including setting children’s accounts to private by default, and making it easier to block and mute users.

Those guidelines are non-binding, but the bloc is clamping down in other ways. It is currently investigating Meta’s Facebook and Instagram, and TikTok under its mammoth content moderation law, the Digital Services Act (DSA), fearing the platforms are failing to do enough to prevent children accessing harmful content.

And last week, it launched an investigation into four pornographic platforms over suspicions they are failing to stop children accessing adult content.

For more social media-related articles, click here. For more Child stories, follow this link.

Total
0
Shares
Related Posts