The European Commission says it is concerned systems used by apps could ‘exploit weaknesses and inexperience’ of children and stimulate ‘addictive behaviour’.
European Union regulators have opened a formal investigation into Meta for potential breaches of online content rules relating to child safety in its Facebook and Instagram platforms.
The European Commission said on Thursday it was concerned the algorithmic systems used by the popular social media platforms to recommend videos and posts could “exploit the weaknesses and inexperience” of children and stimulate “addictive behaviour”.
KEEP READING
list of 3 items
EU probes Chinese site AliExpress over potentially illegal online products
EU threatens to suspend TikTok Lite app’s rewards feature
EU launches disinformation probe against social media giant Meta
end of list
Its investigators will also examine whether these systems are strengthening the so-called “rabbit hole” that leads users to increasingly disturbing content.
“In addition, the Commission is also concerned about age-assurance and verification methods put in place by Meta,” the bloc’s executive arm said in a statement.
The investigation is being conducted under the Digital Services Act (DSA), a law that compels the world’s largest tech firms to enhance their efforts in protecting European users online.
The DSA has strict rules to protect children and ensure their privacy and security online.
Thierry Breton, the EU’s internal market commissioner, said on X the regulators were “not convinced that Meta has done enough to comply with the DSA obligations – to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram”.
In a statement, Meta said: “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them.”