On Thursday, Facebook’s parent company Meta faced a significant investigation from the European Union regarding potential violations of the bloc’s stringent online content law, particularly concerning child safety risks.
The European Commission, the EU’s executive branch, announced in a statement that it is examining whether Meta’s Facebook and Instagram platforms “may stimulate behavioural addictions in children, as well as create so-called ‘rabbit-hole effects’.”
The Commission also expressed concerns about age verification processes on Meta’s platforms and privacy risks associated with the company’s recommendation algorithms.
“We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them,” a Meta spokesperson told.
“This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”
The Commission’s decision to launch an investigation follows a preliminary analysis of a risk assessment report provided by Meta in September 2023.
Thierry Breton, the EU’s commissioner for the internal market, stated that the regulator is “not convinced [that Meta] has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms.”
The EU emphasized that it will conduct an in-depth investigation into Meta’s child protection measures “as a matter of priority.” The bloc will continue to collect evidence through requests for information, interviews, or inspections.
The initiation of a DSA probe allows the EU to take further enforcement actions, including interim measures and non-compliance decisions, according to the Commission. It added that it can also consider commitments made by Meta to address its concerns.
Meta and other U.S. tech giants have increasingly been under EU scrutiny since the introduction of the bloc’s landmark Digital Services Act, a pioneering law from the European Commission aimed at addressing harmful content.
Under the EU’s DSA, companies can be fined up to 6% of their global annual revenues for violations. The bloc has not yet issued fines to any tech giants under this new law.
In December 2023, the EU initiated infringement proceedings against X, the company previously known as Twitter, over suspected failures to combat content disinformation and manipulation.
The Commission is also investigating Meta for alleged violations of the DSA related to its handling of election disinformation.
In April, the bloc launched a probe into Meta, expressing concerns that the company had not done enough to combat disinformation ahead of the upcoming European Parliament elections.
The EU is not the only authority taking action against Meta over child safety concerns.
In the U.S., the attorney general of New Mexico is suing the company over allegations that Facebook and Instagram facilitated child sexual abuse, solicitation, and trafficking.
A Meta spokesperson at the time stated that the company uses “sophisticated technology” and other preventive measures to eliminate predators.
Leave a Reply