European Commission investigates Meta over child protection concerns

The Facebook, WhatsApp, Instagram, Threads, Meta Oculus, Facebook Messenger, Meta Business Suite, Meta Crestor Studio and Meta Workplace apps from Facebook's Meta are seen on an iPhone. Christoph Dernbach/dpa

The European Commission announced on Thursday that it has opened investigations into Meta over child protection concerns regarding its Facebook and Instagram platforms.

The investigations concern fears the two platforms "may exploit the weaknesses and inexperience of minors and cause addictive behaviour," according to a commission press release. Another concern is "rabbit hole" effects that "draw you in to more and more disturbing content," a commission official said. The EU executive is also concerned about minors' access to inappropriate content, as well as privacy.

The two probes into Facebook and Instagram, respectively, fall under the European Union's Digital Services Act (DSA), a broad online content law that requires large platforms to assess and mitigate various risks arising from the use of their services, especially for children.

"We are not convinced that Meta has done enough to comply with the DSA obligations - to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram," said EU industry commissioner Thierry Breton on X.

A commission official explained that a "rabbit hole effect" is created when a platform's algorithms "feed users with content of a certain type - for example leading to depression, or unrealistic body images - that can foster mental health issues in children."

Meta may also be failing to use age verification tools that are "reasonable, proportionate and effective" to prevent minors from accessing content that is inappropriate for children, the commission's press release said.

The commission will also investigate whether the company is falling short of "DSA obligations to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors."

The EU executive will pay particular attention to the default privacy settings Facebook and Instagram have for minors, as well as the "design and functioning" of "recommender systems" which push content towards users.

If Meta is found to have breached the DSA's risk mitigation rules, it could face fines of up to 6% of its global annual revenue.

The company is already subject to DSA investigations over its handling of political advertising, ahead of the forthcoming European Parliament elections on June 6-9.

The commission is also investigating TikTok over similar child protection concerns.

The DSA's risk mitigation rules apply to "Very Large Online Platforms" meaning those with more than 45 million monthly active users in the EU.

Other DSA provisions - such as obligations to have mechanisms allowing users to flag illegal content - also apply to smaller platforms.

Monitoring smaller platforms' DSA compliance is the responsibility of the EU's 27 member states, whereas the European Commission is the DSA enforcer for the large platforms.

Very Large Online Platforms (VLOPs) have to pay a yearly "supervisory fee" to cover the commission's costs, capped at 0.05% of each VLOP's global annual revenue.