Brussels, Belgium:
The EU launched an investigation into Meta's Facebook and Instagram on Tuesday over concerns that the platforms are failing to combat disinformation ahead of the EU elections in June.
The investigation falls under the EU's new Digital Services Act, a landmark law that will crack down on illegal online content and force the world's biggest tech companies to do more to protect users online.
The European Commission said it suspected Meta's ad moderation was “insufficient” and that an increase in paid spots under those circumstances could harm “electoral processes and fundamental rights, including consumer protection rights.”
EU leaders are particularly concerned about Russian attempts to manipulate public opinion and undermine European democracy.
The investigation aims to “ensure that effective measures are taken, in particular to prevent the vulnerabilities of Instagram and Facebook from being exploited by foreign interference,” EU Internal Market Commissioner Thierry Breton said.
“We suspect that Meta's moderation is insufficient, that it lacks advertising transparency and content moderation procedures,” committee chair Margrethe Vestager said in a statement.
Facebook and Instagram are among 23 “very large” online platforms that must comply with the DSA or face fines of up to six percent of a platform's global turnover, or even a ban in serious cases.
Other platforms include Amazon, Snapchat, TikTok and YouTube.
Meta did not comment on the focus of the investigation, instead stating more generally that the US company had “a proven process for identifying and mitigating risks on our platforms”.
A spokesperson for Meta added: “We look forward to continuing our collaboration with the European Commission and providing them with further details on this work.”
Meta's wide reach
Brussels is particularly concerned that Meta does not have an “effective” tool to monitor the elections ahead of the EU-wide elections from June 6 to 9.
It pointed to Meta's decision to shut down CrowdTangle, a digital tool considered vital in detecting viral falsehoods.
Meta has said it will replace CrowdTangle with a new content library, a technology that is still in development.
The committee said the company has five business days to explain what actions it has taken to mitigate the risks of CrowdTangle's dismantling.
The EU's concerns stem from the metaplatforms' reach in the 450 million bloc. Both platforms have over 260 million monthly active users respectively.
The focus of the EU investigation is broad and also includes Meta's move to reduce political content in Facebook and Instagram's recommendation systems.
Brussels fears that this could conflict with the DSA's transparency rules.
The EU also suspects that Meta's mechanism for flagging illegal content is not easily accessible or user-friendly, the commission said.
There is no deadline within which the research must end.
AFP currently works in 26 languages with Facebook's fact-checking program, where Facebook pays to use fact-checks from around 80 organizations worldwide on its platform, on WhatsApp and on Instagram.
Multiple probes
The DSA is one law in a strengthened EU legal arsenal to control big technology.
Brussels has shown that it is willing to assert its legal power under the DSA by opening investigations into Elon Musk's X, TikTok and Chinese retailer AliExpress.
TikTok, owned by China's ByteDance, bowed to commission pressure last week and suspended a rewards program on its spinoff Lite app in France and Spain after Brussels threatened a suspension.
Another regulation is the Political Advertising Law that will complement the DSA when most of its provisions enter into force at the end of 2025.
(Except for the headline, this story has not been edited by Our staff and is published from a syndicated feed.)