Meta, the parent company of Facebook and Instagram, is set to undergo a European Union investigation concerning its handling of disinformation, with a particular focus on Russian efforts to influence upcoming European elections. This probe, announced by Brussels and expected to commence as early as Monday, stems from concerns that Meta has not sufficiently moderated political advertising, which poses a risk to the electoral process.
The European Commission suspects that Meta’s current moderation practices do not adequately prevent the widespread dissemination of misleading political content. Although the upcoming statement from the commission will not specifically name Russia, it will refer to the manipulation of information by foreign actors, indicating a broader scope of concern.
EU officials are also troubled by the accessibility and user-friendliness of Meta’s mechanisms for users to flag illegal content, questioning their compliance with the EU’s Digital Services Act. This landmark legislation, designed to police online content, mandates platforms to be transparent about their efforts to combat misinformation and propaganda.
Should Meta be found in violation of the Digital Services Act, it could face fines amounting to up to 6% of its global annual turnover. This investigation is part of a larger trend of regulatory actions taken by the commission against major tech companies, as member states express increased anxiety over the potential for social media to be exploited by foreign entities to undermine democracy.
The probe will evaluate how Facebook and Instagram manage political content, the discontinuation of tools like CrowdTangle which help track content dissemination, and Meta’s methods for tracking disinformation to aid fact-checkers and journalists. Meta has stated that it has an established process for identifying and mitigating risks on its platforms and that it looks forward to continuing its cooperation with the European Commission.