Facebook Removes More Than 36 Vaccine Misinformation Pages

Share post:

Facebook Inc announced it had deleted more than three dozen pages that spread misinformation about COVID-19 vaccines after the White House called on social media companies to tighten controls on pandemic-related facts circulating on their platforms.

Social media giants such as YouTube, Twitter, and Google have come under fire from the Biden administration for spreading vaccine misinformation, which is hampering the pace of vaccination in a nation where many oppose vaccines.

A recent report by the Center for Countering Digital Hate (CCDH) showed that 12 anti-vaccine accounts are responsible for nearly two-thirds of anti-vaccine misinformation on the internet.

Among the key articles on vaccine misinformation that the Biden administration is fighting are claiming that the COVID-19 vaccines are ineffective, false claims that they carry microchips and that they negatively affect women’s fertility, a White House official said last month.

For more information, read the original story in Reuters.

SUBSCRIBE NOW

Related articles

TikTok Fined 600 Million US By European Union

TikTok has been fined €530 million (USD $600 million) by the European Union for unlawfully transferring European user...

Elon Musk’s X Loses 11 Million EU Users Amid Content Moderation Concerns

Elon Musk’s social media platform X (formerly Twitter) has experienced a significant decline in its European user base,...

Meta Starts Federal Antitrust Trial That Could Lead To A Breakup of the Company

The Federal Trade Commission's (FTC) antitrust trial against Meta Platforms Inc. commenced on April 14, 2025, in Washington,...

Trump Grants 75-Day Extension for TikTok Amid Trade Tensions

President Donald Trump has signed an executive order extending the deadline for TikTok's parent company, ByteDance, to divest...

Become a member

New, Relevant Tech Stories. Our article selection is done by industry professionals. Our writers summarize them to give you the key takeaways