YouTube is removing all content that has contributed towards spreading misinformation about vaccines and extending the ban on false claims about COVID-19 vaccines.
The video-sharing giant said all videos that declare vaccines are dangerous and cause autism, cancer or infertility will be deleted.
The new rule also includes the deletion of accounts of anti-vaccine influencers.
Tech giants have come under criticism for not doing more to monitor false health information on their websites.
YouTube, which is owned by Google, said 130,000 videos have been removed from its platform since the start of the pandemic when it introduced a ban on content that spread misinformation about COVID jabs.
YouTube said in a blog post that it had seen false claims about COVID vaccines “spill over into misinformation about vaccines in general” and the new policy covers long-approved vaccines, including those for measles or hepatitis B.
On the other hand, the tech giant said that personal reports on vaccines, content about vaccine policies, new vaccine trials, and historical videos about vaccine successes or failures will not be removed from the site.
For more information, read the original story in the BBC.