According to Axios, the platform had begun taking down more than 800,000 videos containing false claims about coronavirus beginning last February 2020. In October 2020, YouTube added vaccination misinformation to its COVID-19 medical misinformation policy. Its AI systems and human reviewers flag the videos first, prior to receiving another level of review before it is taken down.
YouTube identifies false claims surrounding the pandemic as any statement that contradicts health experts from reputable authorities such as the World Health Organization. Those who violate these rules enter the company’s “strike” system, which can result in their accounts being banned permanently.
As vaccines become increasingly available across the nation, YouTube has seen a spike in videos spreading misinformation regarding the COVID-19 vaccines. Polls have shown that approximately 30% of Americans are still hesitant to take the vaccines. Many have also reported that their suspicions come from the conspiracy theories perpetuated by online videos.
Many social media platforms, including Facebook and Twitter have placed their own policies in place to minimize the spread of false claims. However, companies have been unsuccessful in finding a long-lasting solution to combat these issues. Though these companies are doing their best to address specific claims, YouTube, in particular, has only been able to address a small subset of claims.
It remains to be seen whether or not YouTube will find a more long-term solution to reducing misinformation about COVID-19.