YouTube Tweaks Its Misinformation Policy With an Imminent Covid-19 Vaccine In Mind

YouTube extended its policy on medical misinformation related to Covid-19 to include content about vaccinations that runs counter to expert consensus from local health authorities or the World Health Organization.

Facebook took similar steps earlier this week, imposing a ban on ads that discourage people from getting vaccines, as well as encouraging its users to get their seasonal flu shots.

The Google-owned video site said part of the motivation for the move was the potential release of a Covid-19 vaccine somewhere down the line and a desire to have policies in place regarding misinformation if and when that does occur.

Types of content covered by the policy update include claims that the imminent vaccine will kill people or cause infertility, as well as assertations that microchips will be implanted in people who receive the shots.

Content violating these policies will be removed, and the holder of the account that posted them will be informed via email.

As with other violations of the video site’s community guidelines, first-time offenders will receive a warning, while repeat offenders will have one strike issued against their channels. Three strikes and those channels will be terminated from the platform.

General discussions in videos, such as broad concerns related to a vaccine, will not come under this policy.

YouTube said it has removed over 200,000 videos since February for dangerous or misleading Covid-19 information, for reasons including: disputing the existence or transmission of the coronavirus; discouraging viewers from seeking medical treatment; promoting medically unsubstantiated preventive methods; or explicitly disputing guidance from local health authorities or the WHO.

Information panels that appear alongside videos and searches related to the pandemic have tallied over 400 billion impressions, according to YouTube.

Latest posts