YouTube is banning anti-vax content — but is it too little, too late?

ANKARA, TURKEY - JULY 18 : A silhouette of a woman with a laptop is seen in front of the logo of You...
Anadolu Agency/Anadolu Agency/Getty Images

While many have pointed fingers at the Delta variant for the most recent coronavirus surge, rampant misinformation is arguably just as much to blame. Anti-vaxxers have justified their refusal to get the jab with all kinds of ludicrous conspiracy theories. But the anti-vax movement goes way beyond COVID-19 vaccines. Misinformation on vaccines against other diseases has permeated social media for years. Now, YouTube is finally taking serious steps to combat it. In a blog post, the company announced it would ban misinformation on all vaccines. Better late than never, I guess?

The Google-owned platform had already enacted a similar ban on false claims about the COVID-19 vaccines, the New York Times reported. Its new policy goes a step further and also bans misinformation on other vaccines that have been approved by local health authorities and the WHO, as well as vaccines more broadly. That includes videos claiming that vaccines don’t curb disease transmission, or cause cancer, infertility, or autism, per YouTube’s blog post. The Times said the platform would also take down the accounts of anti-vaxx activists like Robert F. Kennedy, Jr. and Joseph Mercola.

YouTube’s new policy does seem to leave room for nuance, though. According to its blog post, the platform would allow videos about vaccine trials and policies, retrospective looks at vaccine successes and failures, and personal testimonials about vaccines.

Misinformation on social media stretching back years has fueled hesitancy toward vaccines, including those that protect against COVID-19, the Times pointed out. Specifically, YouTube videos are often what go viral on other social media platforms.

“If you see misinformation on Facebook or other places, a lot of the time it’s YouTube videos,” Lisa Fazio, an associate professor at Vanderbilt University who researches misinformation, told the Washington Post. “Our conversation often doesn’t include YouTube when it should.”

President Biden said in July that social media platforms were partly to blame for disseminating anti-vax misinformation and that they should do more to curb it, according to the Post. But YouTube, like Twitter and Facebook, has long taken a laissez-faire approach to monitoring content, citing the need to protect free speech. The video platform’s latest policy update comes amid criticism from lawmakers, regulators, and users for feeding vaccine skepticism and other societal problems.

Matt Halprin, YouTube’s vice president of global trust and safety, told the Post that YouTube took a while to expand its ban because it had homed in on COVID-19 vaccine information. The company revised its policy once it realized that false claims about other vaccines were stoking fears about the COVID-19 shot.

The problem is, the anti-vax community has had plenty of time to grow on YouTube and Facebook — which means it can persist even if these sites ban it, Hany Farid, a professor of computer science who studies misinformation at UC Berkeley, told the Post. Anti-vax influencers can simply switch to platforms that don’t crack down as hard on content, like Gab or Telegram, and bring their followers with them. While YouTube’s expanded ban is a welcome change, it does feels like too little, too late.