YouTube announced a total ban on vaccine misinformation on Wednesday.
The streaming platform also announced it would be terminating several of its main culprits — including anti-vaxxers Joseph Mercola and Robert F. Kennedy Jr.
“We’re expanding our medical misinformation policies on YouTube with new guidelines on currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO,” the company’s statement reads.
“Our Community Guidelines already prohibit certain types of medical misinformation. We’ve long removed content that promotes harmful remedies, such as saying drinking turpentine can cure diseases. At the onset of COVID-19, we built on these policies when the pandemic hit, and worked with experts to develop 10 new policies around COVID-19 and medical misinformation,” the statement continues.
YouTube says that it has already removed more than 130,000 accounts for flouting the rules over the past year.
Facebook introduced a similar ban earlier this year against prominent anti-vaccine activists. Since the beginning of the pandemic, Facebook has removed millions of posts for sharing misinformation and has banned 3,000 accounts, groups and pages.
In July, Surgeon General Dr. Vivek Murthy warned that misinformation was threatening the country’s response to COVID-19. He called on tech companies to be more proactive in curbing the spread of misinformation.
“As surgeon general, my job is to help people stay safe and healthy, and without limiting the spread of health misinformation, American lives are at risk,” Murthy said in a statement. “From the tech and social media companies who must do more to address the spread on their platforms, to all of us identifying and avoiding sharing misinformation, tackling this challenge will require an all-of-society approach, but it is critical for the long-term health of our nation.”