With a COVID-19 vaccine nearing release, the major tech platforms have agreed to partner on a new program, in conjunction with fact-checking organizations, in order to formulate new, improved approaches to combatting vaccine misinformation.
As reported by BBC:
“Taking part in the effort alongside Facebook, Google-owned YouTube and Twitter are the UK’s Department for Digital, Culture, Media and Sport, the Reuters Institute for the Study of Journalism, Africa Check, Canada’s Privacy Council Office and five other international fact-checking organizations.”
Fact-checking charity Full Fact will co-ordinate the anti-misinformation push.
That could see the platforms develop new, more effective ways for countering misinformation, and for detecting misleading reports before they can gain traction online.
All three companies have already implemented measures to combat anti-vax content – Facebook announced a ban on anti-vax ads in October, expanding its efforts to reduce the reach of anti-vax content, while Twitter added warnings on vaccine-related searches in March last year. YouTube has also moved to demonetize channels and videos which share anti-vax rhetoric.
Yet, health experts say that these measures don’t go far enough. Last July, months before the COVID-19 outbreak, a group of more than 60 public health leaders from around the world issued a public plea to the internet giants to monitor and label inaccurate and disproven claims about vaccines, in order to stop the dangerous growth of various anti-vax movements.
The group of medical professionals said that the rise of anti-vax groups had lead to serious declines in community vaccination rates, putting millions of people at risk. And again, this was before the COVID-19 pandemic, which has sparked new growth in conspiracy theories around the virus and fueled further anti-vax sentiment as a result.
Indeed, back in July, US medical expert Dr. Anthony Fauci said that the US would face significant difficulties in moving beyond the pandemic due to “general anti-science, anti-authority, anti-vaccine feeling” throughout the community. That will delay an effective vaccine rollout, which, in turn, will see ongoing lockdowns and mitigation efforts as communities resist such measures.
Online platforms play a key part in this. Despite its efforts to limit the reach of anti-vax content, Facebook still hosts thousands of anti-vax related groups, while it’s easy to find YouTube videos that support anti-vax conspiracies, despite YouTube pulling ads from such content, when detected.
Just a few months back, YouTube, Facebook, and Twitter all removed an anti-vax video posted by Breitbart – but not before it reached tens of millions of views across their platforms, spreading anti-science messaging.
These are the areas where the platforms will be looking to improve, and hopefully, through this new, collaborative effort, they’ll be able to formulate new plans to not only limit the reach of such but eliminate it entirely, in order to ensure a smooth roll-out of the COVID-19 vaccine.
It’s a big challenge, but this may be the start of an improved effort on all types of misinformation online.