It certainly seems like a week of re-consideration at Facebook.
Following yesterday’s announcement of a ban on Holocaust denial content, which CEO Mark Zuckerberg had infamously said he would allow back in 2018, on the grounds of free speech, today, Facebook has announced a new policy which will prohibit ads that discourage people from getting vaccines.
As explained by Facebook:
“We already don’t allow ads with vaccine hoaxes that have been publicly identified by leading global health organizations, such as the World Health Organization (WHO) and the US Centers for Disease Control and Prevention (CDC). Now, if an ad explicitly discourages someone from getting a vaccine, we’ll reject it.”
Facebook announced its first steps against anti-vax content in March last year, which included reducing the reach of groups and Pages that shared misinformation about vaccinations, and the rejection of ads which included false claims about vaccines. But Facebook still allowed this content on its platform, and will still allow such in organic posts. It just won’t facilitate the same in paid promotions – which is another step in the right direction, although it does underline a key, consistent flaw in Facebook’s policy efforts.
In this case, it’s hindsight. Today, with the imminent need for a COVID-19 vaccine, Facebook has assessed its policies on anti-vax content and determined that it may need to do something in order to address the issue before it becomes a bigger problem, once the COVID-19 vaccine is made available.
But the anti-vax sentiment is already a problem. Back in June, Dr. Anthony Fauci, the head of the National Institute for Allergy and Infectious Diseases in the US, noted that America will face significant challenges in eradicating COVID-19 due to a “general anti-science, anti-authority, anti-vaccine feeling” among the populous.
Various reports have also specifically identified Facebook as a key facilitator in the spread of health misinformation, including COVID-19 conspiracy theories.
As per a report published earlier this year by human rights group Avaaz:
“Studies have shown that anti-vaccination communities prosper on Facebook, that the social media platform acts as a ‘vector’ for conspiracy beliefs that are hindering people from protecting themselves during the COVID-19 outbreak.”
So Facebook has already contributed to the problem, it’s been a concern for some time, and one which various groups have sought to bring to Facebook’s attention. Now, with a bigger, necessary focus on vaccines coming, Facebook has decided to act.
But it’s known about this for some time. So why has Facebook dragged its feet?
The same could be asked about QAnon – last week, Facebook announced a stronger stance against QAnon content, banning all discussion of the conspiracy theory due to concerns that it may be inspiring acts of real world violence. But people had been warning Facebook about this for years – now, with hindsight, Facebook is taking action because it sees immediate concern. But that’s too late for some incidents that have already occurred – why does it take a significant, pressing concern for Facebook to finally take action on something that it’s repeatedly been warned about?
The same again applies to Holocaust denial content – this week, Facebook announced a ban on all Holocaust denial posts because of new data showing an increase in anti-Semitic violence. Which, again, Facebook has absolutely been warned about in the past.
In hindsight, Facebook should have taken action on each earlier, yet, over and over again, the platform fails to take action on such warnings, only to eventually shift its stance once real, significant concerns are present on a broader scale.
Are these changes because Facebook really cares about the role it plays in facilitating such discussion, or because of public pressure, based on media scrutiny? Definitely, various investigations have shown that Facebook has played a significant role in the rise of QAnon, while Facebook clearly doesn’t want to be linked to antisemitic violence, nor any reduction in the effectiveness of the COVID-19 vaccine push.
But Facebook does want all the engagement, it does want to facilitate as much discussion as possible, in order to keep users coming back to its platform.
Does Facebook really have the public interest at heart when it makes these decisions, or is it more public relations?
Definitely, it’s good to see Facebook taking a tougher stance on each front, but it also raises questions about other content that the company does allow on its platforms which could also lead to significant real-world concerns. They just aren’t as big a deal yet. Climate change denial, maybe? Lies in political ads?
Hopefully, these new changes are a result of Facebook taking a different stance on such, and revising its approach to misinformation more broadly.
Indeed, Zuckerberg did note in his post about its policy update on Holocaust denial that:
“My own thinking has evolved […] drawing the right lines between what is and isn’t acceptable speech isn’t straightforward, but with the current state of the world, I believe this is the right balance.”
Maybe, Zuck is starting to see things in a new light, given the links between seemingly harmless chatter on Facebook and real-world consequences.
Freedom of speech is fine, but freedom of facts is not.
In addition to this, Facebook’s also adding new prompts to raise awareness of the flu vaccine via its Preventive Health Tool, while it’s also working with WHO and UNICEF on a new public health messaging campaign aimed at increasing immunization rates.
This is also good, these are good steps – but it just feels like Facebook is always a few steps behind, that it only ever seems to act once significant damage is done. Facebook’s motto early on was ‘Move Fast and Break Things’. Maybe, part of that ethos is still in effect, and Facebook would rather accelerate its growth, even if it means breaking a few things, before looking to clean up its mess.
Still, we have to give the platform the benefit of the doubt. No company has ever been in Facebook’s position, in terms of scale and societal influence, which, inevitably, means that it will make mistakes.
It just seems like a lot of them could have been avoided, had it just taken the time to heed earlier warnings.
Facebook says that it’s starting its flu vaccine awareness campaign in the US this week, with expansion to more regions coming shortly.