Facebook Shares Update on Actions Taken Against Coordinated Inauthentic Behavior

With the US Presidential Election now only 26 days away, Facebook’s efforts to remove instances of ‘coordinated inauthentic behavior’, and avoid a similar situation to 2016, are more pressing than ever.

This week, Facebook has provided an update on its account removals as a result of its detection processes, which, late last month, saw the removal of three Russian-based networks, which were linked to Russia’s Internet Research Agency (IRA), the group responsible for seeking to manipulate US voters ahead of the 2016 poll.

Will the same group be able to influence US voters again?

Hopefully, as per Facebook’s data, it’s now detecting these networks at a much higher rate – according to Facebook’s CIB report, it removed four separate clusters of accounts originating from Russia in September, totaling around 290 Facebook accounts and 20 groups, and not all of these groups were focused on the US election.

But some were, and Facebook’s data shows that it’s been able to remove these groups more effectively, which may negate their impact. Facebook’s also been working with other platforms, including Twitter, to better police such activities, leading to improved outcomes in this respect.

And looking at the overall numbers, it does seem to have quelled any surge in manipulation efforts leading into the US poll.

As you can see in these charts, we started tracking Facebook’s total account removals based on its CIB reports when it announced its monthly reports on such back in March  (note: there was no CIB report in June). We actually suspected that there would be an increase in removals as the election drew near, and related activity ramped up, but evidently, that hasn’t occurred, with the number of removals remaining stable, even decreasing, over time.

Of course, that doesn’t necessarily mean that Facebook is detecting all of them, as it can’t include those that it’s not aware of. But the stats do seem to indicate that it’s increased enforcement measures are generating results.

That, hopefully, will see the less outside influence on the 2020 election. How that impacts the final result, we’ll likely never know for sure, but eliminating the impact of social media manipulation will ensure that the final result is wholly representative of voter sentiment, as opposed to any otherwise motivating push.

In September, Facebook also detected one cluster of accounts originating from China, which also focused on interfering with US politics, and another, more locally-focused group in the Philippines.

Again, it’s difficult to ascertain the full impacts of those efforts on subsequent voter actions, but hopefully, these reports do show that Facebook’s efforts have to lead to significant improvements on this front.