The new approach, reported here for the first
time, uses the tactics usually taken by Facebook's security teams for wholesale
shutdowns of networks engaged in influence operations that use false accounts
to manipulate public debate, such as Russian troll farms.
It could have major implications for how the
social media giant handles political and other coordinated movements breaking
its rules, at a time when Facebook's approach to abuses on its platforms is
under heavy scrutiny from global lawmakers and civil society groups.
Facebook said it now plans to take this same
network-level approach with groups of coordinated real accounts that
systemically break its rules, through mass reporting, where many users falsely
report a target's content or account to get it shut down, or brigading, a type
of online harassment where users might coordinate to target an individual
through mass posts or comments.
In a related change, Facebook said on Thursday
that would be taking the same type of approach to campaigns of real users that
cause "coordinated social harm" on and off its platforms, as it
announced a takedown of the German anti-COVID restrictions Querdenken movement.
These expansions, which a spokeswoman said
were in their early stages, means Facebook's security teams could identify core
movements driving such behaviour and take more sweeping actions than the
company removing posts or individual accounts as it otherwise might.
In April, BuzzFeed News published a leaked
Facebook internal report about the company's role in the January 6 riot on the
US Capitol and its challenges in curbing the fast-growing 'Stop the Steal'
movement, where one of the findings was Facebook had "little policy around
coordinated authentic harm."
Facebook's security experts, who are separate
from the company's content moderators and handle threats from adversaries
trying to evade its rules, started cracking down on influence operations using
fake accounts in 2017, following the 2016 US election in which US intelligence
officials concluded Russia had used social media platforms as part of a
cyber-influence campaign - a claim Moscow has denied.
Facebook dubbed this banned activity by the
groups of fake accounts "coordinated inauthentic behaviour" (CIB),
and its security teams started announcing sweeping takedowns in monthly
reports. The security teams also handle some specific threats that may not use
fake accounts, such as fraud or cyber-espionage networks or overt influence
operations like some state media campaigns.
Sources said teams at the company had long
debated how it should intervene at a network level for large movements of real
user accounts systemically breaking its rules.
In July, Reuters reported on the Vietnam
army's online information warfare unit, who engaged in actions including mass
reporting of accounts to Facebook but also often used their real names.
Facebook removed some accounts over these mass reporting attempts.
Facebook is under increasing pressure from
global regulators, lawmakers, and employees to combat wide-ranging abuses on
its services. Others have criticised the company over allegations of
censorship, anti-conservative bias or inconsistent enforcement.
An expansion of Facebook's network disruption
models to affect authentic accounts raises further questions about how changes
might impact types of public debate, online movements and campaign tactics
across the political spectrum.
"A lot of the time problematic behavior
will look very close to social movements," said Evelyn Douek, a Harvard
Law lecturer who studies platform governance. "It's going to hinge on this
definition of harm ... but obviously people's definitions of harm can be quite
subjective and nebulous."
High-profile instances of coordinated activity around last year's US election, from teens and K-pop fans claiming they used TikTok to sabotage a rally for former President Donald Trump in Tulsa, Oklahoma, to political campaigns paying online meme-makers, have also sparked debates on how platforms should define and approach coordinated campaigns. © Reuters
0 comments:
Post a Comment