Facebook owner Meta is quietly curtailing some of the safeguards designed to thwart voting misinformation or foreign interference in US elections as the November midterm vote approaches.
It's a sharp departure from the social media giant's
multibillion-dollar efforts to enhance the accuracy of posts about US elections
and regain trust from lawmakers and the public after their outrage over
learning the company had exploited people's data and allowed falsehoods to
overrun its site during the 2016 campaign.
The pivot is raising alarm about Meta's priorities and about
how some might exploit the world's most popular social media platforms to
spread misleading claims, launch fake accounts and rile up partisan extremists.
“They're not talking about it," said former Facebook
policy director Katie Harbath, now the CEO of the tech and policy firm Anchor
Change. “Best case scenario: They're still doing a lot behind the scenes. Worst
case scenario: They pull back, and we don't know how that's going to manifest
itself for the midterms on the platforms."
Since last year, Meta has shut down an examination into how
falsehoods are amplified in political ads on Facebook by indefinitely banishing
the researchers from the site.
CrowdTangle, the online tool that the company offered to
hundreds of newsrooms and researchers so they could identify trending posts and
misinformation across Facebook or Instagram, is now inoperable on some days.
Public communication about the company's response to
election misinformation has gone decidedly quiet. Between 2018 and 2020, the
company released more than 30 statements that laid out specifics about how it
would stifle US election misinformation, prevent foreign adversaries from
running ads or posts around the vote and subdue divisive hate speech.
Top executives hosted question and answer sessions with
reporters about new policies. CEO Mark Zuckerberg wrote Facebook posts
promising to take down false voting information and authored opinion articles
calling for more regulations to tackle foreign interference in US elections via
social media.
But this year Meta has only released a one-page document
outlining plans for the fall elections, even as potential threats to the vote
remain clear. Several Republican candidates are pushing false claims about the
US election across social media. In addition, Russia and China continue to wage
aggressive social media propaganda campaigns aimed at further political divides
among American audiences.
Meta says that elections remain a priority and that policies
developed in recent years around election misinformation or foreign
interference are now hard-wired into company operations.
“With every election, we incorporate what we've learned into
new processes and have established channels to share information with the
government and our industry partners,” Meta spokesman Tom Reynolds said.
He declined to say how many employees would be on the
project to protect US elections full time this year.
During the 2018 election cycle, the company offered tours
and photos and produced head counts for its election response war room. But The
New York Times reported the number of Meta employees working on this year's
election had been cut from 300 to 60, a figure Meta disputes.
Reynolds said Meta will pull hundreds of employees who work
across 40 of the company's other teams to monitor the upcoming vote alongside
the election team, with its unspecified number of workers.
The company is continuing many initiatives it developed to limit
election misinformation, such as a fact-checking program started in 2016 that
enlists the help of news outlets to investigate the veracity of popular
falsehoods spreading on Facebook or Instagram. The Associated Press is part of
Meta's fact-checking program.
This month, Meta also rolled out a new feature for political
ads that allows the public to search for details about how advertisers target
people based on their interests across Facebook and Instagram.
Yet, Meta has stifled other efforts to identify election
misinformation on its sites.
It has stopped making improvements to CrowdTangle, a website
it offered to newsrooms around the world that provides insights about trending
social media posts. Journalists, fact-checkers and researchers used the website
to analyse Facebook content, including tracing popular misinformation and who
is responsible for it.
That tool is now “dying,” former CrowdTangle CEO Brandon
Silverman, who left Meta last year, told the Senate Judiciary Committee this
spring.
Silverman told the AP that CrowdTangle had been working on
upgrades that would make it easier to search the text of internet memes, which
can often be used to spread half-truths and escape the oversight of
fact-checkers, for example.
“There's no real shortage of ways you can organise this data
to make it useful for a lot of different parts of the fact-checking community,
newsrooms and broader civil society,” Silverman said.
Not everyone at Meta agreed with that transparent approach,
Silverman said. The company has not rolled out any new updates or features to
CrowdTangle in more than a year, and it has experienced hourslong outages in
recent months.
Meta also shut down efforts to investigate how
misinformation travels through political ads.
The company indefinitely revoked access to Facebook for a
pair of New York University researchers who they said collected unauthorised
data from the platform. The move came hours after NYU professor Laura Edelson
said she shared plans with the company to investigate the spread of
disinformation on the platform around the January 6, 2021, attack on the US
Capitol, which is now the subject of a House investigation.
“What we found, when we looked closely, is that their
systems were probably dangerous for a lot of their users,” Edelson said.
Privately, former and current Meta employees say exposing
those dangers around the American elections have created public and political
backlash for the company.
Republicans routinely accuse Facebook of unfairly censoring
conservatives, some of whom have been kicked off for breaking the company's
rules. Democrats, meanwhile, regularly complain the tech company hasn't gone
far enough to curb disinformation.
“It's something that's so politically fraught, they're more
trying to shy away from it than jump in head first.” said Harbath, the former
Facebook policy director. “They just see it as a big old pile of headaches.”
Meanwhile, the possibility of regulation in the US no longer
looms over the company, with lawmakers failing to reach any consensus over what
oversight the multibillion-dollar company should be subjected to.
Free from that threat, Meta's leaders have devoted the
company's time, money and resources to a new project in recent months.
Zuckerberg dived into this massive rebranding and
reorganisation of Facebook last October, when he changed the company's name to
Meta Platforms. He plans to spend years and billions of dollars evolving his
social media platforms into a nascent virtual reality construct called the
“metaverse” — sort of like the internet brought to life, rendered in 3D.
His public Facebook page posts now focus on product
announcements, hailing artificial intelligence, and photos of him enjoying
life. News about election preparedness is announced in company blog posts not
written by him.
In one of Zuckerberg's posts last October, after an
ex-Facebook employee leaked internal documents showing how the platform
magnifies hate and misinformation, he defended the company. He also reminded
his followers that he had pushed Congress to modernise regulations around
elections for the digital age.
“I know it's frustrating to see the good work we do get
mischaracterised, especially for those of you who are making important
contributions across safety, integrity, research and product,” he wrote on
October 5. "But I believe that over the long term if we keep trying to do
what's right and delivering experiences that improve people's lives, it will be
better for our community and our business.”
It was the last time he discussed the Menlo Park,
California-based company's election work in a public Facebook post.
0 comments:
Post a Comment