The board, set up by the company to address criticism over
its handling of problematic material, makes binding decisions on a small number
of challenging content moderation cases and provides non-binding policy
recommendations.
Meta has been under scrutiny from lawmakers and regulators
over user safety and its handling of abuses on its platforms across the world,
particularly after whistleblower Frances Haugen leaked internal documents that
showed the company's struggles in policing content in countries where such
speech was most likely to cause harm, including Ethiopia.
Thousands have died and millions have been displaced during
a year-long conflict between the Ethiopian government and rebellious forces
from the northern Tigray region.
The social media giant said it has "invested
significant resources in Ethiopia to identify and remove potentially harmful
content," as part of its response to the board's December recommendations
on a case involving content posted in the country.
The oversight board last month upheld Meta's original
decision to remove a post alleging the involvement of ethnic Tigrayan civilians
in atrocities in Ethiopia's Amhara region. As Meta had restored the post after
the user's appeal to the board, the company had to again remove the content.
On Thursday, Meta said while it had taken the post down, it
disagreed with the board's reasoning that it should have been removed because
it was an "unverified rumor" that significantly increased the risk of
imminent violence. It said this would impose "a journalistic publishing
standard on people."
An oversight board spokesman said in a statement:
"Meta's existing policies prohibit rumors that contribute to imminent
violence that cannot be debunked in a meaningful timeframe, and the Board made
recommendations to ensure these policies are effectively applied in conflict
situations."
"Rumors alleging an ethnic group is complicit in
atrocities, as found in this case, have the potential to lead to grave harm to
people," they said.
The board had recommended that Meta commission a human
rights due diligence assessment, to be completed in six months, which should
include a review of Meta's language capabilities in Ethiopia and a review of
measures taken prevent the misuse of its services in the country.
However, the company said not all elements of this recommendation
"may be feasible in terms of timing, data science or approach." It
said it would continue its existing human rights due diligence and should have
an update on whether it could act on the board's recommenation within the next
few months.
Reuters' previous reporting on Myanmar and other countries
has investigated how Facebook struggled to monitor content across the world in
different languages. In 2018, U.N. human rights investigators said the use of
Facebook had played a key role in spreading hate speech that fueled violence in
Myanmar.
Meta, which has said that it was too slow to prevent
misinformation and hate in Myanmar, has said that the company now has native
speakers worldwide reviewing content in more than 70 languages which work to
stop abuse on its platforms in places where there is a heightened risk of
conflict and violence.
The board also recommended that Meta rewrite its value
statement on safety to reflect that online speech can pose a risk to the
physical security of persons and their right to life. The company said it would
make changes to this value, in a partial implementation of the recommendation.
-Reuters
0 comments:
Post a Comment