Companies that fail to comply with the rules face fines of
up to 6% of their annual revenue or global turnover, which will be set by EU
countries.
The EU Executive said Wednesday its proposal aims to replace
the voluntary detection and reporting system currently in place as internet
companies prove insufficient to protect children.
In 2020 according to the EU there were more than one million
reports of child sexual abuse in the bloc of 27 countries, especially during
the COVID-19 pandemic. Then there was a 64% increase in the report in 2021
compared to the previous year. In addition, 60% of child sexual abuse materials
worldwide are hosted on EU servers.
"The proposed rules introduce an obligation for
relevant online service providers to assess the risk of abuse of their services
for the dissemination of child sexual abuse material or for grooming," the
Commission said in a statement.
Companies must then report and remove known and new images
and videos, as well as maintenance cases. The European Union Child Sexual Abuse
Center will be set up to act as a center of expertise and to forward reports to
the police.
The rules will apply to hosting services and interpersonal
communication services such as messaging services, app stores, and internet
access providers.
But according to the European Digital Rights lobby group,
the Commission's proposal could compromise end-to-end encryption and open the
door to authoritarian surveillance tactics. Meta's subsidiary WhatsApp also
voiced the same concern.
"It is very disappointing to see EU regulations
proposed on the internet fail to protect end-to-end encryption," Will
Cathcart, head of WhatsApp, said in a tweet.
"It is important that any measures adopted do not
undermine the end-to-end encryption that protects the security and privacy of
billions of people, including children," a Meta spokesperson said.
But these draft EU rules still need to be discussed with EU
countries and EU lawmakers before they can become law.
0 comments:
Post a Comment