Tech platforms will also need to do more to protect children
from being exposed to grooming, bullying, and pornography, the government said,
to ensure the safety of children online.
"We are entering a new age of accountability for tech
to protect children and vulnerable users, to restore trust in this industry,
and to enshrine in law safeguards for free speech," Britain's Digital
Secretary Oliver Dowden said.
Governments globally are wrestling over measures to better
control illegal or dangerous content on social media, with the European Union
set to unveil its own package on Tuesday.
Britain's new rules, which will be introduced in legislation
next year, could lead to sites which break the rules being blocked and senior
managers held liable for content.
Popular platforms will be required to have clear policies for
content that, while not illegal, could cause harm such as disseminating
misinformation about COVID vaccines.
Dowden said the framework would give large digital
businesses "robust rules" to follow.
Facebook and Google said in February they would work with
the government on the regulations. Both companies said they took safety
extremely seriously and they had already changed their policies and operations
to better tackle the issue.
British media regulator Ofcom will be given the power to
fine companies up to GBP 18 million or 10 percent of global turnover, whichever
is higher, for breaking the rules.
It will also be able to block non-compliant services from
being accessed in Britain.
Online journalism and reader comments on news publishers'
websites will be exempt to safeguard freedom of expression.