As the UK's online safety regulator, we have published a
package of proposed measures that social media and other
online services must take to improve children's safety when
they're online.
In this article, we explain some of the main measures and the
difference we expect them to make. Whether you are a parent,
carer or someone working with children, this can help you
understand what is happening to help children in the UK live
safer lives online.
Protecting children is a priority
Protecting children so they can enjoy the benefits of being
online, without experiencing the potentially serious harms that
exist in the online world, is a priority for Ofcom.
We're taking action – setting out proposed steps
online services would need to take to keep kids safer online,
as part of their duties under the Online Safety Act.
Under the Act social media apps, search and other online services
must prevent children from encountering the most harmful content
relating to suicide, self-harm, eating disorders, and
pornography. They must also minimise children's exposure to other
serious harms, including violent, hateful or abusive material,
bullying content, and content promoting dangerous challenges.
What will companies have to do to protect children
online?
Firstly, online services must establish whether children are
likely to access their site – or part of it. And secondly, if
children are likely to access it, the company must carry out a
further assessment to identify the risks their service poses to
children, including the risk that come from the design of their
services, their functionalities and algorithms. They then need to
introduce various safety measures to mitigate these risks.
Our consultation proposes more than 40 safety measures that
services would need to take – all aimed at making sure children
enjoy safer screen time when they are online. These include:
-
Robust age checks – our draft Codes expect
services to know which of their users are children in order to
keep protect them from harmful content. In practice, this means
that all services which don't ban harmful content should
introduce highly effective age-checks to prevent children from
accessing the entire site or app, or age-restricting parts of
it for adults-only access.
-
Safer algorithms - under our proposals, any
service that has systems that recommend personalised content to
users and is at a high risk of harmful content must design
their algorithms to filter out the most harmful content from
children's feeds, and downrank other harmful
content. Children must also be able to provide negative
feedback so the algorithm can learn what content they don't
want to see.
-
Effective moderation – all services, like
social media apps and search services, must have content
moderation systems and processes to take quick action on
harmful content and large search services should use a ‘safe
search' setting for children, which can't be turned off and
must filter out the most harmful content. Other broader
measures require clear policies from services on what kind of
content is allowed, how content is prioritised for review, and
for content moderation teams to be well-resourced and trained.
What difference will these measures make?
We believe these measures will improve children's online
experiences in a number of ways. For example:
- Children will not normally be able to access pornography.
- Children will be protected from seeing, and being
recommended, potentially harmful content.
- Children will not be added to group chats without their
consent.
- It will be easier for children to complain when they see
harmful content, and they can be more confident that their
complaints will be acted on.
Our consultation follows proposals we've already published for
how children should be protected from illegal content and
activity such as grooming, child sexual exploitation and abuse,
as well as how children should be prevented from accessing
pornographic content.
Next steps
Our consultation is open until 17 July and we welcome any
feedback on the proposals. We expect to finalise our proposals
and publish our final statement and documents in spring next
year.