The Online Safety Act 2023 is a piece of UK legislation that aims to protect children and adults online. It covers a wide range of issues including minimising the risk of children seeing harmful and age-inappropriate content, removing illegal content like child sexual abuse material (CSAM), criminalising fraudulent and scam ads, and introducing age verification for certain online services.
This blog looks at some of the age requirements in the Online Safety Act and what this means for tech companies, adult sites, gaming companies, social media platforms and dating sites.
What is the purpose of the Online Safety Act 2023?
According to the UK government, the Online Safety Act aims to make the UK ‘the safest place in the world to be online’. It primarily seeks to protect children and adults from online content that is harmful or illegal. The Act aims to create a safer digital environment that prioritises user safety and holds technology companies accountable for their actions. The Online Safety Act became law in October 2023.
How is Ofcom enforcing the Online Safety Act?
As the appointed regulator, Ofcom will ensure companies are proactively assessing the risks of harm to their users and introducing safeguards to keep them safe online.
Ofcom is currently giving guidance and setting out codes of practice for how companies can comply with the legislation. They have split enforcement of the Online Safety Act into three phases:
- Phase 1: illegal harms duties: this will address the most significant dangers online such as terrorism and fraud and includes measures to protect children from child sexual exploitation and abuse.
- Phase 2: child safety, pornography and the protection of women and girls: this will focus on protecting children from legal content that may be harmful to them, including pornography, content relating to suicide, self-harm and eating disorders.
- Phase 3: transparency, user empowerment, and other duties on categorised services: the third and final phase will focus on the additional duties that will apply to categorised services, including those relating to transparency reporting, user empowerment, fraudulent advertising and user rights.
Phase 1:
In November 2023, Ofcom published draft codes and guidance for Phase 1 duties. They will publish their illegal harms statement at the end of 2024, which will include the first edition of the Illegal Harms Codes of Practice and the illegal content risk assessment guidance. Companies will be required to complete their risk assessments by mid-March 2025, with these duties becoming enforceable in the same month.
Phase 2:
In December 2023, Ofcom published its guidance of how to use age verification or age estimation to prevent children from accessing pornographic content. They will issue their final guidance for publishers of pornographic content in January 2025, with the expectation that enforcement happens around the same time.
In January 2025, Ofcom will also publish their final children’s access assessments guidance for platforms which are “likely to be accessed by children”. Three months later, Ofcom will publish the relevant risk assessment guidance. Companies must carry out their risk assessments by July 2025, at which point, their duties will become enforceable.
Additionally, Ofcom have announced that they will publish their draft guidance on protecting women and girls in February 2025.
Is this the first piece of UK legislation which addresses online safety?
The Audiovisual Media Services Directive (AVMSD), which is a 2018 EU Directive that the UK implemented, has similar provisions to the Online Safety Act. Its scope however was very limited, and concerned only video-on-demand platforms such as Netflix and Amazon Prime, where content is curated by a provider and there are usually no user-to-user functions. Ofcom was the regulator chosen to enforce this Directive.
In 2021, the Information Commissioner’s Office introduced the Age-Appropriate Design Code (also called the AADC or Children’s Code) as required by the Data Protection Act (DPA) 2018. The Code is enforceable under the UK GDPR and DPA and imposes a set of standards that seek to ensure online services are designed in the best interests of a child.
Who is impacted by age verification requirements within the Online Safety Act?
Companies that have a ‘significant’ number of users in the UK, or whose services can be accessed by a UK user and present online risks, such as platforms with user-generated content. This includes gaming companies, dating sites, social media platforms and adult sites. Ofcom suggests more than 100,000 online services could be subject to the new rules.
These companies will have to review, and possibly adapt, the way they design, operate, and moderate their platforms to ensure they meet the aims of the new online safety regulation.
Under the provisions of the Online Safety Act, all regulated services will have a duty of care in relation to illegal content and if services are deemed accessible by children, a duty to protect children from harm.
All online services will be designated as one of three categories, dependent on their number of users and the functionalities of that service.
The Online Safety Act recognises that companies are very different, in terms of their size, resources and the risks they pose. As such, different safety measures will be appropriate for different types of service. Ofcom’s recommendations will vary for services depending on their size and degree of risk.
How will companies comply with the Online Safety Act?
The Online Safety Act imposes a ‘duty of care’ on platforms, which means they will have a duty to ensure their users are kept safe whilst using their services. They will have to complete risk assessments and develop proactive solutions on how to address potential harms.
Components of the Online Safety Act include:
- Preventing the appearance of illegal content and removing it quickly when it does
- Preventing children from accessing content that is harmful or inappropriate for their age
- Introducing age thresholds for the use of certain services based on the risk they present, and implementing robust methods to check the age of users
- Introducing a duty of transparency for large platforms and social media sites to disclose the methodology of their risk analysis, preventative measures put in place and their efficiency
- Providing parents and children with clear ways to report problems online
How will the age verification requirements in the Act impact social media platforms and tech companies?
To comply with the regulations, social media platforms and tech companies will have a duty of care to keep children safe online.
They will need to develop systems for detecting and removing harmful content and enforce stricter age limits. They will need to provide filtering tools that give users more control over the content they see. Companies should also provide better protection, particularly for children, from cyberbullying, online harassment, hate speech and child exploitation.
Social media platforms will need to introduce stricter age limits and explain in their terms of service how they implement these age limits. In addition, the Online Safety Act also introduces a number of new criminal offences: a false communication offence; a threatening communication offence; an offence of sending flashing images with the intent of causing epileptic seizures; and a cyber-flashing offence (the sending of unsolicited nude images via social media or dating apps). On 31st January 2024, the first person in England and Wales was convicted of cyberflashing under the Online Safety Act.
How will the age verification requirements in the Act impact adult sites?
Research by the Children’s Commissioner discovered that the average age children first see pornography is 13, and 38% of 16-21-year-olds have accidentally been exposed to pornographic content online.
The government has chosen to require services that publish or allow pornography on their sites to explicitly use age verification or age estimation measures to prevent children from accessing this content. Platforms will be held to a higher account and need to use age checking measures which are highly effective at correctly determining whether or not a particular user is a child – to prevent under 18s from being able to access pornography.
If a platform has pornographic content – including video, images and audio – they must take steps to ensure children can’t encounter it. Ofcom has stated that weaker methods will not be allowed; including self-declaration and online payment methods which don’t require ID, such as debit cards.
In the guidance, Ofcom has said that facial age estimation and digital identity wallets can be highly effective methods to help protect children from accessing pornography online. Both of these methods are more privacy-preserving than uploading physical identity documents or credit cards. From our experience, over 80% of adults select facial age estimation when given a choice of options.
These age assurance technologies are ready to use now. We have been working with adult sites for several years to help them with age assurance to protect children online.
How will the age verification requirements in the Act impact gaming companies?
93% of children in the UK play video games. As such, many gaming companies will be impacted by the Act and have to play their part in keeping children safe online.
Online video games will be in the scope of the Act if they:
- offer user-to-user interaction or allow user-generated content
- contain written chat functionality or group chat options
- have players in the UK
Like other companies impacted by the Online Safety Act, gaming platforms will need to comply with the general duties imposed on all regulated services. Gaming companies will also need to carry out child risk assessments to determine whether their game is accessed by children.
Gaming platforms will need to implement age assurance measures so they know the age or age range of their players. Once they know the age of users, they can then deliver an age-appropriate experience. This might involve limiting certain high-risk features for specific age groups, such as voice chat or age-gating certain content if it’s deemed inappropriate for players under a certain age.
Our work with Lockwood Publishing is a practical example of how age assurance can make a game age-appropriate. Adult players on Avakin Life can use our facial age estimation technology to access exclusive age-restricted features. This ensures players aged 18+ can confidently interact and chat with other adult players, while also enhancing the safety and player experience for younger audiences.
How will the age verification requirements in the Act impact dating sites?
The Online Safety Act 2023 includes new criminal offences for cyber-flashing – the sending of unsolicited nude images via dating apps. It will also aim to tackle romance fraud, which sees people tricked into sending money to scammers on dating sites.
Something the Online Dating Association (ODA) has championed is for Ofcom to closely match its Online Safety Act guidance with the ICO’s guidelines on the Children’s Code (or Age Appropriate Design Code).
The ICO’s guidance says that if a significant number of children are likely to access a service, even if it’s not designed for children, dating sites should introduce robust age checking or conform with the standards in the Children’s Code. The ICO’s guidance includes two dating sector-specific use cases, which look at two different types of dating services – those likely to have attempted access by minors and those that are unlikely.
Whilst we wait for Ofcom to confirm the guidelines on how dating companies can comply with the Online Safety Act, dating sites could be reviewing their platform. If they are an 18+ site, they should be considering what percentage of users might be underage, or looking at evidence that might indicate if underage people are likely to engage with members of the dating site.
A number of dating sites have already started looking into age assurance technology and how it can improve user safety and trust.
How Yoti can help
The UK’s Online Safety Act aims to improve internet safety by making online platforms more responsible for regulating and reducing harmful content. It aims to find a balance between protecting users, especially children and vulnerable adults, whilst preserving freedom of expression.
When it comes to the age verification requirements, it will be important for regulated companies to balance effective age checking with privacy. We believe that people should be able to choose between different methods to prove their age online. This will be essential for making sure age checking is inclusive and accessible for everyone.
Yoti offers a range of age assurance options to help platforms of all types and sizes comply with the regulations. We’re already helping platforms like OnlyFans, Instagram, Lockwood Publishing, Yubo and Facebook Dating to keep minors safe and create age-appropriate experiences.
To find out how we can help you comply with age assurance for the Online Safety Act, please get in touch.