Yoti responds to Ofcom’s final guidance on highly effective age assurance for Part 5 pornography providers

profile picture Rachael Trotman 7 min read

Ofcom has published the final guidance on highly effective age assurance for (Part 5) providers of pornography, under the Online Safety Act. There are a lot of good principles and effective guidance to ensure children are protected online and there is a clear deadline of July 2025 for all sites (be that pornography sites or social media platforms which allow pornography) to have age verification in place to prevent children from accessing adult content.

We are pleased to see that Ofcom has listed several popular age assurance methods, such as facial age estimation, Digital ID wallets, and document verification, as capable of being highly effective. These technologies offer robust and privacy-preserving ways to confirm that users are over 18.

However, there are certain areas where the guidance falls short. It lacks the necessary detail and clarity the industry urgently needs to to build public trust and ensure platforms clearly understand and comply with the requirements. We explore some of the missing details below.

 

A lack of definitive methods

The guidance does not provide a definitive list of methods, just the kinds of age assurance that are capable of being highly effective. There needs to be a comprehensive list, which could be updated periodically (which Ofcom does recognise), rather than leaving the industry with examples of methods that lack sufficient detail and are open to interpretation.

In particular there is a commonly used method, which back in 2019, was noted as insufficient, which is now not listed at all. This is ‘checking against publicly available or otherwise easily known information such as name, date of birth and address’.

Without a clearer list, it makes it difficult for platforms to know if such well established, but unlisted methods, are acceptable.

 

A need for set minimum standards

It would be helpful for the guidance to be worded in a way where there are clear, set requirements, and to make reference to the clear statistical levels laid out in the IEEE’s international standard.

The guidance does not yet provide the clarity and comprehensive standards the industry urgently needs. Without minimum standards of what platforms must do, the risk of inconsistency and non-compliance is high. A clear and enforceable set of guidelines and mandatory audit requirements would not only create a level playing field but also build public trust in the system.

 

Lack of set age thresholds for age estimation

With age estimation methods, Ofcom has suggested that the platform should use a challenge age approach. For example, to access 18+ content, users need to be estimated as 25 or over (in line with the UK’s Challenge 25 policy). This 7 year buffer reduces the risk of underage users gaining access. However, it will be down to the platform to assess and set this challenge age, which could create inconsistencies across the industry.

There are other regulators who have chosen to be more specific, rather than just giving an example of what a threshold could be. For instance, in Germany adult sites need to use a 5 year buffer, which equates to a threshold of 23 years, when using facial age estimation for online pornography.

Without precise guidance, there’s a risk of inconsistency and non-compliance. Clear, well-defined performance measures within the regulations would support platforms to comply and help build trust in online age assurance.

 

No minimum standards for document verification

For document verification, Ofcom has suggested that circumvention techniques are used, for example to prevent a child user uploading an image of an ID that does not belong to them. But again, this is just guidance rather than setting minimum standards.

For document checks to be highly effective, minimum standards should require face matching, document authenticity checks and liveness detection. This would ensure the right person is presenting the document, the document is real and authentic, and that a real person is completing the age check. Otherwise there is too much opportunity for successful circumvention.

 

Lack of detail around age tokens

It is encouraging to see Ofcom recognise anonymised age tokens in the guidance. Age tokens are secure credentials that allow users to verify their age once, and then gain access to a number of age-restricted platforms.

An important detail missing from the guidance is that the initial age check for an age token should be done against a highly effective method. After all, the age token reflects the quality of the initial age check. Age tokens from medium confidence age checks (such as credit card checks or third-party authorisation where there is no additional authentication) should not be deemed as highly effective. It is also not clear whether Ofcom has a view on the maximum duration time before an age token expires.

 

Some methods are open to circumvention

Some of the methods that Ofcom has said can be highly effective, could be easily spoofed by children, which means they won’t pass the robustness test.

There are some things which are fundamentally essential to ensuring age checks are effective and can’t be spoofed. For example, using facial age estimation with liveness detection. Ofcom should have been braver and stipulated that platforms using facial age estimation need to use it with liveness detection. Otherwise, it is unnecessarily easy to circumvent. Whereas with high quality liveness detection, facial age estimation is very hard to circumvent.

And when using an identity document, this should be done with face matching, document authenticity and liveness checks. Without such steps, surely this method is open to spoofing, given a child could easily use their parent’s driving licence?

By not specifically defining and clarifying required process elements within certain age check methods, Ofcom are diluting the value of what highly effective, robust age checking means. They are leaving too much open to interpretation which will lead to inconsistent, less trusted age checking across platforms.

 

In-house age checks allowed and no requirement to delete data

When writing online safety regulations, some countries, such as the US, are including a requirement that age checks can’t be done by the adult operator and all age checks must be deleted afterwards.

Ofcom is saying porn sites can do the age checks in-house themselves provided the porn platforms comply with the requirements of the UK’s data protection regime and follow a data protection by design approach. But other regulators have decided that enabling porn operators to capture directly personally identifiable information (PII) from porn viewers to perform age checks is not sensible. Allowing porn sites to capture sensitive information like passport or driving licence details, or Name, Date of Birth and Address, will not help Ofcom to build high public trust.

 

Final thoughts

We have a range of highly effective age assurance solutions which allow platforms to know whether someone is an adult (over 18), without collecting any personal information. With our privacy-preserving solutions, we allow someone to prove their age anonymously, completing the age check without sharing their identity or any personal details.

We now urge Ofcom to provide the industry with further clear, practical guidance to ensure successful implementation.

Online age checking is no longer optional, but a necessary step to create safer, age-appropriate experiences online. If you need highly effective age assurance to comply with the Online Safety Act, please get in touch.

Keep reading

Woman surrounded by green plants using her smartphone

Age Check Certification Scheme evaluation for Yoti Facial Age Estimation

We are pleased to announce Yoti has been re-evaluated by the Age Check Certification Scheme (ACCS) for our facial age estimation (FAE) on our latest September 2024 model. ACCS now report our Mean Absolute Error (MAE) for 18 year olds is just 1.05 years, with a Standard Deviation (SD) of just 1.01 years. ACCS first tested Yoti’s September  2020 model in November 2020, reporting the MAE for 18 year olds to be 1.79, demonstrating our continued effort to improve the performance of our model.   Yoti has been training its FAE model since early 2018 by using data captured mainly

3 min read
Yoti logo on blue background

Facebook Dating introduces age verification with Yoti to create age appropriate experiences

Facebook Dating introduces age verification test to help continue preventing teens from accessing the 18+ experience Meta continues to work with Yoti who specialise in privacy-preserving ways to verify age   The move follows age verification being introduced on Instagram earlier this year. Of those asked to verify their age, 81% of teens chose Yoti’s facial age estimation technology to complete the age check   5th December 2022, London, UK – Today, Facebook Dating has introduced age verification to verify that only adults are using the service and to help prevent teens from accessing it. This will initially be for

3 min read
Yoti and THG logos presented together

THG Ingenuity partners with Yoti for digital age verification solutions

London, UK – 18th November 2022 – Yoti, the digital identity company, has today announced its partnership with THG Ingenuity, THG’s proprietary technology platform specialising in taking brands direct-to-consumers (“DTC”) globally.  Yoti will provide age verification solutions on the THG Ingenuity platform. This will allow THG clients to strengthen age checks and protect underage customers from buying age-restricted products.  Yoti’s age verification solutions include facial age estimation technology, a digital identity app and ID document verification. THG Ingenuity clients can choose which solution complements their business, and ensure they meet regulatory and local market requirements for selling age-restricted goods

3 min read