Articles
The importance of transparency for facial age estimation
To protect young people online, businesses need to provide age-appropriate experiences for their users. This could apply to online marketplaces, social media networks, content sharing platforms and gaming sites. But to put the correct measures in place, businesses need to know the ages of their customers. It was previously thought that the only way to confidently establish a user’s age was with an identity document, like a passport or driving licence, or checks to third-party databases such as credit reference agencies or mobile network operators. However, regulators are now recognising facial age estimation as an effective alternative. As with
Thoughts from our CEO
In this blog series, our CEO Robin Tombs will be sharing his experience, whilst focusing on major themes, news and issues in the world of identity verification and age assurance. This month, Robin gives a summary of recent age verification events, the growing threat of AI voice cloning scams and how Digital IDs can protect you from online scams. Summary of age verification events The last few months have been a really busy time in the age verification world. Here’s a snapshot of some of the key events from the last 10 weeks: 11 March – the US
Making age checks inclusive
In today’s world, we can access a range of goods, services and experiences online. As a result, regulations are being passed across the globe to ensure that young people safely navigate the digital world. From the UK’s Online Safety Act to age verification laws in the US, platforms are being required to check the ages of their users. To do this effectively, the age assurance methods offered must be inclusive and accessible to as many people as possible. Why can’t businesses just use ID documents? Most people tend to think of age assurance as checking a person’s age
How age assurance builds trust and safety on gaming platforms
There is a growing agreement that more needs to be done to improve online safety. Regulators around the world are introducing new laws to make the digital world safer and ensure young people have an age-appropriate experience online. With legislation such as the Age Appropriate Design Code, the UK’s Online Safety Act, and the EU’s Digital Services Act reshaping the industry, gaming companies are facing a new era of accountability and responsibility. From implementing age assurance measures to ensuring age-appropriate content and experiences, gaming companies must navigate the regulatory landscape while prioritising user safety and privacy. This blog explores some
Protect yourself with peer-to-peer checks
Identity-related fraud is huge. Both in the real world but more so online where it’s easy to hide behind a fake unchecked profile. Most platforms don’t want to add friction into their onboarding journeys in fear that people will migrate to another site. A Which? survey of 1,300 buyers found a whopping 32% had been scammed on second hand marketplaces in the last two years. Whether someone is trying to sell stolen goods, committing romance scams, catfishing, or advertising property rentals that don’t exist, there are a whole host of fraud-related crimes and identity is often at the centre
Preparing for the EU’s new AI Act
Artificial intelligence (AI) is changing our world at a speed that, just a decade ago, we never could’ve anticipated. As AI finds its way into our everyday lives, regulators are racing to catch up with its development. In response, last month, the EU voted to bring in the Artificial Intelligence Act, also known as the AI Act. The Act is expected to enter into force in May or June 2024. This blog looks at what the legislation means for businesses and how they can comply. Why is there an AI Act? In recent years, it seems as though AI