An overview of the COPPA updates (and what it means for your business)

profile picture Yoti 13 min read
An image of a young girl using a laptop. The accompanying text next to the image reads “Children's Online Privacy Protection Act - United States”.

The United States Federal Trade Commission has released its updates to the Children’s Online Privacy Protection Act (COPPA) Rule. It aims to strengthen key privacy protections for children online and better reflect the challenges faced in the modern digital age.

The updates introduce stricter requirements for the collection, use and sharing of children’s data. However, it’s worth noting that the rule doesn’t include an explicit exception for the use of children’s personal information solely for age verification. This complicates compliance for platforms that wish to implement more robust age checking than self-declaration. 

Yoti is ready to assist companies with age assurance to support their COPPA compliance. We’ve taken a look at some of the key developments below.

 

What is COPPA?

The Children’s Online Privacy Protection Act of 1998 (COPPA), is a US federal law. Enforced by the Federal Trade Commission (FTC), it regulates the collection, use and sharing of the personal information of under 13s on all commercial websites and online services, including apps and digital platforms.

Originally implemented in 2000, the law was later updated in 2013 to account for digital and technological advances. A second comprehensive review began in 2019, with the final changes published earlier this year. The updated regulations are legally binding mandates on any business subject to the law.

The new changes will take effect 60 days after publication in the Federal Register. Unless otherwise stated, online services and websites will have one year from the rule’s publication to meet the standards.

 

Key updates to the COPPA Rule

Definition of ‘personal information’

The FTC has recognised that technology and the breadth of what qualifies as “personal information” has expanded.

Alongside the well-recognised original definition, which includes details like names or IP addresses, personal information now includes “biometric identifiers that can be used for the automated or semi-automated recognition of an individual”. This includes but is not limited to, fingerprints, retina patterns, genetic data, voiceprints and facial templates.

 

Guidance on compliance for “general audience” websites and services

The law splits websites and services into two categories: “general audience” and “mixed audience”.

General audience websites and services are not primarily designed for children under 13. Services in this category include news websites, general social media platforms and e-commerce sites that do not cater to children.

Previously, these sites had no obligations under COPPA unless they had actual knowledge that users were under 13. If actual knowledge existed (such as if a user self-declared that they were under 13), the site would need to comply with COPPA.

The new rule states that general audience services are now subject to COPPA if it’s widely known that those under 13 are using them. This applies even if these sites don’t directly target this age group. The update strengthens enforcement against services that have knowledge of users under 13 but have not taken appropriate action.

 

Definition of “actual knowledge”

Under the existing rule, general audience services only had to comply with COPPA if they had “actual knowledge” of users under 13. The site would only be subject to the legislation if they had proof they were collecting personal data from this demographic. An example would be if a user entered a date of birth stating that they’re under 13.

The updated rule expands on what qualifies as “actual knowledge”. Now, a site must comply with COPPA if it is deemed to have actual knowledge based on a “totality of circumstances”, on a case-by-case assessment. This includes:

  • reports, media coverage or research showing that a significant number of under-13s use the service
  • internal analytics or AI systems indicating child usage
  • user complaints or reports about children using the service​

 

Definition of “mixed audience”

The 2013 COPPA amendment introduced the concept of “mixed audience” services and sites. However, there was an absence of a clear definition, making it unclear whether a business was required to satisfy COPPA’s requirements.

In the most recent update, the FTC states that a mixed audience website or service is one that is “directed to children” but “does not target children as its primary audience”. Part of the revised test to determine if a company has a mixed audience is whether the content, features and branding are designed for a broad audience but may still appeal to children. These could include gaming platforms and social media sites that have content appealing to all ages, even if only part of it appeals to children younger than 13. This category is distinct from “general audience” or “primarily child-directed” sites or services.

Mixed audience services must not collect any personal information from users prior to determining the user’s age (other than for the very limited purposes set out within the Act). The purpose of this revision is to require these services to take appropriate steps to identify users who are children prior to providing the service. If the service determines that the user is under 13, COPPA rules will then apply. As part of compliance, the online site or service must obtain “verified parental consent” before collecting any personal data from the child. 

The law is clear that age assurance mechanisms for determining age must not encourage users to falsify their age information. For example, websites can’t imply that visitors over a particular age can access better content or features. Additionally, mixed audience sites can’t completely deny access to users who are under 13.

 

Definition of “directed to children”

The FTC lays out criteria to determine if a platform is “directed to children”. These are factors that the FTC will consider when undertaking reviews of services and websites. A non-exhaustive list of elements include:

  • marketing or promotional materials or plans
  • representations to consumers or to third parties
  • reviews by users or third parties
  • the age of users on similar websites or services

 

Guidance on age verification for “mixed audience” websites and services

Under COPPA, mixed audience services need to take appropriate steps “in light of available technology, to determine whether the visitor is a child”. Previously, users could self-declare their age, such as by ticking a box or entering their date of birth. Further verification was only necessary if there was explicit reason to believe the user was under 13. If a user stated they were under 13, this gave the site “actual knowledge” of these users. Then COPPA protections would apply, if required.

COPPA now states that mixed audience sites can no longer rely on self-declaration to ascertain a user’s age. For this reason, sites can collect limited personal information without parental consent to determine which version of the service the user can access.

This change was long-sought by privacy and consumer advocates as self-declaration can be easily circumvented by simply entering a false date of birth. This could allow underage users to access content that’s not age-appropriate.

The FTC doesn’t outline what qualifies as appropriate steps or commercially reasonable age assurance approaches. They instead take an outcomes-based approach. With this in mind, services should offer a range of highly effective age assurance methods to make access to their platforms as inclusive as possible.

 

Changes to verified parental consent

Under the new rule, services must obtain parental consent from parents of under 13s for two different purposes. The first is to consent to the initial collection of their child’s data. The second is to consent to the disclosure of their child’s information to third parties, such as for targeted advertising. An exception exists if the personal information is disclosed to support the service’s internal operations or is integral to the website.

Verifiable parental consent involves checking the parent’s identity before starting the data collection process. One way to do this is using “knowledge-based authentication”. This requires the parent to answer complex questions where the probability of guessing the correct answer is low.

Another method is by matching a live selfie with the image on a verified government-issued photo ID. After the result is delivered, sites must immediately and permanently delete the information and image reviewed from the document.

An application for the use of facial age estimation as another method of verified parental consent has not as yet been concluded. At the time of its initial submission, the FTC “declined without prejudice” the application of facial age estimation as an approved age assurance method by the Entertainment Software Rating Board (ESRB). This was due to a delay in receiving the independent US National Institute for Standards and Technology (NIST) benchmarking report concerning facial age estimation, which provides insights into the technology. As this was the case, the application for facial age estimation to be an approved age assurance method can be refiled in the future.

 

Introducing “text plus” verification

The new COPPA rule introduces a new “text plus” verification method. This is similar to the existing “email plus” option. The FTC authorises companies to collect limited personal data, such as a mobile number or email address, for the express and limited purpose of initiating a process of obtaining verified parental consent.

Here, operators send a text message or an email to the parent to provide consent. The “plus” element refers to the additional steps required for further verification. This could be in the form of a second confirmatory text message, letter or phone call to confirm consent has been granted. In other cases, operators may confirm consent by post after asking for a mailing address.

However, there’s still a risk that children could grant themselves access to these services by using a parent’s device. As such, these options are not available when consenting to the disclosure of personal information to third parties. Messages must also include a notice about revoking consent given in response to the initial message or email.

 

Statement by the Chairman of the FTC

In an unusual move, the Chairman of the Federal Trade Commission, Andrew N. Ferguson, published a separate statement in parallel to the updated rule. He expressed his disappointment about some aspects of the FTC’s updates, including the fact that collecting and using children’s personal data to determine their age still requires verified parental consent. 

Ferguson stated there should be an exception to this requirement, provided the data is deleted once its purpose is fulfilled. This would allow for more robust and accurate age verification, without complicating the process for online services. He believes this should be the case as the rule already contains other exceptions for limited data collection without verified parental consent.

 

Direct notice to parents

Services must make reasonable efforts to provide “direct notice” to parents of their practices regarding under 13s’ data. Operators must inform parents of any changes in the collection, use, or disclosure practices to which the parent has previously consented. 

Additionally, if the service collects any audio files containing the child’s voice, it must describe how it’ll use the files and must commit to deleting the files after using them for the purpose for which they were collected.

If services disclose personal information to third parties, they must describe how the data will be used, why it’s needed, the data retention period, and which third parties will receive it. This transparency ensures parents are informed about how their children’s data is handled.

 

Liability for third-party data collection

Under the new law, platforms may also be liable for third-party collection of children’s personal data that they have facilitated. Previously, responsibility only fell to the third party that collected the data. The hosting platform itself wasn’t liable unless it had actual knowledge of the data collection by the third-party.

Now, services must comply with COPPA if it allows third-parties to collect children’s data and benefits from this collection. Otherwise it risks being held jointly liable for any non-compliance. Examples of benefits include revenue from advertising or engagement numbers. This rule applies even if the service does not explicitly target under 13s.

 

Strengthening data protection and data retention practices 

The COPPA update requires online services and websites to establish, implement and maintain effective data protection programs. These programs must be adapted according to the sensitivity of the information collected and the operator’s size, complexity, and nature and scope of activities.

Services should have designated employees to coordinate these programs. They must undertake assessments to identify internal and external risks to the “confidentiality, security, and integrity of personal information collected from children and the sufficiency of any safeguards in place to control such risks”. Appropriate safeguards should be implemented and maintained to control the identified risks. These must undergo checks to demonstrate that they successfully prevent authorised access to children’s personal information.

The FTC’s rule states that the personal information of children under 13 cannot be retained indefinitely. It should only be retained for as long as it’s necessary to fulfil the purpose for which it was collected. After this point, it should be “securely deleted” when no longer needed.

Services should also have a written data retention policy that is publicly available on their website or platform. This policy should state the reason for collecting the information, why the business needs to retain the information and for how long.

 

Updated transparency requirements for safe harbor programs

Safe harbor programs are self-regulatory guidelines which are submitted by businesses and approved by the FTC. Businesses can choose to be part of self-regulatory programs to help them to comply with COPPA. In this case, businesses are initially subject to the review and disciplinary procedures outlined in the program’s guidelines before facing those drawn up by the FTC.

The update includes revised requirements for safe harbor programs. It requires safe harbor programs to publish their member operator lists and to report certain information to the FTC. This includes the program’s “business model, and the technological capabilities and mechanisms that will be used for initial and continuing assessment of subject operators’ fitness for membership in the safe harbor program”. The program must also be able to make available any “consumer complaints alleging violations of the guidelines”.

 

A significant step to enhancing privacy safeguards for children

We welcome the significant steps made towards enhancing online privacy safeguards for children. However, it’s worth noting Chairman Ferguson’s concerns regarding barriers to implementing robust age assurance measures by mixed audience platforms.

We hope these points will be considered in future revisions. This will help to ensure personal data is protected without creating unnecessary barriers for services that don’t primarily target children.

If your service requires age assurance to comply with COPPA, please get in touch.

Please note this blog has been prepared for informational purposes only. You should always seek independent legal advice.

Keep reading

Woman buying knife online

Age assurance for online knife sales

At the end of January, the UK Government announced they will introduce stricter age checks for online knife sales. Buyers will need to submit a copy of their photo ID, such as a driving licence, as well as proof of address, such as a utility bill. The same person who bought the knife will have to show ID again on delivery, and no knife packages can be left on the doorstep. These measures are part of the upcoming Crime and Policing Bill, expected to be introduced in Parliament this Spring. While knife crime is a key focus of the

7 min read
An image of a woman who is looking at her driving licence. The accompanying text next to the image reads “Tobacco and Vapes Bill - United Kingdom”.

Understanding age assurance in the UK’s Tobacco and Vapes Bill

In a significant move towards tightening regulations on tobacco and vaping products, the UK has introduced the Tobacco and Vapes Bill. Originally introduced by the previous Conservative government, the Bill has now been reintroduced by the new Labour government, signalling bipartisan support. The Bill aims to create a “smoke-free generation” by gradually raising the age of sale for tobacco and vaping products every year until they are completely phased out across the UK.   What is the main aim of the Tobacco and Vapes Bill? The Tobacco and Vapes Bill seeks to tighten the regulatory framework around tobacco and

7 min read

Age verification on adult websites: the facts

Key facts about age verification on adult websites: In the UK, adult websites have to comply with the Online Safety Act Ofcom is the regulator for online safety in the UK Other countries have introduced or are introducing age verification for adult content There are a number of different age verification methods available Age verification methods operate with the strictest privacy standards Some methods don’t require you to use an identity document     As the internet continues to evolve, concerns about online safety, privacy, and ensuring age-appropriate content have become more important. A widely discussed issue in recent

8 min read