Understanding the Kids Online Safety and Privacy Act (KOSPA)

profile picture Rachael Trotman 7 min read
Image of a young girl looking at her mobile phone. The accompanying text reads "Kids Online Safety and Privacy Act - United States".

From the UK’s Online Safety Act to Europe’s Digital Services Act, we’re in an era of increasing online safety regulation. In the US, the Kids Online Safety and Privacy Act (KOSPA) is a landmark piece of legislation, passed in the Senate on 30th July 2024. 

KOSPA is a legislative package that merges two bills – the Kids Online Safety Act and the Children’s and Teens Online Privacy Protection Act (also known as COPPA 2.0). 

This blog looks at some of the requirements of KOSPA and what this means for companies.

What is the purpose of KOSPA?

KOSPA is the first time in nearly 30 years that a piece of legislation protecting children online has been passed. It will create heightened protections for teenagers, and new protections for the design and safety of online platforms.

President Joe Biden has been calling for legislation to improve online safety for young people for the past three years. He first spoke about online harms to young people in his State of the Union address in 2022, pushing for new privacy protections for children online. He has continued to champion the issue and spoke about it again in his 2024 State of Union Speech recognising, ‘there’s more to do, including passing bipartisan privacy legislation to protect our children online’. 

If it becomes law, KOSPA – which includes the Kids Online Safety Act (KOSA) and the Children’s and Teens Online Privacy Protection Act (COPPA 2.0) – could change how the US government regulates technology platforms and child safety.

What is KOSA?

The Kids Online Safety Act (KOSA) was first introduced in February 2022 by Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN). It aims to protect children from harm online. It would require platforms to limit addictive features, allow young people to opt out of algorithmic recommendations, and restrict access to children’s personal data. A key part of KOSA involves implementing a duty of care to prevent and mitigate risks faced by children.

What is COPPA 2.0?

COPPA 2.0 aims to update the Children’s Online Privacy Protection Act of 1998 to strengthen protections around the collection and use of personal data of children and teenagers. 

COPPA 2.0 has four key changes:

  • It raises the maximum age of children covered under the law to from 12 to 17 years.
  • It updates the law’s definition of personal information to include biometric data such as fingerprints, voiceprints, facial imagery and gait, to update the law with advances in technology. 
  • It will close a loophole that allows some companies to track children if they can claim they don’t have “actual knowledge” that their customers are underage. Instead, COPPA 2.0 states that protections should be offered if platforms “reasonably know” young people are using their services (e.g. if users are profiled by age).
  • It moves to autonomous consent for teens, rather than relying on Verified Parental Consent (VPC).

Who does KOSPA apply to?

A core part of KOSPA (arising from KOSA) would set a duty of care to protect ‘minors’ (who are defined as children under 17) on certain online platforms – including social media, online video games, virtual reality worlds, online messaging and video streaming sites. The bill however exempts some entities and services like email and internet service providers, and educational institutions. Regulated or ‘covered’ platforms would need to take “reasonable” measures in how they design their products to limit harm. This might include limiting addictive or harmful features, implementing the highest privacy settings by default and developing more robust parental controls. 

The Federal Trade Commission (FTC) would be responsible for the enforcement of the requirements of the bill. Companies which do not comply, or which fail to prevent or reduce harm online, would face significant lawsuits. State attorneys generals would be able to enforce certain parts of the law, including its provisions on safeguards for minors, disclosure and transparency.

Has KOSPA been signed into law?

KOSPA was passed in the Senate on 30th July 2024, with a vote that passed 91-3. In September 2024, the House Energy and Commerce Committee voted to mark up the two landmark bills, sending the legislation to House leaders for a potential floor vote and bringing the bills one step closer to becoming law. 

A number of civil liberties advocates and opponents of the KOSA bill have raised concerns; primarily around privacy, censoring free speech and limiting access to essential information, including LGBTQ+ related topics. But the KOSA bill’s supporters say it is not designed to target specific types of content – rather the design behind algorithms that recommend content.

An update to the KOSA bill in February 2024 aimed to mitigate the concerns around free speech and access to information. The updated wording includes a specific definition of a “design feature” – something that will encourage minors to spend more time and attention on a specific platform, such as infinite scrolling, notifications and rewards for staying online. By focusing on these types of features, supporters of the bill would like to make platforms safer by design, rather than focusing on the content they host.

What are the current requirements of the bill?

One of the key parts of KOSPA (arising from KOSA) is to introduce a duty of care to prevent and mitigate risks faced by users under 17s. Platforms will need to reduce potentially harmful content, including content on self-harm, anxiety, depression and eating disorders. Platforms must activate the highest levels of privacy and safety settings by default for users under 17, and limit certain design features such as infinite scrolling and rewards for staying online.

Platforms would also have to implement the most restrictive privacy and safety settings by default. KOSPA requires safeguards for children on the internet, like preventing unknown adults from contacting children or viewing their personal information, restricting the ability to share minors’ geolocation data, and letting the accounts of children opt out of personalised recommendations. It would also allow parents or guardians to access children’s privacy and account settings. These safeguards have some similarities with the UK’s Age Appropriate Design Code, and would restrict platforms from collecting personal data. 

The current wording of the Kids Online Safety Act (KOSA) states that platforms must also:

  • disclose specified information, including details regarding the use of personalised recommendation systems and individual-specific advertising to minors
  • allow parents, guardians, minors and schools to report certain harms
  • refrain from facilitating advertising of age-restricted products or services, such as tobacco and gambling, to minors
  • provide users with dedicated pages on which to report harmful content

For platforms with more than 10 million monthly active users in the US, they would also need to report annually on foreseeable risks of harm to minors and what mitigation steps the platform has introduced.

What’s next for KOSPA?

Now that KOSPA has been passed in the Senate, there’s a few more steps before it actually becomes law. The next step will see it progress in the House before going to the President for signing.

With other countries implementing their own online safety regulations, such as the UK’s Online Safety Act and Europe’s Digital Services Act, we will be watching closely to see how KOSPA progresses.

We will continue to update this blog as KOSPA evolves and progresses through Congress. 

Keep up to date with the latest news by following us on LinkedIn.

Keep reading

An image of a woman who is looking at her driving licence. The accompanying text next to the image reads “Tobacco and Vapes Bill - United Kingdom”.

Understanding age assurance in the UK’s Tobacco and Vapes Bill

In a significant move towards tightening regulations on tobacco and vaping products, the UK has introduced the Tobacco and Vapes Bill. Originally introduced by the previous Conservative government, the Bill has now been reintroduced by the new Labour government, signalling bipartisan support. The Bill aims to create a “smoke-free generation” by gradually raising the age of sale for tobacco and vaping products every year until they are completely phased out across the UK.   What is the main aim of the Tobacco and Vapes Bill? The Tobacco and Vapes Bill seeks to tighten the regulatory framework around tobacco and

7 min read
Image of a man holding his mobile phone in one hand and a driving licence in the other hand. The accompanying text reads "Data Bill - United Kingdom".

Understanding the UK’s new Data Bill

The Data (Use and Access) Bill, known more simply as the “Data Bill”, is a landmark piece of UK legislation that aims to reshape how individuals and businesses interact with digital data. It will introduce provisions for a national digital identity trust framework, helping to foster trust in digital identities by ensuring that businesses adhere to strict standards during digital transactions.  This blog gives an overview of the Data Bill and what this means for digital identities in the UK.    Why has the Government introduced the Data Bill? The Government has said that the Bill will “unlock the

9 min read
An image of a young boy who is wearing a red hoodie and looking down at his mobile phone. The accompanying text next to the image reads “Minors’ access to pornographic content: France”.

French regulator Arcom introduces age checks for online adult content

This month, Arcom, the French regulator responsible for online porn, has announced the date platforms will need to introduce age checks. From 11th January 2025, adult operators and platforms with pornographic content will need to check the age of users, ensuring only adults can access the content. There will be a three month transitional period, where temporary methods like bank card verification can be used as a preliminary age filter, but they must include strong authentication to ensure that the user is the cardholder. After the transitional period ends on 11 April 2025, adult site operators will need to

5 min read