From the UK’s Online Safety Act to Europe’s Digital Services Act, we’re in an era of increasing online safety regulation. In the US, the Kids Online Safety and Privacy Act (KOSPA) is a landmark piece of legislation, passed in the Senate on 30th July 2024.
KOSPA is a legislative package that merges two bills – the Kids Online Safety Act and the Children’s and Teens Online Privacy Protection Act (also known as COPPA 2.0).
This blog looks at some of the requirements of KOSPA and what this means for companies.
What is the purpose of KOSPA?
KOSPA is the first time in nearly 30 years that a piece of legislation protecting children online has been passed. It will create heightened protections for teenagers, and new protections for the design and safety of online platforms.
President Joe Biden has been calling for legislation to improve online safety for young people for the past three years. He first spoke about online harms to young people in his State of the Union address in 2022, pushing for new privacy protections for children online. He has continued to champion the issue and spoke about it again in his 2024 State of Union Speech recognising, ‘there’s more to do, including passing bipartisan privacy legislation to protect our children online’.
If it becomes law, KOSPA – which includes the Kids Online Safety Act (KOSA) and the Children’s and Teens Online Privacy Protection Act (COPPA 2.0) – could change how the US government regulates technology platforms and child safety.
What is KOSA?
The Kids Online Safety Act (KOSA) was first introduced in February 2022 by Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN). It aims to protect children from harm online. It would require platforms to limit addictive features, allow young people to opt out of algorithmic recommendations, and restrict access to children’s personal data. A key part of KOSA involves implementing a duty of care to prevent and mitigate risks faced by children.
What is COPPA 2.0?
COPPA 2.0 aims to update the Children’s Online Privacy Protection Act of 1998 to strengthen protections around the collection and use of personal data of children and teenagers.
COPPA 2.0 has four key changes:
- It raises the maximum age of children covered under the law to from 12 to 17 years.
- It updates the law’s definition of personal information to include biometric data such as fingerprints, voiceprints, facial imagery and gait, to update the law with advances in technology.
- It will close a loophole that allows some companies to track children if they can claim they don’t have “actual knowledge” that their customers are underage. Instead, COPPA 2.0 states that protections should be offered if platforms “reasonably know” young people are using their services (e.g. if users are profiled by age).
- It moves to autonomous consent for teens, rather than relying on Verified Parental Consent (VPC).
Who does KOSPA apply to?
A core part of KOSPA (arising from KOSA) would set a duty of care to protect ‘minors’ (who are defined as children under 17) on certain online platforms – including social media, online video games, virtual reality worlds, online messaging and video streaming sites. The bill however exempts some entities and services like email and internet service providers, and educational institutions. Regulated or ‘covered’ platforms would need to take “reasonable” measures in how they design their products to limit harm. This might include limiting addictive or harmful features, implementing the highest privacy settings by default and developing more robust parental controls.
The Federal Trade Commission (FTC) would be responsible for the enforcement of the requirements of the bill. Companies which do not comply, or which fail to prevent or reduce harm online, would face significant lawsuits. State attorneys generals would be able to enforce certain parts of the law, including its provisions on safeguards for minors, disclosure and transparency.
Has KOSPA been signed into law?
KOSPA was passed in the Senate on 30th July 2024, with a vote that passed 91-3. In September 2024, the House Energy and Commerce Committee voted to mark up the two landmark bills, sending the legislation to House leaders for a potential floor vote and bringing the bills one step closer to becoming law.
A number of civil liberties advocates and opponents of the KOSA bill have raised concerns; primarily around privacy, censoring free speech and limiting access to essential information, including LGBTQ+ related topics. But the KOSA bill’s supporters say it is not designed to target specific types of content – rather the design behind algorithms that recommend content.
An update to the KOSA bill in February 2024 aimed to mitigate the concerns around free speech and access to information. The updated wording includes a specific definition of a “design feature” – something that will encourage minors to spend more time and attention on a specific platform, such as infinite scrolling, notifications and rewards for staying online. By focusing on these types of features, supporters of the bill would like to make platforms safer by design, rather than focusing on the content they host.
What are the current requirements of the bill?
One of the key parts of KOSPA (arising from KOSA) is to introduce a duty of care to prevent and mitigate risks faced by users under 17s. Platforms will need to reduce potentially harmful content, including content on self-harm, anxiety, depression and eating disorders. Platforms must activate the highest levels of privacy and safety settings by default for users under 17, and limit certain design features such as infinite scrolling and rewards for staying online.
Platforms would also have to implement the most restrictive privacy and safety settings by default. KOSPA requires safeguards for children on the internet, like preventing unknown adults from contacting children or viewing their personal information, restricting the ability to share minors’ geolocation data, and letting the accounts of children opt out of personalised recommendations. It would also allow parents or guardians to access children’s privacy and account settings. These safeguards have some similarities with the UK’s Age Appropriate Design Code, and would restrict platforms from collecting personal data.
The current wording of the Kids Online Safety Act (KOSA) states that platforms must also:
- disclose specified information, including details regarding the use of personalised recommendation systems and individual-specific advertising to minors
- allow parents, guardians, minors and schools to report certain harms
- refrain from facilitating advertising of age-restricted products or services, such as tobacco and gambling, to minors
- provide users with dedicated pages on which to report harmful content
For platforms with more than 10 million monthly active users in the US, they would also need to report annually on foreseeable risks of harm to minors and what mitigation steps the platform has introduced.
What’s next for KOSPA?
Now that KOSPA has been passed in the Senate, there’s a few more steps before it actually becomes law. The next step will see it progress in the House before going to the President for signing.
With other countries implementing their own online safety regulations, such as the UK’s Online Safety Act and Europe’s Digital Services Act, we will be watching closely to see how KOSPA progresses.
We will continue to update this blog as KOSPA evolves and progresses through Congress.
Keep up to date with the latest news by following us on LinkedIn.