Commentary /

First Steps Toward Operationalizing Age Assurance Mandates: New York SAFE for Kids Act Proposed Rules

As governments around the world move to require age assurance online, attention is growing around whether and how these mechanisms can be implemented accurately and effectively. Recently proposed rules from the New York Office of the Attorney General represent the most significant effort to date by a US public authority to operationalize an age assurance mandate. KGI’s comments on the proposed rules recommend improvements to make them more technically sound and to better account for privacy and service availability.

Download PDF

Recent controversies surrounding age verification on major platforms – from Discord delaying its age verification rollout to Reddit facing fines in the UK for children’s privacy failures – highlight the challenges of adopting age assurance systems in ways that protect user privacy and experiences.

Across the United States and globally, governments are increasingly considering requirements for platforms to verify or estimate users’ ages as part of broader efforts to protect minors online. The majority of US states have adopted or are considering age assurance mandates of some form. These initiatives mark a significant shift from the longstanding norm that most online services can be accessed without verifying a user’s age, raising complex questions about accuracy, privacy, accessibility, and resistance to circumvention. 

Given the complexity of age assurance, some state laws require state Attorneys General or other authorities to adopt implementing guidance or rules before age assurance mandates come into effect. The New York Office of the Attorney General (OAG) was first out of the gate when it announced its notice of proposed rulemaking (NPRM) to implement the Stop Addictive Feeds Exploitation (SAFE) for Kids Act in September 2025. This rulemaking represents the most significant effort to date by a US public authority to operationalize and monitor an age assurance mandate.

KGI welcomed the opportunity to provide input late last year to the New York OAG on the proposed rules for implementing the SAFE for Kids Act, a law which would require social media companies to restrict algorithmically personalized “addictive” feeds and nighttime notifications for users under 18 unless parental consent is provided. With our own landmark report, Age Assurance Online: A Technical Assessment of Current Systems and Their Limitations, now published, we are pleased to share our NPRM comments with the public.

Our submission focuses on the technical and operational requirements when implementing age assurance at scale. Any public authorities designing age assurance requirements must navigate trade-offs between accuracy, privacy, circumvention resistance, and service availability across the user population, and the production and consumption of speech and information. All age assurance systems must confront these trade-offs.

Where age assurance is mandated, its design and implementation should follow current best practices for protecting privacy and ensuring service availability. To that end, KGI’s comments recommend changes to the proposed rules to make them technically sound and better reflect the properties of age assurance systems as they are currently deployed. We suggest ways to improve how the rules balance accuracy with availability given high error rates for some age assurance methods.

To strengthen the proposed rules and ensure they are technically sound and implementable, KGI recommends the following changes: 

  • Move circumvention detection from a quantitative metric to a qualitative certification standard.
  • Ensure certification reports are complete and publicly available for independent verification.
  • Require the use of zero-knowledge proofs (ZKPs) – a cryptographic technology that allows users to prove that they meet an age threshold without revealing their actual age or identity – to support privacy-preserving age assurance once widely supported by mobile operating systems.
  • Clarify privacy, security, and parental consent provisions to make them more implementable, including providing clearer guidance as to what constitutes acceptable methods for verifying parental consent or providing a process for covered operators to determine that their proposed methods are acceptable.

As these rules may present the first opportunity in the US for detailed guidance to scaffold an age assurance mandate, they will present an early test of whether such guidance can support deployed systems that are simultaneously effective, highly available, and privacy-preserving.

The full submission provides detailed analysis of these issues and recommendations and can be accessed here.

Close