Read KGI’s comments to the European Commission here.
The protection of children online is a top priority for policymakers around the world. From the United States to the United Kingdom and Australia to Europe, legislators and regulators are advancing a range of policies to strengthen digital safeguards for children online. In May 2025, the European Commission released draft guidelines on the protection of minors online under the Digital Services Act (DSA). These guidelines articulate a forward-looking, evidence-based approach to the governance of online safety for children.
KGI was pleased to provide feedback to the Commission on how to strengthen the guidelines to make them more effective. Our comments explain how to strengthen key areas of the guidelines – across required risk reviews, refining default account settings, and improving algorithmic recommender system design – in line with evidence from independent research.
The Commission’s guidelines are notable for their reliance on state-of-the-art research and knowledge. KGI commends the Commission for incorporating research and evidence from diverse experts across academia, civil society, international governance bodies, and regulators from other jurisdictions.
Risk Review
A proactive, evidence-based understanding of risks is essential for effectively protecting the privacy, safety, and security of minors online. The Commission and platforms should work towards more robust metrics to understand risks and evaluate the effectiveness of risk mitigations.
The guidelines require that platforms review risks to minors on an ongoing basis. To ensure meaningful assessment of risk and mitigations, the Commission should require verifiable metrics and rigorous methodologies. KGI’s comments highlight resources that the Commission and platforms can use to inform measurement of minor-specific risks and harms across the OECD’s 5Cs typology of risk (content, conduct, contact, consumer, and cross-cutting risks):
- The Integrated Harm Framework, developed by researchers at Stanford, which documents youth-specific social media harms identified in empirical studies as well as the strength of available evidence for specific mitigation strategies.
- A design taxonomy developed by KGI, the Tech Justice Law Project, and the USC Neely Center that maps potential risks to specific platform design choices.
- A recent paper co-authored by KGI and other experts, Social Media Harm Abatement, which describes how to develop a measurement framework grounded in internal and external data sources to effectively assess risks and mitigations.
Protecting minors online requires a proactive, evidence-based understanding of risk. The guidelines’ requirement if platform risk review is an important step. To strengthen effectiveness, we recommend that the Commission prioritize the development and use of robust, consistent metrics to effectively evaluate risks and mitigations.
Default Account Settings and Interface Design
The guidelines focus on default account settings, which impact how minors engage online and may exacerbate risks of harm, including unwanted contact, excessive use, and unwanted explicit content. The guidelines should enable safe default account settings as well as effective review of interface design risks and mitigations.
The guidelines include a strong focus on default account settings and interface design, which play critical roles in shaping the privacy, safety, and overall well being of minors online.
Default account settings significantly impact how and when features are used. Indeed, growing research shows that default settings substantially influence how young users interact with platforms and exposure to sources of harm.
Broad account visibility, discoverability, and automatic recommendations that connect minors with strangers have been identified as key vulnerabilities that can put minors in harm’s way. A survey commissioned by Snap as well as an internal Instagram survey show that a significant number of teens experience unwanted contact – including sexually explicit messages – highlighting the urgent need for safer default settings. Defaults and interface design are also key mechanisms by which design can contribute to excessive use of platform products, including through features and techniques such as autoplay, intermittent and variable rewards, and gamification.
While the guidelines highlight the critical role of default account settings and interface design in shaping how minors interact with online platforms, to meaningfully protect minors, the guidelines should clarify how platforms are to assess and measure risk within default account settings and interface design.
Recommender System Design
Recommender system design is foundational for ensuring a safe online environment for minors. While the guidelines are a major step forward, they can further strengthen recommender system transparency and better optimize for long-term user value through explicit user-provided signals.
The guideline’s focus on recommender systems represent a meaningful step forward. The recommender system sections of the guidelines are grounded in empirical evidence, and align with many of the main findings from KGI’s recent Better Feeds report.
To further strengthen effectiveness, the guidelines should require recommender systems to be optimized for long-term user value instead of attention-maximizing designs focused on engagement (such as clicks, likes, and saves). Better Feeds defines long-term user value as outcomes aligned with users’ deliberative, forward-looking aspirations or preferences. The guidelines helpfully encourage platforms to prioritize explicit user-provided signals over implicit engagement-based signals. To make this guideline more comprehensive, the Commission should recommend how platforms can best leverage predictions based on explicit user preferences. The guidelines should clarify how platforms can practically ensure that the feedback minors give about content actually has a “lasting impact” on their feeds.
The guidelines should also further spell out transparency expectations for recommender system design, as recommended in Better Feeds, including the disclosure of specific input data and weights used in recommender system design.
To strengthen accountability, the guidelines should also introduce expectations for platforms to conduct long-term assessments of their impact on minors’ well-being. Many platforms exempt a group of users from design changes (creating a “holdout group”) and compare how the changes affect outcomes of interest between the holdout group and the rest of the user base. The guidelines should recommend that holdout groups be used to evaluate how design changes affect outcomes related to the privacy, safety, and security of minors. These holdout groups should include a focus on at-risk populations.
Recommender system design is foundational for creating safer online spaces for minors. While the guidelines represent a significant advance, they can be further strengthened by promoting greater transparency, prioritizing long-term user value via explicit user-provided signals, and conducting long-term assessments – such as holdout experiments – that measure how design changes affect minors’ well-being over time.
Conclusion
The European Commission’s guidelines to protect minors online are a significant step forward. As the European Commission revises and implements these guidelines, they will offer important lessons and insights about risk mitigation, safer defaults, and better feeds.
KGI appreciates the European Commission’s forward-looking guidelines to create a safer and healthier online environment for minors. As lawmakers and regulators around the world develop regulations that seek to protect children online, learning across jurisdictions will be critical. Robust metrics and meaningful assessment are foundational for this effort and will allow for effective implementation.
Do you have further thoughts on the Commission’s approach for protecting minors online under the Digital Services Act? Please reach out to knightgeorgetown@georgetown.edu.