The European Commission’s new guidelines on the protection of minors online are a step forward for promoting minors online safety in Europe. Developed under the EU’s Digital Services Act (DSA), these guidelines incorporate a range of expert feedback, including from the Knight-Georgetown Institute (KGI). The guidelines importantly focus on the design of digital platforms, including the platform features, defaults, and algorithms that shape minor’s online experiences. The new guidelines also underscore the importance of reviewing design risks and mitigation effectiveness to advance a safer online environment for children.
While the guidelines are not binding, they will influence how the Commission assesses compliance with the DSA’s expectations and offer guidance to platforms for how digital products accessible to children should be designed. They will inform the Commission’s enforcement of DSA Article 28, which requires online platforms operating in Europe to ensure a high level of privacy, safety, and security for minors.
KGI welcomes the new guidelines and applauds the Commission on its leadership in advancing digital safety for children. This commentary highlights the guidelines’ emphasis on product design, risk review, and measurement, and offers recommendations for effective implementation.
A Design-Centered Approach to Safety
The guidelines focus on safer default account settings and more user control for minors. This scope is grounded in evidence, reflecting the fact that children’s digital experiences are significantly shaped by design defaults.
Under the guidelines, platforms should set minors’ accounts to the highest level of privacy, safety, and security by default. Persuasive design features that may promote overuse or compulsive behaviors, like infinite scroll and autoplay, should be off by default. Defaults should restrict opportunities for unwanted contact by unknown accounts. Platforms should refrain from employing manipulative designs to encourage engagement (for example, through intermittent or random rewards). These recommendations are informed by independent research demonstrating that platform design choices can be misaligned with the interests of minors. Indeed, a study from late 2024 found that the 17 platforms classified as Very Large Online Platforms under the DSA deploy over 500 manipulative design features.
Beyond defaults, the guidelines recommend that minors are given more control over their online experiences. Whether it’s adjusting one’s feed or changing one’s notification settings, minors should have meaningful options to shape their experience. The guidelines specifically highlight the importance of user controls for recommender systems, account settings, and features that promote engagement like autoplay and infinite scroll. The guidelines recommend that such user feedback should have a “lasting impact” on their feed.
Recommended Next Steps: The Commission should provide clearer definitions of the design features relevant to the privacy, safety, and security for minors. This will help create a consistent starting point for reviewing risks and tailoring mitigation measures to the unique digital experiences each platform is designed to provide.
Better Recommender Systems
One of the most innovative aspects of the guidelines relates to their focus on recommender systems. For the first time, a regulatory framework distinguishes between implicit engagement signals (like scrolling or clicks) and explicit user input (like selecting preferences, providing feedback, or completing surveys). The guidelines’ expectations for recommender systems align with the key findings of KGI’s Better Feeds report, particularly on the important distinction between explicit and implicit signals.
The guidelines expect platforms to refrain from designing their recommender systems to process implicit signals of user engagement, unless doing so is in the best interests of the child. The guidelines recommend that platforms should instead rely to a greater extent on user-provided, explicit signals, such as responses to surveys and selection of topic preferences. Emphasis on these types of signals will provide minors with better control over their feeds and provide opportunities for deliberative reflection on their product use. These tools can help to mitigate automatic behaviors online which can be associated with negative health outcomes.
Recommended Next Steps: The Commission can help to drive a shift toward explicit signals by providing technical guidance about recommended sources of input data for recommender systems, including surveys, selection of topic preferences, and user reporting tools (e.g., “show me more/less”).
Risk Review and Measurement
The guidelines articulate an evidence-based approach. Platforms are to assess risks to minors through regular risk review and robustly measure the effectiveness of mitigation measures.
Risk reviews are to focus specifically on children’s well-being. Under the guidelines, platforms should classify risks as low, medium, or high, and then develop and implement mitigation measures proportionate to their severity.
Alongside risk review, platforms should demonstrate that mitigation measures are effective. This must be done through quantifiable measures of risk and mitigation. The guidelines specify that platforms should establish practices for regular collection of data on harm to minors.
Recommended Next Steps: Risk reviews and measurement can be strengthened through technical guidance, including best practices for identifying and quantifying risks to minors. This should include greater clarity about the kinds of metrics (e.g., experience surveys, content or behavior metrics) and methods (empirical research and long-term holdout experiments) that should be used. Measurement should include collaboration with independent researchers, who help ensure that platform analysis of risks to minors is comprehensive and actionable.
Towards Effective Implementation
The Commission’s new guidelines to protect minors online demonstrate how design can advance online safety for minors – addressing defaults, recommender systems, and user controls. By limiting manipulative features, maximizing default protections, and creating more agency for youth through user controls, the new guidelines offer a comprehensive vision of online safety for minors.
Effective implementation will require ongoing collaboration between regulators, platforms, and independent researchers. To that end the Commission should provide clear technical guidance and highlight best practices to help platforms best review and measurably mitigate risks to minors. By fostering a culture of iterative improvement, the EU will help set a global benchmark for child online safety.