Commentary /

KGI Comments to the European Commission on Guidelines to Enforce the Protection of Minors Online

KGI highlights three areas where emerging and established research can inform strategies to guarantee the privacy, safety, and security of minors online: Access to Platform Data, Platform Design Features, and Engagement-based Recommender Systems.

The European Commission recently requested evidence to inform the development of Guidelines on the protection of minors online under the Digital Services Act (DSA). When finalized, these Guidelines will recommend strategies for online platforms to effectively guarantee high levels of privacy, safety, and security for minors online. The Commission’s focus on the experiences of minors is laudable, and many of the strategies to improve the privacy, safety, and security for minors online will benefit broader social media users.

The Knight-Georgetown Institute (KGI) submitted comments to the Commission that highlight increasing evidence demonstrating that platform design impacts how users, communities, and societies experience the benefits and harms from platforms. Indeed, the pitfalls of content moderation have spurred a large and growing movement in public policy, academia, and civil society focused on how content-agnostic platform design choices can support more prosocial interactions and user well-being.

You can read KGI’s full comments to the Commission here.

KGI’s focus on platform design aligns with submissions from other research and advocacy groups including the Integrity InstituteUniversity of Southern California Neely CenterEuropean Digital Rights (EDRi), the European Federation of Psychologists Associations (EFPA), and People vs Big Tech among others.

Platform design choice affects the experiences of all users online, including minors and adults alike. Minors and adolescents, however, have unique vulnerabilities that may impact their online experiences, and research is increasingly focused on adolescents’ experience with social media. Some scholars have found that social media can deliver concrete benefits to young people, including through connection, expression, information and learning, and entertainment. Research also finds, however, that certain features of social media can be harmful to the health of some adolescents. Research focused on adolescents is showing evidence of risks related to: social comparison; body image, dissatisfaction, and disordered eating; displacement of healthy behaviors; and feelings of sadness, anxiety, depression, and stress.

Taking into account these broad risks, KGI’s comments to the Commission highlight three areas where emerging and established research can inform strategies to guarantee the privacy, safety, and security of minors online:

  1. Access to Platform Data,
  2. Platform Design Features, and
  3. Engagement-based Recommender Systems.

In addition to DSA guidelines, these areas can also inform ongoing discussion of the potential for a new Digital Fairness Act to advance consumer protection in the European Union.

Access to Platform Data

The lack of comprehensive independent access to platform data frustrates efforts to develop effective, evidence-based digital platform policies, including those aimed at protecting the privacy, safety, and security of minors. Meaningful researcher and public access to platform data is essential for sound policymaking and ongoing effectiveness. Article 40 of the DSA articulates steps to guarantee public access to public data through Article 40(12) as well as non-public data through vetted researchers under Article 40(4)-(11).

Progress with data access has been slow. Over the last year, Meta shut down access to the public insights tool CrowdTangle; Twitter/X moved to restrict API access for researchers; and Reddit updated access to its Data API in ways that researchers say will limit effectiveness. The Commission has been forced to open multiple investigations into platform non-compliance with Article 40, including against TikTok and Twitter/X.

The ongoing opacity of online platforms makes it particularly difficult for researchers and the public to fully understand how platforms affect minors in both positive and negative ways. Ensuring meaningful independent public and researcher access to platform data is foundational and will strengthen the development and ongoing monitoring of the Guidelines, as well as the DSA regime as a whole.

Platform Design Features

A range of design features can promote or undermine privacy, safety, and security on digital platforms.

Specifically, profile design features and usage-extending design features, including “deceptive patterns,” are increasingly an area of focus for researchers. Our comments highlight risks associated with specific design features for the Commission to consider when developing Guidelines for the protection of minors.

Default account visibility and the recommendation of accounts by platforms may pose particular risks to minors. Expansive visibility defaults may enable bad actors to target and/or mass contact minors’ accounts with significant negative impacts. Filters and lenses to alter images and user appearance also may present risks to minors, including identified risks to minors around body image, dissatisfaction, and depression.

Usage extending design features, including infinite scroll, autoplay, and ephemeral content may encourage users to spend more time on social media than they otherwise would and make users feel less in control of their use. Platforms sequence notifications, social feedback, and other design features to maximize engagement, and by extension time online. Platforms further “gamify” engagement through visible indicators (likes, reactions, like counts, reaction counts) as well as streaks and goals that encourage continuing use. Platforms use notifications to pull users to the app or site. A 2020 study of social media users found that notifications contributed to increased use and platforms often make changing default settings for notifications complex.

As independent studies demonstrate, platform design features can impact the privacy, safety, and security of users of digital platforms. By accounting for the role of design features, the Guidelines can better ensure the privacy, safety, and security of minors online.

Engagement-Based Recommender Systems

The design and functionality of recommender systems shape the digital experiences of platform users, including minors.

Recommender systems are the algorithms that select, filter, rank, and personalize content. Platforms typically design recommender systems to maximize engagement with platform content. In practice, this usually means showing users content that is likely to induce clicks, likes, comments, reshares, dwell time, watch time, or other metrics representing desired user behavior. Since indicators of engagement are a reliable proxy for user attention, online platforms monetized by advertising are incentivized to incorporate engagement metrics into the design of recommender systems.

Engagement-based recommender systems are linked with more harmful content. Research suggests that harmful content is more likely to engage users and, as a result, be ranked more highly in feeds curated by engagement-based recommender systems. Indeed, engagement-based recommendation poses risks to users in multiple ways, including exposure to abuse, polarization, lower quality news diet, and reduced user satisfaction.

Research is identifying ways to optimize recommender systems for values other than engagement. For example, one high-potential approach is “bridging systems” that focus on promoting content from diverse users, which are widely deployed to surface community notes on Twitter/X and YouTube. In emerging empirical research, theoretical development, tools and industry deployments, recommender systems optimized for bridging, quality ratings, and other values are demonstrating the benefits  of non-engagement-based approaches.

Moving Forward

We commend the Commission for issuing its call for evidence. Meaningful collaboration between regulators, platforms, and independent researchers will be critical to address the most pressing policy issues facing the online information environment. Bridging divides between these communities is core to KGI’s mission and KGI is advancing collaborative processes to advance effective platform policies. Through our Gold Standard for Publicly Available Platform Data working group, for example, KGI is supporting efforts to define a proactive vision for how platform data should be made publicly available, under what circumstances, and in what format.

We invite stakeholders in research, policy, civil society, and industry to subscribe to our email alerts.

KGI appreciates comments on our submission from scholars including Bryn Austin from the Harvard T.H. Chan School of Public Health, Corbin Evans at the American Psychological Association, and Katarzyna Szymielewicz at the Panoptykon Foundation.

Close