Report /

Better Feeds: Algorithms That Put People First

As state, federal, and global policymakers grapple with how to address concerns about the link between online algorithms and a variety of harms, KGI’s latest report authored by a distinguished group of researchers, technologists, and policy leaders is a roadmap for design and governance solutions that put users first. Product designers and policymakers can promote better algorithms through detailed design transparency, user choices and defaults, and assessments of long-term impact.

Download PDF

A How-To Guide for Platforms and Policymakers

The algorithmic recommender systems that select, filter, and personalize experiences across online platforms and services play a significant role in shaping user experiences online. These systems largely determine what users see, read, and watch, fueling debates around their potential to amplify harmful content, foster societal division, and prioritize engagement over user well-being. In reaction, some policymakers have turned to blanket bans on personalization or to the promotion of chronological feeds. But there are many better alternatives. Suggesting that users must choose between today’s default feeds and chronological or non-personalized feeds creates a false choice. 

This report, prepared by the KGI Expert Working Group on Recommender Systems, offers comprehensive insights and policy guidance aimed at optimizing recommender systems for long-term user value and high-quality experiences. Drawing on a multidisciplinary research base and industry expertise, the report highlights key challenges in the current design and regulation of recommender systems and proposes actionable solutions for policymakers and product designers.

A key concern is that some platforms optimize their recommender systems to maximize certain forms of predicted engagement, which can prioritize clicks and likes over stronger signals of long-term user value. Maximizing the chances that users will click, like, share, and view content this week, this month, and this quarter aligns well with the business interests of tech platforms monetized through advertising. Product teams are rewarded for showing short-term gains in platform usage, and financial markets and investors reward companies that can deliver large audiences to advertisers.

Concerns have been raised about the relationship between this design approach and a range of individual and societal harms, including the spread of low-quality or harmful content, reduced user satisfaction, problematic overuse, and increased polarization. Available evidence underscores the need for a shift towards designs that optimize for long-term user satisfaction, well-being, and societal benefits.

To achieve this, the KGI Expert Working Group on Recommender Systems proposes that policymakers and product designers adopt the following:

  • Detailed transparency in the design of recommender systems, including the public disclosure of input data sources, value model weights, and metrics used to measure long-term user value. Platforms must also publicly disclose the internal metrics used to assess product teams responsible for recommender system design.
  • User choices and defaults that allow individuals to tailor their platform experiences and switch between different recommendation systems. Minors must be provided with default recommender systems optimized to deliver them long-term value.
  • Assessments of long-term impact, where platforms continuously test the impact of algorithmic changes over extended periods. Platforms must conduct these assessments by running so-called “holdout” experiments that exempt a group of users from design changes for 12 months or more. Public disclosure of aggregated experiment results and independent audits must be adopted for accountability.

This report provides a how-to guide for the implementation of each set of proposals, lighting a pathway towards higher quality designs that may still be personalized or leverage some forms of engagement data, but overcome the design flaws of engagement-optimized systems.

By following this expert working group’s guidance, summarized below, platforms and policymakers can help to address the harms associated with recommender systems while preserving their potential to enhance user experiences and societal value. This report serves as a roadmap for any policymaker or product designer interested in promoting algorithmic systems that put users’ long-term interests front and center.

Core Policy Guidance
Design Transparency Platforms must publicly disclose information about the specific input data and weights used in the design of their recommender systems.
Platforms must publicly disclose the metrics they use to measure long-term user value.
Platforms must publicly disclose the metrics they use to evaluate product teams responsible for recommender system design.
User Choices and Defaults Platforms must offer users an easily accessible choice of different recommender systems. At least one of these choices must be optimized to support long-term value to users.
Platforms must provide easily accessible ways for users to set their preferences about types of items to be recommended and to be blocked. Platforms must honor those preferences.
By default, platforms must set minors’ recommender systems to be optimized to support long-term value to these users. If platforms have insufficient information about long-term value to minors, they must default to non-personalized recommender systems.
Long-Term Holdout Experiments Platforms must run long-term (12-month or longer) holdout experiments on a continuous basis.
Platforms must report the aggregate, anonymized results of the holdout experiments publicly.
Holdout experiments must be subject to an audit by an independent third party.
Global Policy Guidance
Public Content Transparency Platforms must continuously publish a sample of the public content that is most highly disseminated on the platform and a sample of the public content that receives the highest engagement.
Platforms must continuously publish a representative sample of public content consumed during a typical user session on the platform at any given time.
User Defaults By default, platforms must optimize users’ recommender systems to support long-term user value. 
Metrics and Measurement Platforms must measure the aggregate harms to at-risk populations that result from recommender systems and publicly disclose the results of those measurements.

Download the report

Explainers

Recommender Systems 101
Taxonomy of User Signals
Long-Term User Value

Policy Briefs

US Policy Brief
Global Policy Brief

Close