Model Text /

Modular Legislative Components for Better Algorithmic Feeds

Modular legislative components published by the Knight-Georgetown Institute support lawmakers developing legislation that encourages better algorithmic feeds.

Learn more about the modular legislative components for better algorithmic feeds here.

Introduction

These modular legislative components are based on the guidelines in the Knight-Georgetown Institute’s report, Better Feeds: Algorithms That Put People First. The components are designed to offer ideas that could be adopted into legislation in a variety of ways. The components are not a stand-alone bill proposal.

The modular legislative components focus on algorithmic optimization of recommender systems. This optimization is a process that is distinct and separate from any content moderation, filtering, or application of content-based rules, policies, or procedures that covered online platforms may undertake. The modular components focus on recommender system optimization that is designed by the platform prior to and independent of content being delivered to users.

The modular components are designed to apply to any online platform that uses recommender systems in the design of its product.

The modular components use square brackets [ ] to delineate areas where jurisdiction-specific language or processes may be appropriate. Several of the Better Feeds guidelines are technically complex, and lend themselves to specific rules that may be more effectively crafted in rulemaking than in legislation, depending on the jurisdiction. These aspects are identified in square brackets.

Definitions 

Accessible user interface. An interface that requires minimal user interactions (such as clicks, taps, or similar) for a user to input data, make a choice, or take an action while using a covered online platform.1

Algorithmic recommender system. A computational process used to determine the selection, order, rank, relative prioritization, or relative prominence of items provided to a user on an online platform, including search results, ranking, recommendations, display, or any other method of automated selection.2

Covered minors. [Can be defined to align with jurisdiction law. Better Algorithmic Feeds modular components are drafted to align with definitions based on any mechanism of age estimation, assurance, or verification.] 

Covered online platform. An online platform that uses one or more algorithmic recommender systems to recommend, display, share, provide, or otherwise convey items to users. 

Default. A preselected option adopted by a covered online platform for a specific service, product, or feature.3

Engagement. User interaction with items on a covered online platform, including clicks, taps, comments, reshares, watching, dwelling, indications of approval or disapproval (such as likes, dislikes, upvotes, or downvotes), or any other form of interaction.4

Engagement data. Information that a covered online platform collects about engagement on its platform, not including user survey data.

High-value data. Any user-provided data, or predictions of user-provided data made by a covered online platform.

Holdout group. A group of users of a covered online platform that are exempted from the application of algorithmic recommender system design changes.

Item. Element eligible for display by a recommender system, which can include individual posts, accounts, groups, pages, channels, products, advertisements, text, images, videos, audio files, or other media.5

Long-term holdout assessment. A process in which a covered online platform maintains a holdout group for a duration of at least 12 months.

Long-term user value. Outcomes that align with individual users’ deliberative, forward-looking preferences or aspirations as expressed to a covered online platform through user-provided data.

Non-personalized algorithmic recommender system. A recommender system whose output provided to a given user does not depend on processing that user’s personal data.

Online platform. A website, online service, online application, or mobile application. 

Personal data. Any information, including derived data and unique identifiers, that is linked or reasonably linkable, alone or in combination with other information, to an identified or identifiable individual or a device that identifies or is linked or reasonably linkable to an individual. “Personal data” does not include de-identified data or publicly available information.6 [Can alternatively be defined to align with jurisdiction law in jurisdictions with existing laws regulating personal data.]

User. A user of a covered online platform and who is located in [insert name of jurisdiction].7 A user does not include the operator of a covered online platform or a person acting as an agent of the operator of a covered online platform.8

User-provided data. Any of the following categories of information collected by a covered online platform:

  • Information expressly and explicitly provided by the user, including user preferences, settings, search queries, prompts, and any other information expressly and explicitly provided by the user that is not engagement data;
  • User survey data;
  • Indicators or ratings expressly and explicitly selected by the user that are not engagement data; or
  • Other categories of data or more specific definitions of the above categories of data as may be defined by [jurisdictional authority with rulemaking power] via rulemaking.

User survey data. User responses to questions that a covered online platform or a third party acting on the covered online platform’s behalf poses to users.

User survey. Structured data collection tool administered to users of a covered online platform. 

Weights. Individual numeric settings that control the output of a recommender system at a high level across a covered online platform’s user base, such as the relative contributions of different factors to an item’s ranking.

Design Transparency

A.  A covered online platform that deploys an algorithmic recommender system shall prominently and conspicuously provide on its website, service, or application:

1. A list of each algorithmic recommender system in use by the covered online platform;

2. A description of each source of raw information used in each algorithmic recommender system;

3. The weights used in each algorithmic recommender system, categorized into four groups according to each weight’s importance in contributing to the system’s output: weights in the 0-25th percentile range, weights in the 26-50th percentile range, weights in the 51-75th percentile range, and weights in the 76-100th percentile range.

B.  [In jurisdictions where a public authority can be tasked with rulemaking, rulemaking to further define the above disclosure requirements can be added here.]

C.  A covered online platform shall disclose, on an annual basis, the high-level objectives, key results, and performance metrics it uses to evaluate product teams responsible for algorithmic recommender system design.9

User Choice and Defaults

A.  For all services, products, and features where a covered online platform makes use of an algorithmic recommender system, the covered online platform shall offer users an option of at least one algorithmic recommender system design that is optimized on the basis of high-value data. 

B.  A covered online platform shall provide an accessible user interface that enables users to expressly and unambiguously communicate their preferences about the types of items to be recommended and to be blocked in the output of the covered online platform’s algorithmic recommender systems. The covered online platform shall ensure that the output of its algorithmic recommender systems is consistent with those preferences.

C.  Nondiscrimination. A covered online platform shall not withhold, degrade, lower  the quality, or increase the price of any product, service, or feature, other than as necessary for compliance with the provisions of this article or any rules or regulations promulgated pursuant to this article, to a user due to the covered online platform being required to provide recommender system choice to users or otherwise comply with this article.

Covered Minors

A.  By default, a covered online platform shall optimize its algorithmic recommender systems provided to covered minors on the basis of high-value data. 

B.  If a covered online platform has insufficient information to optimize its algorithmic recommender systems for covered minors on the basis of high-value data, the covered online platform shall provide a non-personalized algorithmic recommender system to covered minors by default.

Long-Term Assessments

A.  A covered online platform shall run long-term holdout assessments on a continuous basis in accordance with the rules established under subsection (C). 

B.  On an annual basis, a covered online platform shall make publicly available in a location that is easily accessible a Long-Term Holdout Assessment Disclosure that includes:10 

1. The metrics the covered online platform uses to measure long-term user value;

2. The aggregate, anonymized measurements of each metric across the holdout group(s);

3. The aggregate, anonymized measurements of each metric across the rest of the user base of the covered online platform.

C.  [Require rulemaking to further define how holdout groups shall be constructed and the above disclosure requirements.]

D.  A covered online platform shall, at its own expense and at least once a year, obtain an independent audit of the long-term holdout assessment(s) on its platform and of the Long-Term Holdout Assessment Disclosure. Independent audits shall conform with the following requirements:11

1. Independent auditors that prepare reports under this subsection shall follow inspection and consultation practices designed to ensure that reports are comprehensive and accurate.

2. Covered online platforms shall provide to the independent auditors that prepare reports required under this subsection full and complete cooperation and access to information and operations required to ensure that the report is comprehensive and accurate.

 

Learn more about these modular legislative components here.

Citations
  1. Minnesota “Prohibiting Social Media Manipulation Act” 2024.
  2. KOSA 2023-24. See also Vermont S.69 2025, Minnesota “Prohibiting Social Media Manipulation Act” 2024, PATA 2023-24.
  3. Vermont S.69 2025 and California AB-2273 2021-22. These are more specific than Maryland HB 603 2024.
  4. PATA 2023-24.
  5. New York Senate Bill 7694 2023-24.
  6. Vermont AADC 2025-26.
  7. Reset Model AADC Bill.
  8. California SB 979 2023-24.
  9. This provision would benefit if accompanied by clear legislative findings that explain the justification for the provision. Platforms may track hundreds or thousands of different metrics that can be used to evaluate many different forms of engagement, revenue, and ad impressions, as well as quality and integrity metrics. Requiring the disclosure of all of these metrics would provide a sea of information with no guide as to how the metrics are traded off against each other or which ones carry the most importance when platforms decide to make design changes. Focusing on the metrics used to evaluate product teams provides a narrower window into what the platform views as most important. Making these metrics transparent should incentivize platforms to incorporate employee and team evaluation criteria that better align with user value. Product team metrics that are solely and consistently focused on engagement metrics and do not include metrics related to user value, satisfaction, or harm mitigation should be cause for alarm.
  10. Maryland HB 603 2024, Colorado SB25-086 2025, and KOSA 2023-24
  11. Reset Model AADC Bill.

Close