Model Text /

Modular Legislative Components for Better Algorithmic Feeds

Modular legislative components published by the Knight-Georgetown Institute support lawmakers developing legislation that encourages better algorithmic feeds.

Learn more about the modular legislative components for better algorithmic feeds here.

Introduction

These modular legislative components are based on the guidelines in the Knight-Georgetown Institute’s report, Better Feeds: Algorithms That Put People First. The components are designed to offer ideas that could be adopted into legislation in a variety of ways. The components are not a stand-alone bill proposal.

The modular legislative components focus on algorithmic optimization of recommender systems. This optimization is a process that is distinct and separate from any content moderation, filtering, or application of content-based rules, policies, or procedures that covered online platforms may undertake. The modular components focus on recommender system optimization that is designed by the platform prior to and independent of content being delivered to users.

The modular components are designed to apply to any online platform that uses recommender systems in the design of its product.

The modular components use square brackets [ ] to delineate areas where jurisdiction-specific language or processes may be appropriate. Several of the Better Feeds guidelines are technically complex, and lend themselves to specific rules that may be more effectively crafted in rulemaking than in legislation, depending on the jurisdiction. These aspects are identified in square brackets.

Definitions 

Accessible user interface. An interface that requires minimal user interactions (such as clicks, taps, or similar) for a user to input data, make a choice, or take an action while using a covered online platform.1

Algorithmic recommender system. A computational process used to determine the selection, order, rank, relative prioritization, or relative prominence of items provided to a user on an online platform, including search results, ranking, recommendations, display, or any other method of automated selection.2

Covered minor. A user who a covered business knows or should have known, based on knowledge fairly implied under objective circumstances, is a minor. 

Covered business. A sole proprietorship, limited liability company, corporation, association, or other legal entity that owns (including as a joint venture or partnership composed of businesses in which each has at least a 40% interest in the joint venture or partnership), operates, controls, or provides a covered online platform, except that a federal, state, tribal, or local government entity in the ordinary course of its operations shall not be considered a covered business.

Covered online platform. An online platform that:

  • Conducts business in this State; and
  • Uses one or more algorithmic recommender systems to determine the selection, order, rank, or relative prominence of items provided to a user in whole or in part based on the user’s personal data, unless the data is:
    • Based on user-selected settings, or technical information concerning the user’s device; or
    • A search query, provided that the query is not associated with the user in the online platform’s data storage and is only processed to convey items in direct response to the user’s search.

Default. A preselected option adopted by a covered online platform for a specific service, product, or feature.3

Engagement. User interaction with items on a covered online platform, including clicks, taps, comments, reshares, watching, dwelling, indications of approval or disapproval (such as likes, dislikes, upvotes, or downvotes), or any other form of interaction.4

Engagement data. Information that a covered online platform collects about engagement on its platform, not including user survey data.

High-value data. Any user-provided data or predictions from user survey data made by a covered online platform. 

Holdout group. A group of users of a covered online platform that are exempted from the application of algorithmic recommender system design changes.

Item. Any media eligible for display by a recommender system, which can include individual posts, accounts, groups, pages, channels, products, advertisements, text, images, videos, or audio files.5

Long-term holdout assessment. A process in which a covered online platform maintains a holdout group for a duration of at least 12 months.

Long-term user value. Outcomes that align with individual users’ deliberative, forward-looking preferences or aspirations as expressed to a covered online platform through high-value  data.

Long-term user value metrics. The metrics a covered online platform uses to measure long-term user value.

Online platform. A website, online service, online application, or mobile application. 

Personal data. Any information, including derived data and unique identifiers, that is linked or reasonably linkable, alone or in combination with other information, to an identified or identifiable individual or a device that identifies or is linked or reasonably linkable to an individual. 6

User. A user of a covered online platform who is located in [insert name of jurisdiction],7 but does not include the operator of a covered online platform or a person acting as an agent of the operator of a covered online platform.8

User-provided data. Any of the following categories of information collected by a covered online platform:

  • Information expressly and explicitly provided by the user, including user preferences, settings, search queries, prompts, and any other information expressly and explicitly provided by the user that is not engagement data;
  • User survey data;
  • Indicators or ratings expressly and explicitly selected by the user that are not engagement data; or
  • Other categories of data or more specific definitions of the above categories of data as may be defined by [jurisdictional authority with rulemaking power] via rulemaking.

User survey data. User responses to questions that a covered online platform or a third party acting on the covered online platform’s behalf poses to users.

Weights. Individual numeric settings that control the output of a recommender system at a high level across a covered online platform’s user base, such as the relative contributions of different factors to an item’s ranking.

Design Transparency

A. A covered online platform that deploys an algorithmic recommender system shall prominently and conspicuously provide on its website, service, or application:

1. A list of each algorithmic recommender system in use by the covered online platform;

2. A description of each input to each algorithmic recommender system and the source of the data of each input; and

3. The weights used in each algorithmic recommender system, categorized into four quartile groups according to each weight’s relative importance in contributing to the system’s output.

B. The Attorney General [or relevant state agency] shall, on or before [date], adopt rules to further clarify the information required to be disclosed under subsection (A) of this Section.

C. A covered online platform shall disclose, on an annual basis, the high-level objectives, key results, and performance metrics it uses to evaluate product teams responsible for algorithmic recommender system design.9

User Choice and Defaults

A. For all services, products, and features where a covered online platform makes use of an algorithmic recommender system that uses personal data, the covered online platform shall be configured, by default, to maximize one or more long-term user value metrics.

B. A covered online platform shall provide an accessible user interface that enables users to expressly and unambiguously communicate their preferences about the types of items to be recommended and to be blocked in the output of the covered online platform’s algorithmic recommender systems. The covered online platform shall take all reasonable steps to ensure that the output of its algorithmic recommender systems is consistent with those preferences.

C. Nondiscrimination. A covered online platform shall not withhold, degrade, lower the quality, or increase the price of any product, service, or feature, other than as necessary for compliance with the provisions of this article or any rules or regulations promulgated pursuant to this article, to a user due to the user’s exercise of any rights contained in this article, including the user’s selection of any algorithmic recommender system option or expressed preferences about types of items to be recommended or blocked.

Covered Minors

Any algorithmic recommender system provided by a covered online platform to a covered minor shall be configured, by default, to maximize one or more long-term user value metrics applicable to minors. 

Long-Term Assessments

A. A covered online platform shall maintain at least one holdout group and make all changes to the design of an algorithmic recommender system subject to a long-term holdout assessment, subject to rules promulgated by the Attorney General [or relevant state agency] under subsection (C) of this Section.

B. On an annual basis, a covered online platform shall make publicly available in a location that is easily accessible a Long-Term Holdout Assessment Disclosure that includes:10

1. The covered online platform’s long-term user value metrics;

2. The aggregate, anonymized measurements of each metric across the holdout group(s);

3. The aggregate, anonymized measurements of each metric across the rest of the user base of the covered online platform.

C. The Attorney General [or relevant state agency] shall, on or before [date], adopt rules for the operation of long-term holdout assessments as required under this Section, including:

1. The construction of holdout groups when carrying out long-term holdout assessments under this section;

2. The requirements for Long-Term Holdout Assessment Disclosures as required under subsection (B) of this Section; and

3. In the Attorney General’s [or relevant state agency’s] discretion, exempting from the long-term holdout assessment requirements in this Section any change to the design of an algorithmic recommender system that serves to reduce or prevent direct and immediate harms to users without increasing user engagement or revenue for the covered business.

D. A covered business operating a covered online platform shall, at its own expense and at least once a year, obtain an independent audit of the long-term holdout assessment(s) on its platform and of the Long-Term Holdout Assessment Disclosure. To comply with this requirement:11

1. The independent auditor preparing reports under this subsection must follow inspection and consultation practices designed to ensure that reports are comprehensive and accurate; and

2. The covered online platform must  provide to the independent auditors that prepare reports required under this subsection full and complete cooperation and access to information and operations required to ensure that the report is comprehensive and accurate.

Learn more about these modular legislative components here. Model legislation is available here

Citations
  1. Minnesota “Prohibiting Social Media Manipulation Act” 2024.
  2. KOSA 2023-24. See also Vermont S.69 2025, Minnesota “Prohibiting Social Media Manipulation Act” 2024, PATA 2023-24.
  3. Vermont S.69 2025 and California AB-2273 2021-22. These are more specific than Maryland HB 603 2024.
  4. PATA 2023-24.
  5.  New York Senate Bill 7694 2023-24.
  6. Vermont AADC 2025-26.
  7. Reset Model AADC Bill.
  8. California SB 979 2023-24.
  9. This provision would benefit if accompanied by clear legislative findings that explain the justification for the provision. Platforms may track hundreds or thousands of different metrics that can be used to evaluate many different forms of engagement, revenue, and ad impressions, as well as quality and integrity metrics. Requiring the disclosure of all of these metrics would provide a sea of information with no guide as to how the metrics are traded off against each other or which ones carry the most importance when platforms decide to make design changes. Focusing on the metrics used to evaluate product teams provides a narrower window into what the platform views as most important. Making these metrics transparent should incentivize platforms to incorporate employee and team evaluation criteria that better align with user value. Product team metrics that are solely and consistently focused on engagement metrics and do not include metrics related to user value, satisfaction, or harm mitigation should be cause for alarm.
  10. Maryland HB 603 2024, Colorado SB25-086 2025, and KOSA 2023-24.
  11. Reset Model AADC Bill.

Close