Platform Governance

KGI’s work on platform governance addresses how platform design choices shape user behavior, risks, and societal outcomes. KGI works across policy contexts with independent researchers, civil society partners, industry experts, litigators, and journalists to inform the development of online platform accountability, transparency, and regulation.

 

KGI’s platform governance work focuses on platform design in policy and industry, platform design litigation, and public data access.

Platform Design in Policy and Industry

Each day, billions of users use online tech platforms or scroll through social media feeds, search results, and streaming recommendations that shape what they see, read, and watch. The design of these online platforms, including their algorithmic feeds, influence how users experience the online information environment and determine what users see, wielding enormous influence over users’ online experiences and, increasingly, their lives offline. 

Federal, state, and global policymakers have proposed and adopted a variety of approaches to regulate platform designs and their impacts, from strengthening transparency and user control of algorithmic feeds to addressing deceptive design features or requiring age verification. The maturation of policymaking in this area requires bolstering scientific consensus about which platform design changes effectively mitigate which harms, how to understand the tradeoffs, how to best measure and evaluate design changes, and other questions. 

KGI is working to deepen research-to-policy connections and convene stakeholders in support of this agenda. Learn more about our work on platform design in policy and industry here.

Platform Design Litigation

Litigation is a battleground for platform accountability around the world. Lawsuits across US states now target the design choices behind online platforms – from extended use designs and algorithmic manipulation to privacy violations. Many of these cases employ legal theories grounded in consumer protection and product liability to attempt to make platforms answerable for the design of their products. As US lawsuits advance to critical discovery and remedy phases, there is a growing need to foster collaboration between three communities whose work sits at the intersection of platform design and the law: litigators, technology researchers, and legal scholars. 

KGI has two litigation-oriented projects: Litigating Platform Design and Empirical Research in Tech Litigation. Learn more about our work on platform design litigation here

Public Data Access 

Access to public data about online discourse – the reach of viral posts, the connections between social media accounts, and so much more – is essential for accountability and informed public conversations.  Public data access enables independent research and investigation, informs evidence-based policymaking, and advances collective understanding about the role of online platforms in our lives. 

Thanks to transparency advocates, we have seen various transparency regimes take hold – voluntary, self-regulatory, and regulatory – requiring platforms to share information about their activities, algorithms, and processes with vetted entities including researchers, regulators, or business competitors, and sometimes with the broader public. Other organizations, companies, and researchers have developed automated tools to independently collect and analyze public platform data.

Yet the tools that once allowed researchers, journalists, and civil society to study platforms are disappearing, undermining transparency and accountability. Platforms restrict researcher access while public data is monetized for advertisers, data brokers, and AI training. This imbalance – where companies profit but independent researchers are left in the dark – undermines transparency and weakens oversight.

KGI advances efforts to expand public access to public platform data, drawing on the research community’s experience with platform datasets to push for practical and policy changes that improve transparency and accountability. Learn more about our work on public data access here.

Latest Work

Europe Unveils New Evidence-Based Guidelines to Advance Safer Platform Design for Minors

Commentary /

Europe Unveils New Evidence-Based Guidelines to Advance Safer Platform Design for Minors

The European Commission’s new guidelines to protect minors online mark a step forward in online child safety, offering recommendations for how platforms are designed, including limits on manipulative design techniques, defaults to maximize protection, more agency for children, and regular risk reviews.

The UK’s Ofcom Releases a Roadmap to Advance Researcher Access to Platform Data

Commentary /

The UK’s Ofcom Releases a Roadmap to Advance Researcher Access to Platform Data

A new report by Ofcom, the UK’s communication services regulator, underscores the challenges researchers face in accessing platform data and proposes a roadmap to improve researcher’s access to data to support online safety.

Modular Legislative Components for Better Algorithmic Feeds

Model Text /

Modular Legislative Components for Better Algorithmic Feeds

Modular legislative components published by the Knight-Georgetown Institute support lawmakers developing legislation that encourages better algorithmic feeds.

New EU Guidelines Set to Strengthen Digital Safety for Children in Europe

Commentary /

New EU Guidelines Set to Strengthen Digital Safety for Children in Europe

New draft guidelines by the European Commission aim to shape a better and safer online environment for minors. The Knight-Georgetown Institute submitted comments to the European Commission on its guidelines for protecting minors online under the Digital Services Act – underscoring the need for an evidence-based approach that focuses on strengthening risk reviews, refining default account settings, and improving algorithmic recommender system design.

Systemic Risk Assessment under the Digital Services Act

Commentary /

Systemic Risk Assessment under the Digital Services Act

The EU’s Digital Services Act risk assessments and audits create the possibility of increased platform transparency and accountability, but the first round falls far short of realizing that potential. The first assessments and audits largely fail to sufficiently consider the role of platform design in relation to risk and lack specificity about the data, metrics, and methods used to evaluate risk and mitigation effectiveness. External analysis of risks and mitigations are further undermined by an ongoing lack of access to data.

See All From Platform Governance

Close