/
Introducing Model Legislation for Better Algorithmic Feeds
Model legislation published by the Knight-Georgetown Institute provides a pathway for lawmakers who want to encourage better algorithmic feeds that put users’ interests front and center.
The EU’s Digital Services Act risk assessments and audits create the possibility of increased platform transparency and accountability, but the first round falls far short of realizing that potential. The first assessments and audits largely fail to sufficiently consider the role of platform design in relation to risk and lack specificity about the data, metrics, and methods used to evaluate risk and mitigation effectiveness. External analysis of risks and mitigations are further undermined by an ongoing lack of access to data.
This brief provides an overview of emerging lessons and gaps from the first-round of systemic risk assessment and audit reports as required by the EU’s Digital Services Act (DSA). You can download a PDF version here.
In late 2024, nineteen designated Very Large Online Platforms and Search Engines published systemic risk assessment and audit reports as required by the European Union’s Digital Services Act (DSA). Risk assessment, mitigation, and audit are core components of the DSA’s governance framework. The 2023-2024 reports provide a first glimpse into how platforms are interpreting the DSA’s requirements in relation to systemic risk and effective mitigation.
The first round of assessments represent an important step forward towards transparency and accountability of digital platforms in Europe. For the first time, major online platforms assessed and reported on how their services may contribute to systemic risks in the EU, including threats to fundamental rights, the spread of illegal content, and problematic overuse or other harms to minors. This transparency regime lays the foundation for more informed online user experiences and public conversation of risks. It further supports evidence-based and effective oversight by encouraging digital platforms to proactively identify, communicate, and mitigate risks.
The DSA’s risk assessments and audits create the possibility of increased platform transparency and accountability, but the first round falls far short of realizing that potential. The first assessments and audits largely fail to sufficiently consider the role of platform design in relation to risk and lack specificity about the data, metrics, and methods used to evaluate risk and mitigation effectiveness. External analysis of risks and mitigations are further undermined by an ongoing lack of access to data.
Although platform design shapes users’ experiences online, the systemic risk assessments and audits fail to effectively address platform design in relation to risk or mitigation. Many of the assessments focus primarily on risks associated with content produced by users and hosted on the platform. This approach ignores how platform design can contribute to systemic risks. For example:
To learn more, KGI has conducted reviews of the risk assessments as well as audits in relation to recommender systems used by platforms. The DSA Civil Society Coordination Group, coordinated by the Center for Democracy & Technology, has also reviewed the assessments.
There is an urgent need to incorporate consistent and reliable definitions and measures into risk assessments, mitigations, and audits. The first round of systemic risk assessments and audits have not provided new information on platform risks or the effectiveness of mitigation measures.
In the absence of formal guidance from the European Commission, there is little uniformity or standardization across the assessments and audits. A Delegated Regulation provides guidance for auditors, but does not include recommendations around standard definitions, methodologies, or datasets. Each platform and auditor has largely taken its own approach, and there is a noticeable lack of reporting on specific metrics and data in both the assessments and audits.
There is relevant work for the risk assessments to build on and incorporate. The European Commission, the research community, and civil society, however, need to articulate specific guidance and guidelines for platforms.
Users and the research community face considerable challenges in securing access to platform data. DSA Article 40 requires platforms to enable independent research with both publicly accessible data and non-public data to further the study of systemic risks. A draft Delegated Regulation on vetted researcher access is an important tool to enable more meaningful assessment and triangulation of systemic risk and mitigations. The Commission should urgently prioritize enabling access to private platform data under Article 40(4) as well as access to publicly available platform data under Article 40(12). These tools are important complements to systemic risk assessment and audit, as are the Commission’s own investigations into platform non-compliance.
To reach their potential, risk assessments should systematically evaluate the role of platform design in relation to risk. Platforms also need clear guidance on the specific types of methodologies and measures of risk that should be incorporated into the risk assessment and audit process. The Commission, Member States, platforms, the research community, and civil society should come together to identify priority metrics and methodologies for the next round of assessments.