Commentary /

Systemic Risk Assessment under the Digital Services Act

The EU’s Digital Services Act risk assessments and audits create the possibility of increased platform transparency and accountability, but the first round falls far short of realizing that potential. The first assessments and audits largely fail to sufficiently consider the role of platform design in relation to risk and lack specificity about the data, metrics, and methods used to evaluate risk and mitigation effectiveness. External analysis of risks and mitigations are further undermined by an ongoing lack of access to data.

This brief provides an overview of emerging lessons and gaps from the first-round of systemic risk assessment and audit reports as required by the EU’s Digital Services Act (DSA). You can download a PDF version here.

In late 2024, nineteen designated Very Large Online Platforms and Search Engines published systemic risk assessment and audit reports as required by the European Union’s Digital Services Act (DSA). Risk assessment, mitigation, and audit are core components of the DSA’s governance framework. The 2023-2024 reports provide a first glimpse into how platforms are interpreting the DSA’s requirements in relation to systemic risk and effective mitigation. 

Strengths 

The first round of assessments represent an important step forward towards transparency and accountability of digital platforms in Europe. For the first time, major online platforms assessed and reported on how their services may contribute to systemic risks in the EU, including threats to fundamental rights, the spread of illegal content, and problematic overuse or other harms to minors. This transparency regime lays the foundation for more informed online user experiences and public conversation of risks. It further supports evidence-based and effective oversight by encouraging digital platforms to proactively identify, communicate, and mitigate risks. 

Major Gaps

The DSA’s risk assessments and audits create the possibility of increased platform transparency and accountability, but the first round falls far short of realizing that potential. The first assessments and audits largely fail to sufficiently consider the role of platform design in relation to risk and lack specificity about the data, metrics, and methods used to evaluate risk and mitigation effectiveness. External analysis of risks and mitigations are further undermined by an ongoing lack of access to data.

Platform Design, Risk, and Mitigation

Although platform design shapes users’ experiences online, the systemic risk assessments and audits fail to effectively address platform design in relation to risk or mitigation. Many of the assessments focus primarily on risks associated with content produced by users and hosted on the platform. This approach ignores how platform design can contribute to systemic risks. For example: 

  • Facebook and Instagram’s assessments consider physical‬‭ and‬‭ mental‬‭ well-being as cross-cutting risks relevant to multiple systemic risk areas. Yet the assessments fail to meaningfully consider how Meta’s own product features could incentivize problematic or harmful use or threats to fundamental rights.
  • Snap’s risk assessment devotes seven pages to physical and mental well-being risks, but it largely focuses on risks that arise from content. The assessment fails to consider how Snap’s design – including the centrality of ephemeral content, recommendation of adults to minors through Snap’s Quick Add feature, and Snap Map – could contribute to risks by incentivizing problematic use and unwanted contact or content. 
  • TikTok’s initial risk assessment considers some well-being risks and highlights how TikTok’s Daily Screen Time tool allows minors and parents to set daily limits for time spent on TikTok. Neither the assessment nor the audit consider the effectiveness of these tools. Documents released in litigation in the US show that TikTok’s internal teams believe time management tools will have little impact on addressing risks of harmful or problematic use by minors. 
  • YouTube expressly considers how its design and features could contribute to behavioral addiction in children. The assessment and audit reference screen time limits and other design tools to limit time spent on YouTube. The assessment does not, however, describe whether and how YouTube’s time management tools actually contribute to mitigating the risk of behavioral addiction, particularly for minors. 

To learn more, KGI has conducted reviews of the risk assessments as well as audits in relation to recommender systems used by platforms. The DSA Civil Society Coordination Group, coordinated by the Center for Democracy & Technology, has also reviewed the assessments.

Methodologies and Metrics

There is an urgent need to incorporate consistent and reliable definitions and measures into risk assessments, mitigations, and audits. The first round of systemic risk assessments and audits have not provided new information on platform risks or the effectiveness of mitigation measures.

In the absence of formal guidance from the European Commission, there is little uniformity or standardization across the assessments and audits. A Delegated Regulation provides guidance for auditors, but does not include recommendations around standard definitions, methodologies, or datasets. Each platform and auditor has largely taken its own approach, and there is a noticeable lack of reporting on specific metrics and data in both the assessments and audits. 

There is relevant work for the risk assessments to build on and incorporate. The European Commission, the research community, and civil society, however, need to articulate specific guidance and guidelines for platforms. 

  • Meaningful Metrics: Consistent and reliable metrics will considerably strengthen risk assessment and mitigation. Platforms themselves are not well positioned to establish effective and comparable measures alone. But important work is underway to forge consistent measures of harm. An Integrated Harm Framework, for example, considers youth-specific social media harms as well as the strength of available evidence for specific mitigation strategies. New Risk Assessment Guidance envisions more consistent methodologies. 
  • User Surveys: User surveys, already used by a range of platforms including Snap and Instagram, are an important tool to understand user experiences related to risk and harm. These surveys are administered by platforms as well as independent organizations to understand user experiences. Such tools should be incorporated into risk assessment. 
  • Product Experimentation: Risk assessment should consider high level platform product team goals and aggregated product design experimentation results. Review of high level goals and metrics of platform design teams would allow stakeholders to understand how platform growth goals are aligned (or misaligned) with risk prevention and mitigation goals.

Transparency and Data Access

Users and the research community face considerable challenges in securing access to platform data. DSA Article 40 requires platforms to enable independent research with both publicly accessible data and non-public data to further the study of systemic risks. A draft Delegated Regulation on vetted researcher access is an important tool to enable more meaningful assessment and triangulation of systemic risk and mitigations. The Commission should urgently prioritize enabling access to private platform data under Article 40(4) as well as access to publicly available platform data under Article 40(12). These tools are important complements to systemic risk assessment and audit, as are the Commission’s own investigations into platform non-compliance.

The Way Forward

To reach their potential, risk assessments should systematically evaluate the role of platform design in relation to risk. Platforms also need clear guidance on the specific types of methodologies and measures of risk that should be incorporated into the risk assessment and audit process. The Commission, Member States, platforms, the research community, and civil society should come together to identify priority metrics and methodologies for the next round of assessments. 

 

Close