
potential gold The European Union’s standard for online content governance – the Digital Services Act – is now a reality after the European Parliament passed the legislation overwhelmingly earlier this week. The final hurdle, only in form, is the signing of the text by the European Council of Ministers in September.
The good news is that the landmark legislation includes the broadest transparency and platform accountability obligations to date. It will give users real control and insight into the content they engage with, and provide protection against some of the most pervasive and harmful aspects of our online space.
As the European Commission begins to work out enforcement mechanisms in earnest, the focus now turns to the implementation of the sprawling law. The proposed regime is a complex structure in which responsibilities are shared between the European Commission and national regulators, known in this case as the Digital Services Coordinator (DSC). It will rely heavily on the creation of new roles, the expansion of existing responsibilities, and seamless collaboration across borders. What is clear is that, so far, there is simply no institutional capacity to effectively enact this legislation.
In “Sneak Peek,” the commission offers a glimpse into how they’ve overcome some of the more obvious implementation challenges — like how they plan to oversee large online platforms, and how they’ll try to avoid issues plaguing GDPR, such as out-of-sync national regulators and Selective enforcement — but their proposals only raise new questions. A large number of new staff will need to be hired, and the new European Algorithmic Transparency Centre will need to attract world-class data scientists and experts to help enforce a wide range of new algorithmic transparency and data accessibility obligations. The committee’s initial vision is to organize its oversight responsibilities by subject area, including a social issues group that will oversee some of the new due diligence obligations. The lack of resources here is worrisome and ultimately threatens to turn these hard-earned obligations into empty tick box exercises.
An important example is the obligation of platforms to conduct assessments to address systemic risks to their services. This is a complex process that takes into account all the fundamental rights protected by the EU Charter. To do this, tech companies will have to develop a Human Rights Impact Assessment (HRIA) – an assessment process designed to identify and mitigate potential human rights risks stemming from a service or business, or in this case a platform – civil society Society urges them to do this throughout the negotiation process. However, the Board, comprising the DSC and chaired by the committee, will annually assess the most prominent systemic risks identified and outline best practices for mitigation. As someone who has contributed to the development and evaluation of HRIA, I know this is no mean feat, even with independent auditors and researchers involved.
To have an impact, the assessment requires the establishment of a comprehensive baseline, specific impact analysis, assessment procedures and a stakeholder engagement strategy. The best human rights impact assessments embed a gender-sensitive approach and pay particular attention to systemic risks that will disproportionately affect those from historically marginalized communities.
This is the most specific way to ensure that all potential rights violations are covered.
Fortunately, international human rights frameworks, such as the UN Guiding Principles on Human Rights, provide guidance on how best to conduct these assessments. Nonetheless, the success of the provision will depend on how platforms interpret and invest in these assessments, and more on how the Commission and national regulators will enforce these obligations. But at their current capacity, the ability of these agencies to develop guidelines, best practices, and assess mitigation strategies is nowhere near the scale required for DSA.