This is an investigation into the implementation of Article 27 of the AI Act, which requires a Fundamental Rights Impact Assessment (FRIA) prior to deploying high-risk AI systems in public services, banking, and insurance. This was conducted in partnership with the Federación de Consumidores y Usuarios (CECU). We conducted expert interviews with academics, public administration officers, representatives from civil society organisations, and an investigative journalist as well as consulting a wide range of relevant documentation.
The initial output is a report and an advocacy brief on the question of how to ensure that Fundamental Rights Impact Assessments are completed in a way that meaningfully protects fundamental rights. We highlight how diverse public participation in these processes, and effective transparency mechanisms, ensure oversight and enforcement are essential components of a meaningful Fundamental Rights Impact Assessment.
From 2022 to 2025 I co-organized the group Conversations with Practitioners under EAAMO Bridges. We interviewed people who work directly with marginalized groups about their day to day realities and challenges. In 2024 / 2025 we worked closely with one organization, Chayn, which works in the gender based violence space. Together we developed a Principles to Practice framework based on Chayn's trauma informed design principles, started a 'Feminist AI' Working Group, and engaged in deep exploration of questions of safety and harm when using AI technologies for supporting survivors.