In this final, special episode of Designing Cities for All, we will explore the disproportionate impact of AI systems on marginalized groups, with a focus on LGBTQIA+ individuals. Even though the implementation of algorithmic models is heavily increasing, the inclusivity of these structures does not improve evenly. In fact, they worsen discrimination, going as far as building models to detect one’s sexual orientation from their facial features. This reinforces the urgency about the trajectory of these technologies as they are becoming the tools to govern societies. As gender and sexual identity are also crucial parts of one’s identity, we will explore if data structures could ever reach a complexity to accurately ‘decode’ this multiplicity.
About the programme
Biometric data (physical, physiological, or behavioral characteristics) we represent is collected by surveillance systems, leaving no part of our bodies unanalyzed. This information is automatically labeled into rigid classifications with no room for ‘error’ or divergence. During this translation, from data to meaning, queer individuals are automatically excluded. Many incidents show how misidentification has detrimental effects on LGBTQIA+ communities in everyday life. Speakers of this program will uncover intersections between AI, body and identity to discuss the extent identification systems can be developed, or fail to do so.
About the speakers
Masuma Shahid (she/her) is a researcher and lecturer (AI, Queer Data & LGBTQ+ rights) at the Erasmus School of Law (Erasmus University Rotterdam, The Netherlands) and one of the co-coordinators of the LGBTQI Working Group at the Berkeley Center on Comparative Equality and Anti-Discrimination at Berkeley Law. She is also the author of the chapter ‘AI and LGBTQ+ rights’ in the handbook “Artificial Intelligence and Human Rights” by Quintavalla & Temperman (eds.), Oxford University Press, 2023. Shahid is currently working together with ILGA-Europe to research the impact of AI on LGBTQ+ rights in Europe and make policy recommendations to the European Union and the Council of Europe.
Radical Data is a collective that creates technology for liberation and joy. Bringing practices from data science, socially engaged art, and Latin American activism, they build community-based projects that redefine the role of technology in our present and future. Rayén Jara Mitrovich (they/them) is a Chilean artist, activist and researcher whose work focuses on intimacy, bodies and their interaction with technology. Jo Jara Kroese (they/them) is an English-Dutch mathematician, artist and technologist. Their work uses data and technology to create tools for resistance and real utopias.
Megan Thomas (she/her) is a researcher on human rights activism, specializing in gender rights and queer activism. Megan has a background in French Literature, Human Rights and Democratization and International Crimes, Conflict and Criminology. She has researched protection policies for human rights defenders and the stigmatization of LGBTIQ+ activists worldwide. In a recently published report on AI & disinformation against LGBTIQ+ communities , developed together with Meredith Veit and Forbidden Colours, she highlights the risks of AI for LGBTIQ+ communities online.
Nico Voskamp (he/him) is responsible for internal operations, partnerships, and fundraising at Bits of Freedom. With an academic background in cultural and computer sciences, his research has focused on the use of algorithms and artificial intelligence in Dutch municipalities. Bits of Freedom is dedicated to defending human rights in the context of digital technology, with a particular focus on privacy and freedom of communication. Based in Amsterdam, the NGO works through legal action, advocacy campaigns, and research to shape policy and enforcement in both Brussels and The Hague.
About Designing Cities for All: RE generation
Over the first two years of Designing Cities for All (DCFA), we’ve learned about exclusion by design and the (re)design of inclusive cities. Along the journey, a certain question kept popping up: what exactly does ‘for all’ entail? After focusing mostly on the ‘who’ over the past two years, DCFA is rebooting as Designing Cities for All: RE-generation. This way around, the series is also incorporating the ‘what’ by looking through the fresh lens of regenerative design. This emerging field might very well be a promising answer to the challenges of our time, as it focuses on the design of products, services, systems, and processes that lead to both social and ecological recovery and that keep the systems healthy.
Artificial Intelligence is going to change our world, that’s inevitable. But in what way it changes our world is still up to us. And for LGBTQ people, often marginalised by traditional systems, we need to be wary of how AI could filter us out. Because if we don’t it could tell our story incorrectly, and leave us behind, as the technology expands.
The risks of AI-powered oppression of sexual and gender diversity are already here.
Numerous states across the globe have deployed unregulated AI systems to assess welfare claims, monitor public spaces, or determine someone’s likelihood of committing a crime. These technologies are often branded as ‘technical fixes’ for structural issues such as poverty, sexism and discrimination. They use sensitive and often staggering amounts of data, which are fed into automated systems to decide whether or not individuals should receive housing, benefits, healthcare and education — or even be charged with a crime. Yet instead of fixing societal problems, many AI systems have flagrantly amplified racism and inequalities, and perpetuated human rights harms and discrimination.