Professor Noopur Raval | October 17 | The Algorithm is a Lie: revisiting blackboxing, transparency and user engagement with AI
Abstract | With the rise of complex computational systems and their real-world applications in all domains of life – work, hiring, socializing, criminal justice etc., we are often told that algorithmic systems are inherently complex, require high degrees of technical expertise, and thus, are unknowable to a degree. We are also often informed that harmful outcomes of algorithmic systems are caused by “bias” and if we control for bias in training data and design, we can make these systems better – or even less harmful and racist and sexist. In response, computational and legal scholars and activists have developed auditing tools to unpack “blackboxed” algorithmic systems and reveal how they function in the hope of producing better social and economic outcomes.
This talk draws on years of ethnographic work across projects and mobilizes case studies to demonstrate how the expertise and unknowability of algorithmic systems is a ruse and a distraction from fundamental questions of values and pedagogy in Computer Science and associated fields. These fundamental challenges are more than just bias or glitches that can be fixed technically. The talk will also argue that algorithmic systems can be investigated as socio-technical systems by non-expert users. In doing so, the talk invites scholars across LIS fields to develop experiential, participatory, archival and experimental methodologies that tackle algorithms as social objects.
Speaker Bio | Dr. Noopur Raval is an assistant professor in Information Studies at UCLA. She is a feminist interdisciplinary interpretivist researcher and educator with a deep interest in how people create and maintain a good life and how technology can support visions of good work and life across the world. She received her PhD in Informatics from UC Irvine and has held positions at UC Santa Cruz, AI Now Institute at NYU, Microsoft Research and Xerox Research in the past.