Critical Bestiaries of AI: Reimagining Monsters

A psychedelic-inspired, distorted picture of a monster in saturated colours.

Project Leads:

Nina Hallowell

Professor of Social and Ethical Aspects of Genomics


Rachel Douglas-Jones

Associate Professor of the Anthropology of data and infrastructure


Nicholas Pitt

Public Engagement Officer




About the project:

The Bestiaries Project is collaboration between the Wellcome Centre for Ethics & Humanities at the University of Oxford and the ETHOS Lab at the IT University of Copenhagen. The project's goal is to bring publics' attention to aspects of Artificial Intelligence (AI) that require oversight and regulation. We will do this by creating an AI bestiary , a collection of AI beasts or monsters representing areas of AI technology that are in need of governance. Teams of Oxford researchers who work on AI from various perspectives (for example, those who create AI-based computer programs, or who look at ethical and social implications of introducing AI in society) and monster studies will create the bestiary during a facilitated workshop. In addition to identifying AI beasts, or areas of AI that are in need of governance, these researchers will develop a series of collages that represent the AI beasts that they have imagined. An artist will facilitate this workshop. This collection of AI beasts will be used as the basis for workshops in Oxford primary schools, where students will make their own beasts taking inspiration from what the artist and researchers have created. The students work will culminate in an exhibition in central Oxford, where there will also be the opportunity for more children and their parents to come and make more of their own beasts. The aim of the Bestiaries Project is engage different audiences - academic researchers, primary school children and members of the public - in thinking more critically about AI and its governance; where the gaps are, and how they can or should be filled. 




This project is funded by the Minderoo AI Challenge Fund.