Skip to main content

Alex Taylor: What we research and how we go about doing research is intimately connected!

Alex Taylor is a Reader in Design Informatics at the University of Edinburgh. He has been contributing to the areas of Science and Technology Studies and Human-Computer Interaction for over twenty years and has held positions in both academic and industrial research, most recently as Centre Co-Director of HCID at City University of London, and in the past at the University of Surrey, Goldsmiths, Xerox, Google and Microsoft Research.

Alex is a Digital Futures Scholar in residence from October 2023 to October 2024, hosted by Rob Comber, Associate Professor, Division of Media Technology and Interaction Designs at KTH.

Hi Alex, having recently joined the Institute of Design Informatics at the University of Edinburgh, could you share some insights into your current focus and objectives in the realm of Science and Technology Studies and Human-Computer Interaction within this new academic environment?

– My interests continue to be in the always-emerging relations between actors of all kinds. That might be between, for example, people and digital technologies or between nonhuman critters and their surroundings. What interests me here is how these relations influence how we engage with and make sense of the world. In this way, I would call the relations ‘world making’ – the relations create the conditions for very particular worlds. For me, this thinking draws heavily on diverse scholarly influences including feminist technoscience, animal studies and, more recently, intersectional and black studies

With a diverse background in both academic and industrial research, how has your experience shaped your approach to understanding the entanglements between social life and machines? Are there specific lessons or perspectives from industry that have influenced your academic work?

That’s hard to answer. The sceptic in me wants to say that industrial research is driven by fiscal growth and market share. In past experiences in industrial research – even though I’ve had a lot of freedom –  I found that I was ultimately answerable to how my research related to ‘the business’. Being in academia has been liberating because that burden has been lifted. I find my research directions can now be led by their potential impact on society, whether that be in terms of knowledge production or through more direct, concrete interventions.

However, the interplay between the constraints and potentiality for research is of course much more complex than this. I value my experiences in industry because they’ve helped me understand that the entanglements between social life and machines ‘always come with their worlds’. That is, there’s never a pure or unadulterated setting in which research is done, and we’re always making sense of social and technical relations from particular positions and with particular stories in mind.

What I’ve come to understand in my research on the entangled relations between social life and machines is that it matters what positions we take and what stories we are part of and help to tell. So, it’s not so much whether you’re in industry or the academy, but how you make yourself open to and accountable for the worlds you get behind.

Your long-term research ambition revolves around the concept of a reparative AI. What inspired you to explore this perspective?

– Thinking with ideas like world making, and the worlds we’re accountable for, I’ve had a long-held interest in fairness, equity and justice. As we know, technologies risk cementing long-held inequities and in many cases amplifying them. There are some clear cases of this in the burgeoning use of algorithms and AI. Borrowing from Jenny Davis, Apryl Williams and Michael Yang*, I see the word ‘reparative’ as important as it immediately highlights the intrinsic biases in computational systems. It also suggests that we can’t achieve neutral solutions to bias, but are always as I’ve said above taking a position. To take the position of reparation in AI, then, is to actively resist social disparities and inequities and to commit to the work of making more just worlds possible. It’s this active and reparative approach to the entangled relations between technology and society that really motivates me in my research.

As the Centre Co-Director of HCID at City University of London, you likely engaged in various projects. Could you highlight a specific project or research endeavour that you found particularly intriguing or impactful and the outcomes that emerged from it?

As a co-director at City, University of London, alongside the amazing Steph Wilson, it was less the projects that drove me and more the environment we created for the research to happen. What was important to me was fostering a sense of collective care and commitment to the sorts of values I’ve touched on above. For me, it means very little to do research on topics like equity, justice and reparation without at the same time creating flourishing environments that promote such values. I see what we research and how we go about doing research as intimately connected.

It’s important for me to add that what I’ve learnt is that this ‘project’, if you will, of building caring research environments that share a collective commitment to justice isn’t straightforward and there’s no final endpoint. It is an unending endeavour that requires experimentation, involves knock-backs, and most importantly must be collective. We wouldn’t have had the special environment there is in HCID at City without the special people ready to put in the hard work.

In your role as a Digital Futures Scholar, you plan to work with the Digital Futures environment members to explore the concept of reparative AI and investigate concrete cases. What specific challenges or opportunities do you foresee in this collaborative exploration?

I feel very fortunate to be working with people based at KTH and Stockholm University as part of Digital Futures. People like Airi Lampinen and Rob Comber are leading the way when it comes to careful and critical scholarship examining technology in society. They’re also surrounded by incredible scholars, both junior and senior, pushing hard on the status quo and troubles in computing and beyond.

The challenge for us will be to use our relatively short time together to build something more substantial that might set the stage for the sorts of research environments I’ve described and that might allow us to take reparation seriously. With research being so dictated by small pots of money and short horizons it’s hard to do the structural work required to achieve real change. This is what I really want to do through Digital Futures and, alongside people like Airi and Rob, the challenge is going to be to build our story and get behind it with way too little time.

*Davis, J. L., Williams, A., & Yang, M. W. (2021). Algorithmic reparation. Big Data & Society, 8(2)