2021-11-02 RWB at EUC (Arie Kers) (65)
Image credit: Arie Kers

One example of such an application would be monitoring social unrest (e.g. riots and crime) in individual neighbourhoods. Gabriele Jacobs, the Dean of the Erasmus University College, serves as the coordinator for the AI-MAPS project. AI-MAPS, which stands for Artificial Intelligence for Multi-Agency Public Safety Issues, is a study on the safe and ethical application of artificial intelligence for public safety purposes.

Spectre

The idea of municipal governments, the police, tech companies and researchers all collaborating on artificial intelligence might evoke the spectre of ‘Big Brother is watching you’, but that is exactly what Jacobs and her fellow grant applicants are seeking to prevent in conducting this study. “Several municipalities already use artificial intelligence, which can detect (among other things) whether protests are getting out of hand and whether violence is being advocated in certain chat rooms. We want the development of applications such as this one to start with all stakeholders being involved. So we don’t want to do things the way they are usually done: a tech giant develops something, people start using it, and then other people start looking at how desirable all of this is. We want to look in right from the start and see how we can allow normal people, tech companies, non-governmental organisations and government agencies to have their say on the product being developed in real-life situations.”

Jacobs realises that her team is seeking to get a very large number of stakeholders involved – basically, our entire society. This may result in the process becoming extremely unwieldy. However, figuring out how this is supposed to be done is actually one of the objectives of the study. “We want to find out how we can get that many stakeholders to collaborate. How does a municipal government find the right tech company? And how can we get a tech company to collaborate with legal experts in coming up with creative ideas regarding the legal and ethical aspects of artificial intelligence? We hope to develop a method for this. What is special is that we explicitly wish to focus on the role and meaning of experience and the imagination.”

Walk

Onderzoek kunstmatige intelligentie AI-MAPS Gabriele Jacobs illustratie klein – Elzeline Kooy
Image credit: Elzeline Kooy

Jacobs explains that while they were drafting the grant application, many of the stakeholders met several times. “It was a magical experience! We had a tech company, the police and the Willem de Kooning Academy all in the same room, and somehow we were all completely on the same page. Basically, the application procedure was an example of the method we hope to develop in the study.”

Isn’t there a risk that the main contributors to the project will turn out to be companies and government agencies, while it proves hard to get regular people involved? Jacobs acknowledges that this is a real risk. “For instance, we wanted to get activist movements involved in our study, as well, but they were unwilling to join us. They’re not likely to talk to the police and the university. But we have developed – partly through the Willem de Kooning Academy – all sorts of methods to get people involved. For instance, we offer a walk through the city in which people gain certain experiences in certain strategic spots and are asked questions about the city.” So basically, they are a focus group of sorts, but one that is actually taken to the streets.

Convergence

Jacobs says that the project is a good example of joint research with a societal impact – exactly what modern science is supposed to be all about. In addition to EUC, the parties involved include ESL, ESSB, Delft University of Technology, Leiden University, The Hague University of Applied Sciences and others. “This transcends one academic discipline or one faculty. We have stated the problem and now we are trying to find the right experts.”