Research Dutch universities should help prevent scandals if the allowance affair

More and more governments and companies are relying on artificial intelligence and decision-making algorithms. However, negligence and invasion of privacy must be prevented. Recently, the Dutch Organization for Scientific Research (NWO) allocated 21.3 million euros to the project ‘The algorithmic society’, in which various Dutch universities participate. Automated decisions in, for example, the judicial system and the hospitals are examined.

Most people now know the scandal: the compensation case, in which the tax authorities violated human rights by implementing algorithms that led to ethnic profiling. The example clearly shows that taking ethical considerations into account when using algorithms and artificial intelligence is indispensable for professional organizations to function healthily. After all, disturbing situations can arise when too much is focused on solely computer results and bureaucracy in an organization.

“People are quick to trust systems and that they do what they have to. But do they also know on what basis the data is made available to them? This is often not the case, “says José van Dijck, professor of media and digital society at Utrecht University (UU). She is part of the new project from UU. “We will explore how we can guarantee important values ​​such as privacy, equality and security.”

To combine forces

The ten-year project is led by UvA University professors Natali Helberger and Claes de Vreese. Other participants are, in addition to Utrecht University, Erasmus University Rotterdam, Tilburg University and Delft University of Technology. Van Dijck hopes that the research can, among other things, help to create awareness among professionals who deal with automated processes on a daily basis. “How can we ensure that they are constantly aware of safeguarding public values? By understanding how the rollout of algorithms in different sectors of society is progressing, we can ultimately clarify this. ”

human rights

Utrecht University will receive approximately 3.5 million euros from the total budget. One of the projects on which the money will be spent is the Impact Assessment Human Rights and Alrogitmes (IAMA), developed by Janneke Gerards at Utrecht Data School. Together with colleagues, she created an instrument, a kind of manual, that supports organizations in decision-making around the development and use of algorithms. Discussion points are described step by step, which must be addressed before the implementation of the algorithm.

When following IAMA, one of the first questions that politicians should ask themselves is what the specific goal is with the use of algorithms. It also says that people should always have the freedom to reject the decisions of the algorithms. By illuminating in this way the course of a careful decision-making process of algorithms, the manual can help prevent problematic situations, such as the reimbursement case.

Research in algorithms within administrative bodies is just one of the focus areas in the Algorithmic Society. Van Dijck is enthusiastic about the various areas of expertise that the universities bring together in the project. “In Utrecht we focus on governance issues, but in Amsterdam the focus is on impact research of algorithm applications, while the focus in Rotterdam is on applications in the healthcare system. The research focuses on three sectors (media, justice and health), but is widely applicable. “

A moving target

One of the challenges facing scientists in research is the ongoing development of technology in society. This makes it difficult to predict how algorithms will be used in the future. Take, for example, voice assistants. They are increasingly used in schools and hospitals. Who knows what effect it will have on the way we handle algorithms in practice.

“We are studying a moving target. That is why we must always keep our eyes open for change.”

Leave a Comment