Creating space for dignity and empathy: examining algorithms in government decisions
Automated government decisions can be cost-effective. But using algorithms also makes understanding the reasons behind decisions particularly complicated. Dr Melanie Fink, Assistant Professor at Leiden Law School, has received a Veni grant to fund her research into this.
Over the next three years, Dr Fink will investigate how we can create space for dignity and empathy when automating government decision-making. She will do this by working out a way for humans to fill in the lack of reason-giving capacity of the machine.
One example of this is an asylum application being denied, for instance because the applicant is considered to pose a risk to public order. Dr Fink: ‘The law is based on the assumption that these impactful decisions are taken by humans, whose reasoning behind the decision we can understand – even if we disagree with it. But when algorithms are used instead, such assumptions no longer hold true. This affects the contestability and legitimacy of the decision, but also the human dignity of the individual affected.’
Dehumanising effect of automation
Dr Fink says that automation practices that reduce the complexity of a human life to a series of data are ‘dehumanising’. This was the case for the algorithmic profiling used by authorities to identify suspicious recipients of childcare benefits and social assistance benefits in the Netherlands. Dr Fink continues: ‘People no longer have trust in the fairness of these processes, which ultimately undermines trust in administrative and judicial procedures as well.’
Humans and machines working together
Dr Fink wants to explore how humans can once again have a role in decisions, while continuing to enjoy the benefits of automation. After all, an algorithm can predict the likelihood that someone will commit fraud or their eligibility for asylum and benefits. It also saves the government costs. ‘That raises questions,’ says Fink. ‘What could humans do, which tasks should we assign to machines and in which situations? Which individuals and legal experts are suited to the task of giving substance to the human dignity aspect of government decisions that involve algorithms?’
Expanding the duty to give reasons
Under European law, which is Dr Fink’s specialisation, there has been a longstanding duty to give reasoned decisions. Over time, legal scholars, legal experts and governments have come to view this obligation narrowly, mainly as a tool to ensure the contestability of a decision. ‘Of course, this is important, says Dr Fink. ‘After all, without knowing the reasons for a decision, you cannot defend yourself against it. But this obligation can be so much more. It can form a bridge between the authorities and the individual, helping someone feel seen or heard as a human being.’
The duty to give reasons forms the basis for Dr Fink’s research into how the government and administrative authorities can arrive at better decisions. As she explains: ‘The project touches on access to justice – an issue that’s close to my heart – in a new way for me. People need to trust in the fairness of the procedures first of all, and a key condition for that is feeling treated like a human being who understands how and why the government has arrived at a particular decision.’
A gatekeeper role for administrative authorities
Dr Fink feels that civil servants should also want to treat citizens and other interested parties as human beings, rather than simply as a number. She continues: ‘They can do this by providing meaningful justifications with their decisions. Civil servants should act as gatekeepers between humans and machines – they should be the gateway to humanity.’
Dr Melanie Fink’s research comes at a good time, as the European Union recently adopted new regulations on the use of artificial intelligence. This is exciting for legal experts, as these legislative instruments are yet to be fine-tuned. Dr Fink concludes: ‘My hope is that some of the results from this project will contribute to this process.’
Foto: Igor Omilaev via Unsplash