Bart Custers discusses facial recognition on Nieuwsuur
Despite a lack of specific legislation on this issue, Dutch Minister of Justice and Security Dilan Yeşilgöz is allowing the national police force to experiment with facial recognition technology.
Since 2019, the Dutch police force has not been allowed to experiment with facial recognition technology due to a lack of legal and ethical safeguards on this issue. Until about a year ago, that was, when the police released their own document containing basic principles for the use of facial recognition technology. The Dutch minister considered this sufficient grounds to allow experimentation with the technology. Bart Custers, Professor of Law and Data Science at eLaw, the Center for Law and Digital Technologies, expressed his critical view on this on Dutch news programme Nieuwsuur broadcast on 5 January 2024.
The EU Artificial Intelligence Act, which is set to regulate artificial intelligence, is currently being drafted. This will ensure that the use of AI and facial recognition in police investigations is better regulated. The current police framework does not adequately reflect this.
Custers says that many cases never make it to court – particularly in situations where someone is not guilty and has done nothing wrong, while people's rights have been violated in the process. This then renders the police’s current internal review insufficient. External experts should also be involved in matters relating to privacy and other fundamental rights, for example.
Another concern is that current facial recognition technology is not always reliable. The Nieuwsuur broadcast features someone who the technology wrongly identified as a suspect. Custers points out that this occurs on a regular basis and that the technology is also less reliable for certain ethnic groups as AI is regularly trained using biased datasets.
More information on the document (in Dutch) developed by the police is provided on the Nieuwsuur website. The programme broadcast on 5 January 2024 is available here.