Universiteit Leiden

nl en

GPS blunders and security risks: why do we blindly follow technology?

Computer says no: end of story. Twenty years ago, a hilarious line in the British TV series Little Britain, now a reality. We all blindly follow technology at times, with varying consequences. For ISGA lecturer and researcher Daan Weggemans, it's a subject worthy of a PhD.

Have you ever gotten into trouble by blindly following technology?

Yes, many times! To give you an example: the day I submitted my book in May, I wanted to celebrate in the city. It was very cloudy, so I checked Buienradar to see if it would rain. It said it wouldn’t, so I went out without a coat. Well, I have never been caught in the rain so badly in my life. And that, after spending years working on a book about blind trust in technology and extreme obedience.

What have I learned from it? It’s so deeply ingrained in us, which is why it’s important to understand this phenomenon. No matter how aware you are, you still end up following technology more often than not. It’s much more than just a tool—it guides us far more than people tend to think.’

Daan Weggemans proudly shows his book: 'Computer says no'.

And digital developments are advancing rapidly.

New digital developments often help us make better choices. This makes life genuinely easier and helps us deal with the complexity of the world. The downside is that technology can also make mistakes, or we might use it incorrectly and then blindly follow it. That’s how you end up soaked in the rain or arriving at the wrong destination with your GPS. The difference now is that these technologies are also used in highly critical areas, such as healthcare, the police, customs, and the tax authorities. In those cases, the consequences of the same small errors can be huge—or even life-threatening.’

Time for your research, then.

We need to start thinking about the impact of digital technologies right now because this is just the beginning. Everywhere around us, we find advising and communicating technologies telling us what to think or do.

'A kind of cultural void has emerged, and technology has stepped in to fill it'

With new systems, it’s crucial to ensure that humans retain some level of control, especially in key organisations. The human should make the final decision, but there are cases where they can’t actually oversee the process at all. Take the example of a parking enforcement officer in a scan car—this role has become far more dependent on technology. You might argue that the human is still present, but what power do they actually have?

The technology makes no exceptions, I read in your book.

Yes, it is very consistent. No means no. The question in my book is: how can we better understand and interpret our interactions with such technologies?

My answer to this is: perhaps we should start looking at technology as an authority. That might sound strange because authorities are people, right? Like police officers, teachers, or individuals in positions of power or knowledge.

For a long time, the idea was that authorities couldn’t be things. We tend to think of technology as passive—tools that act on our behalf. But that’s not entirely true. Technology can mislead us, persuade us, or even force us to behave in a certain way. However, what we didn’t really have an explanation for was why people would abandon their own judgement, without coercion or debate.

That’s what this book is about. Authority gets things done, based on trust or respect. But can technologies also possess this capability? And if the answer is yes, how?

Do you also find yourself thinking about the future, wondering: where is this heading?

It's only becoming more extreme. The idea of authority carries a negative connotation. Think of the Milgram experiments and wars, which revealed how far people could go in obeying authority.

But authority relationships are often very useful as well. There are simply things you don’t know, so it’s helpful when a professor, parent, or police officer provides an answer. Or when a judge can make a ruling.

'The idea of authority carries a negative connotation—think of war'

Now we see that the guiding role of digital technologies in our lives is becoming increasingly significant, independent, and sophisticated. They are behaving more and more authoritatively, and in a complex world, it is becoming increasingly difficult to say no to them. Organisations have access to vast amounts of data, so if your screen tells you that someone is a threat and shouldn’t be allowed on a plane, or that they might be a fraudster—how do you, as a human, verify that?’

But then is a human exception made?

Not always. There are international cases where young children were not allowed to board planes because they were mistakenly flagged as terrorists. If a customs officer then decides to follow the system anyway, what does that mean for all the other cases—like if I were to arrive there?

With the digitalisation of our society, we’re going to see many more cases like this. Technologies hold unique positions of information, and we increasingly give them names that convey authority—‘Expert System,’ ‘Socrates,’ and so on. You think, ‘Well, they must know best.’

And beyond that, there’s also a cultural layer beneath this world of technology. It provides answers to questions about uncertainty in our daily lives. In the past, we had religion; the priest, minister, or village elder provided the answers. A kind of cultural void has emerged, and technology has stepped in to fill it.’

Is technology the new preacher?

In a way, perhaps it is—or the new police officer, teacher. Technology can take on all these roles, and that is quite intriguing.’

How can we best navigate this development?

For countries or organisations, the task now is to establish how to handle this effectively: when is compliance beneficial, and when should we set limits to prevent potential errors?

This can be done by training people, raising awareness, and giving individuals the space to make different choices—'computer says no, but I don’t care' But it also means accepting that mistakes will be made.

When things do go wrong, there needs to be a mechanism in society to resolve issues as quickly as possible and compensate people if they end up on a certain list unfairly. We’re not there yet. How do we ensure accountability and liability?

Little Britain: Computer always says no to Carol Beer

Due to the selected cookie settings, we cannot show this video here.

Watch the video on the original website or
This website uses cookies.  More information.