Book a Demo
search icon
illustration

Anger management in the world of AI

2 min read

Written by: Luc Brandts

publication inner img
illustration

As I have a rather extended commute, I’ve turned to audiobooks and podcasts to make travel time more useful and pleasant. After having finished Geert Mak’s book on the recent history of Europe in Dutch, I turned to another book in English. Here’s where the trouble started: every three or four minutes, Siri thought I was asking a question. And even more so, a question it couldn’t quite answer. So, the reading of the book was interrupted, Siri bleeped, and apologized that it couldn’t quite find that destination or something along those lines.

Now, this is not a blog about complaining about Siri, Alexa, Google Assistant or any of the other technologies out there. It’s quite amazing what these things are already capable of, and there’s a lot more coming. It’s also not a blog about my incorrect usage of Siri.

No, this is about the behavior it yields. While I like to think I’m generally known as “calm and composed” and certainly not someone who’d like to win an argument by raising my voice, I have to admit that after the third time Siri interrupted my book, “calm” wasn’t the best attribute to describe me. Frustratingly, just like in the real world, raising my voice didn’t help at all. Moreover, Siri stays utterly calm, which also doesn’t help to accept its behavior. (I certainly expected a sincere apology).

Apparently, it’s more acceptable to raise our voices at a virtual assistant than at another person; at least the threshold is lower. Still, it didn’t quite feel right. There’s an ongoing debate on aggressive games having a bad influence on young people. First-person shooter games stimulate violent behavior in kids, and as a result, governments have implemented age limits for many of these kinds of games. We now may have another interesting topic at hand: does the extended use of Siri lead to anger management issues? Don’t get me wrong, I don’t think an age limit for Siri is necessarily a good idea.

Maybe, it’s one of the jobs of the future; a job that doesn’t exist yet, but will become normal someday. I do see a future role for anger management coaches for artificial intelligence. The better they become, the closer they are to people, and the more relevant they are. We don’t want to take the shouting at Siri to the real world. Welcome to the uncanny valley.

Author:

Luc Brandts

Group CEO

image of author
yellow dot illustration

Let’s keep in touch

We'll keep you posted on the latest news, events, and publications.

  • This field is for validation purposes and should be left unchanged.