Although facial recognition technology uses artificial intelligence, it’s not always very intelligent. The Government’s facial detection system recently rejected a perfectly appropriate passport photo of a young black man because it thought his mouth was open.

I believe most of us have submitted an image to have our passport made. You know how it goes: you’re supposed to have a plain expression, looking straight at the camera with your mouth closed and without a smile. When Joshua Bada submitted a photo just like this, he got rejected after the automated check with a message saying “it looks like your mouth is open.” He still chose to submit the photo, with the explanation in the comment box: “My mouth is closed, I just have big lips.”

Joshua explains that he was a bit annoyed, but he wasn’t surprised. He says that he has encountered a similar problem on Snapchat, where the filters didn’t quite recognize his mouth. As he explains, it’s obviously because of his complexion and “just the way [his] features are.” After sharing what happened with his friends, they reached out to him and they all agreed that it’s funny, but it shouldn’t be happening. I would say the same. The Race Equality Foundation called this and similar cases “technological or digital racism.” Although the problem is in the algorithm, it’s caused by not testing it properly to see if it would work for black or ethnic minority people. This isn’t the first time artificial intelligence proved to be “racist.” Big companies like Nikon and Google had the same problem with their products. Nikon cameras from ten years ago would ask “Did someone blink?” when an Asian woman was taking photos of her family. A bit more recently, Google Photos app tagged African Americans as “gorillas.” Now, I understand that artificial intelligence is, as the name says, artificial. It makes mistakes humans probably wouldn’t. And while some of them are utterly hilarious, some turn out to be offensive for users. And although AI is not always right, it’s up to us humans to train the algorithms better and test them properly in order to avoid situations like this. [via the Telegraph; image credits: Joshua Bada/PA]