Google’s AI labels what it sees in your photos, and sometimes it doesn’t really do the best job. Now Google has announced some changes and its Cloud Vision API tool is going gender-neutral. Instead of labeling people in photos as “man” or “woman,” the tool will now play it safe and label them simply as “person.”

Business Insider obtained a recent email reportedly sent to Google developers. It reads that the company’s AI tool shall no longer use “gendered labels” to tag images. According to Google, this change will be introduced because “a person’s gender cannot be inferred by appearance.” Indeed, I believe that we can all agree on this one.

The email further reads that the changes will be implemented in order to align with Google’s own Artificial Intelligence Principles, “specifically Principle #2: Avoid creating or reinforcing unfair bias.” This principle reads as follows: Perhaps you remember the scandal from a few years ago when Google Photos app tagged African Americans as “gorillas.” Google “fixed” the issue almost three years later by simply removing the “gorilla” label from its algorithm. I guess it was the simplest and possibly the safest solution. As for the latest “gender-neutral” tagging system, as I already mentioned, I agree that someone’s gender can’t be determined from someone’s looks. So I guess this is a good call, but I feel that Google played it safe again. However, with dozens of genders that are recognized today, who can blame them? It’s better to just play it safe and tag someone as “person” than to assume their gender. After all, the only certain (and the most important) thing is that that’s what we all are – people. [via The Verge, Business Insider]