Facial recognition and video surveillance can have different applications, both good and bad. But government officials in a Chinese city decided to use it for the latter. They photographed people wearing pajamas in public and announced their photos to shame them. And as if that weren’t enough, there were also names and other personal data published along with the photos.

On Monday, the officials of Suzhou, Anhui province, published images of people wearing pajamas in the street on the official WeChat account. The officials called it “uncivilized behavior,” arguing that the region was entering a national “civilized city” competition. They further added that residents were banned from wearing pajamas in public. Other than wearing pajamas, other behaviors were publicly shamed too. These included “lying [on a bench] in an uncivilized manner,” and handing out flyers.

As if public shaming weren’t bad enough, the photos also included sensitive data such as names and ID card numbers of the people caught on camera. Naturally, it caused disbelief and criticism. According to the BBC, some people argued that there was nothing wrong with wearing pajamas outside. Others focused on a much bigger problem: by this kind of public shaming, the officials infringed people’s privacy. The BBC writes that the officials “sincerely apologized” later, but still argues that they wanted to “put an end to uncivilized behavior.” They added that “of course [they] should protect residents’ privacy,” but the damage is done. Now, wearing pajamas outside isn’t really my style, but I still think it’s not material for public shaming. First of all, there are much worse things that people do and they do deserve shaming. And second, wearing pajamas doesn’t hurt anyone at all. But a much bigger problem here is publicly exposing sensitive data such as personal names and ID numbers. Just imagine the ways they could be misused, and they were published publicly for what? Publicly mocking people for wearing pajamas. I’m pretty sure that this isn’t how facial recognition should be used! [via BBC]