On Sunday, Brooklyn-based programmer Jacky Alciné uncovered a serious flaw in Google Photos' auto-labeling system. The program's algorithms, which are perpetually learning how to identify what's in your photos so you can search for them by subject, had categorized his photos. He had skyscrapers, airplanes and cars.
And then, over a photo of Alciné and a friend, who are both black, Google had given them the label "Gorillas."
Alciné alerted Google's team through Twitter, to which the chief architect of social reached out to help.
While the programming is advanced — it can identify photos of structures like the Eiffel Tower and tell you where the photos were taken, according to Yahoo Tech — Zunger went on to explain that it still has at least one awful issue.
"Searches are mostly fixed, but they can still turn up photos where we failed to recognize that there was a face there at all," Zunger wrote to Alciné via Twitter. "We're working on that issue now. ... We're also working on longer-term fixes around both liquestics" — that is, "words to be careful about in photos of people" — and "better recognition of dark-skinned faces."
Zunger told Alciné the issue has been fixed. But the program still runs into mislabeling people, of all races, saying until recently, the program confused white faces with dogs and seals. "Really interesting problems in image recognition here: obscured faces, different contrast processing needed for different skin tones and lighting, etc.," he wrote to Alciné. "We used to have a problem with people (of all races) being tagged as dogs, for similar reasons."
Google Photos likely wasn't programmed to be racist. Some of the responses to Alcine's post sure are, however.
What's worse, this isn't the first time an auto-labeling system has failed its users. In May, the advanced image-recognition system of the photo-hosting site Flickr auto-tagged "animal" and "ape" on a photo by photographer Corey Deshon of a man named William. The tags are now removed.
As PetaPixel pointed out, Flickr also labeled a photo of the concentration camp Dachau with "jungle gym" and "sport." And in 2009, HP's webcams allegedly weren't able to identify black faces, but could easily pick up white faces, only motion-tracking the latter.
Trying to make computers act like humans isn't a small feat, and plenty can go wrong. To prevent it in the future, more attention needs to be put on creating a complete database of people. The program is only as powerful as it's told to be, and to find out where things are getting lost, it needs to know how diverse the world actually is.