Facial recognition can't classify transgender or non-binary people, study finds

Facial Recognition System concept.
Shutterstock
Impact

Debates around facial recognition technology have continued to heat up — and it has even resulted in cities like San Francisco banning the tech's use by local government. Now, a new study has confirmed yet another one of the technology's shortcomings: facial recognition can't classify transgender or non-binary people. As more government agencies and private companies explore facial recognition, the technology's failures leave already vulnerable populations open to increased risk.

The study, conducted by Jed Brubaker, Jacob Paul, and Morgan Klaus Scheuerman from the University of Colorado Boulder's Information Science department, tested popular face analysis services from Amazon, Clarifai, IBM, and Microsoft.

Researchers collected 2,450 pictures from Instagram. Each picture was labeled by its owner with a hashtag indicating their gender. From there, researchers had seven groups of 350 images: #woman, #man, #transwoman, #transman, #agender, #agenderqueer, and #nonbinary.

The systems were most accurate when it came to cis men and women, getting it right 97.6 and 98.4 percent of the time respectively. However, when it comes to trans and non-binary individuals, that's where things took a turn.

Trans men were misgendered as women up to 38 percent of the time. People who are agender, genderqueer, and nonbinary were misgendered 100 percent of the time.

“We found that facial analysis services performed consistently worse on transgender individuals, and were universally unable to classify non-binary genders,” Scheuerman, the paper's lead author, said. “While there are many different types of people out there, these systems have an extremely limited view of what gender looks like.”

Part of the problem is that facial recognition can only recognize what it's taught. If it is only shown gender through a binary, then that's all it can see, and it will try to fit everybody within that — regardless of whether it's actually appropriate.

While this study is important, trans people have raised the alarm on facial recognition before. Os Keyes, a PhD student at the University of Washington's Department of Human Centered Design & Engineering, tackled automatic gender recognition by examining 30 years of facial recognition research.

In a November 2018 research paper, Keyes wrote that the models used to approach gender "fundamentally erase transgender people, excluding their concerns, needs, and existences from both design and research."

The issues posed by facial recognition's failure to see trans and non-binary people are severe. For example, landlords in New York are already trying to use facial recognition in rent-stabilized apartment buildings to replace keys. Keyes told VentureBeat that this could lead to trans people getting flagged and law enforcement called, which is especially dangerous for trans people of color.

These concerns are not far-fetched, especially when remembering that facial recognition already fails to see Black people. Those who live at various intersections, whose humanity is already rendered invisible in society, then become invisible to the machines that it produces.

Many have expressed concerns that facial recognition will only codify various forms of oppression and research is making that alarmingly clear. The solution to this problem rests beyond shoving trans and non-binary people into datasets to train facial recognition, but perhaps instead asking if this technology needs to exist at all.