How a Simple Google Image Search Exposes Society's Most Harmful Stereotypes

A collage with the results from a simple Google image search

Google's mission is to organize the world's information. In doing so, it serves as an inadvertent mirror for our cultural stereotypes and ideals.

The company sends millions of digital "spiders" around the Internet to scan pages and store data for indexing. In Google Search, websites are ranked based on relevance, determined most by the number of links that connect the billions of pages on the Web. The more citations, the more authoritative a source is.

In July 2001, the company introduced Image Search, which ranks visual material in order of relevance to the inputted term. Images earn relevance by living on highly trafficked sites, appearing on pages next to the searched term and about 200 other factors. 

Google says the function of its search algorithms is to "take your questions and turn them into answers." These answers, regardless of the computational process behind their generation, are not objective. 

"Search engines function as gatekeepers, channeling information by exclusion and inclusion as well as hierarchization," explained Institute for Network Cultures researcher Miriam Rasch in August. "Their algorithms determine what part of the web we get to see and their omnipresence fundamentally shapes our thinking and access to the world."

We see this at work every day with Google. Previous studies have shown that Google searches can expose "racial bias in society," where names "typically associated with black people were more likely to produce ads related to criminal activity."

Whereas web search points users to answers elsewhere on the Internet, Image Search returns data-derived mosaics that are often sufficient "answers" in themselves. These image sets collectively communicate a Platonic ideal of the searched-for term. In some cases they can be seen to reflect the stereotypes, preconceptions, attitudes and ideals of the American Internet-browsing public.

According to Google's algorithms, "beauty" is best embodied by a young, thin, white brunette, preferably near a flower.

The fact that "beauty" returns only women reflects the fact we pay more attention to the physical appearance of women than that of men in a variety of contexts. That many of the photos representing "beauty" showed women applying makeup reveals this intensity of this ideal: Women have to change themselves to achieve it.

Here are the results for "powerful person":

You get the predictable Obamas and Putins. But Eric Cantor and Michael Jordan also turn up, while no women do. You have to scroll through the results for a while before you encounter one.

Some of the most revealing search results are for occupations. "Doctor" surfaces a platter of mostly white, mostly male faces:

So does "engineer":

Same for "executive," although there are noticeably more women represented here than in previous searches:

Some results are a bit more nuanced. For example, "teacher" returns almost exclusively women:

But "professor" returns almost exclusively men:

Image Search also indicates that the negative connotations of the word "thug" are most often applied to black men.

Why it matters: Roughly 3.5 billion Google searches are conducted every day, and 65% of the population are visual learners. Google exerts broad influence over our access to information; Image Search is a significant part of that influence. Even academics have been shown to regularly consult Google Image Search for peer-reviewed research.

Image Search isn't just a neutral conduit of information. How it chooses to display results constitutes an act of authorship. The Google algorithm wasn't handed down by God: Behind any algorithm are human hands. Google says it changed its search algorithm 665 times in 2012 alone. (The company has not responded to requests for comment on the process behind Image Search.)

The meanings behind the above mosaics transcend the formulas set by Google's engineers. All of these images were created, uploaded, distributed and redistributed by humans who don't work for Google, i.e. the rest of us. These images' coalescence into discrete semantic units is an expression of our collective attitudes and preconceptions. And while it's ultimately on us to address the common cultural and social stereotypes we inadvertently or unconsciously deploy on a daily basis, Google's engine is having a real impact on how people understand themselves and see others.