The Secret Experiment Behind Facebook's "I Voted" Sticker


The news: That "I Voted" button at the top of your Facebook feed isn't a badge of civic duty — it's part of an experiment you didn't ask to be part of.

The button made its debut in 2008 as a way for users to signal to their friends that they voted in the national election. But since the 2010 midterm elections, it's been tied to secretive research by the social networking site on whether it can influence voting patterns, "a 61-million-person experiment in social influence and political mobilization." Researchers estimated that in 2010, Facebook ultimately "increased turnout directly by about 60,000 voters and indirectly through social contagion by another 280,000, for a total of 340,000 votes."

The 2010 study reached a chilling conclusion: 

It is possible that more of the .60% growth in turnout between 2006 and 2010 might have been caused by a single message on Facebook.

As Mother Jones reports, Facebook conducted yet another voting experiment for the 2012 elections. Three months prior to Election Day, they altered the News Feeds of 1.9 million users to include more hard news stories rather than gushy personal statuses. A Facebook data scientist told Mother Jones this change "measurably increased civic engagement."

This is great news, right? On the surface, sure. Elections can have dire consequences if they're not taken seriously by average voters: After all, it's political and social elites who are more likely to vote and consequently have their interests represented in Congress. So in that regard, Facebook's efforts to mobilize potential voters is a positive development in a country that ranks 120th in the world in voter turnout.

However, there's a darker side to Facebook's democratic experiment. Jonathan Zittrain, famed Harvard professor of Internet law and computer scientist, claimed in New Republic article that Facebook could potentially manipulate its algorithm not only to get people to vote, but to get people to vote for Mark Zuckerberg's preferred candidate.

Zittrain imagined a close election in which "Zuckerberg makes use of the fact that Facebook 'likes' can predict political views and party affiliation." In such a scenario, Zittrain asked what would happen if Zuckerberg used Facebook's tools to mobilize just the people with his preferred political leanings. If such a thing happened in a close enough election, Facebook could influence the results. After all, as Zittrain says, George W. Bush won Florida by just 534 votes in the 2000 presidential election.

Facebook could be the deciding factor in a future election that's just as contested.

Should we be scared? Michael Buckley, Facebook's vice president of global business communications, told Mother Jones the tweaks to voter behavior were implemented "in a neutral manner." He also promised "greater transparency whenever we encourage civic participation in the future."

However, these words are from the same company that manipulated user emotions without permission earlier this year. And as the Atlantic observed, "social networks skew young and female: two reliably progressive-leaning demographics. Even if Facebook distributed the button equally to its users, it might still bring more liberal users to the polls than conservative ones."

Nothing happens by accident on Facebook. Everything we're doing and feeling is being shaped by a shadowy algorithm we know nothing about and cannot control. Unfortunately, the only thing you can do about it is be aware.