In the wake of Donald Trump's victory over Hillary Clinton in the 2016 presidential election, many are accusing Facebook of turning a blind eye to false stories that may have misled voters and affected the vote.
The accusations have been so frequent and so damning that even top Facebook executives have questioned the social network's role in the election, and Facebook employees have reportedly formed a task force to combat fake news on the platform — despite CEO Mark Zuckerberg's dismissal of the issue.
Facebook is arguably the most influential news source in the country. According to the Pew Research Center, 62% of Americans get their news from social media, and Facebook is the leading source.
The continued national outcry may lead to consequential change in the way Facebook curates content in its News Feed — which could have a major effect on what Americans read.
How Facebook's News Feed works
The News Feed is Facebook's algorithmic tool that constantly refreshes news stories, status updates, photos, friend activity and so on. As Facebook says, a user's News Feed is "influenced" by information relevant to the connections one interacts with the most. If you frequently comment, like or share stories from a specific page or person, Facebook will display more updates from that source.
When it comes to politics, it's easy to see how this system can create an echo chamber — especially one that's filled with misinformation.
Throughout this polarizing election, it seems as though political affiliation has had a huge stake in personal relationships. Swaths of "Delete me if you're voting for X" statuses populated scores of Facebook accounts, which likely led to more homogenized News Feeds. That would certainly make it easier for blatantly biased publications such as Breitbart or Occupy Democrats to spread within social circles. But the real danger lies in the completely fictitious sites that some argue had real repercussions for the American electorate.
How the News Feed could change
Facebook employees want to correct the News Feed problem, regardless of what Zuckerberg says. The New York Times obtained private message threads between employees calling for staff meetings to address concerns amid allegations of distributing false news stories. Since Saturday, Facebook has restated its intentions to crack down on fake sites by updating its policy on displayed advertisements — the primary moneymaker for these publications, according to NBC News. Even if Facebook can't vet each article posted, the loss of ad revenue could damage the livelihood of those fake sites.
"In accordance with the Audience Network Policy, we do not integrate or display ads in apps or sites containing content that is illegal, misleading or deceptive, which includes fake news," a Facebook representative told NBC. "While implied, we have updated the policy to explicitly clarify that this applies to fake news. Our team will continue to closely vet all prospective publishers and monitor existing ones to ensure compliance."
It's a simple but effective step from Facebook. The next challenge will be successfully implementing the policy and filtering out remaining fake news sites with enough financial capital to survive a drop in advertisement dollars. Even if Facebook improves its screenings, the damage is done. Tech giants will have to grapple with the notion that their products may have changed the course of the election — and, perhaps, history.