After a 2016 presidential election in which social media was inundated with disinformation and foreign interference, the companies suddenly at the center of our democracy are finally taking some steps to protect against a similar outcome this year. On Thursday, Facebook announced that it will make a number of changes to its platform in order to limit the amount of misinformation voters are exposed to. That includes restricting political ads the week before Election Day, combating posts that encourage people not to vote, and preventing politicians from falsely claiming victory if the results don't go their way. Gee, I wonder who that policy was put in place for?
The decision to ditch political advertisements for the final week of election season is unlikely to be a policy that satisfies anyone. Under this temporary rule change, Facebook will not allow politicians to place new ads on its platforms, including Facebook and Instagram, starting October 27. However, the company will allow politicians to continue running existing ads, and will allow campaigns to shift who the ads are targeting toward and how much is spent on getting the content into the feeds of users. New ads will be allowed to run after Election Day.
This change amounts to a half-measure at most, particularly given Facebook's history of questionable policy decisions when it comes to political advertisements. The company has refused to apply its fact-checking efforts to ads placed by politicians, allowing the possibility for candidates to engage in paid disinformation campaigns. Elizabeth Warren pushed to try to show how absurd this policy is by running an ad campaign claiming that Facebook founder Mark Zuckerberg endorsed President Trump for re-election. President Trump has already used Facebook's lax policy to his advantage, running ads that contain false information, like claiming Joe Biden offered Ukraine $1 billion in aid if the country would remove from office a man who was investigating a company that Biden's son worked for. Facebook might avoid some particularly desperate and shameless ads in the final week of what is sure to be a contentious election cycle, but it appears to have no plan in place to stop campaigns from paying to spread falsehoods, should they so choose.
The policy falls short in another way, and one that Facebook seems to understand is likely to be a problem, based on its other policy changes. The company will stop restricting ads as soon as Election Day comes to pass, but it is possible that this year's election will not have a result for several days after November 3. Experts have already warned that given the amount of mail-in voting expected, it is likely that we will have to wait days, if not longer, to get the full results of the election. In fact, it's possible that it will look as though President Trump has a clear lead on election night, only for things to dramatically flip as mail-in votes get counted. A recent survey found that Biden voters are twice more likely than Trump supporters to vote by mail. That leaves open the possibility that Trump will declare victory on election night and challenge the legitimacy of the election should Biden end up winning.
Facebook appears to be aware of that fact, noting that it is putting in place policies to combat false claims about the election outcome. The company said that it will add an "information label" to any content on its platform that tries to delegitimize the outcome of the election or tries to falsely claim that lawful voting methods will result in fraud. That's another tactic that Trump has already tried, by claiming that mail-in voting is rife with fraud. It was also the first message from the president that resulted in Twitter fact-checking him. Facebook also said that it will link to official election results if a candidate tries to declare victory before a race has been called in an effort to show users the true status of the race.
In addition to these efforts to mark misinformation regarding election results, Facebook also plans to remove posts that attempt to disenfranchise or prohibit people from voting. The company has previously removed posts that actively discouraged people from voting, but will now take action against content that tries to mislead people about things like ballot access or voter ID requirements. In 2016, an ad campaign on Facebook targeting Hillary Clinton voters suggested that it was possible to vote by text message. That would be taken down under Facebook's new policy.
While the company has come a long way from Zuckerberg saying that it was "a pretty crazy idea" to suggest false information on his company's platforms could affect an election, Facebook still hasn't fully reckoned with its role in swaying the vote. These steps are better than nothing, but it's unlikely that Facebook will escape this election cycle without finding itself involved in controversy.