It's not quite an exodus, but it's certainly a movement. On Tuesday morning, another Facebook employee resigned from the company over its failure to adequately address the rampant hateful and racist speech on the platform. Ashok Chandwaney, a software engineer who spent about five and a half years at Facebook, said in a resignation letter posted on the company's internal employee network and obtained by the Washington Post that they could "no longer stomach contributing to an organization that is profiting off hate in the US and globally."
In the 1,300 word post, the 28-year-old Chandwaney laid out their reasons for jumping ship from Facebook, despite finding the company to be a "pleasant and mutually respectful workplace." The decision largely boiled down to a simple conclusion in Chandwaney's view. Facebook, despite the efforts of its own employees, external groups, and ongoing pressure from advertisers, has decided to continue to place profits over social good. "Facebook is choosing to be on the wrong side of history," they wrote.
Chandwaney was not short on examples. Using Facebook's five core values ("Be bold," "focus on impact," "move fast," "be open," and "build social value") as guiding posts, they showed the variety of ways the company has failed to live up to its own aspirations.
The company failed to "be bold" when it chose not to take action to address the growing amount of hate speech on its platform, even as advocacy groups like the Anti-Defamation League and major corporations including adidas, Coca-Cola, Ford, Starbucks, Unilever, and Verizon demanded Facebook address the problem. "To me being bold means seeing something that's hard to do but, knowing it’s the right thing to do, rolling up my sleeves, and diving in," Chandwaney wrote. "Boldness is not, on the other hand, taking a pass on implementing the recommendations from organized civil rights advocates ... and even our own civil rights auditors."
Facebook failed to "focus on impact" repeatedly, in Chandwaney's assessment, even when the company was directly faced with the devastating effects the platform has had in sewing hate and violence. There is perhaps no clearer example of Facebook's harmful impact than the genocide in Myanmar, during which the Myanmar military targeted and killed the Muslim Rohingya minority group living within the nation's borders. The ethnic cleansing was incited and organized on Facebook, and the company was slow to react as the killings were occurring. Similarly, Facebook failed to recognize its impact when right-wing militia groups in the US started organizing and calling for acts of violence on its platform. These groups were brought to the attention of Facebook moderators repeatedly but allowed to operate until one of the members shot and killed two people in Kenosha, Wisconsin, finally prompting Facebook to ban them.
On moving fast, Chandwaney noted that Facebook had every opportunity to stop these types of events from occurring, but chose not to. While the company often claims to move faster than its competition in addressing hate speech on its platform, Chandwaney said they more often saw the company drag its feet. "Feedback is supposed to be a gift, yet despite the enormous feedback (and multiple lawsuits for discriminatory ads) very little action has been taken," they said, going so far as to say that the changes the company has made are ones that could have been done "instantly" if Facebook truly saw those changes as a priority. "The actions that have been taken are easy, and could be interpreted as impactful because they make us look good, rather than impactful because they will make substantive change," they wrote.
Similarly, Facebook has failed to live up to its principle to "be open." Chandwaney said that the company has chosen to "hide the receipts" after it was made public that Facebook removed strikes against conservative publications for spreading misinformation. While Facebook has made some gestures toward openness and addressing its failures, including commissioning and publishing a civil rights audit of its own platform, even that has been lacking. It took two years for the report, which found Facebook had "real world consequences that are serious setbacks for civil rights," to actually be released and the company offered little insight into the process while the audit was taking place.
All of these failings across the company's principles led to Facebook also falling short in one of its final core values: "build social value." Given the company's clear role in not only allowing hate speech, but in some cases playing an active role in helping it spread, it can be hard to conclude that Facebook’s role in providing connectedness is actually a social good. "To this day, the meaning of this value escapes me," Chandwaney wrote. "In all my roles across the company, at the end of the day, the decisions have actually come down to business value."
Facebook did not respond to a request for comment on Chandwaney's departure.
Chandwaney is not the first employee to exit Facebook in response to the company's failure to take action on hate speech. Earlier this year, engineer Timothy Aveni left the company after CEO Mark Zuckerberg publicly stated he would not take action against a post by President Trump in which he said that "looting" would lead to "shooting." That decision also led to thousands of Facebook employees participating in a virtual walk out. It's possible that more employees have left the company quietly in response to its inaction, but it seems likely that more will leave like Chandwaney and Aveni, offering sharp critiques and warnings on their way out.