At Facebook, Race Is an Issue That Goes Far Beyond the Company Wall

Impact

Racial tensions boiled over at Facebook this week when CEO Mark Zuckerberg posted an internal memo telling employees that malicious racist behavior would not be tolerated at the company.

The memo, obtained by Gizmodo, was written in response to Facebook employees crossing out instances of "Black Lives Matter" written on whiteboards around its headquarters in Menlo Park, California, and replacing the message with "All Lives Matter." Zuckerberg called this behavior "malicious" and "disrespectful" before schooling his staff on the meaning of Black Lives Matter.

"There are specific issues affecting the black community in the United States, coming from a history of oppression and racism. 'Black Lives Matter' doesn't mean other lives don't — it's simply asking that the black community also achieves the justice they deserve," Zuckerberg wrote. 

According to a source within Facebook, the issue has been ongoing for weeks. (Facebook declined to comment on this story.)

This sudden wave of "All Lives Matter" sentiment at Facebook may have been prompted by two visits in early February: one, from members of the Black Lives Matter movement; the other, from Shaun King and other black activists, who spoke at a company town hall. After these meetings, "Black Lives Matter" started appearing on some of the company's whiteboards.

Free speech vs. hate speech

Mic/Getty Images

Malkia Cyril, founder and executive director of the Center for Media Justice and a member of Black Lives Matter, manages several Facebook pages. Recently, she has come up against an overwhelming amount of profane, racist and threatening messaging on the public pages she manages. 

When she posts a video, it might take her three hours to prune her comments, she told Mic. "And I come back three hours later and it's just as many. I was like, 'This is not working.'" 

Cyril organized a meeting with Facebook's content and policy change team at the company's Menlo Park headquarters. Representatives from Black Lives Matter, the Center for Media Justice, Color of Change and the Ella Baker Center met with Facebook head of global policy management Monica Bickert, community engagement manager Aaron Moses, senior counsel of global infrastructure Bari Williams, policy communications agent Parisa Sabeti and public affairs associate Matt Steinfeld. 

At the meeting, Cyril said, she and her colleagues highlighted some of the issues they were experiencing. The first issue was the double standard about posts that violate Facebook's terms. For instance, she told Mic, black women will post photos of their naked breasts to draw attention to the killings of black women under the hashtag #SayTheirNames. And each time, Cyril said, Facebook is fast to pull down those photos in accordance with its community standards. Within context, it's clear these photos are not porn. They're using nudity to make a political statement.

"And yet, being called a nigger, which is equally an explicit violation of Facebook's policies, not only can take weeks to get addressed, but also frequently we're told, 'It doesn't violate our code of conduct.'" Cyril told Mic. The company did attempt to clarify how it moderates its platform last year, but instances of perplexing content deletion and unaddressed hate speech persist. 

When she spoke to Facebook, the company's community standards team told her that they recognized that the unaddressed hate speech was legitimately violating their community standards. One of the possible reasons that the posts weren't taken down is because the people who respond to the requests may not have understood them. 

According to Cyril, the company explained that because takedown requests are monitored by employees all over the world, the company removes any accompanying text, because the person reviewing may not understand the language. Text that gets flagged goes through a separate moderating process altogether.

An image of a white baby next to a picture of a baby ape doesn't violate Facebook's community standards. However, that same image with text that says "white babies and black babies are not the same" would. And while comparing black people to apes is a notorious racist trope, not everyone in the world knows that. 

"You're having people evaluate hate speech who have no context for the hate speech," Cyril told Mic. "On the other hand, let's keep it real. People are racist against black people all over the world. It's not limited to the United States."

"You're having people evaluate hate speech who have no context for the hate speech," Cyril told Mic. "On the other hand, let's keep it real. People are racist against black people all over the world. It's not limited to the United States."

During the meeting, Cyril and her associates agreed to send over some recommendations for how to deal with this issue. The group advised the company to put together a task force to address racism on the platform that includes black users. They also suggested the company implement policy changes and technical solutions that address some of the failings of their human moderators. 

But Cyril isn't sure that these proposals will be enough to effect change on the platform.

"You have an almost all-white company, and under those conditions, you cannot expect anything different to happen," Cyril told Mic. "They cannot effectively address anti-black racism in their company or on their platform without some change."

So where does that cultural change begin?

Facebook's staff is overwhelmingly white and male.

Alison Yin/AP

Since releasing its first diversity report in 2014, Facebook has made very public overtures to diversity — but the numbers don't yet represent the company's values. Black employees comprise 2% of Facebook's total staff and only 1% of its tech employees. There may likely be a dearth of voices inside the company to push for these issues to be resolved. 

In 2014, the company announced it had partnered with the Anita Borg Institute, Girls Who Code, the National Society of Black Engineers and the Society of Hispanic Professional Engineers, promising to expand internship opportunities for underrepresented groups. 

A year later, the percentage of black employees had not risen, which is not surprising for a company of Facebook's size. But Facebook, at least publicly, seemed determined to show it could usher in more diverse hires by implementing a rule that ensured minorities would be represented in each batch of job candidates up for a new position. Nearly a month after the report, the company took its diversity measures a step further, releasing a series of training videos on how to manage unconscious bias in the workplace. The company will release its third diversity report this summer.

The diversity training was widely seen as a step toward changing Facebook's culture and welcoming a wider spectrum of backgrounds to the company. A recent Harvard study, however, demonstrated that diversity programs don't necessarily lead to more diverse hires, and can even cause white men to feel discriminated against

Facebook's responsibility as a company, as a platform

Getty Images

There is a big question around Facebook's responsibility in moderating people's freedom to express their every thought — racist or otherwise.

Cyril argues that we need to hold Facebook accountable the same way we would a media entity like CNN. "People have done all manner of research on how television news, especially cable news, has influenced behavior, policy and the culture in general," she told Mic

Facebook, with its 1 billion users, arguably wields as much influence as cable news. To that point, a Pew study from 2015 indicates that millennials receive most of their political news through Facebook. 

Furthermore, what's shared on Facebook may have consequences that extend offline. Tanya Faison, head of Black Lives Matter in Sacramento, California, told Mic that racists who troll the organization's local Facebook page also show up at rallies and other events, sometimes toting guns. Though no violence has erupted at Black Lives Matter events in Sacramento, problems persist elsewhere. At the end of 2015, five people were shot at a Black Lives Matter protest in Minneapolis. 

Getty Images

Faison said she has blocked harassers before, but she can't stop them from posting on her organization's events pages, which are public by necessity. Faison would like to see Facebook mitigate some of these technical issues, and she was present at the meeting with the social network. But, she said, she was disappointed with the outcome of the meeting. 

"There was no agreement on any resolutions at the time of the meeting," Faison told Mic.

Meanwhile, Zuckerberg seems committed to battling hate speech, at least in spirit. At a town hall in Berlin, Germany, he announced Facebook was determined to get better at removing hate speech on its platform, according to Bloomberg News. In this case, he was specifically referring to posts that denigrate refugees. As for Black Lives Matter and other black activists, Facebook has promised to stay in conversation about its hate speech policies going forward.

While Cyril is optimistic about potential changes, she's convinced that in order to effect change on the platform, Facebook needs to address racial issues inside the company. When employees are crossing out "Black Lives Matter" in favor of "All Lives Matter" in the workplace, their colleagues may not feel motivated to speak up on issues of racism. "It's not just a matter of holding these employees accountable, because it's not really about punishment, it's about change. Real accountability will happen when Facebook can claim a diverse workforce."