Last week, Facebook’s employees staged a rare virtual walkout in protest of CEO Mark Zuckerberg's do-nothing stance on President Trump's posts encouraging violence against protestors. Now, the company is working on helping Facebook Group moderators better address racial issues.
Facebook is home to more than 10 million Facebook Groups consisting of 1.4 billion members. If you can think of a topic, there's probably a Group devoted to it. The posting rules for each Group are set by the moderators, many of whom don’t allow 'political' content. This has led to big problems for Black members, who have found their posts about racial issues like Black Lives Matter marches blocked or deleted by moderators.
A report by The Verge detailed the story of a member, Jocelyn Kopac, of the Facebook Group 'Boss-Moms,' where mothers who are business owners can gather and chat. She described a group in turmoil as moderators deleted any posts related to BLM, police brutality, and related content under the belief that the subjects are 'political.' What the moderators failed to realize, however, is that these subjects are not political to Black users — they are part of their everyday lives.
"People are saying this is a political problem," Kopac told The Verge. "It is not a political problem. It's a human problem."
Another member of a Facebook Group helmed by white moderators suspected the group's mods were purposely calling racial issues political to maintain an image. "I just think it's sad that communities are trying to censor this to make it seem like we are a perfect society instead of trying to tackle the issues," the user told The Verge.
The arguments over whether discussions involving race and Black lives are off-topic or 'too political' have shattered Groups into segments. Those who feel silenced by the larger Groups are now creating smaller Groups that sometimes contain "more extreme people" because they tend to attract members who "are mad at the other [group]," another user told The Verge.
Facebook has tried to help Group moderators navigate these issues by providing training courses and advice on how to handle issues such as race. But the task is still daunting; most moderators are just regular members who volunteer for the positions and end up overwhelmed by the debates that pop up. Facebook has tried to lighten the burden by offering tips such as thoroughly researching social issues, opening moderator positions to individuals of diverse or underrepresented groups, and creating clear examples of what constitutes as politics if the group has a 'no politics' rule.
These guidelines can feel a bit hypocritical, though, given Facebook's resistance to its own employees calling for the company to stop promoting President Trump’s misleading and dangerous posts.
The feeling of BLM and Black issues being silenced is worsened by the number of white supremacy accounts and groups that are allowed to openly exist on Facebook. To its credit, the company has been busy running around trying to get rid of groups encouraging armed members to go to Black Lives Matter protests, and recently banned nearly 200 accounts related to white supremacists. Facebook has also stepped up to restrict the spread of 'boogaloo'-related Facebook Groups — an anti-government, far-right extremist group that believes in provoking a second Civil War — after reports revealed that President Trump’s anti-lockdown tweets were inspiring members to 'liberate' states and stirring up discussion of how to make explosives or plans for possible acts of rebellion against their local government. But the groups still aren't completely banned from the platform unless their members are found encouraging real-life violence, and the number of boogaloo-related groups has only grown since November 2019, according to a report by the Network Contagion Institute released in June.
Facebook's relatively hands-off approach to Facebook Groups has turned it into a platform where white supremacy continues to thrive despite the company's earlier promise to expel such groups. In fact, a report published in May 2020 by the Tech Transparency Project (TTP) found that "[many] of the white supremacist pages identified by TTP were created by Facebook itself. Facebook auto-generated them as business pages when someone listed a white supremacist or neo-Nazi organization as their employer."
Despite the attempts to make amends for its initial lack of moderation, the company's slow reaction to ban these groups combined with moderators censoring and smothering racial topics in other groups has created an online environment that mimics the real world in an unfathomably unfortunate way: One where Black voices are minimized when inconvenient.
Mic has reached out to Facebook for comment on the guidelines offered to Facebook Group moderators and the banning of white supremacist groups, and will update this article if we receive a response.