Last November, writer Lindy West reported a harrowing tweet hurled her way, showing an image of a Thomas the Tank Engine character and the words, "CHOO CHOO MOTHERFUCKER THE RAPE TRAIN'S ON ITS WAY, NEXT STOP YOU." In return, she wrote on the Daily Dot, Twitter notified her that they found the comment was "currently not violating the Twitter rules."
This is just one of countless examples of not only the horrendous harassment and abuse women especially face on social media, but the refusal of social media companies to address it. The organization Women, Action & the Media is working to tackle this pervasive problem.
The group released a report Wednesday about Twitter-based harassment based on data the organization collected while granted the status of "authorized reporter" by Twitter in November. This status, according to the report, enabled third-party organizations to report on behalf of individuals and allowed those reports to be prioritized by Twitter differently.
The goal of the report is "to inspire constructive discourse as well as systemic and structural change to make Twitter and other platforms safer for all voices, especially those who are targeted the most — women," Jamia Wilson, the executive director for Women, Action & the Media, told Mic via email.
And the report certainly revealed findings worthy of conversation.
While those experiencing harassment may be routinely ignored by social media companies, WAM! found that this doesn't extend to other users of the platform. Bystanders and delegates (authorized agents of the person receiving harassment) compose the majority of reporters, the report found, which speaks to the concept behind HeartMob, a platform built to address online harassment by harnessing the power of bystander intervention.
According to the report, of the 317 genuine harassment reports submitted to Women, Action & the Media, hate speech and doxxing (releasing private information) were the most common incidents. Interestingly, 19% reported "other," which led group to conclude that Twitter should better define what constitutes online harassment and abuse — and therefore increase accountability for more types of harassing behavior.
In 55% of the 161 cases of abuse WAM! submitted to Twitter, the social media platform deleted, suspended or warned the alleged account. Given anecdotal experiences like West's — and even Twitter's own CEO admitting the company "sucks" at handling online abuse — this rate seems like an encouraging start to a more rigorous policy that doesn't tolerate abuse.
Twitter acted on requests by WAM! more frequently than they declined them, according to the report, even suspending or warning users who made threats of violence three times more often than they declined them. Similarly, Twitter acted on more cases of hate speech than it declined, suggesting that the group's intervention was, overall, beneficial to those seeking recourse for the harassment they faced.
"While we appreciate the reforms Twitter has made, there are much-needed structural and policy changes that we urge Twitter to implement," Wilson told Mic.
To this end, the report proposes concrete solutions Twitter can implement, including expanding the ability for users to filter abusive mentions, updating the abuse reporting interface to account for the potential trauma targets experience and diversifying Twitter's leadership.
While this report centers on Twitter-based harassment, 17% percent of reports mentioned harassment also persisting on platforms such as Facebook, Instagram, Tumblr, Reddit and YouTube, Wilson told Mic, adding, "We believe our study can be useful for other platforms as well."