At least 40 people witnessed the gang rape of a 15-year-old girl that was broadcast on Facebook Live, the social network's livestream service. No one called the authorities — and then the girl went missing.
The girl's mother showed Chicago Police Superintendent Eddie Johnson screengrabs of the broadcast on Monday, CNN reported, after her daughter had been gone for 24 hours. The video was removed from Facebook, and the teen, who is now being treated in the hospital, has reunited with her mother.
Facebook users are technically able to report live videos in real time. The company has "a team on-call 24 hours a day, seven days a week, dedicated to responding to these reports immediately," according to Facebook's Community Standards. But this isn't an isolated incident of real-time violence broadcast on the platform: A user livestreamed the beating of a mentally handicapped man in Chicago in January. As Vice's Noah Kulwin pointed out, a number of other violent incidents have popped up across other livestreaming platforms such as Periscope and Twitch.
"This is a hideous crime and we do not allow this kind of content on Facebook," a Facebook spokesperson said in an emailed statement. "We take our responsibility to keep people safe on Facebook very seriously and will remove videos that depict sexual assault and are shared to glorify violence."
Livestreaming platforms still haven't figured out how to guarantee that violent content isn't broadcast in the first place, and despite having dedicated teams in place, ready to respond to reports of violating content, these tools haven't been effective if users watch without knowing what to do — or caring what happens.
Why aren't users intervening when they can simply hit a button to report it?
The bystander effect is real
This is a studied phenomenon called the bystander effect — when someone's hesitation or apathy to intervene in a dangerous situation becomes stronger the more people are involved. All 40 (or more) of the users witnessing the sexual assault on Facebook Live were watching the other 39 users for a clue how to act. No one did anything.
Emily May, co-founder and executive director at Hollaback!, a network of grassroots activists powering a global movement to end harassment both online and off, described the effect in a phone call: They were thinking, "Well, if they're not doing anything, why is it my responsibility to do something?"
Hollaback! is conducting webinars training users interested in ending harassment online. May outlined the five D's: direct intervention, distraction, delegate, delay and document the harassment. Users don't need to do all of these tactics, but rather pick one that they feel comfortable using and applies to the situation. In the instance of witnessing a sexual assault streamed live on Facebook, May said direct intervention and delegation are likely a user's best bet to try and stop and report the violence.
Direct intervention, in this case, would involve a user saying something like, "Hello, this isn't OK," or "This is harassment," May said. "In this scenario it would be trying to contact the person who is filming what's happening on their Facebook feed and telling them to stop, telling them it's not OK, telling him that they are hurting her and seeing if that might work."
However, May noted, direct intervention can sometimes be the most risky form of intervention because it can turn the attention of the abuser towards you. But because the violence is happening online, the same rules don't necessarily apply.
"Now, in a situation as severe as this case, who cares," May said. "This person is on the internet and this woman is being sexually assaulted in real time. You kind of just need to do something, anything, to intervene in that situation."
Delegation is another useful method to report and try and stop violence online, since you can't be there in real-time to directly intervene. This involves contacting the social media platform where it's happening and reporting it. Facebook, Reddit, Tumblr and Twitter all allow bystanders to report harassment, so you don't have to be the one directly experiencing it, May noted.
May said this is an example where you call 911. She said she encourages users to use caution, generally, before calling 911 — to check in with the person being harassed first, if possible, because you don't necessarily know their immigration status or relationship to the police "and whether the police would ultimately make them feel safer."
But again she said that in a situation where somebody is actively being hurt, as with the 15-year-old on Facebook Live, "calling the police is something that you just have to jump to as a way to hopefully interrupt and make the violence stop."
This occurs after the harassment is over. Delaying is when a user checks in with the victim of abuse, asking them if they are OK or if there is anything you can do to help.
"Having folks now come forward and send supporting, loving messages to this woman, I imagine is going to be a huge benefit for her," May said. "What our research shows, when it comes to bystander intervention, even doing something as little as a knowing glance can reduce trauma for the person being harassed."
She also included examples such as creating a hashtag in support of the victim, sending messages or emails to her or having somebody start a Tumblr that inundates her with kindness, adding that these types of efforts "can help her to validate the horrific pain that she's experienced, the humiliation, the violence, but also to help her to work through it and heal from where she is right now."
What Facebook can do
May suggested social media sites like Facebook set up third-party services where users can report harassment and abuse they face online. This would give companies broader knowledge of the harassment their users experience, but that this will likely have to happen through legislation.
"From my conversations, these social media companies aren't interested in it because they don't want anyone, any third-party player, to see the type and extent of harassment happening on their platform because it's bad for business," May said.
But what they can do — something that doesn't sacrifice their data to outside parties — is look toward initiatives that aim to minimize trauma, May said. She pointed to HeartMob, an online community which lets users report and document harassment across platforms and get help unique to their situation. She also pointed to counseling and legal sources, noting these social networks need to recognize they can't do it all and that third parties may be better equipped to deal with issues of harassment.
"If they are going to continue to run a social media platform, it's really irresponsible not to be investing and helping people reduce the trauma they face on those platforms," she said.