Let’s say you’ve just posted a selfie. Or maybe you’ve shared a political opinion online or reacted to a piece of news. All are exceptionally common acts most of us do without even really thinking about it. Typically you’ll get a good reaction — positive reinforcement from friends — or maybe a spirited back-and-forth debate. Someone might disagree with you, which is fine. (They’re wrong, but it’s fine.) Unfortunately, that’s not always as bad as it gets. Insults about your appearance or hateful comments about your identity might start rolling in. “Log off, bitch.” “You’re ugly.” Or worse: “I know where you live.”
Sadly, online harassment has become an almost inescapable part of our daily lives. A Pew Research Center study published in July found that 41% of Americans had been harassed online, with 66% witnessing it happening to others. Eighteen percent said they’d been subjected to especially pernicious forms of abuse, like sustained harassment, stalking or threats of physical violence.
More often than not, that type of behavior is taking place on social media platforms. And while 79% of us, according to the Pew study, think online services have a duty to take action when harassment occurs, we all know it doesn’t always work that way. No matter how often you might report what seems to be clearly harassing behavior, it might not technically violate the terms of service of Twitter or Facebook, which can be byzantine and frustratingly literal. When that happens, it compounds the target’s sense of distress, a recent study from the University of Michigan found, because they’re essentially being told their trauma isn’t real.
HeartMob is a social media network for people who need help or who want to help people in a time of emotional distress.
There may not yet be a one-size-fits-all solution for ridding the internet of trolls and harassers, but in the meantime, groups like HeartMob hope to help by simply being there. HeartMob, an outgrowth of the anti-street harassment group Hollaback!, is a community meant to offer support, resources for documenting abuse and, perhaps most importantly, a sympathetic audience when someone might need it the most.
As a study done by Hollaback! and the Worker Institute at Cornell University found, when someone experiences harassment on the street, sometimes something as simple as a sympathetic statement or look from a bystander had a positive influence on their experience. HeartMob brings that idea to the internet. The platforms might not care about your abuse, HeartMob says, but we do and we’re here to listen.
“It’s a platform where folks can report their experience with harassment and either leave it there, share their story or ... start to tap folks within their own network or in our network to help them out with that harassment,” HeartMob co-founder Emily May said in a phone interview.
In short, HeartMob is a social media network for people who need help or who want to help people in a time of emotional distress; a sort of bat-signal you can send for backup to arrive. That might mean getting help reporting a particularly nasty troll or compiling instances of abuse for later documentation, in case it’s being deleted too fast. It could also just mean people sharing messages of positive reinforcement.
A video on the site (seen above) explains exactly how HeartMob works. In it, a woman has just posted a selfie and soon begins seeing negative comments in her feed: messages saying that she’s ugly, she looks like a boy. She turns to HeartMob next, explaining what happened, selecting “picture harassment” from a drop-down menu with different types of abuse.
“People are bombarding me with threats in response to my selfie. I’m afraid some of them might be real,” she writes, choosing from a series of tags that apply: verbal harassment, hate speech, threats of violence and non-consensual porn.” The incident is then headlined on the landing page under the category of abuse at hand, whether it’s racist, transphobic, ableist or so on. Soon thereafter, a series of supportive messages roll in.
It may not cut off the problem at its root, but it’s a lot better than going through it alone.
According to May, many of the users on HeartMob are people who are being harassed by a single person, and they may or may not know who the person responsible is.
“A lot of times, in those situations, folks are really aiming for the ability to share their story and get support, particularly in the form of reporting messages,” May said. “That’s the most common kind of report we get.”
“A lot of who HeartMob ends up serving are people who don’t have people, a community, or whose abuse is happening in silence.” — Emily May, HeartMob co-founder
People who are getting swarmed with abuse are rarer, she said, because those tend to be people with larger profiles, who naturally have people coming to their defense in the moment.
“A lot of who HeartMob ends up serving are people who don’t have people, a community, or whose abuse is happening in silence,” May said.
It is, quite literally, a safe space. And as such, they’re also selective about who they allow onto the site. You have to submit a social media account with your real name to be allowed on, and they might deny users for any number of reasons. I was turned down on my first try. That’s because, May said, it’s obviously a big target for trolls or harassers, and they don’t want it to be infiltrated.
Since HeartMob piloted in 2016, it’s grown slowly and tentatively. To date, they’ve documented around 800 incidents of harassment, and have a few thousand bystanders engaged on the site.
It would be great, May said, if platforms took abuse more seriously, but that doesn’t seem to be happening any time soon.
An approach that targets platforms is a huge part of the solution for online harassment, May said. But waiting for companies or the government to create social change can be a slow and arduous process. “When it comes to forms of violence like online harassment what we see [as having the] biggest impact is people having each other’s backs.”