Reddit, Facebook and Twitter have long been plagued by a subset of users that insist on harassing others. Beleaguered by death threats and stomach-turning verbal abuse, a group of former Reddit employees is launching a new social forum with a focus on taming harassment.
The crew behind the new platform, called Imzy, is a recognizable bunch that includes founder of Redditgifts, Dan McComas and Reddit's former Head of Community Jessica Moreno.
The group is receiving support from investors to build a place that's more thoughtful about how it curates positive community interaction, according to Recode. Imzy is also partnering with popular director and founder of Lenny Letter Lena Dunham and Community creator Dan Harmon.
To keep the platform imbued with healthy social behavior, its creators are requiring people to become "members" in order to post in a forum. Moderators and support staff are available at all times to keep the community's values in place by actively monitoring and deleting comments that wade into offensive territory. The platform is also invite-only for now, which could help to slowly build a more positive community.
Call to action: The launch comes as many platforms are considering how to weed out trollish behavior. Redditors like those found on the subreddit /r/TwoXChromosomes have long had to combat abusive behavior. To combat death threats and hate speech, Reddit introduced a series of anti-harassment tools. But Reddit is far from alone in its quest against bad actors. Last year, former Twitter CEO Dick Costolo aired his frustrations with users' behavior and his plans implement a hardline stance against bad actors. But once embedded, trolls — like ticks — are difficult to remove.
This realization is causing many to reconsider their approach to developing community-based systems. That means rather than taking an if-you-build-it-they-will-come approach to online community building, platform makers are deeply considering how they want their platform to run prior to launching. Because the problem is, when you build a community, the trolls will always come unless there are provisions in place to stop them. Privacy controls and harassment flagging shouldn't be tacked on at the end.
Trolls will be trolls: Of course not everyone is pleased about Imzy's launch. In response to Imzy's proposed community guidelines, which include the right to pull hate speech and other violence-inciting commentary, one redditor remarked, "Someone wanted a safe space for the Germans too. Nobody cared until the Germans kept needing more of it."
"The problem in self-proclaimed 'safe spaces' is rarely written in words but in interpretation of them," another redditor wrote. "Where do they draw the line between 'encouraging hatred' and rational dislike of specific subgroups based on given evidence. Even here on Reddit places like /r/Worldnews don't place that line of 'allowed' comment in a rational place."
There is a notion among some cohorts on Reddit that the oppressor is now being oppressed by those it once persecuted — and that it's not fair. The assumption, as outlined by the redditor mentioned above, is that certain points of view will be valued above others and comments that don't follow the accepted line of thinking will be flagged as hate speech.
Can there be an all-inclusive platform? People clash on the internet. Topics like religion, politics and ethics can stir up a lot of emotion. What's needed online is a place that can temper the hostility that arises from such discussions, without limiting inclusivity. It's a hard balance to strike, but the payoff could be huge for everyone involved.
Correction: May 20, 2016