People are cutting themselves on TikTok. What are we going to do about it?

Maxine McCrann
Life

Trigger Warning: This piece contains descriptions and true accounts of self-harm.

“This is for all my real friends out there,” said a girl with pastel pink hair on TikTok. She was sitting on the side of her neon flower print covered bed with a razor blade hovering over her left arm. “Whoa,” I thought. “This is no joke.” The girl started cutting the back of her forearm. She made three careful slashes and held them up closer to the camera. They weren’t deep, but there was definitely blood and it was starting to drip just a little down towards her wrist. The girl looked directly at the camera, her blue-grey eyes slightly unfocused. “You,” she said. “You know how it is.” The video ended abruptly, as though her propped up phone had dropped from whatever it was leaning against.

I put my phone down and cried until my eyes stung and my face was as red as that poor girl’s blood. It was hard to go back to work that day. She was right: I do know how it is. As a teen, I was also a cutter. Cutting, for teenage me, was a way of trying to externalize my inner turmoil, of making the pain I was feeling real and material. But it was also a way of turning what I was feeling into something tangible.

I hid the marks from the adults in my life, but I often showed them to friends. I guess I wanted someone to see me. I guess I wanted help. Now that adolescent pain is safely decades behind me, but I can’t stop thinking about this other young girl, so full of pain that she was posting it for thousands of strangers. She probably wants help, too.

I saw that TikTok around six months ago, but I have never seen the girl again. The video has been taken down and so, perhaps, has her account. Still, every time I open the app, even as I mindlessly scroll through queer makeup tutorials, I’m half hoping she’ll pop back up in my feed, smiling and saying she got the help she was hungry for. This personal experience made me want to start a broader conversation about self-harm on social media. If this is the way people are reaching out now, what is the right thing to do when we come across people hurting themselves on social media platforms like TikTok — and who gets to decide?

It’s important to note, first, that while I was impacted by her TikTok, I know that young person was probably not trying to upset me or anyone else. “They probably don't realize what they post may be triggering to others,” says Lucia Wallis Smith, a New Jersey-based psychotherapist. She also probably wasn’t trying to show off or glorify self-harm. “People who post self-harm videos are not trying to recruit others but to reach out for help.” Every expert I spoke with confirmed that these TikToks are what cries for help sound like in the digital age.

Not all kinds of self-injury are public pleas for attention, though. In fact, most are practiced in the dark. “Typically self- harm tends to be a very private and shameful behavior,” says Ilona Váró, a psychotherapist in Los Angeles. Individuals prefer to keep this behavior a secret for as long as possible, Váró explains. “When the behavior is made more public, it is generally behavior to start a conversation about what is going on on a deeper level.” So if someone is showing you their scars — or in this case posting them on social media — they probably want someone to reach out to them, and they’ve probably endured a lot of private suffering that you never got to see.

martin-dm/E+/Getty Images

Experts see both a lot of potential benefit and a great deal of potential harm that could be created when people post publicly about self-harm. “It could raise someone's [a viewer’s] awareness of the issue,” Smith says. “It might motivate someone to take action to support people who are resorting to self-injury.” In other words, it may lead some people to do what I am doing right now: investigate.

But, Smith says, making public appeals directed at strangers on social media won’t yield predictable results. “It's a mixed bag. Many people feel supported and may seek treatment because of discussions on platforms like TikTok, but self-harm can be socially contagious.” Basically, it may raise awareness and help some people find community and support, but it might also help others learn about new ways to hurt themselves. While people who post about their own self-harming behavior on social media may not intend for others to imitate it, recent studies suggest that self-harming behaviors are indeed socially contagious, especially among adolescents.

Váró agrees that a self-harm community on TikTok can be dangerous but also might help some people realize they’re not alone in their pain. Ultimately, though, Váró seems to think that the semi-anonymous nature of social media could cause more harm than good. “I don't believe social media is the appropriate community to be connecting with others who self-harm, at least not when the act itself is being depicted,” Varo says. On top of everything else, she worries that folks who post about self-harm could become the unwitting targets of hurtful criticism.

Not only that, Váró fears that the community people find on TikTok may be somewhat illusory — they may feel like support networks for a while, but they can’t take the place of therapy. Centering mental health care in the social media domain may prevent people from really addressing the underlying issues that cause the self-harming behavior in the first place. “If people would process the underlying emotions, needs, triggers, and urges which lead to self harm that would be a different story,” Váró says.

It’s unfortunately unlikely that someone will post a self-harming video on a platform like TikTok and then find the resources and support they actually need to heal on their own. Ideally, there’d be some type of intermediary — maybe someone internally flagging these videos and reaching out.

In digging a little to see if that exists, I found that TikTok does seem to be trying to address self-harming on the app. I get the sense that they’re trying to strike balance in the way they moderate user-created material. On the one hand they don’t want to censor people, but on the other, they don’t want the platform to be used in ways that could be triggering — or could make them legally liable for users behavior.

“We do support members of our community sharing their personal experiences with these issues in a safe way to raise awareness and find community support,” a spokesperson for TikTok tells Mic. But, they add, they don’t allow explicitly self-harming videos. “To avoid normalizing, encouraging, or triggering self-harm behavior, we do not allow imagery that depicts such behavior, regardless of the user's intention of posting it. We remove content that may encourage or normalize acts that are likely to lead to physical self-inflicted injury.”

"Many people feel supported and may seek treatment because of discussions on platforms like TikTok, but self-harm can be socially contagious.”

So, then, what happens when, say, a young girl posts a TikTok of her cutting herself? I’m told that TikTok takes the video down and then notifies the user about why it was removed. This is not an unusual occurrence. According to TikTok, 6.2% of the content removed in the last six months of 2020 was removed because it violated the company’s “suicide, self-harm, and dangerous acts policy.” The spokesperson told me that TikTok, “provide[s] additional resources for our community members who may be struggling with self-harm behaviors at our Safety Center.” But what exactly does “provide” mean, I wonder.

I asked the rep whether a user who is potentially in crisis would be directly contacted by a human offering support. “If a video is removed for violating our policy on self-harm, the user will receive a notification in the Inbox of their app,” they told me. In other words, no. So, then, when, say, an emotionally fragile teen’s video is removed, they get an auto-response with links to some resources. Yeah, that’s not dismissive or depersonalizing at all.

Smith and Váró, surprisingly, seem to agree that most self-harming videos shouldn’t be removed at all. “Videos about self-injury are not always horrific,” she says. “Pictures of self-harm can be places of help and positive stories about recovery.” Smith says that she is not in favor of complete censorship, and neither is Váró. “People should absolutely have the freedom to talk about their struggles, urges, past behaviors, however, showing this behavior on a social media platform simply glorifies the coping skill and ignores underlying emotions,” she says, but says that they should not be allowed to “graphically depict self-harm.” Smith adds that this kind of content should definitely have a trigger warning.

What experts do want to see is greater moderation on social media platforms when it comes to dealing with mental health issues. Naturally, they don’t think that young people who might hurt themselves should simply be sent a form letter telling them that their video violates the rules. Instead, moderators should connect directly with them.

Many social media platforms have come under fire recently for how they handle mental health issues on their platforms. But, actually, TikTok was applauded in a recent study about self-harm policies on social media platforms for allowing users to share their self-harm stories in a way that researchers say raises awareness and creates community. That’s cool, and I would never deny the importance of community, but I also think that TikTok, a company that has recently been named the most valuable privately-owned company in the world, can probably afford to give users some one-on-one time.

The truth is that this is going to keep happening until mental health issues are destigmatized and properly addressed, and covering their asses may be the best many tech companies can do without broader support. Personally, I am praying for the day when we can accept and care for all the members of our community, virtual or not. Until then, I will keep looking for the girl with the pink hair. I do know, I do see you, and you are not alone.

If you or someone you care about are having self-injurious thoughts, you can text the Self-Harm Crisis Textline 24 hours a day, 7 days a week by texting HOME to 741741. If you or someone you love is thinking about suicide, call the National Suicide Prevention Lifeline at 1-800-273-8255 [TALK] for free confidential emotional support 24 hours a day, 7 days a week.