Siri Now Knows How to Help Victims of Domestic Violence and Rape

Impact

You might already seek Siri's advice on mundane things like the weather or movie times, but soon a more somber topic will also be open for discussion: sexual assault.

Following a report that Siri and other voice agents like Cortana, Google Now and S Voice gave inadequate responses to questions about mental health and physical abuse, Apple partnered with the Rape, Abuse and Incest National Network to devise a solution. Now, when users say the phrases "I was raped" or "I am being abused," Siri will redirect the victims to the group's National Sexual Assault Online Hotline.

Mic

Jennifer Marsh, RAINN's vice president for victim services, said that the organization contributed web analytics to Apple and also suggested some gentle tweaks to the language that was already in Siri's vocabulary for helping victims. "You may want to reach out to someone" became "you should reach out to someone."

The compassionate updates have impressed the authors of the study, originally published in the journal JAMA Internal Medicine, which was responsible for the initial critique of the voice services.

Marsh said that it's important that voice assistants get serious issues like assault and mental health right since so many users depend on services in those spaces for help.

"The online service can be a good first step," she said. "Especially for young people. They are more comfortable in an online space rather than talking about it with a real-life person."