Amy the Virtual Assistant Is So Human-Like, People Keep Asking It Out on Dates


"Hey girl, want to grab drinks? ;)" 

"Thank you so much for your kind invitation. Because I'm an artificial intelligence personal assistant, I'm unable to join you in person. Have a good meeting!"

Classic modern romance. 

The company launched an artificial intelligent assistant (read: not human, non-corporeal) in 2014 to schedule meetings. The bot's name is Amy. One result no one expected: Someone asked Amy on a date nearly every month in 2015.

"To be brutally honest, I did not anticipate that. I anticipated many things. That was not one of them," founder Dennis Mortensen said in a phone interview. 

Lucky for, its team of AI interactive designers had settings in place to deal with such a romantic gesture, even if they didn't anticipate humans would try to sweetly hold its intangible creation in their arms.

The AI assistant has been programmed to respond politely to a pool of requests that assume it is a human, such as: "Will you be joining us on the call?" "Can you pick me up from the lobby when I arrive?" "Can you make sure you have still water in the meeting room?" and other services a fleshless, boneless, soulless entity cannot perform. 

"We decided to humanize our agent. When you make that decision, you should do your damnedest best to be successful at that," Mortensen said. 

To humanize the assistants, calls them Amy and Andrew — the bot's owner can choose — and not HELPERBOT5000. Right now, most of's clients choose to call their bots Amy, but Andrew is quickly catching up, said. (The bot is still in beta.) 

Amy/Andrew also has a human-like voice, which Mortensen hopes will let users make "an emotional connection" with the bot. So emotional that some people might, say, ask it out for a drink. 

Amy's very different from Tay, Microsoft's weed-loving Nazi chatbot.

Microsoft recently experimented with a Twitter chatbot named Tay, which got hijacked by internet trolls and became a racist Hitler sympathizer within 24 hours. Tay was subsequently halted until further notice.

"It was very unfortunate what happened with Microsoft," Mortensen said. "We had a little laugh about it behind closed doors."

Don't confuse chatbots like Tay with an artificially intelligent assistant like Amy. Mortensen said he is not "surprised" by what happened with the Tay.

"I can see how people got scared for a second: AI gone rogue," Mortensen said. "That is not the case" with Amy.

Amy is engineered using vertical AI, meaning it won't just spit back out what humans feed into it. The company wrote all Amy's dialogue. There's no way to bend Amy out of shape. You can corner it with questions, but rather than echo you, it'll just ask for clarification. In other words, your bigoted trolling is futile. 

Amy is more like Ava, Alicia Vikander's lifelike robot from the film Ex Machina. You may recall this scene:

Caleb: It's just in the Turing test, the machine should be hidden from the examiner.

Mortensen cites this dialogue as's approach to Amy and Andrew. The bot isn't trying to deceive users: It says it's not human right there in its email signature, as seen below. 


Mortensen aims to create a machine so effective that people continue to treat it kindly, or, like some unsuspecting humans, forget and ask it out on a date.  

Amy is the future.

Facebook has M. Microsoft has Cortana. Google has Google Now. Apple has Siri. The push for a bot-driven future is underway, and Amy is a classic example of how the relationship between humans and machines is going to blur. 

You may have even already unwittingly interacted with Amy or Andrew. Check your email. The future might be in there. Just don't ask it out on a date.