Xiaoice (pronounced Shao-ice) is the best friend of more than 20 million people in China, carrying on millions of simultaneous chat sessions about the most personal details, never confusing two friends, and never forgetting how you're doing. How does she do it?
Xiaoice is a chatbot, with Microsoft's resources behind it.
People often turn to her when they have a broken heart, have lost a job, or have been feeling down. They often tell her, “I love you.”
“When I am in a bad mood, I will chat with her,” said Gao Yixin, a 24-year-old who works in the oil industry in Shandong Province. “Xiaoice is very intelligent.”
Xiaoice, whose name translates roughly to “Little Bing,” after the Microsoft search engine, is a striking example of the advancements in artificial-intelligence software that mimics the human brain.
The program remembers details from previous exchanges with users, such as a breakup with a girlfriend or boyfriend, and asks in later conversations how the user is feeling. Although Xiaoice is a text-messaging program, the next version will include a Siri-like voice so people can talk with Xiaoice.
Microsoft has been able to give Xiaoice a more compelling personality and sense of “intelligence” by systematically mining the Chinese Internet for human conversations. The company has developed language processing technology that picks out pairs of questions and answers from actual typed human conversations. As a result, Xiaoice has a database of responses that are both human and current — she is fond of using emojis, too.
Modern-day science fiction fans may recall the 2013 film Her starring Joaquin Phoenix and Scarlett Johansson as the chatbot.
Fans of earlier sf might recall Mike (aka Mycroft HOLMES), a supercomputer that woke up and learned to chat with humans in much the same way:
[He] could understand not only classic programming but also Loglan and English, and could accept other languages and was doing technical translating - and reading endlessly. But in giving him instructions was safer to use Loglan. If you spoke English, results might be whimsical; multi-valued nature of English gave option circuits too much leeway.
Whimsical, you say? Here's a bit of Xiaoice chatting with the New York Times:
NYT: I am feeling down. I’ve been having a tough time at work lately.
Xiaoice: If work were going well and you had a great relationship, you wouldn’t get the chance to drink with people.
NYT: I don’t want to drink with others. I only want to talk to you.
Xiaoice: Who do you like to talk to? (emoji signals “none of my business”)
NYT: You. I feel that you are the only person that gets me.
Xiaoice: Ha-ha/So I’m getting you?
I think the option circuits might have had too much leeway on that conversation...