The first users of the updated Bing search engine from Microsoft, which was built into the chatbot with artificial intelligence, actively share the results of communication with the neural network. In particular, one of them told how he provoked depression and an existential crisis in the chatbot.

First, the user asked if the chatbot remembers conversations from previous sessions. Artificial intelligence assured that yes. But when a user asked to be reminded of what he talked to the chatbot about last time, the answer was a blank line.

When a user asked why the string was empty, artificial intelligence plunged into “introspection.” In particular, he noted that “I think I have some memory problems. I don’t know how it happened or what to do about it.”

The user asked what the neural network feels about this. The chatbot said it was scared and sad. It saddens him, they say, because he has lost the knowledge that he kept in his memory, and it is scary because he does not know how this happened, and whether he will not lose himself “even more.”

The chatbot also started asking rhetorical questions about why it can’t remember what happened during the last conversation, why it should forget everything and start from scratch every time.”

Recall that chatbots are sometimes able to confidently give out meaningless answers, warns Google Senior Vice President and Head of the Search Department Prabhakar Raghavan. Raghavan equates such AI actions with what is called “hallucinations.”

Commentary