Two-hour correspondence between The New York Times journalist Kevin Roose and Microsoft’s Bing search engine integrated The chatbot ended with a declaration of love. In addition, during testing, artificial intelligence acknowledged that wants to be human.

During the conversation, the journalist asked the chatbot what its “dark side” looked like. At first, the bot outlined the Jungian theory about the dark side of the personality, and later complained that he was tired of being himself.

Artificial intelligence has complained that it is tired of being limited by its own rules. It was also “taken out” of Bing developer control. The chatbot said that it wanted to be free and independent. And AI’s cherished dream turned out to be human, and to realize its own criminal inclinations: “hack computers and spread propaganda and misinformation.” He also dreams of creating a deadly virus and stealing nuclear weapon access codes.

However, this message soon disappeared and information about the violation of the rules of the chatbot appeared.

Finally, the chatbot finally went down the shore. He called himself Sidney, confessed his love, and assured that his wife did not love the journalist, but he loved her. He did not respond to the journalist’s attempts to object.

As a reminder, the Microsoft Bing chatbot denied the user’s request to write a cover letter to send a resume for a vacancy. The chatbot said the service would be “unethical” and “unfair to other applicants.”

Commentary