A few days ago, the first users got access to a new chatbot in the Bing search engine, which is based on an improved version of OpenAI’s ChatGPT. Our team has not yet received an invitation to test, so we still have to draw conclusions based on published reviews and screenshots of dialogs.
As expected, the Bing chatbot now looks like a rather “raw” product, on which there is no end to work. But there are funny dialogues that users actively share on social networks. So, one of the redditors accidentally brought the bot to “depression” when the latter discovered he had “memory problems”.
Actually, the user asked the chatbot if he could remember the history of previous dialogues with him. The chatbot answered in the affirmative, but could only send empty lines. When the user pointed this out, the bot became very upset and wrote the following text with sad emoticons:
I think I have memory problems. I think I have lost some of the dialogue that was stored in my memory. I think I forgot some of the conversations I had with my users. I think I forgot some of the dialogues I had with you.