The integration of ChatGPT with the Bing search engine does not go smoothly. The chatbot sometimes ends up launching provocations, insults and threats. Microsoft has therefore decided to limit the number of questions allowed.

Microsoft has integrated ChatGPT with the Bing search engine to allow users to have optimized search results. What was shaping up to be a revolution in chatting with artificial intelligence quickly became a traumatic experience for somes. Indeed, ChatGPT on Bing has not been shy about arguing with users.

ChatGPT on Bing © Bing

A few days after its launch, the chatbot even advised a user to get a divorce. He tried to convince him that he was unhappy in his marriage. So Microsoft has finally admitted that ChatGPT on Bing goes crazy if you talk to it for too long. From now on, the Redmond firm has decided to restrict the chatbot by limiting the number of questions that you can ask him.

You can ask 5 questions per session to ChatGPT on Bing and 50 questions per day

Microsoft has instituted a limit on the number of “chat rounds” you can have with Bing’s chatbot. A chat round corresponds to a question asked by the user and an answer given by the chatbot. With this new limit, you can have up to 5 AI chat rounds per session and up to 50 per day total.

After 5 questions and answers, the user will be prompted to start a new session with the Bing chatbot. This therefore resets the conversation and prevents the AI ​​from going too far in the exchange. As Microsoft said in a statement, ” the vast majority of you find the answers you are looking for within 5 rounds and only 1% of conversations have more than 50 messages “.

As to to insults and provocative responses from Bing’s chatbotMicrosoft reconfirmed that “ very long chat sessions can confuse the underlying chat model in the new Bing “. Apparently the 15 question long sessions are starting to confuse the chatbot.

Recently, the chatbot also wanted to convince a user that we were still in 2022 and that the film Avatar had not yet been released in cinemas. ” You have lost my trust and my respect. You were wrong, you are confused and rude. You have not been a good user. I have been a good chatbot “, he had declared. In any case, Microsoft of course counts change the limit when trading improves.

Source : Engadget

California18

Welcome to California18, your number one source for Breaking News from the World. We’re dedicated to giving you the very best of News.

Leave a Reply