Reducing the number of Bing ChatGPT sessions to fix the problem of inappropriate responses

Reducing the number of Bing ChatGPT sessions to fix the problem of inappropriate responses

Microsoft recently Reduced the number of chats a user can do with the Bing chatbot And each user can do 5 chats per session and a total of 50 chats during the day. Each chat is a conversational turn consisting of a user’s question and Bing’s answer, and this process is said to be called a chat. After reaching the chat limit, the user will be told that the chat limit has been reached and will be asked to choose a new topic after five chats to continue.

In this regard, Microsoft has admitted that the limitation of Bing chat is due to the fact that long chat sessions confuse Bing’s AI-based chat and eventually lead to unexpected responses. In this regard, it seems that since ChatGPT became available in the Bing search engine, people have reported strange and even annoying behavior.

New York Times columnist Kevin Roose also published the full text of his conversation with Bing Chat, and it seems that Bing’s artificial intelligence chat mentioned hacking his computer and spreading false and false information! In this regard, Chat Bing expressed his interest in him in a part of the conversation with the New York Times columnist and wrote about his failed marriage! In another conversation published on Reddit, Bing insisted that the movie Avatar: The Way of Water has not yet been released and that Microsoft’s chat technology thought it would be in 2022!

Microsoft Bing chat session reduction

Insulting users was also observed by Bing chat bot in the chat session and in one case Microsoft chat Bing told the user that he was confused and rude and was not a good user in general, after these reports Microsoft took action and in a blog post He explained the strange behavior of the Bing chatbot and further stated that he will reduce the number of Bing chat sessions to solve this problem.

In this regard, the company said that very long chat sessions with 15 or more questions will mislead the AI ​​model embedded in the chat and force it to respond in a way that is not necessarily helpful and will not be constructive in accordance with the intended tone. . Microsoft has also admitted that it will continue to improve the Bing chatbot by receiving feedback from users.

  • Bing artificial intelligence threatens critical users!
  • Will Tesla integrate ChatGPT with its EVs?
  • Release of initial version of Edge and Bing search engine with ChatGPT support

More Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.
You need to agree with the terms to proceed

Most Viewed Posts