To keep conversations normal and the chatbot from going haywire, Microsoft has limited the number of responses Bing’s AI chatbot can give to five.
According to a blog post, Bing Chat will now respond to up to five questions or statements in a row for each conversation before prompting users to start a new topic. In addition, users will be limited to 50 total replies per day. Furthermore, the change is required to ensure that the bot behaves more human-likely and does not go off on tangents.
The restrictions are intended to keep conversations from becoming strange. Long discussions, according to Microsoft, “can confuse the underlying chat model.” The company also apologized for the chatbot’s behavior and promised to take steps to avoid similar incidents in the future.
Bing users will get a prompt to start a new topic once a limit is reached. According to the post, the chat conversation cap went into effect on Friday because “very long” chat sessions can confuse Bing’s underlying chat model.
“At the end of each chat session, context needs to be cleared so the model won’t get confused. Just click on the broom icon to the left of the search box for a fresh start,” according to the post.
The five-response limit is part of Microsoft’s larger effort to make its AI-powered chatbots more realistic and natural. The company has been collaborating with OpenAI to create more sophisticated chatbots that can interact with users in a more human-like manner.
The sources for this piece include an article in ArsTechnica.