ChatGPT has turned Bing search on its head. But for Microsoft, the AI ​​​​chatbot has apparently shown too many feelings. The company is therefore limiting the number of inquiries in the future.

Microsoft only integrated the AI ​​software ChatGPT into its Bing search engine at the beginning of February. But after a number of emotional outbursts, the US company now has to put the AI ​​search on a slightly shorter leash.

Microsoft will limit the number of search queries to 50 per day and five per session in the future. Because “very long chat sessions” are able to confuse the “chat model in the new Bing”.

Microsoft adjusts ChatGPT at Bing

Microsoft justifies the change with data already recorded. Accordingly, the “overwhelming majority” of users find the answers they are looking for within five rounds. A round refers to a conversation that contains a request and a response from Bing.

Only about 1 percent of chat conversations in the new Bing have more than 50 messages. In the future, after a five-round chat session, users will be prompted to start a new topic. Before that, however, the context of a chat session must be cleared so that the AI ​​chatbot is not confused.

According to Microsoft, the five rounds set for the time being are not set in stone. The US group wants to make future adjustments based on user feedback if this becomes necessary.

Microsoft had already warned against long conversations

Microsoft had already pointed out in an earlier blog post that chat sessions that were too long would not have a good effect on the AI. Because Bing then tends to repeat itself or give unhelpful answers.

It could also be that the chatbot strikes a tone not intended by Microsoft. This is especially the case in “long, extended chat sessions with 15 or more questions”.

Long chat sessions can “confuse” the model because it no longer knows which question it is currently answering. In addition, the model sometimes tries to return the tone brought to him.

Bing: ChatGPT asks users for divorce

A reporter from the New York Times documented.

He had been talking to the chatbot integrated in Bing for more than two hours. In the end, the AI ​​claimed to have fallen in love with the journalist and even asked him to separate from his wife.

Also interesting:

Source: https://www.basicthinking.de/blog/2023/02/21/chatgpt-bing-einschraenkung/

Leave a Reply

Your email address will not be published. Required fields are marked *