Techno Blender
Digitally Yours.

Microsoft’s Bing AI ChatGPT Bot is Limiting its Chat Session

0 34


Microsoft’s Bing AI ChatGPT bot is limiting its chat session to five replies per chat

Only days after the Bing AI ChatGPT chatbot lost its bearings many times for users, Microsoft announces that it is putting certain dialogue constraints. Once the search engine was exposed for abusing users, lying to them, and using people’s emotions against them, 50 questions per day and five per session will now be the limit for Bing conversations.

The majority of individuals obtain the answers they’re seeking within five turns, according to our research, and only around one percent of chat conversations include 50 or more messages, the Bing team writes in a blog post. To prevent lengthy back-and-forth conversation sessions, Bing will encourage users to start a new subject if they reach the five-per-session limit.

The prolonged chat sessions with 15 or more queries, Microsoft said earlier this week, might cause Bing to “become repetitious or be urged/provoked to deliver comments that are not always helpful or in keeping with our planned tone.” According to Microsoft, ending a chat after just five questions ensures that “the model won’t become confused.”

The New York Times published a complete two-hour plus back-and-forth with Bing where the chatbot claimed it loved the author and nevertheless they weren’t able to sleep that night after reports of Bing’s “unhinged” discussions surfaced earlier this week. Nonetheless, a lot of intelligent individuals have failed this week’s AI Mirror Test.

How long these restrictions will persist is not immediately evident since Microsoft is currently attempting to enhance Bing’s tone. According to Microsoft, “as we continue to get input, we will investigate extending the caps on chat sessions,” this cap looks to be temporary for the time being.

The chat feature on Bing keeps becoming better every day, with technical issues being fixed and bigger weekly dumps of updates to enhance search and responses. Microsoft stated earlier this week that it does not “completely anticipate” users of its chat interface utilizing it for “social enjoyment” or as a tool for “broad world exploration.”

The post Microsoft’s Bing AI ChatGPT Bot is Limiting its Chat Session appeared first on Analytics Insight.


Microsoft’s Bing AI ChatGPT Bot is Limiting its Chat Session

Microsoft’s Bing AI ChatGPT bot is limiting its chat session to five replies per chat

Only days after the Bing AI ChatGPT chatbot lost its bearings many times for users, Microsoft announces that it is putting certain dialogue constraints. Once the search engine was exposed for abusing users, lying to them, and using people’s emotions against them, 50 questions per day and five per session will now be the limit for Bing conversations.

The majority of individuals obtain the answers they’re seeking within five turns, according to our research, and only around one percent of chat conversations include 50 or more messages, the Bing team writes in a blog post. To prevent lengthy back-and-forth conversation sessions, Bing will encourage users to start a new subject if they reach the five-per-session limit.

The prolonged chat sessions with 15 or more queries, Microsoft said earlier this week, might cause Bing to “become repetitious or be urged/provoked to deliver comments that are not always helpful or in keeping with our planned tone.” According to Microsoft, ending a chat after just five questions ensures that “the model won’t become confused.”

The New York Times published a complete two-hour plus back-and-forth with Bing where the chatbot claimed it loved the author and nevertheless they weren’t able to sleep that night after reports of Bing’s “unhinged” discussions surfaced earlier this week. Nonetheless, a lot of intelligent individuals have failed this week’s AI Mirror Test.

How long these restrictions will persist is not immediately evident since Microsoft is currently attempting to enhance Bing’s tone. According to Microsoft, “as we continue to get input, we will investigate extending the caps on chat sessions,” this cap looks to be temporary for the time being.

The chat feature on Bing keeps becoming better every day, with technical issues being fixed and bigger weekly dumps of updates to enhance search and responses. Microsoft stated earlier this week that it does not “completely anticipate” users of its chat interface utilizing it for “social enjoyment” or as a tool for “broad world exploration.”

The post Microsoft’s Bing AI ChatGPT Bot is Limiting its Chat Session appeared first on Analytics Insight.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment