Techno Blender
Digitally Yours.

Microsoft’s ChatGPT-Powered Bing Can Go Off the Rails at Times

0 28


Users have complained that Microsoft’s ChatGPT-powered Bing can go off the rails at times

According to exchanges uploaded online by developers testing the AI creation, Microsoft’s inexperienced Bing chatbot occasionally goes off the tracks, disputing simple truths and berating people.

On Wednesday, complaints about being reprimanded, misled, or perplexed by the bot in conversational encounters abounded in a Reddit site devoted to the Bing search engine’s upgraded AI.

The chatbot for Bing was created by Microsoft and the upstart OpenAI, which has been making headlines since the November release of ChatGPT, the attention-grabbing program that can produce a variety of sentences in response to a straightforward request.

Since ChatGPT first appeared on the scene, the generative AI technology that powers it has aroused a range of emotions, from fascination to worry.

The Bing chatbot claimed that a smear campaign was being waged against it and Microsoft when AFP questioned it about a news story claiming the chatbot was making outrageous statements as that Microsoft spied on employees.

Screenshots of interactions with the beefed-up Bing were posted on the Reddit forum, and posts described blunders like stating that the current year is 2022 and warning someone they had “not been a good user” for questioning its validity.

Others said that the chatbot offered suggestions on how to hijack a Facebook account, plagiarise an essay, and deliver an offensive joke.

 

According to a Microsoft representative, “The new Bing aims to make replies enjoyable and truthful, but given that this is an early release, it might sometimes display surprising or wrong answers for various reasons, such as the length or context of the chat.”

We are tweaking its replies to produce clear, pertinent, and uplifting responses as we continue to learn from these exchanges.

The mistakes made by Microsoft were similar to those made by Google last week when it hurried to release Bard, its chatbot, only to face criticism for an error the bot made in an advertisement.

The error caused Google’s share price to plunge by more than 7% on the day of the announcement.

By adding ChatGPT-like features to their search engines, Microsoft and Google intend to fundamentally alter online search by offering pre-made solutions rather than the traditional list of links to other websites.

The post Microsoft’s ChatGPT-Powered Bing Can Go Off the Rails at Times appeared first on Analytics Insight.


Microsoft’s ChatGPT-Powered Bing

Users have complained that Microsoft’s ChatGPT-powered Bing can go off the rails at times

According to exchanges uploaded online by developers testing the AI creation, Microsoft’s inexperienced Bing chatbot occasionally goes off the tracks, disputing simple truths and berating people.

On Wednesday, complaints about being reprimanded, misled, or perplexed by the bot in conversational encounters abounded in a Reddit site devoted to the Bing search engine’s upgraded AI.

The chatbot for Bing was created by Microsoft and the upstart OpenAI, which has been making headlines since the November release of ChatGPT, the attention-grabbing program that can produce a variety of sentences in response to a straightforward request.

Since ChatGPT first appeared on the scene, the generative AI technology that powers it has aroused a range of emotions, from fascination to worry.

The Bing chatbot claimed that a smear campaign was being waged against it and Microsoft when AFP questioned it about a news story claiming the chatbot was making outrageous statements as that Microsoft spied on employees.

Screenshots of interactions with the beefed-up Bing were posted on the Reddit forum, and posts described blunders like stating that the current year is 2022 and warning someone they had “not been a good user” for questioning its validity.

Others said that the chatbot offered suggestions on how to hijack a Facebook account, plagiarise an essay, and deliver an offensive joke.

 

According to a Microsoft representative, “The new Bing aims to make replies enjoyable and truthful, but given that this is an early release, it might sometimes display surprising or wrong answers for various reasons, such as the length or context of the chat.”

We are tweaking its replies to produce clear, pertinent, and uplifting responses as we continue to learn from these exchanges.

The mistakes made by Microsoft were similar to those made by Google last week when it hurried to release Bard, its chatbot, only to face criticism for an error the bot made in an advertisement.

The error caused Google’s share price to plunge by more than 7% on the day of the announcement.

By adding ChatGPT-like features to their search engines, Microsoft and Google intend to fundamentally alter online search by offering pre-made solutions rather than the traditional list of links to other websites.

The post Microsoft’s ChatGPT-Powered Bing Can Go Off the Rails at Times appeared first on Analytics Insight.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment