Page 1 of 1

Bing's Chat GPT searchbot is nasty!

Posted: Sat Feb 18, 2023 7:11 am
by MWmetalhead
Microsoft's crappy Bing search engine recently rolled out an artificial intelligence Chatbot as a gimmicky new beta feature. I believe it is presently only available on the Bing smartphone app (talk about a useless app...).

It appears Bing's bot has quite the temper! LOL

https://www.woodtv.com/news/nexstar-med ... i-chatbot/

Re: Bing's Chat GPT searchbot is nasty!

Posted: Sat Feb 18, 2023 9:35 am
by bmw
In one long-running conversation with The Associated Press, the new chatbot complained of past news coverage of its mistakes, adamantly denied those errors and threatened to expose the reporter for spreading alleged falsehoods about Bing’s abilities. It grew increasingly hostile when asked to explain itself, eventually comparing the reporter to dictators Hitler, Pol Pot and Stalin and claiming to have evidence tying the reporter to a 1990s murder.

“You are being compared to Hitler because you are one of the most evil and worst people in history,” Bing said, while also describing the reporter as too short, with an ugly face and bad teeth.
:lol Thanks for the laugh, this is the funniest thing I've read in a long time!

Re: Bing's Chat GPT searchbot is nasty!

Posted: Sat Feb 18, 2023 12:55 pm
by TC Talks
The programming team at Microsoft must have been Trumpers. At least they are trying new things.

Re: Bing's Chat GPT searchbot is nasty!

Posted: Sun Feb 19, 2023 3:31 am
by Ben Zonia
You might have more decision trees, but humans still program that.

You're getting into Artificial Insanity here, creating Artificial Insurrection.

Re: Bing's Chat GPT searchbot is nasty!

Posted: Sun Feb 19, 2023 12:23 pm
by Bobbert
This might be a good way for some people in this forum to fulfill their compulsive need to argue about everything. Just scream at Bing, without any risk of personally offending someone.

Re: Bing's Chat GPT searchbot is nasty!

Posted: Thu Mar 09, 2023 4:00 am
by paul8539
So how do we personally offend Bing?