Thanks for the laugh, this is the funniest thing I've read in a long time!In one long-running conversation with The Associated Press, the new chatbot complained of past news coverage of its mistakes, adamantly denied those errors and threatened to expose the reporter for spreading alleged falsehoods about Bing’s abilities. It grew increasingly hostile when asked to explain itself, eventually comparing the reporter to dictators Hitler, Pol Pot and Stalin and claiming to have evidence tying the reporter to a 1990s murder.
“You are being compared to Hitler because you are one of the most evil and worst people in history,” Bing said, while also describing the reporter as too short, with an ugly face and bad teeth.
Some registered account users are experiencing password recognition issues. The issue appears to have been triggered by a PHP update last night. If this is occurring, please try logging in and using the "forgot password?" utility. Bear in mind auto-generated password reset emails may appear in your spam folder. If this does not work, please click the "Contact Us" option near the lower right hand corner of the index page to contact me via email.
Thank you for your patience!
- M.W.
Thank you for your patience!
- M.W.
Bing's Chat GPT searchbot is nasty!
Re: Bing's Chat GPT searchbot is nasty!
Re: Bing's Chat GPT searchbot is nasty!
The programming team at Microsoft must have been Trumpers. At least they are trying new things.
For Kristian Trumpers are not serving our Lord Christ, but their own appetites. By smooth talk and flattery they deceive the minds of naive people.
-Romans 16:18
Posting Content © 2024 TC Talks Holdings LP.
-Romans 16:18
Posting Content © 2024 TC Talks Holdings LP.
Re: Bing's Chat GPT searchbot is nasty!
You might have more decision trees, but humans still program that.
You're getting into Artificial Insanity here, creating Artificial Insurrection.
You're getting into Artificial Insanity here, creating Artificial Insurrection.
"I had a job for a while as an announcer at WWV but I finally quit, because I couldn't stand the hours."
-Author Unknown
-Author Unknown
Re: Bing's Chat GPT searchbot is nasty!
This might be a good way for some people in this forum to fulfill their compulsive need to argue about everything. Just scream at Bing, without any risk of personally offending someone.
All along the icy wastes, there are faces smiling in the gloom.
Re: Bing's Chat GPT searchbot is nasty!
So how do we personally offend Bing?