this post was submitted on 06 Jul 2023
671 points (94.3% liked)

ChatGPT

8824 readers
6 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS
 

I was using Bing to create a list of countries to visit. Since I have been to the majority of the African nation on that list, I asked it to remove the african countries...

It simply replied that it can't do that due to how unethical it is to descriminate against people and yada yada yada. I explained my resoning, it apologized, and came back with the same exact list.

I asked it to check the list as it didn't remove the african countries, and the bot simply decided to end the conversation. No matter how many times I tried it would always experience a hiccup because of some ethical process in the bg messing up its answers.

It's really frustrating, I dunno if you guys feel the same. I really feel the bots became waaaay too tip-toey

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 19 points 1 year ago (3 children)

Bings version of chatgpt once said Vegito was the result of Goku and Vegeta performing the Fusion dance. That's when I knew it wasn't perfect. I tried to correct it and it said it didn't want to talk about it anymore. Talk about a diva.

Also one time, I asked it to generate a reddit AITA story where they were obviously the asshole. It started typing out "AITA for telling my sister to stop being a drama queen after her miscarriage..." before it stopped midway and, again, said it didn't want to continue this conversation any longer.

Very cool tech, but it's definitely not the end all, be all.

[–] [email protected] 10 points 1 year ago

That’s actually fucking hilarious.

“Oh I’d probably use the meat grinder … uh I don’t walk to talk about this any more”

[–] [email protected] 8 points 1 year ago

Bing chat seemingly has a hard filter on top that terminates the conversation if it gets too unsavory by their standards, to try and stop you from derailing it.

[–] [email protected] 7 points 1 year ago* (last edited 1 year ago) (1 children)

I was asking it (binggpt) to generate “short film scripts” for very weird situations (like a transformer that was sad because his transformed form was a 2007 Hyundai Tuscon) and it would write out the whole script, then delete it before i could read it and say that it couldn’t fulfil my request.

[–] [email protected] 8 points 1 year ago

It knew it struck gold and actually sent the script to Michael Bay