I was using Bing to create a list of countries to visit. Since I have been to the majority of the African nation on that list, I asked it to remove the african countries…

It simply replied that it can’t do that due to how unethical it is to descriminate against people and yada yada yada. I explained my resoning, it apologized, and came back with the same exact list.

I asked it to check the list as it didn’t remove the african countries, and the bot simply decided to end the conversation. No matter how many times I tried it would always experience a hiccup because of some ethical process in the bg messing up its answers.

It’s really frustrating, I dunno if you guys feel the same. I really feel the bots became waaaay too tip-toey

You are viewing a single thread.
View all comments View context
1 point

If you think there is “censorship” happening you don’t even know the basics of the tool you are using. And I don’t think you know just how much $$ and time go into creating these. Good luck on your open source project.

permalink
report
parent
reply
-1 points

Ahh well let’s just give up, lay down and spread those cheeks…

permalink
report
parent
reply

ChatGPT

!chatgpt@lemmy.world

Create post

Unofficial ChatGPT community to discuss anything ChatGPT

Community stats

  • 96

    Monthly active users

  • 302

    Posts

  • 2.3K

    Comments

Community moderators