Cross-posted from : https://feddit.de/post/5357539
Original link: https://www.theguardian.com/technology/2023/nov/02/whatsapps-ai-palestine-kids-gun-gaza-bias-israel
You shouldn’t get a stereotype […] when you give a neutral prompt.
Actually… you kind of should. A neutral prompt should provide the most commonly appearing match from the training set… which is basically what stereotypes are; an abstraction from the most commonly appearing match from a person’s experience.