Bias in training data on display in weird way

So i was working on this Tabletop roleplaying game project and for my own amusement I told two different video generating ai models to generate
"a '90s toy commercial featuring boys and girls of different races in halloween costumes saying "I've got the urge to be a pirate" "ive got the urge to be a ninja!" or spy or whatever they are dressed as"
thats it thats the exact prompt, and both of them gave me very different products but both had zero girls, and in both the pirate was a black boy, the ninja an east asian boy, and the spy a white boy. Makes perfect sense in hindsight but I really didn't see it coming and most surprising (for me) is the black child as pirate. Kind of arbitrary but must be reflecting something in the data. Anyway, i found that kinda enlightening, maybe you will too, bye.

submitted by /u/Immediate_Tooth4437
[link] [comments]

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top