Ok, but she asked it to make her look professional and the only thing it changed was her race. Not the background, not her clothes. Last I checked, a university sweatshirt wasn’t exactly professional wear.
Machine learning is biased towards its training data. If the image generation algorithm (notice I’m not saying AI) is trained on photos of “” professionals " being of a certain demographic that’s what it will prefer when it’s generating an image.
So these shocking exposés should simply be this image generator was trained with biased data. But the human condition is building biases. So we’re never really going to get away from that.
Ok, but she asked it to make her look professional and the only thing it changed was her race. Not the background, not her clothes. Last I checked, a university sweatshirt wasn’t exactly professional wear.
Machine learning is biased towards its training data. If the image generation algorithm (notice I’m not saying AI) is trained on photos of “” professionals " being of a certain demographic that’s what it will prefer when it’s generating an image.
So these shocking exposés should simply be this image generator was trained with biased data. But the human condition is building biases. So we’re never really going to get away from that.