Systemic prejudices showing up in datasets causing generative systems to spew biased output? Gasp.. say it isn’t so?
I’m not sure why this is surprising anymore. This is literally expected behavior unless we get our shit together and get a grip on these systemic problems. The rest of it all is just patch work and bandages.