488
Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis
(www.theverge.com)
This is a most excellent place for technology news and articles.
I mean "taking pictures of people who are smiling" is definitely a bias in our culture. How we collectively choose to record information is part of how we encode human biases.
I get what you're saying in specific circumstances. Sure, a dataset that is built from a single source doesn't make its biases universal. But these models were trained on a very wide range of sources. Wide enough to cover much of the data we've built a culture around.
Except these kinds of data driven biases can creep in from all sorts of ways.
Is there a bias in what images have labels and what don't? Did they focus only on English labeling? Did they use a vision based model to add synthetic labels to unlabeled images, and if so did the labeling model introduce biases?
Just because the sampling is broad doesn't mean the processes involved don't introduce procedural bias distinct from social biases.