this post was submitted on 25 Feb 2024
186 points (90.8% liked)

Technology

59030 readers
2943 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Google to pause Gemini AI image generation after refusing to show White people.::Google will pause the image generation feature of its artificial intelligence model, Gemini, after the model refused to show images of White people when prompted.

you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 45 points 8 months ago (1 children)

No, the problem is that they filter prompts and inject new parameters into prompts specifically to avoid creating white subjects. It's so bad that, when asked to generate a chessboard, Gemini would only make one with black pieces.

[โ€“] [email protected] 5 points 8 months ago

That would not have caused them to go offline. Modifying a hash table takes 0 minutes of down time. Likewise a LoRA layer takes no down time. The only reason to go completely offline is because they need to filter the base dataset and retrain from scratch. It means the error is so intertwined across so many neural layers that a simple extra filter layer is unable to address it.

The neural network is like a giant multi dimensional cloud in 3d but where there are more than 3 dimensions. All the stuff in the cloud are vector relationships. If there is some easily traversed path where neural connections are gravitating towards a simple modification like slice across that cloud can modify that easily traversed path ever so slightly to make it less easily traversed. This is something like a LoRA that can be tacked onto the model's math.

However, if the undesirable behavior is due to something like all roads leading to the center of a giant city metropolis, no slice across that cloud can subtly alter all of the neural paths without impacting adjacent data. It is all approximated floating point math where every concept and generation parameter is inner related. Things like bunny rabbit and Playboy playmate are stored in the same tables. If you try and make all bunny rabbits black, you are also altering all playmates. It is simply because there is an minor relationship between these concepts and therefore they share a vector space inside some tensor tables. There is a very big difference between how the initial table values are created across all layers and how a modified layer works. When things go really bad, the only option is to retrain the whole thing from scratch.