this post was submitted on 29 Jan 2024
1016 points (99.1% liked)

Technology

59322 readers
5220 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 9 months ago

I think with a human operator, we can be proactive. A person can be informed of bias, learn to recognize it, and even attempt to compensate for their own.

I think you're being very optimistic here. I hope very much that you'd be right about the humans. I have a feeling that a lot of these type of decisions are also resulting from implicit biases in humans that these humans themselves might not even recognize or acknowledge. Few sexists or racists will admit to being racists or sexists.

I agree about your point about the "computer says no" issue. That's also addressed in the video and fits well into her wider point that large parts of the population not understanding how so-called AI works is a huge problem.