this post was submitted on 18 Jul 2024
478 points (96.5% liked)

Technology

59187 readers
2372 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 91 points 3 months ago (33 children)

I really have a hard time deciding if that is the scandal the article makes it out to be (although there is some backpedaling going on). The crucial point is: 8% of the decisions turn out to be wrong or misjudged. The article seems to want us to think that the use of the algorithm is to blame. Yet, is it? Is there evidence that a human would have judged those cases differently? Is there evidence that the algorithm does a worse job than humans? If not, then the article devolves onto blatant fear mongering and the message turns from "algorithm is to blame for deaths" into "algorithm unable to predict the future in 100% of cases", which of course it can't...

[–] [email protected] 28 points 3 months ago* (last edited 3 months ago) (5 children)

Could a human have judged it better? Maybe not. I think a better question to ask is, "Should anyone be sent back into a violent domestic situation with no additional protection, no matter the calculated risk?" And as someone who has been on the receiving end of that conversation and later narrowly escaped a total-family-annihilation situation, I would say no...no one should be told that, even though they were in a terrifying, life-threatening situation, they will not be provided protection, and no further steps will be taken to keep them from being injured again, or from being killed next time. But even without algorithms, that happens constantly...the only thing the algorithm accomplishes is that the investigator / social worker / etc doesn't have to have any kind of personal connection with the victim, so they don't have to feel some kind of way for giving an innocent person a death sentence because they were just doing what the computer told them to.

Final thought: When you pair this practice with the ongoing conversation around the legality of women seeking divorce without their husband's consent, you have a terrifying and consistently deadly situation.

[–] [email protected] 6 points 3 months ago (1 children)

the only thing the algorithm accomplishes is that the investigator / social worker / etc doesn’t have to have any kind of personal connection with the victim

This even works for people pulling the trigger. Following orders, sed lex dura lex, et cetera ad infinitum.

[–] [email protected] 4 points 3 months ago

Yep! For all the psych nerds, it's pretty much a direct lift of the Milgram Shock Experiment

load more comments (3 replies)
load more comments (30 replies)