this post was submitted on 20 Feb 2024
66 points (100.0% liked)
World News
22058 readers
48 users here now
Breaking news from around the world.
News that is American but has an international facet may also be posted here.
Guidelines for submissions:
- Where possible, post the original source of information.
- If there is a paywall, you can use alternative sources or provide an archive.today, 12ft.io, etc. link in the body.
- Do not editorialize titles. Preserve the original title when possible; edits for clarity are fine.
- Do not post ragebait or shock stories. These will be removed.
- Do not post tabloid or blogspam stories. These will be removed.
- Social media should be a source of last resort.
These guidelines will be enforced on a know-it-when-I-see-it basis.
For US News, see the US News community.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Automated resume screening tools have always been harmful, and have been employed for years now in a lot of companies. The issue comes down to how to filter applications in a scalable manner, but this seems paradoxical since those same companies then complain about a lack of qualified candidates after rejecting them all, leading those candidates to then apply elsewhere. If these companies hired less-than-perfect candidates instead of being so trigger happy with their rejections, there'd probably be far fewer applications to review in the first place, making these automated screening tools less necessary.
The bias question is more relevant now that companies are using more complex AIs. I'm glad the article brought it up since it's difficult to quantify how biased a model is towards some groups and against others, and where in the model that bias comes from.