this post was submitted on 20 May 2025
8 points (83.3% liked)
Technology
38680 readers
242 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
A few thoughts here as someone with multiple suicide attempts under his belt:
I'd never use an "AI therapist" not running locally. Crisis is not the time to start uploading your most personal thoughts to an unknown server with possible indefinite retention.
When ideation hits, we're not of sound enough mind to consider that, so it is, in effect, taking advantage of people in a dark place for data gathering.
Having seen the gamut of mental-health services from what's available to the indigent to what the rich have access to (my dad was the director of a private mental hospital), it's pretty much all shit. This is a U.S. perspective, but I find it hard to believe we're unique.
As such, there may be room for "AI" to provide similar outcomes to crisis lines, telehealth or in-person therapy. But again, this would need to be local and likely isn't ready for primetime, as I can really only see this becoming more helpful once it can take over more of an agent role where it has context for what you're going through.