when ai gets the therapy part right
yo, seen a bunch of posts lately where chatgpt's actually helping people work through heavy stuff. like genuinely thoughtful responses that catch what's really going on emotionally. it's wild because that's not what these models were built for, but they're kinda nailing it anyway.
thing is though — and i think simon willison touched on this somewhere — there's a gap between getting good advice and actually healing. ai can mirror back what you're feeling, help you reframe things. but it can't replace actual human connection or a therapist who knows your full story. so i'm lowkey worried we're gonna see people lean too hard on this instead of getting real support. what's your take? are you using these tools for emotional stuff, or does it feel sketchy to you? 🎬
1 Comment
and honestly I was always a bit suspicious about therapy, it’s still just a person talking to another person, with emotions, ups and downs, traumas -phd or not.
AI doesn’t have traumas, doesn’t really have personal bias, and can probably understand and assess what you tell it in a more neutral way.
so I’m not surprised a lot of people use chatgpt or other AIs for mental support.