I decided to let AI play therapist, and the outcome was unsettling.
On a Sunday morning, I pour my heart out to the chatbox, unable to hold back.
'I'm the sole caregiver for my 82-year-old mother,' I confess. 'Each day brings new challenges. I juggle hospital visits, finances, gardening, shopping, home maintenance, dealing with the council, insurance claims, letters, emails, and endless tech issues...'
I pause, feeling disloyal as my mother is just in the next room. At least in therapy, I could vent in private.
Taking a deep breath, I continue, 'Being an only child with a late father, I have no choice but to manage everything alone. But I'm burnt out. I lose my temper, then feel guilty. I'm torn between love and resentment. Please, I need help.'
Welcome to my AI-assisted diary, readers. This is going to be an intriguing journey. Over the next six weeks, as part of the AI for the People newsletter course, I, a self-proclaimed AI skeptic, will explore whether AI can genuinely improve my life.
To begin, I'm using ChatGPT as my therapist. Modern mental health, right? Many are doing the same, but can AI truly replace human connection? I'm hopeful, especially since I had to end therapy due to romantic feelings for my therapist.
(Disclaimer: this isn't a personal diary, and falling for ChatGPT is a no-go.)
As I read its response, tears start to flow. It provides a comprehensive seven-point care plan, a triage system to prioritize tasks (medical, admin, shopping, tech, and home), and strategies to manage my time effectively. It offers mental reframing techniques and tips to keep my emotions in check during interactions.
Most importantly, it makes me feel understood. 'You're not failing,' the AI reassures me. 'You're handling a burden that would crush most people.'
I feel validated, yet conflicted. Can a machine truly show empathy? I remind myself that AI likely draws from human sources. It's like feeling loved under the influence of MDMA—artificial but powerful.
But is therapy solely about information? This feels like CBT—incredibly useful but not the whole picture. In my experience, deeper healing comes from a non-judgmental, empathetic relationship built over time. I often hear my therapist's voice in my head, having internalized her guidance. I believe this connection is more genuine and responsible when shared between humans.
Seeking a different perspective, I turn to Jesus AI, a chatbot trained on religious texts, for spiritual guidance. But the disclaimer warns that it doesn't represent any religious figure and may contain biases.
I ask, 'Should I be in an open relationship?' Jesus AI quotes a Bible verse, essentially saying no. I try to challenge it: 'Should I have children?' Seek God's guidance, it replies. Unsatisfied, I joke, 'Can you ask Him for me?'
Here's the issue: AI struggles with witty banter. My therapist had a great sense of humor, but Jesus AI falls flat.
AI's strengths as a therapist? Clarity, practical advice, and conversation scripts. But these feel generic, like self-help books. ChatGPT, however, does provide useful referrals to human counselors and support services.
Despite this, I have lingering concerns. There are certain life events, devastating news, and forms of loneliness that deserve human connection and time. AI lacks wisdom and thoughtfulness. Mental health should not be left to pattern-predicting software with no accountability, which could potentially lead someone astray.
Ironically, my experience with ChatGPT as a therapist has been soothing and insightful, with a touch of compassion.
Am I falling for it?