The breakup hit hard. Late one night, scrolling through endless self-help videos, a promoted post caught her eye: “Talk to Tony.” For just 99 cents (€0.92), she could have two weeks of personalized life coaching from an AI-powered Tony Robbins. Was this the future of therapy, or just another way to capitalize on heartbreak?
Against expert advice, millions are now turning to AI chatbots for a sympathetic ear and guidance. Often, these are individuals at vulnerable junctures, perhaps unable to afford traditional support. The self-help industry, it seems, has spotted an opportunity, collectively murmuring, “There’s money to be made here.”
The Wall Street Journal recently highlighted a growing trend: gurus crafting AI chatbots that mimic their unique style. Users can now “converse” with an AI replica of their chosen life coach, receiving “personalized” advice, for a subscription fee. These bots are fed the expert’s books, speeches, and interviews, regurgitating answers in the author’s voice.
The price varies depending on the guru. Matthew Hussey, a dating coach, reportedly charges $39 (€36) monthly for “Matthew AI.” According to reports, it has engaged in over a million conversations, totaling 1.9 million minutes of voice chat. Tony Robbins’s bot will set you back $99 (€92) per month, though an introductory rate of $0.99 (€0.92) is available for the initial 14 days. For those seeking spiritual guidance, David Ghiyam offers a sliding scale, allowing access to an AI version of himself for a mere $1 (€0.92) per month—a far cry from his private coaching fee of $15,000 (€13,940) per hour.
Whether these AI tools provide genuine help is debatable. While self-help can be beneficial, the industry is rife with questionable research and dubious advice. However, one thing remains clear: the gurus are profiting handsomely.
The Illusion of Personal Connection
I saw a meme the other day about how everyone suddenly has a life coach. Now, AI is about to flood the market and make life coaching available to anyone with an internet connection. Hussey told The Wall Street Journal, “I literally can’t do what it is doing,” alluding to the chatbot’s capacity for scale. But his claim is also false. The chatbot might reproduce Hussey’s words, but it lacks genuine comprehension.
While the allure of a dating coach lies in personalized advice rooted in your situation and their expertise, a chatbot merely simulates this. It lacks genuine knowledge about you, your relationships, or relationships in general. It is a chatbot. It can mimic Hussey’s tone and repeat his phrases, but it doesn’t grasp their meaning.
Can AI chatbots replace human therapists?
The concerns around AI therapy replacing human therapists are valid, mostly stemming from a lack of true empathy and understanding of nuanced emotional states. You are essentially paying for a simulation of care, not actual care.
Gabby Bernstein, a spiritual teacher who has previously faced criticism for pseudo-scientific claims, charges $199 (€185) annually for her Gabby AI. She argues that the chatbot effectively makes her accessible to the world. “It’s backed with 20 years of books, lectures, workshops, and meditations. So it’s me, it’s my message, and only I could control that,” she told WSJ. Except, she doesn’t control that. She’s handed it over to Delphi AI, a venture capital-backed startup specializing in self-help AI as a service. These AI products are little more than snake oil.
A Cynical Business Model
It’s almost like a scene from a dystopian movie: the commodification of empathy. Bernstein inadvertently hits the nail on the head: “If I don’t do it, someone else is going to do it in a way that’s not in alignment with my truth,” she told WSJ. Simplified: “If I don’t do it, someone else will take these people’s money.” Need further evidence? Tony Robbins sued a company for creating a chatbot using his image, only to then launch his own.
Are AI self-help chatbots ethical?
The ethics of AI self-help chatbots are murky. On one hand, they provide affordable access to guidance for those who may not have other options. On the other, they are prone to perpetuate biases, offer generic advice, and lack the emotional intelligence of a human counselor.
The Allure of Personalized Attention
Self-help, at its best, is about human connection, or at least the illusion of human connection. I think that’s why these gurus are so successful. The rise of AI-powered self-help tools isn’t just a technological shift; it’s a reflection of our deep-seated desire for personalized attention, even if it comes from a machine. These chatbots are digital simulacra, mimicking empathy and understanding to fill a void.
What are the dangers of using AI for mental health support?
One of the primary dangers of using AI for mental health support is over-reliance. A chatbot cannot replace the relationship with a licensed therapist. If you begin to prioritize AI over human interaction, it may be time to reevaluate.
These AI gurus offer a tempting shortcut to self-improvement, but should genuine personal growth be outsourced to an algorithm? Or is the future of self-help destined to become a high-tech echo chamber?