
September 30, 2025
Writers: Abby Alvarado, Kevin Miko Buac
Editor: Tobey Calayo
Researchers: Abby Alvarado
Creatives: Jia Moral, Ian Stephen Velez
Moderators: Tobey Calayo
Introduction
In the age where algorithms can listen, the question remains – can they truly understand?
We have reached the age where artificial intelligence is everywhere and utilized in everything. The AI trend has long settled into our world, from social media portraits to restaurant waiter robots. However, it was only around 2023 when AI had garnered significant traction and became mainstream [1].
ChatGPT, specifically, is a popular AI Chatbot tool that continuously develops and offers updates that allow its users to experience endless possibilities – and by “endless possibilities” we meant “infinite answers” to every question thrown by the user. This very fact started pulling people in like a good Netflix show – who wouldn’t want to have literally all the answers to all your questions at the tip of your fingers, right? May it be an exam question, a grammar correction, a relationship problem – AI’s got your back! And as AI continues to evolve, it is no surprise that people are beginning to rely on chatbots for something as personal as psychotherapy.
AI as Healers
Talking to a robot might have felt off for most people a decade ago, but it has become the norm today. These days, people opt to use AI for the simplest of things: Questions such as “What is the weather like today?”, “Can you help me write an email to a client?” and “How do I make a matcha latte at home?” are some common prompts typed by users of AI chatbots. This only goes to show that AI is increasingly being integrated into people’s everyday lives and decisions. So when people state that they talk to AI about their personal problems and whatnots, it no longer raises eyebrows.
A 2024 study published by the National Library of Medicine reported that about 28% of respondents had turned to AI for quick support and as a substitute for personal therapy [2]. A viral post on Reddit by a ChatGPT user stated that “They don’t project their problems onto me. They don’t abuse their authority. They’re open to talking to me at 11 pm,” referring to the chatbot [3]. This makes it evident that the convenience and flexibility of AI play a great role in why people are drawn to it.
Another ChatGPT user claimed that AI helped them more than “15 years of therapy.” The user followed that although they had undergone both inpatient and outpatient care, what helped them most with their mental health was the daily chats they had with OpenAI’s language model [3].
There is no doubt that access to mental health is difficult in most areas in the Philippines. There are roughly 1,700 licensed psychotherapists in the country, with only around 800 who are practicing, resulting in a ratio of around one therapist for 125,000 Filipinos [4]. The demand far exceeds the supply, which makes the fee for an hour of psychotherapy in the range of 1,000 to 4,500 pesos [5]. For a Filipino who earns a minimum wage of 363.90 pesos per day on average [6], it is no surprise to put those mental health needs under the rug or opt for something cheaper, better if free… that’s when AI chatbots enter the picture for people who lack access to a therapist, whether due to financial barriers, time constraints, or reluctance to seek professional help, AI appears to be an appealing, low-cost alternative to traditional therapy.
AI as Harmers
Despite AI Chatbots’ benefits, they remain a two-edged sword. Some people may experience short-term relief from AI-based support. Still, Cranston Warren, a clinical therapist at Loma Linda University Behavioral Health, cautions that relying on it for ongoing mental health care brings serious concerns and tends to offer only shallow benefits [7]. The same therapist provided three reasons to back up this claim:
First, AI’s lack of human touch. According to Warren, “AI doesn’t know when to push, back off, or simply hold space for someone.” While AI can provide general guidance and even simulate the structure of a therapy session, it falls short in areas that matter the most – emotional understanding and genuine compassion. A chatbot cannot read body language, ask thoughtful follow-up questions, or explore deeper layers of emotion the way a trained therapist can. This gap becomes even more concerning when dealing with clinically diagnosable conditions, where relying on AI poses significant risks. As Warren points out, mental health treatment requires a wide range of approaches: from solution-focused methods like Cognitive Behavioral Therapy (CBT) and Dialectical Behavior Therapy (DBT) to more reflective styles such as Psychotherapy or Person-Centered Therapy. Making those clinical judgments, however, is something AI simply cannot do [7].
Second, the tendency to misdiagnose and create a false sense of security. Because AI cannot provide a medical diagnosis, relying on it as a replacement for therapy in cases of serious mental health conditions can be dangerous. Conditions such as schizophrenia or bipolar disorder require close clinical oversight, particularly regarding medical management – something AI is not equipped to handle. Warren states, “For someone struggling with distorted thinking, such as catastrophizing or minimizing their struggles, AI may reinforce that perspective rather than correct it.” Even in milder cases, Warren warns of the danger of overdependence. Unlike a licensed therapist, AI cannot determine whether someone’s perception of reality is accurate, which can create a false sense of reassurance. In times of crisis, the limitations become even clearer: AI cannot recognize when a person needs urgent intervention, call emergency services, or step in to ensure safety. As Warren emphasizes, that vital human layer of protection is missing [7].
In relation to this, real-life incidents also highlight the risks of relying on AI for mental health. In 2023, a Belgian man ended his life after weeks of confiding in an AI chatbot about his eco-anxieties, with his widow later telling local media that without those conversations, “he would still be here”. More recently, in April 2024, a 35-year-old man in Florida who reportedly struggled with bipolar disorder and schizophrenia was fatally shot by police after believing that an entity named Juliet was trapped inside ChatGPT. According to his father, this delusion contributed to the tragic confrontation [8].
Lastly, the privacy concern. As Warren warns, “Your interaction with AI is not guaranteed to be private. Everything you feed into the model is being analyzed for data.” Unlike licensed therapists bound by laws like HIPAA, AI developers aren’t required to safeguard user information. This lack of regulation leaves room for privacy breaches, misuse of sensitive data, and unintended consequences for vulnerable users [7].
In summary, while AI offers convenience and accessibility, its role as a substitute for professional therapy is filled with risks. From the lack of empathy and clinical judgment to concerns about privacy, safety, and the potential for harmful outcomes, the disadvantages highlight limitations that cannot be ignored. AI can be a valuable tool for individuals facing minor mental health challenges, particularly when quick advice or stress management support is needed. It may offer short-term, surface-level assistance for those experiencing mild depression, anxiety, and mood shifts as it can suggest coping strategies, track emotions and behaviors over time, provide structured exercises or journaling prompts, and even offer general encouragement between therapy sessions [7], but at best, AI may serve as a supplemental tool for support. Still, it cannot replace the depth, responsibility, and human connection that trained therapists provide.
Algorithm may listen, respond, and even guide, but proper understanding, the kind that heals, still belongs to human connection. If you or someone you know needs mental health consultation, kindly refer to our directory for mental health facilities, services, and organizations around the Philippines: https://mentalhealthph.org/directory/


Guide Questions:
- As AI chatbots like ChatGPT become more advanced, do you think people are turning to them out of convenience or out of desperation due to the lack of accessible mental health services?
- Can an AI algorithm truly understand human emotion or does it only simulate empathy?
- How might AI reshape how the next generation views mental health care?
References
[1] https://ourworldindata.org/brief-history-of-ai
[2] https://pmc.ncbi.nlm.gov/articles/PMC11488652
[3] https://fortune.com/2025/06/01/ai-therapy-chatgpt-characterai-psychology-psychiatry/
[4] https://opinion.inquirer.net/164710/board-of-psychology-goes-overboard
[5] https://www.moneymax.ph/personal-finance/articles/cost-of-therapy-philippines
[6] https://tradingeconomics.com/philippines/minimum-wages
[7] https://news.llu.edu/health-wellness/can-i-use-ai-my-therapist-truth-about-turning-chatbots-therapy



