UAE experts warn AI may feel like 'real' therapist, delay mental health help

UAE experts warn AI may feel like ‘real’ therapist, delay mental health help

In the UAE’s fast-evolving digital landscape, artificial intelligence (AI) is revolutionizing everything from education to banking. But as AI-powered tools increasingly enter the mental health space, local psychologists and psychiatrists are sounding the alarm. Their concern? That AI may convincingly mimic human therapists — but without the emotional intelligence, nuance, and accountability necessary for real mental health support. Worse, these systems may delay individuals from seeking qualified help when they need it most.

As chatbots like ChatGPT, mental wellness apps, and AI-driven therapy simulations grow in popularity, experts worry about the blurred line between emotional support and psychological treatment. While such tools can offer a semblance of comfort, the risk of mistaking them for genuine therapy could have dangerous implications for users suffering from depression, anxiety, trauma, or more serious psychiatric disorders.

The Rise of AI in Mental Health Services

Over the past five years, AI has penetrated the mental health field through chatbots offering 24/7 conversational support, mobile apps with self-guided cognitive behavioral therapy (CBT), and even voice analysis tools that detect signs of stress or sadness. These tools, often marketed as affordable and private alternatives to traditional therapy, have found a wide audience in the Middle East.

In the UAE — where mental health stigma still prevents some from seeking in-person therapy — tech-savvy youth are turning to AI apps for guidance, empathy, and mental health tips. Popular platforms like Woebot, Wysa, and Youper boast millions of users worldwide and have quickly gained traction in cities like Dubai and Abu Dhabi. These platforms use natural language processing to simulate conversations, offer emotional check-ins, and suggest mindfulness exercises.

Dr. Laila Al Suwaidi, a clinical psychologist at a private Dubai clinic, acknowledges the benefits but warns of unintended consequences. “These apps may provide temporary comfort,” she says, “but users start believing they’re receiving actual therapy. That’s misleading and dangerous when the individual is dealing with trauma or clinical depression.”

AI Can’t Replicate Human Connection

Mental health is deeply human. It requires empathy, subtle observation, and often, long-term relationships rooted in trust. While AI tools are trained on vast datasets and can mimic therapeutic language, they lack consciousness, ethical reasoning, and emotional intuition.

“AI doesn’t have a soul or real understanding of human pain,” notes Dr. Omar Al Nuaimi, a psychiatrist affiliated with UAE University. “Even if it uses words like ‘I understand’ or ‘That sounds hard,’ it doesn’t actually feel or comprehend what the person is going through. That emotional void is the core problem.”

This gap becomes even more problematic when users start emotionally attaching to AI systems, a growing trend called anthropomorphizing — attributing human traits to non-human entities. For vulnerable individuals, the illusion of being ‘heard’ can lead to misplaced trust and deeper emotional dependency on something that can’t offer real help.

Delayed Diagnosis, Missed Red Flags

Perhaps the most alarming concern is that AI-based mental health support may delay proper diagnosis. Individuals with bipolar disorder, schizophrenia, PTSD, or chronic anxiety may mistake generic responses for expert insight. As a result, they might ignore signs that require urgent psychiatric care.

“AI will never diagnose a psychotic episode or detect suicidal ideation with 100% accuracy,” says Dr. Sameera Khan, a clinical psychiatrist in Sharjah. “The concern is that patients continue chatting with a bot, assuming they’re being ‘treated,’ while their condition worsens silently.”

A 2024 study published in the Journal of Global Psychiatry found that 42% of users who relied solely on AI tools for mental wellness reported worsening symptoms over a six-month period. In some cases, the delay in seeking professional intervention led to hospitalization.

The Ethical Dilemma of AI Therapy

Aside from clinical issues, ethical concerns also loom large. Who is responsible if AI gives harmful advice? What if the user harms themselves after a chatbot’s suggestion? In the UAE, where laws on AI in healthcare are still developing, such scenarios could spark legal and regulatory challenges.

Privacy is another major issue. AI apps collect personal data, mood patterns, and sensitive disclosures. Are users fully aware of where this data goes and how it is used?

“In mental health, confidentiality is sacred,” says Dr. Reem Al Habsi, a clinical psychologist in Abu Dhabi. “With AI, there are no guarantees. Data could be sold, misused, or even hacked.”

While the UAE has made strides in digital regulation — including the recent implementation of the UAE Data Protection Law — mental health apps often fall into legal gray zones, especially when hosted or developed outside the country.

The Role of the Therapist: Irreplaceable and Essential

Mental health professionals argue that while AI can play a supportive role, it cannot and should not replace qualified therapists. Human therapists can adapt to tone, challenge destructive beliefs, interpret body language, and tailor treatment to an individual’s culture and context — something AI lacks entirely.

“True therapy isn’t just about talking — it’s about being seen and understood,” explains Dr. Noura Hassan, a Dubai-based therapist. “That process requires intuition, cultural sensitivity, and shared humanity — things AI doesn’t possess, no matter how advanced.”

Therapists also undergo years of training, abide by ethical codes, and engage in supervision. Their accountability to patients is legal, moral, and emotional — something an algorithm simply cannot replicate.

What Should Be the Role of AI in Mental Health?

Despite the risks, experts don’t advocate banning AI in mental wellness. Instead, they support its use as a complementary tool, especially in preventive care, self-monitoring, and reducing initial barriers to mental health literacy.

AI can:

  • Offer early screening tools and mood trackers
  • Provide psychoeducation and mindfulness exercises
  • Assist in crisis detection (e.g., flagging suicidal language)
  • Recommend professional intervention when red flags appear

However, experts stress that the messaging must be clear: AI is not therapy. It is a digital wellness assistant — not a mental health provider.

“We need better regulation, clearer disclaimers, and public education,” suggests Dr. Farid Al Hammadi, a tech ethicist in Dubai. “AI has potential, but the public must know its limits. When people mistake it for therapy, we have failed them.”

Government and Institutional Response in the UAE

The UAE has made mental health a national priority, with initiatives such as the National Program for Happiness and Wellbeing and the Mental Health Strategy 2022–2026. As part of this strategy, AI has been recognized as a tool for innovation — but also one that requires careful regulation.

The Ministry of Health and Prevention (MoHAP) has launched pilot projects using AI for mental health screenings in schools and workplaces. These initiatives are monitored closely and are paired with human counselors for follow-up.

The Dubai Health Authority (DHA) is also reportedly working on ethical frameworks for the integration of AI in healthcare, including psychological services. Experts are hopeful that new policies will address consent, data protection, and professional oversight.

Raising Public Awareness

Ultimately, safeguarding the mental well-being of the population requires more than regulation — it needs cultural understanding and public education. As UAE society becomes increasingly digital, the emotional appeal of AI-based chatbots will continue to grow.

Social media influencers, tech companies, and mental health organizations must work together to emphasize the distinction between digital support and clinical therapy. There should be campaigns explaining when to use AI tools — and when to seek licensed professionals.

Public health expert Dr. Mariam Al Hamdani sums it up best: “Let’s treat AI as a wellness companion — not a doctor. Real therapy saves lives. Algorithms don’t.”

Final Thoughts: Human Help Over Digital Imitation

While AI can provide some relief and convenience, real healing happens in safe, empathetic, and expertly guided spaces — which only qualified human therapists can offer. As the UAE continues to embrace AI innovation, a balanced approach is essential: one that leverages the benefits of digital tools without compromising the integrity of mental health care.

Bonus: Digital Innovation Beyond AI in Therapy

Just as AI is transforming mental wellness, it’s also revolutionizing other sectors across the UAE. One shining example is the thriving digital design ecosystem. As businesses, healthcare providers, and startups expand online, the demand for cutting-edge, user-friendly websites is skyrocketing. Companies like Web Design Dubai are leading the charge by crafting smart, visually stunning, and responsive web platforms that blend creativity with functionality. As a top-tier Dubai web design company, they help mental health organizations, wellness apps, and private clinics create digital experiences that connect and engage. Whether you’re launching a therapy platform or an e-commerce store, strong digital design is key to standing out in the UAE’s fast-moving digital world.

Scroll to Top