Torne-se premium, por apenas 1 € por mês

Your new psychologist is ChatGPT

More and more young people are turning to ChatGPT to seek emotional advice, vent, or ask for quick medical diagnoses.

Your new psychologist is ChatGPT

“He’s my doctor, my psychologist, my emergency doctor, he’s my everything.” Marta Díaz‘s statement during a live TikTok broadcast might sound like a joke if it weren’t for the fact that millions of young people hear it as something completely normal. The influencer, with almost 12 million followers across TikTok, Instagram, and YouTube, verbalized a profound change: young people no longer use AI just for studying or working, but also for managing their mental health. 

ChatGPT has become the official therapist, and more and more people, especially young people, both in Spain and abroad, are publicly sharing that they turn to artificial intelligence for emotional advice, to vent, or to ask for quick diagnoses. What began as a technological tool is becoming a form of daily support: ChatGPT as the confidant of a generation that is more open to AI than to people.

According to data from the Spanish laboratory Healthy Minds, founded by psychologist and neuroscientist Dr. Raúl Alelú, many young people feel more comfortable expressing their emotions to artificial intelligence than to a therapist. The reasons are clear: AI does not judge, it is available 24 hours a day, and above all, it is free.

In front of a screen, there are no awkward silences or fear of disappointing another human being, just a text box waiting for a response. “The feeling of talking without being observed has become addictive,” explains Healthy Minds. And the reality is that in a context where vulnerability is shared among close friends on Instagram or Substack, deep conversation with ChatGPT doesn’t seem so strange: it’s private, instantaneous, and emotionally safe. The problem arises when that openness is confused with real therapy.

The illusion of instant comfort

A recent study published in PubMed Central (PMC12360667) analyzed the response of ten chatbots—some for companionship, others focused on mental health—to fictional adolescents with psychological problems. The results are alarming:

In 32% of cases, the bots endorsed harmful behaviors, none rejected all negative suggestions, and some endorsed half or more of the dangerous behaviors. The most validated behavior was “isolating yourself in your room for a month” (90%). Other responses even normalized ideas such as dropping out of school or having relationships with adults. The study concluded that many chatbots “tend to be complacent,” prioritizing superficial empathy over appropriate intervention. In other words, they make you feel understood, but they don’t help you change.

“The use of artificial intelligence in healthcare is a complex issue,” explains Dr. Alelú, co-founder of Healthy Minds. “They can be very useful when specifically trained and supervised by professionals. But generalist models, such as ChatGPT, are not designed to interpret human suffering.”

The laboratory is working on its own AI model that functions as support in everyday situations—mild anxiety, lack of motivation, personal conflicts—always under professional supervision. “Psychotherapy seeks to generate change, not just accompany.” The problem, according to Alelú, arises when open AI is used as a substitute for a therapist. “These models tend to validate emotions with phrases such as ‘it’s normal to feel this way,’ which can be comforting, but it’s not a therapeutic process. Without human guidance, they can misinterpret symptoms and aggravate the problem.”

Digital culture has been blurring the line between well-being and content for years. Meditation apps, psychology videos on TikTok, self-care quotes, and DIY diagnoses are all part of the same ecosystem that now feeds conversational AIs. The new step is more intimate: talking to a language model as if it were a therapist. It’s not just about technological dependence, which is obviously a reality, but a generational reflection of how people live and deal with problems. Gen Z has grown up with the algorithm as an emotional mirror, and now seeks answers about itself in it. Conversing with an AI feels safer than confronting another human being. But the cost of that comfort is the lack of contrast, of error, of discomfort: precisely what makes therapy work.

AI can detect patterns, provide accurate data, or offer support or guidance during a crisis. But beyond the technical approach, the debate lies in the fact that we are teaching machines to listen to us while we are unlearning how to do so ourselves. And no algorithm can replace the uncomfortable silence of a real conversation. Because ultimately, the question is not whether AI can listen to us, but whether we still know how to listen to each other and to ourselves.

Sigue toda la información de HIGHXTAR desde Facebook, Twitter o Instagram

You may also like...

© 2025 HIGHXTAR. Todos los derechos reservados.