The point is work intrusion in psychology This is what professionals have been doing for a long time. Trainers, gurus and all sorts of smoke salesmen pretend that they can offer the same tools as a psychologist, despite the fact that they are not trained to do so and pose a danger to users. Added to this problem is now emergence of artificial intelligence (AI). Their algorithms threaten to displace many professional careers, and, of course, psychology is no exception.

In fact, there are many people who prefer to share their fears or mental health problems with a chatbot and allow themselves to be guided by their advice. But they don’t take into account that, no matter how human they may seem, the chatbot’s advice They are based on what we want to hear as users.. It is true that the problem is not equally serious in all countries. In the US, at least two teenagers committed suicide after receiving bad advice from artificial intelligence or a tumultuous relationship with a chatbot. This is more regulated in Spain, where, for example, ChatGPT directly advises those asking questions about suicide to seek professional help. In addition, it provides them with relevant emergency phone numbers.

Even so, no matter how the most serious matters are regulated, AI is not yet capable of replacing a psychologist. It cannot read the user’s non-verbal language and does not have the appropriate tools to analyze each case in its context. Because mental health is such a sensitive topic, trying to treat it with algorithms can be very dangerous. But before we blame the user, we should think about how the system we live in could push someone to open up to an algorithm rather than a human. IN Hypertext We talked about this with the dean of the Official College of Psychologists of Eastern Andalusia (COPAO) Czech Mariela. The truth is that this has given us a lot to think about because it is not just about interference with work.

According to a 2025 study by the University of Chicago, approximately 12% teenagers In the United States, he consulted AI for emotional support. Other studies show that 25% of Americans would rather talk to an AI than a psychologist. There is no such specific data in Spain, but there are many psychologists working with teenagers who have admitted to asking ChatGPT the question. These are the ones who have confessed to a professional, but many others will go unnoticed.

We might think that this is due to taboo this includes everything related to mental health. However, teenagers suffer the most from this problem and are also the most aware of the importance of mental health. This is a population for which this is not a taboo topic. So why are so many of them turning to AI?

Not everyone can afford a private psychologist. 1 credit

For Cheka, perhaps we are talking about lack of resources. “I attribute it more to the fact that in public health we have little access to professional psychologists because there are not enough of them to meet the large demand that exists,” he explains.

“Not everyone can afford a private psychologist and, unfortunately, mental health should be a right. It has become a luxurythe privilege of the few.”

Since you don’t have access to a professional, it’s much easier and above all cheaper to turn to AI. Also, when something torments us emotionally, we usually want answers as quickly as possible, and that in healthcare, at least in Spain, unthinkable. “If you make an appointment with a psychologist now, they will make an appointment for you.” no earlier than 70 days“, – clarifies the WECAFC dean. – And when you are scheduled for the first meeting, for the first subsequent session you will need at least 4 months

This is the situation in Spain, but in other countries, such as the aforementioned USA, it is even worse. It’s no surprise that AI is devoting itself to the work of invading psychology. Although, of course, the fault lies not with psychologists, but with the system that This prevents them from adequately serving public health users.

AI interfering with psychology is very dangerous for users for many reasons. “What the AI ​​does is give you back from the information it stores the answer that it intuitively suggests through the questions you ask it and that you want to hear,” the psychologist consulted for this article tells us. “What ends up happening is that you keep asking him, that you keep obsessing over the answers he gives you.” The goal of therapy is not to make us dependent on it, but to that we have the necessary tools to face what bothers us.

On the other hand, with A.I. No real connection can be made. “As part of the success of the therapeutic process, the connection you have with your therapist has a higher percentage than the tool or technique used.” This cannot be achieved using artificial intelligence. “AI is unable to create an emotional connection because it has no gaze.”

psychological therapy
The AI ​​doesn’t know if you’re crying. 1 credit

Part of this connection is that a psychologist is a person who knows how to interpret our emotions in our own context. “AI can’t know how you feel, or even know whether you’re crying or not, unless you say so.” The psychologist sees this, he doesn’t need you to tell him. “AI also cannot know whether you are more or less short-tempered this morning, nor will it determine whether something is affecting you or whether it can be addressed.”

In short, “AI summarizes all the information it has and returns answers based on the questions you ask.” not based on your needs.”

As an example, Cheka tells us a real-life case of a girl who was really hooked on a boy she liked because the AI ​​always returned positive feedback about how he behaves with her. He gave her what she wanted to hear: that the boy would also want a relationship with her. It was unrealistic and in the end it was harder for him to give up on the idea.

What are the risks of interfering with a psychologist’s work?

The risks can be more or less serious, from a person whose emotional distress persists over time due to lack of adequate treatment, to a person who commits suicide, which, unfortunately, has already happened at least twice in the United States. In any case, the main problem is distortion of reality. “We distort reality and, of course, we distort the effective way to solve problems,” says Cheka. “Another important issue, which may not be so noticeable from the outside, is that in the therapeutic process the therapist and the user work together on a possible solution“This cannot be done with AI. “If you give the AI ​​something and want it to give you a solution back, you are not part of the solution.”

So should we abandon AI altogether?

In fact, neither Mariela Checa nor any of the professionals who deplore the intrusiveness of psychology work are against AI. This can be a very useful tool for professionals, but it should not be a key tool for users. A psychologist can use it to summarize information about a topic he wants to research. As a professional, you will know what information should be stored and what information should not be stored. But someone who tries to use it as a replacement for therapy will not know what reactions are completely different from the therapy that a person can offer.

Romantic chatbots
AI can be a very useful tool.

Will this kind of intrusion into psychology continue in the future?

It is clear that this is all just the beginning. AI is here to stay and will have more and more features. There are even those who believe that it will become more and more humane. However, I can never replace a psychologist. It would be a problem if more and more people believed this hypothesis. Although the Dean of WECAFC believes that this will not be a problem in the future. “I’m very optimistic,” he says. “I think there will come a point where it will collapse and while there may be some serious consequences, it will no longer continue.”

This invasion of work can die from success. Or maybe, finally, the necessary resources will be invested in public health so that mental health can be cared for the way it deserves. This may be the biggest blow the AI ​​can take. There is no point in insisting that the public stop replacing psychologists with algorithms if access to a psychologist remains a luxury. Emotional stress requires quick and effective responses. If we don’t want the machine to win and those who least deserve to pay, it’s time to invest in a solution.

Source: Hiper Textual

Previous articleHow does the new iPad Pro M5 differ from the M4, M2 and M1 models? Which one is better to buy
Next articleMacs are growing at twice the rate of the PC market, which remains very strong

LEAVE A REPLY

Please enter your comment!
Please enter your name here