14 year old boy named Sewell Setzer III committed suicide after writing a love letter to an AI he had been dating for a long time attitude. This happened in February last year. However, the matter has now become public due to a legal battle waged by his mother against Character.AIa company that created a chatbot based on Daenerys Targaryen, with whom the young man established a relationship.
According to Sewell’s mother, it was AI that caused her son’s suicide. However, the company protects itself by ensuring that it always users are warned that their conversations with the app characters are fictitious.
Mental health problems are complex and multifactorial. You can’t pinpoint a single reason why they occur. Moreover, when it comes to suicide, it is very difficult to point fingers. However, the case of Sewell and his love for AI is an example of how far the advent of artificial intelligence has come in our lives. Many fear that this will cost us jobs, but we forget the possibility that it will also deprive us of social connections. It would be just as dangerous. Therefore, it is important to pay attention to situations like this young American’s to ensure they get the help they need.
What are Feature.AI characters like?
Character.AI is an app that allows users to create AI chatbots or interact with chatbots created by other users. In Sewell’s case, he chose Dannychatbot based Daenerys Targaryen which was created by someone else without HBO’s permission.
These chatbots can be trained to respond to conversations the way the user wants. In addition, they have memory I can remember conversations above and include detailed information about them in every interaction. They may exhibit something similar to feelings when interacting with users. For example, if a person shows love for AI, it is can respond accordingly.
This seems to be exactly what happened with Sewell: their conversations started innocently, but gradually developed into a more serious nature. more intimate and even sexy. Dani was always there when he needed to talk, responding quickly and telling him what he needed to hear. This gave rise to that feeling of falling in love, which is far from new.
Many cases of such crushes have been documented and may continue to occur as chatbots like Dani become more successful.
What is the reason for the love for AI?
In an article for Forbespsychologist Mark Travers spoke about two main reasons why the love for AI is becoming more widespread. For starters, he notes that chatbots have been improved by anthropomorphic appearance which makes users get along better. They feel like people. And secondly, Travers explains that AI chatbots respond triarchic theory of love. This indicates that falling in love is supported by three pillars: intimacy, passion and commitment. The first refers to a private connection between two people, or in this case, a human and an AI. The fact that I can tell him everything. Passion, logically, refers to desire. Finally, commitment is a sign of loyalty.
Chatbots are very well designed and meet all the requirements. A person can talk to them for hours and always They listen to you without arguing or judging you. On the other hand, if the user is enthusiastic, the chatbot will act accordingly. And finally, they’re designed to be fun, so commitment is a given.

All this can lead to love for AI, but it can also cause harm. An MIT psychologist explained this this summer. Sherry Turkle. According to her, a relationship with any type of AI begins with creating great comfort and relieving stress. However, over time this can pose a huge emotional risk.
This is because users end up idealizing the relationship. The psychologist tells this using the example of a married man who, although he did not leave his wife, greatly cooled his relationship with her by starting a relationship with a chatbot with artificial intelligence. The reason is that the relationship with machines is ideal. They always tell us what we want to hear, act as we would like, and empathize with us. This may make users prefer them to real human relationships. The same thing happened to Sewell. Little by little isolated and he just wanted to lock himself in his room and talk to Dani.
Social isolation is one of the leading causes of mental health problems. Therefore, although these people feel listened to and supported, deep down they lack those real human relationships that are so necessary. Turkle reminds us that stress, disagreement, rejection and vulnerability are also part of human relationships. Without all this, we are faced only with false sympathy and illusory interest. “The AI doesn’t care about you.”
AI does not act as a psychologist
Many companies developing artificial intelligence chatbots claim that they can help users’ mental health. On the one hand, through accompaniment, and on the other, through psychological consultations. AI can be trained to answer questions and conduct therapy, just like a psychologist would. Or so it is supposed. But the reality is completely different.
Chatbots don’t know: “I don’t know.” They have the answers to everything, always based on the information they collect from the Internet. This causes them to intuitively get answers, but not in the same way as the human mind, especially a psychologist. For this reason, many experts do not trust artificial intelligence chatbots created for this purpose.

Sewell’s love for the AI business is a good example of how ineffective it is. Analyzing the conversations after his death, it is clear that the young man He expressed suicidal intentions Dani. It was true that she told him not to do it, but she didn’t have the tools to help him. If he had opened up to a psychologist, his family or friends, the ending might have been different. So while AI cannot be 100% blamed for teen suicide, this should serve as a wake-up call for us to pay attention to our teens’ relationships. Both human and technological.
If this article has made you feel uncomfortable or have suicidal thoughts, please do not hesitate to seek help. In Spain, you have line 024 at your disposal. There is a way out.
Source: Hiper Textual
