Beware of using ChatGPT or other chatbots as a shortcut to reply to friends. Even your colleagues. You run the risk of not being liked as much anymore. A new study from Ohio State University shows that using artificial intelligence tools to write personal correspondence, such as letters and text messages, causes a deeply negative reaction in those who receive it.
The researchers presented 208 adults with a new fictional “best friend” named Taylor.. They asked participants to talk to Taylor in writing about one of three possible scenarios: they were tired and needed support; they had a conflict with a colleague and needed advice; or his birthday was approaching. The slogan indicated that they had known each other for many years.
The experiment was conducted online. Everyone had to write a short message to Taylor describing their situation. Everyone received an answer. Analysts told them that Taylor had written a draft of the message. However, there was an option.
Some participants were told that “their new best friend” had an artificial intelligence chatbot that could help them find the right tone for a message. Others were told that a member of the writing community helped them make corrections. And to the third group, they explained that Taylor did all the editing alone. Then everyone had to rate how they felt about Taylor’s answer.
Is it really so bad that your friend uses ChatGPT?
The researchers scored all responses equally in front of the participants. They were even told that Taylor was “brooding” regardless of the scenario presented. But people reacted differently depending on whether Taylor received help or not.
In cases where they were told that their “friend” was using a chatbot such as ChatGPT to edit a response, participants reacted much more negatively. Most of these participants, for example, stated that they disagreed with the statement. “Taylor meets my needs as a close friend.”
They also distrusted the link: they were less confident in the statement. “He likes me as a close friend”. Overall, they rated Taylor’s response as less appropriate than those who received a response written solely by Taylor.
You might think that these people don’t want to use technology for something so personal. However, the researchers found something important: The results showed that people reacted just as negatively when they were told that Taylor had enlisted the help of another person, a member of an online writing community.

On perceived effort to maintain relationships.
This has more to do with the perception of how much effort my “friend” puts into our relationship than using an AI chatbot. “Receiving help from another person was not significantly different from using artificial intelligence in terms of perceived partner effort, relationship satisfaction, uncertainty, and perceived importance,” the study report noted.
“Effort is very important in a relationship”said Liu Bingjie, lead author of the study and professor of communications at The Ohio State University. “People want to know how much you’re willing to invest in your friendship, and if they feel like you’re taking a shortcut by using artificial intelligence to help, that’s not good,” Liu said in a statement.
The worse participants rated their friend Taylor’s efforts, the less satisfied participants were with their relationship and the more insecure they felt about the friendship. Of course, you probably won’t admit to your friends that you’re responding to them using a chatbot. But Liu believes that as ChatGPT and other similar tools become more popular, people will gradually become more flexible in determining when a non-human is interacting with us.
Another study published in 2016 analyzed the use of the Facebook “Like” button and came to a similar result. Research has shown that receiving a single click rather than a carefully considered and thoughtful response makes people evaluate their relationships more negatively.
The main takeaway: It’s better to handle your relationships on your own, Liu insisted. “Don’t use technology just because it’s convenient. Sincerity and authenticity are still very important.”.
Source: Hiper Textual
