New! Sign up for our free email newsletter.
Science News
from research organizations

AI can help write a message to a friend -- but don't do it

Use of AI can make partners less satisfied, more uncertain

Date:
September 11, 2023
Source:
Ohio State University
Summary:
Using artificial intelligence applications to help craft a message to a friend is not a good idea -- at least if your friend finds out about the use of AI, a new study suggests.
Share:
FULL STORY

Using artificial intelligence applications to help craft a message to a friend is not a good idea -- at least if your friend finds out about the use of AI, a new study suggests.

Researchers found that people in the study perceived that a fictional friend who used AI assistance to write them a message didn't put forth as much effort as a friend who wrote a message themselves.

That perception may be understandable, but the effect goes beyond the message itself, said Bingjie Liu, lead author of the study and assistant professor of communication at The Ohio State University.

"After they get an AI-assisted message, people feel less satisfied with their relationship with their friend and feel more uncertain about where they stand," Liu said.

But to be fair to AI, it wasn't just the use of technology that turned people off. The study also found negative effects when people learned their friend got help from another person to write a message.

"People want their partners or friends to put forth the effort to come up with their own message without help -- from AI or other people," Liu said.

The study was published online recently in the Journal of Social and Personal Relationships.

As AI chatbots like ChatGPT become increasingly popular, issues about how to use them will become more relevant and complex, Liu said.

The study involved 208 adults who participated online. Participants were told that they had been good friends with someone named Taylor for years. They were given one of three scenarios: They were experiencing burnout and needed support, they were having a conflict with a colleague and needed advice, or their birthday was coming up.

Participants were then told to write a short message to Taylor describing their current situation in a textbox on their computer screen.

All participants were told Taylor sent them a reply. In the scenarios, Taylor wrote an initial draft. Some participants were told Taylor had an AI system help revise the message to achieve the proper tone, others were told a member of a writing community helped make revisions, and a third group was told Taylor made all edits to the message.

In every case, people in the study were told the same thing about Taylor's reply, including that it was "thoughtful."

Still, participants in the study had different views about the message they had supposedly received from Taylor. Those who received a reply helped by AI rated what Taylor did as less appropriate and more improper than did those who received the reply that was written only by Taylor.

AI replies also led participants to express less satisfaction with their relationship, such as rating Taylor lower on meeting "my needs as a close friend."

In addition, people in the study were more uncertain about their relationship with Taylor if they received the AI-aided response, being less certain about the statement "Taylor likes me as a close friend."

One possible reason that people may not like the AI-aided response could be that people think using technology is inappropriate and inferior to humans in crafting personal messages like these.

But results showed that people responded just as negatively to responses in which Taylor had another human -- a member of an online writing community -- help with the message.

"What we found is that people don't think a friend should use any third party -- AI or another human -- to help maintain their relationship," Liu said.

The reason, the study found, was that participants felt Taylor expended less effort on their relationship by relying on AI or another person to help craft a message.

The lower participants rated Taylor's effort by using AI or another person, the less satisfied they were with their relationship and the more uncertainty they felt about the friendship.

"Effort is very important in a relationship," Liu said.

"People want to know how much you are willing to invest in your friendship and if they feel you are taking shortcuts by using AI to help, that's not good."

Of course, most people won't tell a friend that they used AI to helped craft a message, Liu said. But she noted that as ChatGPT and other services become more popular, people may start doing a Turing Test in their minds as they read messages from friends and others.

The phrase "Turing Test" is sometimes used to refer to people wondering if they can tell whether an action was taken by a computer or a person.

"It could be that people will secretly do this Turing Test in their mind, trying to figure out if messages have some AI component," Liu said. "It may hurt relationships."

The answer is to do your own work in relationships, she said.

"Don't use technology just because it is convenient. Sincerity and authenticity still matter a lot in relationships."

Liu conducted the study with Jin Kang of Carleton University in Canada and Lewen Wei of the University of New South Wales in Australia.


Story Source:

Materials provided by Ohio State University. Original written by Jeff Grabmeier. Note: Content may be edited for style and length.


Journal Reference:

  1. Bingjie Liu, Jin Kang, Lewen Wei. Artificial intelligence and perceived effort in relationship maintenance: Effects on relationship satisfaction and uncertainty. Journal of Social and Personal Relationships, 2023; DOI: 10.1177/02654075231189899

Cite This Page:

Ohio State University. "AI can help write a message to a friend -- but don't do it." ScienceDaily. ScienceDaily, 11 September 2023. <www.sciencedaily.com/releases/2023/09/230911141009.htm>.
Ohio State University. (2023, September 11). AI can help write a message to a friend -- but don't do it. ScienceDaily. Retrieved December 21, 2024 from www.sciencedaily.com/releases/2023/09/230911141009.htm
Ohio State University. "AI can help write a message to a friend -- but don't do it." ScienceDaily. www.sciencedaily.com/releases/2023/09/230911141009.htm (accessed December 21, 2024).

Explore More

from ScienceDaily

RELATED STORIES