Personal info and privacy control may be key to better visits with AI doctors
Patients like AI doctors that offer privacy and remember their social information, researchers report
- Date:
- October 30, 2024
- Source:
- Penn State
- Summary:
- Artificial intelligence (AI) may one day play a larger role in medicine than the online symptom checkers available today. But these 'AI doctors' may need to get more personal than human doctors to increase patient satisfaction, according to a new study. Researchers found that the more social information an AI doctor recalls about patients, the higher the patients' satisfaction, but only if they were offered privacy control.
- Share:
Artificial intelligence (AI) may one day play a larger role in medicine than the online symptom checkers available today. But these "AI doctors" may need to get more personal than human doctors to increase patient satisfaction, according to a study led by researchers at Penn State. They found that the more social information an AI doctor recalls about patients, the higher the patients' satisfaction, but only if they were offered privacy control.
The research team published their findings in the journal Communication Research.
"We tend to think of AI doctors as machines that are antiseptic and generic," said S. Shyam Sundar, Evan Pugh University Professor and the James P. Jimirro Professor of Media Effects at Penn State. "What we show in this study is that it's important for these AI systems to not just talk about a patient's medical history but also to individuate them socially by recalling certain non-medical information about them, such as their occupation and hobbies. At the end of the day, AI doctors may guide patients through telehealth visits just as well as human doctors."
To see whether a doctor's knowledge of a patient's social or medical history increases patient satisfaction, the researchers asked 382 online participants to interact with a medical chatbot over two visits spaced about two weeks apart. Participants were told that they were interacting with a human doctor, an AI doctor or an AI-assisted human doctor. During the first visit the "doctor" -- in reality, a pre-compiled script that the researchers created for consistency -- chatted with patients about topics related to diet, fitness, lifestyle, sleep and mental health, and asked personal information about their occupation, relationship with their family, dietary habits and favorite activities. Then the doctor offered general recommendations for diet, exercise and mental health management.
During the second visit, the doctor either recalled the patient's medical or personal information or asked the patient to remind them of this information. Then the doctor gave similar health advice as they did in the first visit and offered half the patients the option to put their visit on the record and save their information to the online platform at the conclusion of the final session. Participants then completed an online questionnaire to assess their satisfaction with the service.
The researchers found that patients gave higher scores to AI doctors that recalled the patient's social information as long as the doctor offered privacy control before concluding the visit. The human doctor, on the other hand, did not need to recall either social or medical information for patients to feel that they had a close relationship with the doctor.
"When an AI doctor recalls a patient's social information, it is perceived as putting more effort into individuation, which leads to higher patient satisfaction, but only when the patient has privacy control," said Cheng "Chris" Chen, lead author and assistant professor of communication design at Elon University who graduated from the doctoral program of mass communications at Penn State. "This was surprising because AI systems treat all data the same, but patients see it differently. They perceived it as the doctor putting in more effort to recall the patient's social information."
The process still requires effort on the patient's part, Sundar added.
"Patients still want the AI system to provide them privacy control," he said. "It's like, as long as you give me control over my data, I appreciate you knowing about my social life and appreciate the effort you put in."
The study has implications for the design of AI systems in the medical field, according to the researchers.
"Recalling patient social information may lead to better satisfaction and patient compliance and more positive health outcomes," Cheng said.
According to co-author Joe Walther, Bertelsen Presidential Chair in Technology and Society and distinguished professor of communication at the University of California, Santa Barbara, the research also addresses larger questions about what it means to know someone, or rather to feel known by someone or something.
"When I tell a student her homework was better than many other students', it's just a numerical comparison," Walther said. "When I tell her that her homework is better than she did earlier in the semester, she knows I know her, that she's not just a number. The same goes for doctors: Am I just the latest lab tests, or am I unique?"
Other contributors to the study include Mengqi Liao, University of Georgia.
Story Source:
Materials provided by Penn State. Original written by Francisco Tutella. Note: Content may be edited for style and length.
Journal Reference:
- Cheng Chen, Mengqi Liao, Joseph B. Walther, S. Shyam Sundar. When an AI Doctor Gets Personal: The Effects of Social and Medical Individuation in Encounters With Human and AI Doctors. Communication Research, 2024; 51 (7): 747 DOI: 10.1177/00936502241263482
Cite This Page: