Doctors more likely to use negative language describing Black and Hispanic patients in electronic health records, study suggests

Doctor visiting with patient

A new study of patients’ electronic health records found that doctors were more likely to use negative words in describing visits with Black and Hispanic patients than white patients, something that could result in bias toward and unequal treatment of patients, according to the researchers.

Photo of doctor and patient
Photo credit: 123rf.com

The study titled “Examining Linguistic Differences in Electronic Health Records for Diverse Patients With Diabetes: Natural Language Processing Analysis,” analyzed the medical records of Black, white and Hispanic or Latino patients who were seen by 281 physicians in a large metropolitan area. The researchers were interested in whether or not doctors showed bias in the language choices when describing patients in post-visit reports.

“Previous studies have shown that care providers’ biases may be part of the reason for racial disparities in health,” said Eden King, the Lynette S. Autrey Professor of Psychological Sciences at Rice University and one of the study’s lead authors. “We wanted to know whether we could detect such biases in the language providers use in health records, and we did.”

The summaries from doctors for Black and Hispanic patients contained significantly more negative adjectives (such as “unkind,” “negative” or “stupid”) and significantly more fear and disgust words (such as “intimidate,” “attack,” “cringe” and “criticize”) than those for white, non-Hispanic patients. The notes for Hispanic or Latino patients included significantly fewer positive adjectives (such as “supportive,” “kind,” “great” and “nice”), trust verbs (such as “affirm,” “advise,” “confide” and “cooperating”) and joy words (such as “admiration,” “elated,” “glad” and “pleased”) than those for white, non-Hispanic patients.

“Understanding that providers’ language may indicate bias points to an opportunity to interrupt it,” King said. “If we can perfect algorithms to detect such bias, we can raise awareness in the moment of the patient-provider conversation. That awareness may be enough to encourage more equitable health care.”

King and her fellow researchers hope their work will enable physicians and other researchers to identify and mitigate bias in medical interactions with the goal of reducing health disparities stemming from bias.

Co-authors of the study include Isabel Bilotta, Deutser; Scott Tonidandel, Belk College of Business University of North Carolina at Charlotte; Winston Liaw, University of Houston Tilman J. Fertitta Family College of Medicine; Diana Carvajal, University of Maryland, Baltimore; Ayana Taylor, University of California, Los Angeles; Julie Thamby, Duke University School of Medicine; Yang Xiang, Peng Cheng Laboratory; Cui Tao, Mayo Clinic; and Michael Hansen, Baylor College of Medicine.

The study was published in JMIR Medical Informatics and is online here. It was supported in part by a grant from the Rice Race and Anti-Racism Research Fund.

Body