
Patients are increasingly turning to AI chatbots like ChatGPT for healthcare advice and answers. In fact, over a four-week period in 2025, users submitted more than 580,000 weekly health-related questions to ChatGPT.
As generative AI tools become a more common source of health information, beyond accuracy, they should also include how clearly and compassionately information is communicated, especially for stigmatized conditions like opioid use disorder (OUD).
A recent study in the American Journal of Addictions compared ChatGPT-generated responses to FAQs from major U.S. health organizations. This analysis offers insight into how AI can support patient education in a context where both readability and tone are critical.
Researchers took OUD questions from the FAQ sections of state and federal public health agencies, professional societies, and American medical centers and entered them into ChatGPT. They compared the ChatGPT answers with the original site’s FAQ answers; the ChatGPT responses were found to be too complex and clinical for the average reader with an opioid use disorder. Using the National Institute on Drug Abuse (NIDA) “Words Matter” framework, they noted the frequency of stigmatizing terms, lexical density, and syntactic complexity in the answers.
Terms flagged by the (NIDA) list occurred in 9.6 percent of the ChatGPT responses, and in just 6.0 percent of the organizational FAQs sections. According to study co-author Akhil Anand, MD, the result of this research was to prioritize accessible, non-stigmatizing communication as part of OUD treatment. “Communication is not neutral; it shapes trust, stigma, and willingness to seek treatment,” he explained. “Although we found no significant increase in stigmatizing terminology, increased complexity alone may constitute a barrier to care,” the Cleveland Clinic psychiatrist elaborated.
Providers should use clear, easy-to-use language when speaking with patients with substance use disorders. According to the study, ChatGPT answers also included longer words, with an increased syntactic and informational load. The answers deviated from the recommended sixth-to-eighth-grade reading level for patient materials. Researchers also noted that the study did not use motivational language and did not evoke the empathic tone necessary for sensitive health topics.
Peter Vernig, PhD, MBA, ABPP, the Vice President of Mental Health Services at Recovery Centers of America, said that “medical jargon can be a covert form of gatekeeping.”
“When a person experiences a medical crisis, it’s already more challenging for them to process information, and the scope and presentation of medical knowledge provided by AI may be beyond their ability to understand. Someone struggling with opioid use disorder doesn’t need a textbook; they need empathy, understanding, and support,” Dr. Vernig explained.
Black communities in rural areas have been significantly impacted by the opioid epidemic, with increases in opioid-related hospitalizations and higher overdose rates compared to urban areas. Some of the contributing factors include unemployment, poverty, and the widespread availability of opioids. Often, the lack of healthcare infrastructure in rural communities means residents must travel long distances for treatment. When they do receive the care needed, they’re often faced with discrimination and stigma.
“Research has shown with remarkable consistency that Black Americans shoulder the burden of a heavy bias in the healthcare system,” Dr. Vernig explained. “They face disparities in pain management, are prescribed life-saving medications like buprenorphine less frequently, and often have to deal with a system that lacks cultural competency.”
Fifty-five percent of Black Americans report having had negative experiences with healthcare providers, contributing to ongoing distrust of the medical system. This mistrust is deeply rooted in a history of unethical practices and exploitation, including cases like Henrietta Lacks and the Tuskegee Syphilis Study, which continue to shape community perceptions of healthcare and medical research today.
“The resulting mistrust of the medical field has driven many to bypass the medical system altogether and seek answers online, the most recent iteration being AI chatbots,” Dr. Vernig explained. “Setting aside issues of accuracy and lack of context, which are inherent in this application of artificial intelligence, the language used by chatbots can be a barrier for many users.”
While the study did not assess key aspects of addiction care or how patients interpret and use ChatGPT-generated information, the key point remains: AI responses can be difficult for patients to understand. “Large language models can simplify text when explicitly prompted,” Dr. Anand said. “But this study shows that if you use them ‘out of the box,’ you may get content that’s technically sound yet overly complex.”
The study’s researchers emphasized the importance of a hybrid approach that uses AI while ensuring that patient education is grounded in ‘human judgment, health literacy standards, and person-first language.’
“In OUD care, where engagement can be fragile, and stakes are high, plain language is not a stylistic preference – it is a clinical intervention,” Dr. Anand said. “And for now at least, the art of clear communication in addiction care remains a distinctly human responsibility.”
By subscribing, you consent to receive emails from BlackDoctor.pro You may unsubscribe at any time. Privacy Policy & Terms of Service.
Are you a healthcare professional? Register with us today!