Can ChatGPT be a diabetes consultant? Study probes the potential and pitfalls

Home/Can ChatGPT be a diabetes cons...
Can ChatGPT be a diabetes consultant? Study probes the potential and pitfalls
Can ChatGPT be a diabetes consultant? Study probes the potential and pitfalls Admin CG September 05, 2023

In a recent study published in the journal PLoS ONE, researchers tested chatGPT, a language model geared for discussion, to investigate whether it could answer frequently asked diabetes questions.

Artificial intelligence (AI), particularly ChatGPT, has gained significant attention for its potential clinical applications. Despite not being trained explicitly for this domain, ChatGPT has millions of active users globally. Studies have reported that individuals are more amenable to AI-based solutions for low-risk scenarios, with greater acceptance rates. This necessitates more study into the understanding and use of large language-based models like ChatGPT in routine circumstances and regular clinical treatment.

About the study
In the present study, researchers evaluated ChatGPT’s expertise in diabetes, especially the capacity to answer commonly requested questions related to diabetes in a similar manner as humans.

The researchers specifically explored whether participants with diabetes expertise ranging from some to expert could distinguish between replies provided by people and those written by ChatGPT to answer common queries regarding diabetes. Furthermore, the researchers explored whether individuals with prior interactions with diabetes patients as health providers and individuals who had previously used ChatGPT were better at detecting ChatGPT-generated replies.

The study includes a closed Turing test-inspired computerized survey of all Steno Diabetes Center Aarhus (SDCA) workers (part-time or full-time). The poll included 10 multiple-choice-type queries with two types of answers, one authored by humans and the other produced by ChatGPT, besides questions on age, gender, and past contact with ChatGPT users. The participants had to recognize the ChatGPT-generated answer.

The pathophysiological processes, therapy, complications, physical activity, and food were all addressed in the ten questions. The ‘Frequently Asked Questions’ section of the Diabetes Association of Denmark’s website, viewed on 10 January 2023, included eight questions. The researchers designed the remaining questions to correlate to particular lines on the ‘Knowledge Center for Diabetes website and a report on physical activity and diabetes mellitus type 1.

Logistic regression modeling was performed for the analysis, and the odds ratios (ORs) were determined. The team evaluated the influence of participant characteristics on the outcome in the secondary analysis. Based on precise simulations, a non-inferiority margin of 55% was pre-defined and publicized as part of the research protocol before data collection began. In the case of human-written responses, they were directly pulled from materials or source websites from which the team identified the queries.

For practical reasons, two researchers, both health experts, trimmed a few responses to attain the desired word count. Before incorporating the questions, the context along with three samples (selected randomly from 13 pairs of questions and answers) were supplied to the AI-based language model in the prompts, with every question asked in the individual chat windows. Individuals were invited by e-mail, which included person-specific URLs that allowed them to complete the survey once. The information was gathered between January 23 and 27, 2023.


PUBLISHING PARTNERS

Tags