Don’t trust ChatGPT for cancer care advice
Accurate and erroneous suggestions were blended within about 30% of the chatbot’s answers, complicating the identification of mistakes.
The internet has transformed the way people gather information, and for those seeking medical knowledge, this has proven to be a game-changer. But with the rise of advanced technologies like ChatGPT, the question arises: can artificial intelligence provide accurate medical recommendations? Researchers from Brigham and Women’s Hospital delved into this very question, revealing intriguing findings in a recent study published in JAMA Oncology.
ChatGPT, a sophisticated AI chatbot, has been making its mark as a source of medical advice. However, the study discovered that in around one-third of cases, ChatGPT provided recommendations for cancer treatment that didn’t align with the respected National Comprehensive Cancer Network (NCCN) guidelines. This underscores a crucial lesson: while technology is a valuable resource, it has limitations that must be recognized.
Dr. Danielle Bitterman, an expert in Radiation Oncology and part of the Artificial Intelligence in Medicine (AIM) Program at Mass General Brigham, emphasized, “ChatGPT responses can sound a lot like a human and can be quite convincing. But, when it comes to clinical decision-making, there are so many subtleties for every patient’s…