Listen to the article
Oncologists Warn Against Patients Using AI for Cancer Treatment Advice
Medical professionals are increasingly concerned about cancer patients turning to artificial intelligence platforms for treatment advice, according to Dr. Derek Tsang, a Radiation Oncologist at Princess Margaret Cancer Centre, one of Canada’s premier cancer treatment facilities.
Dr. Tsang recently highlighted the growing problem of patients consulting AI tools for medical guidance, noting that these interactions “often end up being a source of misinformation for patients and their families, and is fraught with hazard.”
His observations come amid the rapid proliferation of generative AI tools like ChatGPT and Google’s Gemini, which have become increasingly accessible to the public. While these platforms can provide general information on various topics, medical professionals warn they lack the specialized knowledge and clinical judgment necessary for personalized cancer care.
The issue has gained enough prominence to warrant attention in the Canadian Medical Association Journal (CMAJ), where author Ningjun Shao explores the complex relationship between artificial intelligence, medical misinformation, and patient care in an article titled “Medical AI, misinformation, and the value of engaging with patients.”
This trend reflects a broader pattern in healthcare, where patients increasingly seek information from online sources before or after consulting medical professionals. A 2023 survey by the Canadian Cancer Society found that over 65% of cancer patients search for information online, with nearly 30% reporting they’ve used AI tools to research treatment options.
The emergence of AI as a source of medical information presents unique challenges for oncologists and other healthcare providers. Unlike traditional internet searches, AI platforms can deliver information with an authoritative tone that may convince patients of its accuracy, regardless of whether the content is medically sound or appropriate for their specific condition.
“The danger lies in the confidence with which these AI systems present information,” explains Dr. Sarah Henderson, an oncology researcher not involved with Dr. Tsang’s statement. “Patients may not realize these systems can ‘hallucinate’ or generate plausible-sounding but inaccurate information, particularly about cutting-edge cancer treatments or clinical trials.”
Healthcare providers are now adapting to this new reality by proactively asking patients about their AI research and addressing potential misinformation during consultations. Some cancer centers have begun developing guidance for patients on evaluating online information sources, including specific recommendations about the limitations of current AI systems.
Princess Margaret Cancer Centre, where Dr. Tsang practices, serves thousands of patients annually and stands among North America’s largest cancer treatment facilities. As part of the University Health Network, it combines clinical care with cutting-edge research, making its practitioners particularly attuned to the intersection of technology and cancer care.
Medical experts emphasize that while AI may eventually play a valuable role in healthcare, current consumer-facing systems are not designed to replace professional medical advice. They lack access to individual medical records, cannot perform physical examinations, and are not programmed with the specialized knowledge required for cancer treatment decisions, which often involve complex risk-benefit calculations.
Patient advocacy groups have echoed these concerns while acknowledging patients’ desire for information. “We understand patients want to be informed participants in their care,” says Michael Harrison of the Canadian Cancer Patient Coalition. “But we recommend patients discuss information found online with their healthcare team rather than making decisions based on AI outputs alone.”
As AI technology continues to evolve, medical institutions and technology companies face mounting pressure to develop standards for medical information provided by these platforms and to clearly communicate their limitations to users seeking health advice.
For now, oncologists like Dr. Tsang continue to emphasize the importance of direct patient-physician communication and reliable medical sources when making critical treatment decisions.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


7 Comments
This is a concerning development. Patients need to understand the limitations of AI and rely on doctors for personalized, evidence-based cancer care. The spread of misinformation through these platforms could have serious consequences.
I agree. Oncologists are right to raise awareness about this issue and urge patients to consult qualified medical experts rather than turning to AI for complex medical decisions.
While AI can be a useful tool, it should never replace the guidance of trained medical professionals, especially for high-stakes health decisions. Patients must be cautious about using AI for cancer treatment advice.
I can see how the accessibility of AI tools could lead to patients seeking treatment advice outside of proper medical channels. Oncologists are right to sound the alarm on the risks of misinformation and improper guidance.
Absolutely. AI may have its place in providing general health information, but for critical decisions around cancer treatment, patients need the expertise and clinical judgment of qualified doctors.
This is a concerning trend. Patients should be wary of relying on AI for complex medical advice, especially for serious conditions like cancer. Consulting qualified medical professionals is crucial for personalized, evidence-based care.
The rapid proliferation of AI tools is a double-edged sword. While they can provide useful information, they lack the specialized knowledge and clinical judgment needed for personalized cancer treatment. Patients should be wary of relying on them.