

Using AI chatbots for Medical Information – How to Prevent Fake Medical News?
Do you still use online search engine to find medical information?
Follow us on our WhatsApp channel for the latest updates: https://bit.ly/
AsiaMDWhatsAppEN
AI chatbots like ChatGPT are taking over traditional search engines by offering fast, conversational responses. They now understand queries better and can even summarise online content.
But using AI tools to search for health information may carry risks. The AI chatbot’s knowledge is limited to its training data, so it may offer outdated and false healthcare advice. Crucially, it cannot verify the truth of its own output.
All too often, AI chatbots present errors with excessive overconfidence. Research study showed GPT-4 excelled at clinical questions yet still “hallucinated” facts without citing sources. In medicine, such mistakes can be dangerous: misreading symptoms, trusting unproven treatments, or following flawed guidance can delay proper care by a qualified medical doctor. The World Health Organization (WHO) has warned that health misinformation spread quickly, especially during medical crises, which can undermine public health responses.
In addition, some of the AI Chatbots generate false medical information, with several of them giving false answers 100 per cent of the time in this research study published in Annals of Internal Medicine.
To reduce the risk of fake medical news when using AI chatbots, consider the following:
Always ask for sources
Trustworthy answers should cite reputable sources such as WHO, the U.S. Centers for Disease Control and Prevention (CDC), or peer-reviewed medical journals. If no sources are given, treat the response with caution.
Use AI as a starting point, and not as a diagnosis
AI chatbots can help summarise information, but they should never replace professional medical advice. The U.S. Food and Drug Administration (FDA) has emphasised that AI tools should not be used for medical diagnosis or treatment decisions without human oversight.
Understand their limitations
Most AI models are trained on data from a fixed point of time in the past. AI chatbots trained in 2023 may not be aware of updated guidelines or newly approved medications in 2024 or later. Even AI tools that recommend healthcare advice may use unrelated information and give inconsistent recommendations.
Learn to spot misinformation
Be alert to overly simplistic explanations, exaggerated claims, or absent sources—these are common hallmarks of misinformation. Improving digital health literacy is essential to navigate health information online safely.
WHO: “Let’s flatten the infodemic curve” (WHO Infodemic Tips)
The World Health Organization (WHO) recommends seven practical steps to appraise health claims:
- Assess the source – Is it a recognised health authority or peer-reviewed journal?
- Check publication dates – Is the information up to date?
- Cross-verify facts – Compare with official sites like PubMed or the WHO.
- Consult fact-checkers – Use dedicated services (e.g., FactCheck.org).
- Look for transparency – Are authors, funding and conflicts of interest declared?
- Evaluate evidence level – Are recommendations based on controlled trials or expert consensus?
- Be skeptical of sensationalism – Beware of click-bait headlines and dramatic language.
Promote human oversight
Several hospitals now deploy AI chatbots whose replies are vetted by medical clinicians to ensure accuracy and speed. A Stanford Medicine study found that doctors using a chatbot matched the AI’s performance and outperformed those relying on web searches alone, highlighting the strength of a “human + AI” approach.
Technology can support health information, but it must not replace personal caution while using it.
This article was produced solely for the purpose of healthcare and medical knowledge. Not all innovations are available or approved for clinical use. AsiaMD may receive financial or non-financial sponsorship from the companies or institutions involved in these innovations. However, AsiaMD does not endorse any specific product or services in the article, in addition to the Terms and Conditions for the use of our AsiaMD.com website. Please consult your healthcare professional if you need more information.
0 Comments