Healthcare AI News 8/6/25
News

OpenAI will improve ChatGPT’s ability to detect signs of mental health issues or emotional stress after reports that it sometimes reinforces user delusions. The company says that AI can feel more personal and responsive than other technologies, which can be problematic to someone who is experiencing mental issues.
Clinicians credit Epic’s AI, which flags keywords in radiology reports, with helping identify lung cancer in a patient who was initially diagnosed with sinus issues.

Google says that Med-Gemini did not hallucinate when it cited the non-existent term “basilar ganglia” in a research paper, but instead relied on radiology reports for training in which “basal” was often erroneously transcribed as “basilar.”

A watchdog group finds that ChatGPT will advise teens on how to get drunk or high, hide eating disorders, and write a suicide note.
Business

Health tech and AI marketplace operator Elion raises $9.3 million in seed funding. The company says that 60% of US health systems have used its service.
Research
Researchers find that LLMs hallucinated 50% to 82% of the time when a single false element — such as a bogus lab result or nonexistent condition – was inserted into simulated clinical notes, warning that “adversarial hallucination” poses a serious risk to real-world AI uses such as clinical decision support.

Mass General Brigham researchers develop FaceAge, an AI tool that estimates age from facial photos. They found that cancer patients often appear five or more years older than their actual age, with the most aged-looking patients having the lowest odds of survival. They say that doctors already use visual assessments when considering ordering chemotherapy or radiation and the tool will help them quantify it.
A study finds that LLMs can screen EHR data to identify clinical trial candidates but sometimes performs poorly on specific eligibility criteria, leading them to instead score the patient by the percentage of requirements that they meet.
Other
Google DeepMind CEO Demis Hassabis predicts that AI may eventually take over aspects of diagnosis and decision-making that are typically performed by doctors, but it will never replace nurses because it can’t provide empathy, emotional support, and human connection.
Contacts
Mr. H, Lorre, Jenn, Dr. Jayne.
Get HIStalk updates.
Send news or rumors.
Follow on X, Bluesky, and LinkedIn.
Sponsorship information.
Contact us.
![]()





































































"A valid concern..." Oh please. Everyone picks the software they like and the origin of that software is an afterthought.…