Home » Readers Write » Currently Reading:

Readers Write: AI Can’t Feel Emotions, But It Can Be Designed to Care

March 4, 2026 Readers Write No Comments

AI Can’t Feel Emotions, But It Can Be Designed to Care
By Richard Mackey

Richard Mackey, MBA is CTO at CCS.

image

AI-assisted chronic disease management is becoming a reality. Some of the biggest AI companies have set their sights on healthcare with the launch of solutions like ChatGPT for Health and new personal health data management tools like those offered by Claude from Anthropic.

Chronic diseases like diabetes, heart disease, and depression require not just medical oversight, but consistent engagement, trust, and behavioral support. AI tools are starting to offer just that, both inside and outside of the traditional care environment.

Still, if those AI interactions feel cold, impersonal, or judgmental, they can drive disengagement, the opposite of what’s needed to improve long-term patient outcomes.

Done poorly, AI can amplify the very problems that it is supposed to solve. Done well, empathetic AI becomes a force multiplier, extending the reach of human care, building trust at scale, and helping people feel supported, even when interacting with a “machine.”

When Empathy Is a Design Challenge

Empathy in AI isn’t about simulating emotions or pretending to be human when it’s not. AI shouldn’t try to be human, but it does need a native understanding of the emotional context of the interaction and an ability to respond in a way that feels respectful, supportive, and authentic. In other words, empathy in AI is a design problem, one that spans data, UX, language, and intended purpose.

Consider the example of a patient managing type 2 diabetes. If a patient stops using their continuous glucose monitor, a typical automated system might flag it as noncompliance. But an empathetic AI agent that is trained not just to process the data but also to understand human behavior might recognize subtle signals in the data that indicate emotional burnout or socioeconomic barriers, and adjust the tone of outreach accordingly. That could mean offering reassurance instead of reminders, or escalating the case to a human clinician or social worker for follow-up.

Striking the right tone and balance in the design of communication with the agent, seeking to understand or offer encouragement, for example, will make a meaningful difference in whether a patient reengages or shuts down.

The ROI of Empathy

In value-based healthcare, where providers and health plans are financially accountable for outcomes, empathetic AI that is embedded in chronic disease management workflows can have measurable impact. AI can use sentiment analysis or behavioral cues to help identify patients who are at risk of disengagement or decline, triggering proactive interventions from human outreach staff.

AI can also handle routine administrative tasks with appropriate tone and timing and without clocking out at the end of an eight-hour shift, freeing up human clinicians to focus on complex, relationship-based care that fosters engagement and sustains motivation.

The result is fewer hospitalizations, higher therapy adherence, improved satisfaction scores, and ultimately, better chronic experiences and better health outcomes at lower cost.

Designing for Trust in the Age of Automation

As AI becomes more embedded in the healthcare ecosystem, its ability to convey empathy in a transparent way must be a priority. Research has already shown that it’s possible, with human respondents identifying AI responses as more empathetic and engaging across scenarios ranging from crisis situationsand cancer care to everyday communications from healthcare providers.

The consumer world is quickly operationalizing this approach, with companies like beauty brand Sephora and airline Qatar Airways scoring accolades for their AI assistants’ optimal blend of digital efficiency, personalization tools, and engagingly empathetic personality. As companies like OpenAI and Anthropic turn their attention to healthcare, they are likely to lean into a similar empathy-first approach to assist individuals with healthcare-specific tasks.

The key to success will be maintaining transparency and trust in the AI-powered healthcare ecosystem as we leverage the technology’s seemingly near-limitless potential. The bottom line is that we don’t need AI to have feelings, but we do need it to understand ours, especially when and where support and care is needed most as a patient.



HIStalk Featured Sponsors

     







Text Ads


RECENT COMMENTS

  1. Today's post contains the phoenixes rising from the ashes of the post COVID telehealth era. There's two things that destroy…

  2. "Block [...] will lay off nearly half of its workforce [...citing...] AI". Here's an alternative interpretation. AI has become an…

  3. A Week in the Life of Clinical Informatics: Sunday: patient care. Monday: system design. Tuesday: iterate on feedback. Wednesday: governance…

Founding Sponsors


 

Platinum Sponsors


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gold Sponsors


 

 

 

 

 

 

 

 

RSS Webinars

  • An error has occurred, which probably means the feed is down. Try again later.