top of page

When AI Hallucinates: Ferreting Out Falsifications From Facts

Reginald the Rooster - Original Art by Heather Thompson with the assistance of Generative AI Midjourney

GETTING AN AI TO CONFESS


It's a little sad how much some LLMs apologize for their mistakes, but considering how confident they are when they falsify information, I suppose it's appropriate.


When an AI generates misinformation, it is called a "hallucination." When it hallucinates on healthcare topics, there can be severe consequences. In this week's experiment, one AI delves into its own limitations.


New Article in the Home Care Technology Report

Heather Thompson is Home Care Tech Report's AI Specialist. In addition to AI, Her focus areas are growth strategy and the Agency of the Future. Previously known as Heather Rooney, she started her career at OCS, a pioneer in benchmarking and business intelligence for Healthcare at Home. She later founded her own firm, Heather L. Rooney Strategy & Marketing. Heather has over 20 years of experience in home health, hospice, and private-duty home care. She is a nationally recognized thought leader, keynote speaker, and respected voice in major publications. Heather has a solid reputation for helping organizations position themselves for dynamic shifts and emerging trends. She is also an award-winning artist, contemplative theologian, and disability/rare disease advocate published worldwide. Heather is excited to be back with the home care community during this unprecedented moment in technological history.



8 views
bottom of page