A lot of people today are having troubles trusting artificial intelligence not to become sentient and take over the world à la “The Terminator.” For the time being in healthcare, one of the big questions for clinicians is: Can I trust AI to help me make decisions about patients?
Today’s podcast seeks to provide some answers with a prominent member of the health IT community, Dr. Blackford Middleton.
Middleton is an independent consultant, currently working on AI issues at the University of Michigan department of learning health systems. Previously he served as chief medical information officer at Stanford Health Care, CIO at Vanderbilt University Medical Center, corporate director of clinical informatics at Partners HealthCare System, assistant professor at Harvard Medical School and chairman of the board at HIMSS.
Like what you hear? Subscribe to the podcast on Apple Podcasts, Spotify or Google Play!
Talking points:
-
One of the biggest considerations the industry faces with AI is trust.
-
How can executives at healthcare provider organizations convince clinicians and others to trust an AI system they want to implement?
-
What must vendors of AI systems for healthcare do to foster trust?
-
It’s extremely important to ensure all parties involved are comfortable with collaboration between man and machine for decision-making.
-
How do healthcare organizations foster such comfort?
-
What must provider organization health IT leaders know about patient-facing AI tools?
-
What do the next five years look like with AI in healthcare? What must CIOs and other leaders brace for?
More about this episode:
Where AI is making a difference in healthcare now
UNC Health’s CIO talks generative AI work with Epic and Microsoft
Penn Medicine uses AI chatbot ‘Penny’ to improve cancer care
Healthcare must set guardrails around AI for transparency and safety
How ChatGPT can boost patient engagement and communication
Follow Bill’s HIT coverage on LinkedIn: Bill Siwicki
Email him: bsiwicki@himss.org
Healthcare IT News is a HIMSS Media publication.