The GLP is committed to full transparency. Download and review our Annual Report.

Viewpoint: AI may boost our diagnostic abilities, but it’s not ready to replace human doctors

I read a recent article in Nature Medicine about new inroads in deploying artificial intelligence (AI) in pediatrics. In the article, researchers report their success in using AI to mine electronic health records as a diagnostic tool. As the authors point out, this is not just a matter of making clinicians’ lives easier: missed diagnoses and misdiagnoses occur with disturbing frequency, leading to increased morbidity and mortality and higher costs.

However, I must sound a note of caution. While AI may be helpful in diagnosis, unless a day comes when machines can fully replicate human thought and emotions, we should be wary of allowing AI to move beyond diagnosis and actually make medical management decisions for us.

Related article:  We could try to build a conscious robot. But how would we know if we succeeded?

Viewing this from the perspective of pediatric palliative care, though incorporating AI seems very tempting, I worry that radical integration could be shortsighted. The key to navigating complex decisions with families involves careful examination of hopes, fears, values and goals.

Should AI be used to augment medical decision making? Absolutely, and we’re just beginning to explore the ways in which that might manifest itself. But replacing human beings is a separate issue altogether.

Read full, original post: Where AI in Medicine Falls Short

The GLP aggregated and excerpted this article to reflect the diversity of news, opinion, and analysis. Click the link above to read the full, original article.
News on human & agricultural genetics and biotechnology delivered to your inbox.
Optional. Mail on special occasions.

Send this to a friend