martes, 10 de octubre de 2023
The ‘model-eat-model world’ of clinical AI: How predictive power becomes a pitfall Katie Palmer By Katie Palmer Oct. 10, 2023
https://www.statnews.com/2023/10/10/the-model-eat-model-world-of-clinical-ai-how-predictive-power-becomes-a-pitfall/?utm_campaign=morning_rounds&utm_medium=email&_hsmi=277609794&_hsenc=p2ANqtz--xNN9rtCaK9enf_gIs52onEGb_G3DjJ6-g0kbZoEbKELOx-f_1TU5cxoqvNzzhAL4MO8FL0Mi9Bzv7U9rhcpcCMA1Pvg&utm_content=277609794&utm_source=hs_email
This is both counterintuitive and concerning. Just as medicine applies more AI tools to predict serious events like strokes or sepsis, those models can fall victim to their own success. As their performance plummets, their inaccurate results can pose harm instead of prevent it. “There is no accounting for this when your models are being tested,” said Akhil Vaid, author of a new paper in the Annals of Internal Medicine on models to predict death and kidney injury in ICU patients. “When it starts to work, that is when the problems will arise.”
It’s called data drift. Successful predictive models create a feedback loop: The AI tool helps keep patients healthier, then electronic health records reflect lower rates of kidney injury or mortality — and other predictive models use that data to retrain models over time. STAT’s Katie Palmer explains.
AI gone astray: How subtle shifts in patient data send popular algorithms reeling, undermining patient safety
Casey Ross
By Casey Ross
Feb. 28, 2022
Data analysis by Adam Yala, Janice Yang and Ludvig Karstens — Jameel Clinic, Massachusetts Institute of Technology
https://www.statnews.com/2022/02/28/sepsis-hospital-algorithms-data-shift/?utm_campaign=morning_rounds&utm_medium=email&_hsmi=277609794&_hsenc=p2ANqtz-8PAB7qbK4sKeAexcVK2xgiA9KTPe6gWeN4rcF6pByhfsBoIMi93ToYWrG0oGAP_e5LyOA7h2vtJSSlwA7hLzgmZSiPqg&utm_content=277609794&utm_source=hs_email
Suscribirse a:
Enviar comentarios (Atom)
No hay comentarios:
Publicar un comentario