Mainstream commentators find it difficult to understand that healthcare is one of the most resistant organisations in terms of “innovation”. Enterprises running it full steam need someone to blame when it comes to “certifications”. Meanwhile, healthcare costs are rising due to bloated bureaucracy and being shielded from competition. Hence, AI was hailed as a solution to “transform” healthcare and “speed up” diagnosis and workflows.
In specific medical settings, AI is already proving to be a valuable tool. Pathologists for example, increasingly lean on AI when carrying out cancer diagnoses. Machine learning techniques, a subset of AI, have been applied to the field of histopathology with the aim of speeding up cancer diagnostics by identifying and grading tumours.
Regina Barzilay, a computer scientist at the Massachusetts Institute of Technology, developed a machine-learning model to identify women at high risk of developing breast cancer that was more accurate than that used by physicians. A researcher in France last month developed an AI model along with start-up Owkin that predicts the risk of breast cancer patients having a metastatic relapse, which the group argues could be used to help avoid unnecessary chemotherapy.(emphasis mine)
Models come and go like seasons. These algorithms require a wider validation, addressing costs and standardisation. It seems like a memo from the executive teal to push around the agenda for AI adoption. Who pays for the cost of cloud/on-premise? What is the average duration of turn around time? Who’s responsible for false negatives (they are more important than false positives).What is the process of certification?
AI won’t magically transform process inefficiencies. You need to have the basics sorted out. Open up EMR solutions for competition. Enforce common standards and transparent pricing for lab procedures. Trim the excessive bureaucracy. Establish quantifiable parameters.
These are common sensical solutions. Common sense, though is uncommon.