The corona pandemic has not even reached its peak, and it poses are an exciting challenge to researchers- can AI solve this problem? We might see several peaks of the outbreak as it tends to remain undetected; may spread from symptomatic to other individuals.
I had been a part of a Tweetstorm to find novel ways to understand how the hospitals or the healthcare systems are rising to the challenge. It was apparent that hospitals would need to make their capacity public- number of beds, ventilators, trained personnel, personal protection equipment and the like. Even if anyone mashed up the data to make sense of the emerging pandemic and a rather narrow definition of who gets tested, we are still staring into an abyss. I think it was a somewhat optimistic projection with a firm twitter resolve to “stop the surge”.
Impossible because healthcare systems are balkanised, work in silos and make profits only by keeping asymmetry. It is next to impossible to deal with the contingency plans are the sudden surge has to be managed at home.
What this crisis has shown is that making use of the technology is, at its heart, a data problem.The algorithms need to be trained on large amounts of information before they can be put to work. A novel coronavirus, by definition, is not something that has been seen before, which hampers their use. But there are also many institutional and technical barriers that make it hard to collect and apply information in the population-wide, real-time way that is needed.
The author gives the example of NHS but at the same time identifies critical operational lacunae- hospitals work in data silos. Interoperation of data is a long way to go but would be resisted as it would centralise it as a single point of failure.