Machine learning is the new snake oil for “personalised medicine”. Let’s start with this assumption- a genome-wide study is done to assess the “potential inhibitors” but where are the drugs? What additional inputs are we giving to get a meaningful outcome? (measured in terms of overall survival).
The following blurb is for a Physics article but it should help us to pause and to reflect- why is everything being overhyped?
The flaws in these particular studies are not huge in themselves, but the way they have been reported is symptomatic of a problem that is in fact serious. Such overblown reports lend false credence to the idea—most notoriously promoted in Chris Anderson’s 2008 infamous article, “The End of Theory: The Data Deluge Makes the Scientific Method Obsolete,” and increasingly part of the general zeitgeist—that AI generally and deep learning theory will soon replace all other approaches to computation or even to knowledge, when nothing of the sort has been established. The media enthusiasm for deep learning is sending the wrong impression, making it sound like any old problem can be solved with a massive neural network and the right data set, without attention to the fundamentals in that domain.
The truth is that many of the hard, open problems in the world require a great deal of expertise in particular domains. In the sort of problems in these two papers, it means if someone wants to make a useful contribution to the solution of the three-body problem or similar problems, one has to spend serious time studying the sophisticated science of differential equations, numerical computation, and dynamical systems. In the domain of natural language understanding, it might similar mean incorporating a great deal of work in linguistics and psycholinguistics, rather than gathering just a large data set and large assembly of sophisticated computers.
The widespread misimpression that data + neural networks is a universal formula
via Are Neural Networks About to Reinvent Physics? – Issue 78: Atmospheres – Nautilus