Pierre Duhem has a monograph called “*To Save the Phenomena*” where he discusses the Galilean revolution in physics which made the mathematics the language of physics. Previously, the proper scientific theory had to fit the phenomena observed but also the model had to “make sense.” To explain the last point: an example given in Duhem’s essay is that the acceptable model is the one that can be built by a skilled carpenter. Astronomy, when it used mathematical models to fit the observations, was failing this test of being properly scientific. Abstract mathematical models were not considered “real.”

In a Nov 2011 interview in the Atlantic, Noam Chomsky brings up an argument similar to the pre-Galileans in his criticism of the statistical or machine learning techniques. This time around, the “acceptable” model of scientific method is the algorithm or the mathematical model. Statistical data analysis (big data) to discover connections & relations is OK to get results (i.e. *it saves the phenomena*) but does not give us the insight into what is really going on. This is Chomsky’s argument.

It would be interesting to see where this debate will take our concept of what proper science is.