Islamic Statistics: A Disruptive Technology

Disruptive technology is an innovation that significantly alters the way that consumers, industries, or businesses operate. A disruptive technology sweeps away the systems or habits it replaces because it has attributes that are recognizably superior. I am now actively looking for teachers of introductory statistics courses throughout the Islamic world. If you would like to learn a revolutionary approach to teaching statistics, based on Islamic epistemological principles, please sign up on the Google Document: https://forms.gle/FhbmVcoZkQTT9wku5

In a previous posts Preface to Radical Statistics and Why an Islamic Approach to Statistics?, I have explained how we can develop a new approach to statistics, which rejects a century of developments based on a methodology created by Sir Ronald Fisher, father of modern statistics. It is my hope that this will be a disruptive technology – it will eventually sweep away the entire structure of knowledge which has been built up under this name, and replace it with a radically different alternative. This structure of statistics (which I studied in detail and depth at the Ph.D. Statistics program at Stanford University), is currently being taught in universities around the globe. This structure is deeply deficient for a number of reasons. The most important among these is the logical positivist philosophy which was used to create the foundations of this discipline in the early 20th Century. We briefly describe the key problems created by this.

First, a very brief introduction to the Logical Positivist philosophy. Over a century of brutal and massively destructive religious warfare between Christian factions made it necessary for European intellectuals to find an alternative to religion on which to create a secular political science. Bitter experience had shown them that Christianity was highly unsuitable for this purpose. Ultimately, over the period of more than two centuries, this led to the creation of a whole body of knowledge which forms the foundations of secular modernity. Given the warfare among Christian factions, it was a necessity for secular knowledge to command consensus without reference to an agreed upon body of religious or moral knowledge. Therefore, modern secular knowledge claimed to be a body of objective knowledge, universally applicable to all human societies, and derived on basis of publicly confirmable observations and uncontestable logic alone. This theory of secular knowledge received its final polished form in the early 20th Century with the development of the philosophy of logical positivism (LP). According to LP, valid knowledge could be based only on observations and logic. Science was based on observations and logic and was the only form of valid knowledge. Religion was based on unobservables like God, afterlife, Day of Judgment, angels, etc. and hence not a form of valid knowledge. For a more detailed account, see The Emergence of Logical Positivism.

In fact, the task facing the European intellectuals was an impossible one. Any body of knowledge which informs us about appropriate political, economic, and social institutions and regulations must necessarily be built on moral foundations. But violent disagreements among Christian factions made it impossible to find suitable moral foundations on which consensus could be achieved. This led to the use of “reason” as a code-word to develop knowledge — reason could be re-defined in ways suitable to the desired epistemological goals. Thus Kant used reason to arrive at Newton’s Laws of Gravity, and also to devise a universal moral code of behavior, applicable to all “rational” human beings. Continuing in this fashion, economists today define “rationality” in a peculiar way to endorse as rational only behavior which seeks to maximize worldly pleasure — belief in the unseen is considered irrational. The punchline is that secular modern knowledge claims to be objective, universal, and applicable to all human societies, but it conceals a large base of Eurocentric assumptions. For a more detailed account of what has been concealed beneath the facade of “reason”, see The Puzzle of Western Social Science and Origins of Western Social Sciences.

At the heart of Logical Positivism is a rejection of unobservables as a basis for knowledge. This is in dramatic contrast to the message of the Quran, which opens by explaining that Taqwa (God-consciousness) requires faith in the unseen. It is this rejection of unobservables that leads to serious problems at the heart of modern statistics. The concept of probability refers to unobservable events which might have happened, instead of what actually occurred. Statistics is based on flawed foundations because of Logical Positivist efforts to remove these unobservables, in accordance with the misconception that science is based purely on observables, In probability, the awkward and incoherent frequency definition, and the equally awkward and incoherent Bayesian definitions are in use in textbooks around the world. Neither refer to the concept of probability as being about what might have happened, because such terminology is meaningless according to LP. Equally serious and fundamental is the problem that causality is never observable. Only correlations are observable. So despite clarity on the fact that correlations are not the same as causation, statistician continue to equate the two, because they have not satisfactory theory of causation — see Causality as Child’s Play. Within the philosophical framework of LP, no satisfactory theory of probability, or causation, can ever be developed, because both concepts relate fundamentally to unobservables.

In addition, a third fundamental defect in modern statistics was created by lack of computing capabilities. Since large data sets were difficult to handle, Sir Ronal Fisher defined statistics to be about the reduction of data, to small number of “sufficient statistics”. The technique he devised for this purpose was to imagine that the data is a random sample from a hypothetical parent population. A well-chosen parent population can be characterized by a few parameters, and modern statistics became the art of learning about these few parameters from the data. Modern textbooks continue to base statistics on this methodology created by Fisher, even though advances in computational capabilities have rendered it obsolete. By careful choice of the imaginary parent population, an expert statistician can make any data set appear to have any pre-selected characteristics desired. With modern computers, there is no need to reduce the data, since we can analyze enormously large data sets directly, without imposing simplifying assumptions upon them. This simple fact has not been noticed, and statistics continues to be taught in the Fisherian mold.

These fundamental defects in the foundations of modern statistics render it ripe for a revolution. This is what my new textbook in Statistics entitled: Real Statistics: A Radical Approach attempts to accomplish. In a previous post (Preface to Radical Statistics) I have described some of the key features of the new textbook, and provided links to the first draft. The first draft basically compiles a large number of lectures which were created independently, though with a view to coherence and sequencing. Nonethless, these posts contain repetition, and are written in an informal manner. I an now in process of created the second draft, which should be a nearly final and polished draft, ready for submission for publication. As a I go through the revisions, I will be putting up the revised sections up on this blog for comments and feedback. Teachers and students of statistics are especially encouraged to participate in this venture by following along and offering comments on the posts regarding clarity, coherence, or any other relevant issue. It is worth noting that this textbook will be published for the purpose of spreading knowledge, and will be made available freely, or at minimal cost to cover expenses, to as large an audience as possible. In the next post, I will discuss the political power/knowledge issues which will arise when attempting to use a disruptive technology – one which renders previous approaches obsolete, and calls for a revolution in how statistics is taught around the world.

This entry was posted in Uncategorized by Asad Zaman. Bookmark the permalink.

About Asad Zaman

BS Math MIT (1974), Ph.D. Econ Stanford (1978)] has taught at leading universities like Columbia, U. Penn., Johns Hopkins and Cal. Tech. Currently he is Vice Chancellor of Pakistan Institute of Development Economics. His textbook Statistical Foundations of Econometric Techniques (Academic Press, NY, 1996) is widely used in advanced graduate courses. His research on Islamic economics is widely cited, and has been highly influential in shaping the field. His publications in top ranked journals like Annals of Statistics, Journal of Econometrics, Econometric Theory, Journal of Labor Economics, etc. have more than a thousand citations as per Google Scholar.

1 thought on “Islamic Statistics: A Disruptive Technology

Leave a comment