JR'S Free Thought Pages |
What Is a Theory? By JR, July 2022 Introduction What is often described as the “scientific method” is based on a set of strategies, methodologies, heuristics and procedures, the investigative results from which a “theory” may be the result. But since the 1960s when Thomas Kuhn argued in his influential book The Structure of Scientific Revolutions that the criterion of “falsification”’ formulated by Karl Popper as the demarcation between science and pseudoscience is insufficient on its own to determine the scientific validity of an idea or theory it has been considered by many scientists today that there is no single methodology or criteria for distinguishing scientific theories from non-scientific ones. Kuhn himself added confusion to the scientific landscape, especially with his notion of “paradigm shifts” that has greatly contributed to widespread epistemic relativism we now call postmodernism. Kuhn further increased confusion in this book by rejecting the established rules for determining scientific validity by expanding the conception of science to include economics [1] (called the “dismal science” by many members of the “hard sciences” such as physicists, chemists and biologists) and much of psychology, including Freud’s theories of psychoanalysis and the discredited “chemical imbalance in the brain” theory of psychological maladies such as depression and psychosis. The problem with this, as even Kuhn had to admit, was that it makes it very difficult to distinguish between Science on the one hand and Pseudo-science, intellectual rubbish and religious mumbo jumbo on the other. A glaring example of the consequences is the country of guns and bibles, the United States of Jesus and MAGA where numerous conspiracies prevail and creationists have for decades farcically claimed that Creation Science and Darwinian Evolution should be given equal time in high school biology classes. On the other hand, theoretical physicists have produced arcane ideas such as String Theory, justified primarily by its mathematical rigour and elegance with precious little empirical and experimental evidence. String Theory and the Big Bang perhaps ought to be considered purely at the level of a hypothesis and not bona fide theories. The Scientific Concept of Theory In everyday use the word "theory" from the perspective of the scientist has been grossly misconstrued. People who are not sufficiently knowledgeable in the sciences generally consider a theory as an untested hunch, conjecture or hypothesis - even a guess - without supporting facts, investigation, experimentation and evidence. In some respects “theory” is conceptually related to the concept of “theorem” in mathematics. All logical arguments, both inductive and deductive, are based on one or more assumptions (or premises) that are generally derived from the intuitively obvious. For example in mathematics they are called axioms or postulates for which “a + b = b + a” is an example from elementary algebra. A theorem in math is the inference of a sequence of deductive reasoning steps based on axioms and in most cases other proven theorems called a “proof”. Many readers are likely familiar with the proof of Pythagoras’ Theorem from elementary high school Euclidean geometry. All arguments of course succeed or fail based on the quality of their premises. The Random House College Dictionary defines theory as “a coherent group of general propositions used as principles of explanation for a class of phenomena and observed facts” and cites Newton’s Theory of Gravitation and The Theory of Evolution as common examples. This is the generally accepted definition pertaining to the corroborated results of scientific inquiry as opposed to the mutated conception by the masses as “guess or conjecture” which within science is a potential or probable explanation called a hypothesis. In certain branches of mathematics we have number theory, game theory, probability theory and limit theory and the concept of infinity, the latter two of which are requirements to understanding Calculus. These examples are foundational principles involving definitions, axioms, theorems and formulae for an area of specialized mathematical investigation. Consider how theory is generally understood within the notion of “conspiracy theories” which in our world of social media and misinformation in which many people do not even understand the distinction between fact and opinion. Yes, in our so-called “democracies” we are entitled to our opinions however irrational, but not our own facts. In almost in every case a conspiracy theories such as Q-Anon or that vaccines cause autism are NOT genuine theories in the scientific sense of empiricism, testing, experimentation, logical coherence and consistency and a host of other characteristics required by cogency of argument and genuine scientific inquiry. The disturbing truth is that most people tend to believe what ideologically and emotionally appeal to them and what they’ve been taught by their families and culture, filtering out information that conflicts with or disconfirms deeply held beliefs. This cognitive flaw especially afflicts the religious, but all of us to some degree are vulnerable and is something we need to be actively aware of if we are to have any chance of overcoming false beliefs. What feels to us like a rational position might not be anything of the sort; frequently it could instead be an emotional decision dressed in the borrowed garb of rational thought, entangled with the very fabric of how we define ourselves. This makes us resistant to changing our minds, even when the available facts and evidence conflict with it. As Jonathan Swift once observed, “reasoning will never make a man correct an ill opinion, which by reasoning he never acquired.” Inevitably clinging to irrational and false belief detrimental to us and I believe has a moral dimension, what has been called the “ethics of belief”. Whether the issue is religious claims, climate change, health policy or even politics we need to be able to evaluate the available information critically without the distorting lens of ideology or cultural bias distorting ability to know the truth. Surely not caring about the truth is within the ethical realm. While we may hold incredibly strong personal convictions, the universe or reality doesn’t give a damn what we believe. And if we persist in choosing ideology over scientific research, evidence and theory, we diminish and even endanger ourselves and others. We are entitled to our own opinions, but not our own facts. [2] For professional scientists (for example someone with a PhD in physics, chemistry or biology), a theory has nearly the opposite meaning. A theory is distinct from a hypothesis which may become a theory following experimentation, collection of facts and a cogent explanation of those facts and well-substantiated explanation, evidence of an aspect of the natural world that can incorporate laws, hypotheses and facts. The theory of gravitation, for instance, explains why apples fall from trees rather than heading skyward and why astronauts as in the movie 2001: A Space Odyssey, float about in space. Similarly, the Theory of Evolution brilliantly explains how plants and animals evolved and perhaps mutated as covid-19 is doing today - some very similar and some very different as in isolated Australia and how and why they came to exist on earth now and in the past, often explained by the fossil record. One of the strengths of science is that theories are always provisional and like all science probabilistic and never make claims to certainty. If you want certainty, there are the palliatives and supernatural delusions of religion. A theory not only explains known facts; it also allows scientists to make predictions of what they should observe if a theory is accepted as verified. A theory must be testable and falsifiable – that is, there must be a possible way of demonstrating it false. That is why religion and other supernatural, paranormal and other pseudo-scientific rubbish such as astrology are not scientific even at the level of a hypothesis. So two of the most important criteria of scientific theories are they must be testable and falsifiable. In addition, new evidence ought to be compatible with a theory. If not, the theory is modified, refined or even rejected. The longer the central claims of a theory persist, the more observations it predicts, the more tests it passes and the more facts it explains, the stronger the theory. But not all theories are equal in the sense say that the Big Bang is as evidentially strong as say, the Theory of Gravitation or Theory of Evolution and the Genetics that flowed from it, the latter of which is the foundation of all modern biology. Many advances in science such the development of Genetics following Darwin's death have greatly enhanced evolutionary thinking and resulted in huge advancements in medical science. Yet even with these new discoveries and advancements, the theory of evolution still persists today, much as Darwin first described it and is universally accepted by all genuine scientists. The trap of infinite regress notwithstanding, the religious claims about deities and creation myths (including the ludicrous “intelligent design” Trojan horse which is just a code phrase for “creationism”) are not science and they have been in most cases banished from the science class decades ago. But at least 75% of the world population believe in pre-scientific and pre-enlightenment supernatural entities exist (gods, angels, demons and god-men) and the silly doctrines reflected in Christianity, Islam and Hinduism. Notes: [1] As the late Canadian economist and acolyte of John Maynard Keynes, John Kenneth Galbraith, once quipped, “The only function of economic forecasting is to make astrology look respectable”. [2] Conspiracies do of course exist and are more common that most of us would care to admit. For example, throughout history wars have based on nationalistic and patriotic bullshit, deceit and outright lies, primarily designed to convince the masses of mainly working class men who fight and die in them that they are noble and justified. It’s axiomatic that “the first casualty of war is the truth”. It needs to be clarified that a conspiracy theory - or more accurately conspiracy fantasy – is in fact, not a theory in the scientific understanding of “theory”. Endless bullshit, propaganda, indoctrination and conspiracies propounded by the ruling classes have existed throughout recorded history and sadly with the emergence of the internet, including the so-called “social media” which have become multi-billion dollar corporate entities, the spread of intellectual rubbish and conspiracy fantasies conflating fact with opinion has become toxic mind virus epidemics. Today, everyone can publish their views and with this, more and more conspiracy fantasies appear and they proliferate like our undemocratic overpopulated screwed up world controlled by capitalist oligarchs and their sock puppets in our so-called “democratic” governments. One of the underlying premises of many conspiracy delusions is that “nothing happens by chance”; there is always some definite cause such as god or even the fallacious notion of fate. Conspiracy fantasies often attempt to invent causal connections that are not connected by causation. Causation is often conflated with correlation and the fallacy of false cause (post hoc ergo proctor hoc), confirmation bias and dozens of other logical fallacies proliferate like a brain virus. The fact that something can seemingly happen out of a vacuum is impossible to comprehend for the advocates of conspiracy fantasies, and even more so for those who invent, sell, and broadcast conspiracy fantasies and useless wastes of time and money such as lotteries and online gambling, the latest mass swindle being peddled on non-stop TV ads. Second, most events cannot be accurately predicted and what are called “black swan” events occur more often that most will admit. How many economists predicted the global 2007-09 financial crash? There were very few and the economists who did were generally anti-capitalists and outside the mainstream, often smeared and reviled. Third, conspiracy fantasies thrive on the hallucination that nothing is quite what it seems – whether at first sight, or even after a second look. Fourth, creator of conspiracy fantasies are also convinced that everything that occurs is the result of some secret cabal like the Illuminati. But conspiracy theories can arise almost by some offhand remark that once disseminated becomes the single cause of a certain event which is sometimes called the “reductive fallacy”, often directing historical events. This happened after World War I. Irish physicist David Robert Grimes describes what happened in his excellent book Good Thinking: “In 1918, at the twilight of the First World War, the highest echelon of the German army, the Oberste Heeresleitung (OHL), was a de facto military dictatorship. By the end of the spring offensive on the Western Front, it was clear to the high command that the war was all but lost. Seeing inevitable defeat on the horizon, the OHL rapidly implemented a transition to a rudimentary parliamentary system. Under this new civilian authority, a peace accord was reached and the war ended. But the armistice of November 1918 was to throw nationalistic right-wing elements of the German establishment into disarray; how could the might of the imperial war machine have been so thoroughly overcome? Their shame was compounded by the terms of the Treaty of Versailles, which laid blame for the conflict firmly at German feet. The break-up of the once-proud German military and navy and the stiff financial cost the failed war effort incurred were deemed incredibly humiliating by the militaristic contingent of the German empire. Many of them simply refused to even countenance the multitudinous factors shaping German military decline. From the ashes of wounded national pride and the complex realities of a bloody war arose a terrible myth: The German defeat must be due to traitorous elements on the home front that had conspired to destroy Germany from within. The myth was adopted wholeheartedly by many, even those who should have known better, such as General Erich Ludendorff. When dining in 1919 with the British general Sir Neill Malcolm, an impassioned Ludendorff reeled off a rambling litany of reasons why the German army had been so thoroughly routed the year prior. In this frenzy of excuses, he dropped the now-infamous canard that the home front had failed the military. Historian John Wheeler-Bennett recounts the conversation between the two military men: Malcolm asked him: “Do you mean, General, that you were stabbed in the back?” Ludendorff’s eyes lit up and he leapt upon the phrase like a dog on a bone. “Stabbed in the back?” he repeated. “Yes, that’s it, exactly; we were stabbed in the back.” And thus was born a legend which has never entirely perished. Following this fallacious epiphany, Ludendorff became the leading evangelist for the Dolchstoßlegende, or stab-in-the-back myth. This convenient fiction placed the blame squarely on the shoulders of lurking saboteurs, and was adopted eagerly by many in German society. The identity of these nefarious elements varied with the prejudices of the believers: Bolsheviks, Communists, pacifists, trade unionists, republicans, Jews– sometimes combinations of all these detested types. It resonated with ultranationalists, echoing the symbolism in Richard Wagner’s opera Götterdämmerung of Hagen burying his spear in Siegfried’s exposed back. The early democratic leaders of the Weimar Republic and signatories of the German armistice were denounced as the “November criminals’” by rabid right-wing reactionaries. These feelings ran angry and deep; signatory Matthias Erzberger was assassinated by the ultra-nationalistic Organization Consul in 1921, with foreign minister Walther Rathenau murdered by the group the following year. Of course, the simplistic betrayal explanation was devoid of any substance, thoroughly refuted by scholars both inside and outside Germany. But a complete lack of veracity is rarely an impediment to an easily grasped story’s taking firm hold. Believers in the myth cherry-picked alleged instances of “betrayal” by rogue elements. For instance, Kurt Eisner, a Jewish journalist, was convicted of treason for inciting a strike at a munitions factory in 1918. Eisner himself was assassinated by a nationalist the following year. As Ludendorff must have known, such actions were inconsequential to German defeat. By 1918 Germany was already out of reserves and for an array of reasons completely overwhelmed. But admitting that Germany’s defeat had several complex influences didn’t jibe with the same reassuring simplicity that the stabbed-in-the-back narrative provided. The legend gave believers something else too: a scapegoat for perceived failings. From this face-saving fiction something even more poisonous emerged: a virulent new strain of anti-Semitism and deep-seated political hatred. This twisted alternative history found a charismatic mouthpiece in the form of a young Austrian firebrand named Adolf Hitler. Hitler embraced the myth completely, fusing it seamlessly with his own growing anti-Semitism and anti-communist beliefs. In Mein Kampf, he blamed Germany’s defeat on the noxious influence of international Jewry and Marxist elements. Nazi propaganda denigrated the democratic Weimar Republic it overthrew as an agent of betrayal, decrying it as “a morass of corruption, degeneracy, national humiliation, ruthless persecution of the honest ‘national opposition’–fourteen years of rule by Jews, Marxists and ‘cultural Bolsheviks.’” When Hitler took power in 1933, the Dolchstoßlegende became not just a fringe view but Nazi orthodoxy, taught as inerrant truth to schoolchildren and citizens alike. Jews especially were singled out for blame, branded disloyal elements who had betrayed Germany from the inside. This charge in turn became a license to dehumanize, and under Hitler the Nazi state rebranded Jewish citizens as parasites and traitors. This myth-fueled dehumanization laid the groundwork for the most staggering and unfathomable deliberate destruction of innocent life in history. By the end of the Second World War in 1945, approximately 6 million Jews had been systematically executed by the Nazi state, and up to a further 11 million others had lost their lives, victims of what the Nazi machinery called the “Final Solution”–what we now know as the Holocaust. Murder on this scale is simply impossible to comprehend, an ugly reminder of the human cost when sinister narratives take hold in a nation’s psyche. We will never completely understand the bizarre mindset employed to justify such genocide, and we must be careful not to commit the reductive fallacy ourselves in trying to elicit answers to these horrifying questions. Still, it is fair to say reductive narratives played an ominous role in the callous scapegoating of Jews and others, reinforcing the prejudices of the perpetrators and collaborators. Causal reduction fallacies come in multitudinous flavors, and perhaps the most pervasive of these are false dilemmas or false dichotomies. These assert a binary choice between extreme options, even when an entire ocean of options may exist. Despite their intrinsic hollowness, false dichotomies are supremely well suited to demagoguery, narrowing spectra of possibilities down to just two or so choices. If this inherently reductive rhetorical sleight of hand is accepted by an audience, the orator can readily present the outcomes as alternatively desirable or contemptible. Consequently, false dichotomies are inherently polarizing and not amenable to compromise. The Machiavellian trait of this fallacy is that it can be used to force the unaligned or nonpartisan to ally themselves with the speaker or lose face. It carries with it an implication that those not entirely in agreement with the proposal of the speaker are implicitly (or sometimes, incredibly explicitly) deemed the enemy. This is nonsensical but surprisingly powerful, with a magnet-like ability to align the unwary in the direction the speaker wishes. Predictably, it has a long history of political deployment, most notably in the form of “you’re either with us or against us” pronouncements, across all divisions of the political spectrum. Vladimir Lenin, speaking in 1920, declared: “It is with absolute frankness that we speak of this struggle of the proletariat; each man must choose between joining our side or the other side. Any attempt to avoid taking sides in this issue must end in fiasco.” Worlds apart politically, over eight decades later, President George W. Bush would use the exact same gambit in addressing a joint session of Congress in the wake of the 9/11 attacks, warning all nations listening that “Either you are with us, or you are with the terrorists.” While Lenin and Bush would balk with contempt at each other’s politics, there’s a pleasing irony in the fact that neither had qualms about employing naked rhetorical falsehood to silence all but the most polarized views. The long, ignoble pedigree of the false dilemma is impressive; historical examples would fill the rest of this book and volumes more besides. Arthur Miller’s play The Crucible is set during the Salem witch trials, written in 1953 as a brilliant allegory for the overpowering hysteria of the then-prevailing anti-communist panic. In it, Deputy Governor Danforth invokes the fallacy, warning that “a person is either with this court or he must be counted against it, there is no road between.” Outside politics, false dilemmas are used on emotive topics to push specific narratives, with logic often rendered unsound by the existence of other valid positions on a spectrum between the two extremes posited. By their very nature, false dichotomies are antithetical to rational discourse, fostering extremism. The inherent polarization of a false dilemma can poison pragmatic solutions and dash constructive dialogue. Its deep intrinsic appeal lies in its ability to compress an entire spectrum down to simple, mutually opposed extremes, explaining its long-standing appeal to despots and demagogues. It is, however, rather telling that its corrosive influence has not reduced with time; it is still employed in a wide range of fields, with tedious predictability. Social media is rife with precisely this phenomenon, where complex topics with a wide scope for nuanced views get distilled down to a shouting match between two binary and diametrically opposed interpretations. In these forums, the spectrum of opinion becomes curiously bimodal. The appeal of reductive fallacies is relatively easy to grasp: They offer simple, soothing explanations for complex phenomena. The illusion of understanding is reassuring and affirming, a psychological comfort blanket and totem of protection in a confusing world. The urge to understand cause and effect is something primal and intrinsic to the human condition–this enduring desire has been the engine that has driven mankind’s development and intellectual appetite for millennia. It has led us to everything from taming fire to formulating quantum mechanics. Without this irrepressible drive to understand, we would be bereft of vast swaths of art and science. Yet, for at least as long as we’ve had the desire to understand, so too we have fallen victim to causal fallacies – it is written in the lingo of our superstitions, our rituals, and even our religions. As we will see in the next chapter, however, it can be remarkably difficult to separate cause from effect and far too easy to err, to our collective detriment.” (pp. 76-81)
|