JR'S Free Thought Pages
                                                                       No Gods  ~ No Masters    ~ No Bullshit





                          JOHN L. REBMAN





                           Social and Educational Studies


    We accept this thesis as conforming to the required standard







                                      March 1994      

                          © John L. Rebman, 1994





One of the primary aims of education is to enable students to secure reliable standards and procedures by which they can acquire beliefs that are, if not true, at least likely to be true. The questions of belief acquisition and the manner, in which those beliefs are held, although epistemic, are also distinctively ethical. Implicit within epistemological concepts such as truth, justification and objectivity are ethical concerns such as honesty, integrity and responsibility. In response to the question "What ought I to believe?" any serious critical thinker must examine the reasons for holding (or not holding) a belief, and ascertain whether or not they are good reasons. Good reasons involve attention to rational or intellectual standards such as evidential support, objectivity, justification and truth. My discussion of the moral dimensions of epistemological questions will follow the path delineated by W.K. Clifford (1877) in his essay "The Ethics of Belief". Within the context of the notions of intellectual virtues and vices, I will argue that intellectual integrity and epistemic responsibility entail the acceptance of the aforementioned standards and an avoidance of credulity.

Recently, however, the Enlightenment project of rationality has come under serious attack from feminist philosophers, neo-pragmatists, postmodernist philosophers and proponents of the "sociology of knowledge" who, in their efforts to avoid dogmatism, claim that knowledge lacks foundations, truth is relative to culture or "conceptual scheme," and objectivity a myth. Although a thorough treatment and discussion of the views advanced by these groups far exceeds the scope of this thesis, their claims are, I shall argue, self-refuting and entail a destructive relativism and possible descent into radical skepticism. For the most part, I will focus my criticisms on Pragmatism, particularly the variety espoused by Richard Rorty, arguably the most influential contemporary philosopher.

If the extremes of radical skepticism and dogmatism are to be averted, educators must adopt the premise that knowledge is possible but at the same time accept the fact that much of what we claim to know is uncertain. Hence, many of our beliefs should be regarded as transi­tory and, therefore, held tentatively. I shall argue that by assuming a posture of humility in the face of knowledge claims, holding to a realist and fallibilist theory of knowledge, entertaining beliefs with a healthy skepticism and abandoning the "quest for certainty" (as Dewey has asserted), we can avoid dogmatism, indoctrination and the intellectual vice of credulity. If we value autonomous critical thinkers as an important component within a liberal democratic society, then these dispositions ought to be fostered in our students. This dispositional approach to critical thinking I refer to as constructive skepticism and will argue that it is a necessary requirement for any serious critical inquirer.



                     TABLE OF CONTENTS


1. Skepticism

          1.1 Ordinary Meaning of Skepticism

          1.2 Philosophical Skepticism

          1.3 Methodological Skepticism: Descartes

          1.4 Varieties of Skepticism

          1.5 Mitigated Skepticism

          1.6 Twentieth Century Epistemology

          1.7 Wittgenstein

          1.8 Constructive Skepticism


2. The Ethics of Belief

          2.1 Ancient and Medieval Sources

          2.2 Enlightenment Skepticism

          2.3 Locke's Ethics of Belief

          2.4 Epistemic and Ethical Concepts                                                      

          2.5 W.K. Clifford's "Ethics of Belief"

          2.6 Clifford's Normative Epistemology: Evidence

          2.7 Clifford's Normative Epistemology: Responsibility

          2.8 Clifford's Normative Epistemology: Authority

          2.9 Primitive Credulity and Suspension of Judgement


3. Belief, Pragmatism and Truth

          3.1 The Concept of Belief

          3.2 Belief and Truth

          3.3 Belief, Faith, and Pascal's Wager

          3.4 William James' "Will to Believe"

          3.5 Pragmatism and Science

          3.6 Pragmatism and Relativism

          3.7 Pragmatism and Self-Refutation

          3.8 Wittgenstein: Coherence of Beliefs


4. Rationality, Objectivity and Truth

          4.1 Conceptions of Rational­ity

          4.2 Rational Principles

          4.3 Objectivity and Rationality

          4.4 Objectivity as a Normative Notion

          4.5 Siegel's Epistemology

          4.6 Objectivity and Truth: Dewey

          4.7 Objectivity and Truth: Rorty

          4.8 Dewey's Notion of Truth

          4.9 Habermas


5. Critical Thinking, Constructive Skepticism: Conclusions

          5.1 Educational Aims

          5.2 The Problem of Indoctrination

          5.3 Educators are Concerned with Belief

          5.4 Credulity, Truth and Constructive Skepticism

          5.5 Belief and Critical Thinking

      6. Fallibilism and Constructive Skepticism


      7. Endnotes


      8. References



However much education must be involved in promoting values, its primary function is to influence beliefs. But the questions of how beliefs are acquired and how they are held are clearly value-laden.

The essence of the liberal democratic outlook and the scientific temper lays not so much in what beliefs are held as how they are held. Bertrand Russell has written:

The essence of the liberal outlook lies not in what opinions are held, but how they are held: instead of being held dogmatically, they are held tentatively, and with consciousness that new evidence may at any moment lead to their abandonment. This is the way in which they are held in science, as opposed to the way in which they are held in theology.[1]

As educators we should take every effort to avoid the extreme positions of radical skepticism and dogmatism. Russell claimed that dogmatism and radical skepticism are both absolute philosophies, one certain of knowing, the other of not knowing. What education should eschew, Russell argues, is certainty, whether of knowledge or ignorance. He likened the demand for certainty to a sort of neurosis in which all questions of ultimate concern can be explained within a hermetically sealed cognitive circle. It is not only necessary that we realize most of what passes for knowledge is, in a greater or lesser degree, uncertain or vague, but that it is at the same time necessary to learn to act upon the best hypothesis or inference to the best explanation without dogmatically believing it.                                                                       

An empiricist, fallibilist theory of knowledge and a realist metaphysical stance can provide a halfway house between the extreme positions of dogmatism and radical skepticism. "Knowledge of all good things", says Russell, "is difficult but not impossible; the dogmatist forgets the difficulty, the [radical] skeptic denies the possibility."[2] Almost all knowledge, except perhaps for such things as belief in the existence of an external world and phenomenal reports of our immediate sensory experience, is held to be doubtful in varying degrees. But, education should not acquiesce in radical skepticism, but neither should it resort to indoctrination nor promote credulity. It should leave a student with an ability to separate real knowledge from the "intellectual rubbish" and dogmatism of which he will find "an abundant diet, in our own age as in any other."[3] Credulity in the face of repeated groundless assertions and proselytising is an intellectual vice and one of the curses of our modern "mass media" and "information" society. One must be critically aware of the potential deceptions and dubious nature of what is served up by the media as "knowledge" or useful information. The rampant dissemination of pseudo-scientific and paranormal beliefs is due, in large part, to the rapid emergence of the mass media on a global scale. These media have virtually replaced the schools, colleges and universities as the chief conveyors of information. And it would seem that the primary interest of most media conglomerates are entertainment rather than factual information, profit rather than truth and selling products rather than contributing to the sum of human knowledge. Mainly because of the media, large sectors of the public simply assume that angels and Satan exist, astrology is true, psychic powers are real, that it is possible to modify material objects by the mind, that physical diseases can be cured by miraculous faith healing and prayer and that the earth is visited regularly by extraterrestrial entities who are engaging in sexual biogenetic experiments with humans. But unfortunately these quasi-religious phenomena are rarely exposed to skeptical scrutiny by those same media because we live in a culture where criticisms of the uncorroborated claims of religions are generally considered to be ill advised or in bad taste. Hence, to guard against credulity by fostering in our students the rational virtues and the "critical spirit" should be one of the chief aims of education. I will argue that a critical thinker strives after goals that are normative, such as truth. The outcome of any critical inquiry ought to be judged in terms of epistemological rather than rhetorical standards and, in terms of what is, or at least what is likely to be, the case, as opposed to what is merely expedient, convenient or coveted. Truth is independent of human wishes and desires - the world is what it is and would remain so even if mankind disappeared.

Many postmodernist philosophers and proponents of the sociology of knowledge, however, argue that there cannot be any objective standards of rationality because all arguments and truth are distorted or rendered relative by vested interests, ideological or cultural frameworks, gender bias or desire for power. Others have argued that rationality itself is strictly instrumental. I will try to show that these views are unsound and argue that rationality and logically related concepts are inherently normative, insofar as we consider a person rational if she is able to provide good reasons for her beliefs. Hence, the vital element for any serious, responsible and honest thinker is unequivocally: "What ought I to believe?"                                       

Even if we accept the arguments of postmodernist philosophers such as Richard Rorty that there is no Archimedean point, no God's eye view or neutral position from which to validate our claims to knowledge and knowledge lacks any foundational basis, it does not necessarily mean we must resign ourselves to a pernicious epistemological relativism and nihilistic subjectivity. We are not mere prisoners of our own conceptual schemes or "language games." The Enlightenment project has left us with a rich tradition and valuable legacy of rational­ity, objec­tivity, as well as stan­dards and procedures for open-minded inquiry that enable us to justify many of our beliefs and determine their truth. The practical task of the Enlightenment project was, to a large extent, an educational task: to develop in people the universal power of reason and thereby emancipate them from the dictates of ignorance and superstition created by the religious and political institutions of the old despotic social order and hence become rationally empowered to transform themselves and the social world in which they live. But now we seem to be witnessing a reactionary drive to jettison the last vestiges of enlightenment educational theorizing under the banner of postmodernism, neo-pragmatism, feminism, deconstructionism and anti-enlightenment movements that have called into question not only the universality of logic and rationality, but have raised common sense beliefs to the status of self-authorizing truth. The arguments of people like Rorty and Paul Feyerabend, for example, entail a fragmented notion of rationality in which truth becomes compartmentalized as sociological fact, subject to the vagaries of history; particu­larity, gender and circumstance, and acceptance of their ideas have serious implications for education. Beliefs, truth, and their justification simply become cultural and conceptual contingencies. But if postmodern conceptions of rationality insist that we abandon notions of objectivity, truth, and universality of rational discourse, then surely it must do so without adopting the view that any beliefs and practises are as good as any other. It was Richard Foley, I believe, who has stated, "rationality is what stands between us and a chaotic disagreement in which anything goes."

Those who attempt to deny the universality of rational discourse, for example, ought to consider whether they argue against reason with or without invoking reason. If with reason, then they confirm those very principles of rationality they are striving so diligently to refute. But if they argue without reason which, in the interest of consistency with their own views they surely must do, they put themselves outside the purview of rationality thereby propelling themselves into a fanciful domain in which all opinions are equal - Foley’s “anything goes” zone.

     The central arguments of this thesis are summarized as follows:

          (1) Ethics is important in epistemological considerations and there are normative principles in epistemology that are connected to ethical principles. Epistemologies make normative claims; they tell us that one should meet certain standards to obtain the best kinds of belief. I will argue that there is a strong ethical compo­nent to rational belief acquisition that arises within the context of con­flicting knowledge claims. Evaluation of knowledge claims involves epistemic responsibility and intellectual integrity - a respect for evidence, justification and truth. We should realize that our efforts to make sense of the world, to objectively see things the best way we can -  to see things as they "really are" - is an activity con­strained by the nature of our human cognitive equipment and by the nature of reality itself. Epistemic responsibility, the obligation to "know well", is to be found in intellectual virtues. These virtues are dispositions, sensitivities, abilities, and desires to understand, to get at the "truth" without resorting to fantasy, superstition, illusion or self-deception.                                        

      (2) As educators, we ought to desire reliable pro­cesses in belief acquisition and, more importantly, we ought to demand that our beliefs be true or, at least, probably true. Moreover, it is important how our beliefs are held. All beliefs should be accessible and susceptible to intellectual scrutiny and, consequently, vulnerable to modification or rejection.

          (3) As educators we should be vitally concerned with helping students think critically about their belief acquisition, to avoid credulity, and encourage them to regularly examine and re-evaluate their beliefs. This end, I will maintain, can best be achieved by an approach to critical thinking which I will call constructive skepticism. Dogmatism and indoctrination of any form is antithetical to democratic principles and hence, should be avoided. However, it must be made clear that I am not suggesting any necessary conflation of or logical connection between credulity and theism or dogmatism and theism, since theism is potentially one of several possible parad­igm cases of credulity. Moreover, perhaps it could be argued that both theism and atheism are dogmatic in that they each make claims to certainty.

          (4) Rationality and intellectual virtue must be of central concern to teachers and, in fact, are important educational aims. As educators, we should promote the liberal democratic ideal of rational­ity that entails openness to argument, objectivity, impartiality and respect for students as autonomous rational agents. Rationality is a precursor to all meaningful argument and discourse, and truth, the object of rationality, should be viewed as having intrinsic value. Neutrality and objectivity (rejected by the "sociology of knowledge" as impossible ideals) implies a certain approach to the truth - which the source of an argument or point of view is, by the standard of rationality, irrelevant to its truth, falsity or validity. Intellectual virtue involves notions such as open-mindedness, honesty, tempered skepticism and "Socratic" humility - the courage not to pretend to know what one does not know and to accept the transitory nature and fallibility of much that we claim to know. It involves the capacity for self-reflection and the will to overcome a kind of repressive intellectual akrasia or reluctance to challenge and reconsider our web of treasured beliefs. Intellectual virtue involves what Sartre called authenticity. Auspicious appeals to transcendent, dogmatic authorities are tantamount to an abdication of one's moral and epistemic responsibility to oneself - an escape into the Sartrean self-deception of the etre-en-soi.[4] Recently, writers on critical thinking such as John Passmore, Harvey Siegel, William Hare, Matthew Lipman, Richard Paul and others have emphasized the importance of fostering these intellectual dispositions in our students.

          (5) If we desire "best belief" (i.e., beliefs that are true or likely true), then critical thinking should employ what I have already referred to as constructive skepticism. Skepticism is a necessary antecedent to any inquiry since, as Wittgenstein, Peirce and Dewey have rightly pointed out, if one were certain of a belief or proposition, there would be no need for the inquiry. What concerns me is what I perceive to be the "primitive credulity" (to use Peirce's term), not only of my students, but of the general public. Our young people are the constant targets of politicians, commercial advertisers and proselytizing apologists such as faith-healing evangelists, religious cults, astrologers, self-help gurus, psychics, and "New Age" mystics. It is, therefore, important that our young people be intro­duced to criti­cal think­ing at the earliest possible opportun­ity and be encouraged to adopt a disposition of a healthy, mitigated, methodological skepticism as a defence against outrageous claims. Hence, the Bertrand Russell "Will to Doubt" ought to override the William James "Will to Believe." I will critique the pragmatic theory of truth, particularly as espoused by James and Richard Rorty (and to a lesser extent, John Dewey), and argue that it is a confused notion of the concept of truth.

          (6) We should accept the dictum "To err is human." Fallibility is one of our most uniquely human characteristics and we should, as Dewey has argued, give up the "quest for certainty." What Dewey called "the quest for certainty" in modern philosophy begins with Descartes who claimed that indubitable knowledge could be acquired by turning inwards and judiciously appealing to the pure thought of "clear and distinct ideas," an intuitive process of vigilant meditation "free from the fluctuating testimony of the senses." Although we should remain open to the possibility, our search for absolutes is quite likely a pretentious, futile endeavour. As teachers we should foster in our students a sense of temperance and humility concerning our claims to knowledge and encourage them to accept the fallible, transitory nature of much of what we claim to know and to accept a world which is, for the most part, contingent and uncertain.

          (7) I will argue for a humanistic conception of education which will stress the need for educating autonomous critical thinkers, persons who are skeptical of appeals to overly facile, immutable, and transcendent or absolutist approaches to the solution of complex human problems. Our education system's over emphasis on the content of thought (rather than the standards and procedures) and its concomitant didactic methodology, the repression of creative and critical inquiry, over-specialization, the predominance of instrumental reason, bureaucratization, and over-emphasis of one-dimensional technical or metaphysical solutions to complex social, political and environmental problems are just a few of the many issues that educators and others in positions of authority and responsibility ought to address.

To philosophize is to doubt - Michel de Montaigne

                           1 SKEPTICISM

          1.1 Ordinary Meaning of Skepticism

Quite obviously most of us are skeptics to a certain degree, each of us possessing varying tolerance levels of credulity and doubt, but we may not agree where the limits should be drawn. One who is either credulous or skeptical in an absolute sense would have a difficult time functioning in the real world. However, when I read in the newspaper[5] several months ago that in a recent poll 53% of the respondents believe the second coming of Jesus Christ will occur sometime in the next millennium there does not appear to be a scarcity of credulity among the general public. The results of a recent poll conducted in the United States by Gallup reported that: (1) One in every four Americans believes in ghosts, (2) More than half believe in the Devil, and one in ten claimed to have talked to the Devil, (3) Three in four at least occasionally read their horoscopes in the newspaper, and one in four say they firmly believe in the tenets of Astrology, (4) One in every four Americans believe they have had a telepathic experience in which they have communicated with another person without the use of the traditional five senses, (5) More than 70% believe in a life after death, and (6) One in five believes in reincarnation.[6]

Skepticism, in the general sense, is nothing very esoteric. We encounter it every day when we are aghast at the outrageous claims of the headlines in the National Enquirer magazine at the supermarket checkout, when we listen to the logical fallacies and deceptions of most television commercials, and when we listen to the vacuous rhetoric of many of our politicians. We are appalled at the antics of evangelists on Sunday morning television and we realize that there is a price to pay for untrammelled credulity on a used car lot.

The popular definition of a skeptic in the Oxford Dictionary is "one who maintains a doubting attitude with reference to some particular question or statement." The Random House Dictionary defines a skeptic as (1) "a person who maintains a doubting attitude, as towards values, plans, statements, or the character of others." and (2) "a person who questions the validity or authenticity of something purporting to be factual." Hence, skepticism in the general sense is the state of mind, temperament, or attitude possessed by those who call themselves skeptics. The word skeptic evolved from the Greek word skeptikos, which meant “thoughtful or inquiring”, or "to question, consider or examine". Skeptic (capital "S") also refers to any member or follower of the philosophical school of the ancient Greek Pyrrho (360-270 B.C.) who held that "there are no adequate grounds for certainty as to the truth of any proposition whatever."[7] But this position, which I will discuss at greater length later in this chapter, seems clearly sterile and inert and hence, unproductive. It is held by virtually no one, except perhaps a few confused solipsists who doubt even their own existence. Skepticism is itself a positive assertion about knowledge, and thus turned on itself cannot be held. If a person is skeptical about everything, then he would have to be skeptical about his own skepticism. Like the decaying sub-atomic particle, this extreme form of skepticism uncoils and spins off the viewing screen of our intellectual cloud chamber. The philosophical definition of skepticism, which I will discuss next, refers to the doctrine that "the truth of all knowledge must always be in question and that inquiry must be a process of doubting."[8] Philosophical skepticism ranges from complete, total disbelief in everything, to a tentative doubt in a process of arriving at knowledge.

          1.2 Philosophical Skepticism

Skepticism, as a critical philosophical attitude, questions the reliability of the knowledge claims made by philosophers, scientists and others. Philosophical skeptics have been engaged in inquiry into alleged human achievements in different fields to see if any knowledge has been or could be gained by them. They have questioned whether any necessary or indubitable information can actually be gained about the nature of things. Skeptics have organized their questioning into systematic sets of arguments aimed at raising doubts as to whether anything can be known at all or, in another form, claiming that knowledge of some things can only be attained with difficulty and given certain precautions. In this second form it supports a methodological policy of reserve and circumspection in the formation of beliefs - its opposite is dogmatism. A methodological skeptic is one who uses the technique of doubt as a device to assist him in his quest for knowledge, in contrast to the theoretical, radical (or terminal) skeptic for whom skepticism represents the theory or position on which he takes his stand.

          1.3 Methodological Skepticism: Descartes

The quintessential example of methodological skepticism is, of course, Descartes. Although not a skeptic himself, Descartes nevertheless is responsible for an argument that must be classified among the most imaginative and profound in the entire arsenal of skepticism. The argument appears in his first Meditation, and the steps in the argument are well known. Having systematically doubted everything he could, including the existence of the external world and other minds, Descartes came to the conclusion that he could not doubt the existence of his own mind. Hence, the certainty that he doubts - the cogito ergo sum. Descartes was never really skeptical about there being a definite procedure for attaining a complete deductive knowledge based upon this indubitable truth. He believed it is possible to rise above skepticism and find knowledge that is absolute, certain, necessary, and self-evident, which would serve as the foundation for all other knowledge and especially for knowledge of reality.

Descartes' aim was to provide foundations for both Science and Religion, and the full title of the Meditations may suggest that the religious motivation was paramount, since included in the book one finds circuitous "proofs" of the existence of God and the immortality of the soul. Descartes insists that our knowledge of the contents of our own minds is indisputable and self-evident while knowledge of external things is not - but any attempt to propose a criterion, which authorizes acceptance of some ideas or appearances, is doomed to failure. Once the criterion is itself subjected to skeptical challenge, the defender is faced with an infinite regress of skeptical challenges unless he argues in a circle. The apparent circularity of Descartes' argument for God's existence hinges on his claim that "anything which is clearly and distinctly perceived is true." Our assurance that God exists rests upon our clearly and distinctly perceiving that He does; yet our right to trust our clear and distinct ideas depends upon our assurance that God exists and is benevolent.

A question, which arises partly from the previous discussion, is whether or not it is rational for a person to believe something for which there is little or no evidence? In order to confront this question, we must ask what kind of thing is "evidence." A proposition may itself be evident, or it may be a proposition which is the end product of an inference or implication which is supported by other propositions. Propositions which are themselves evident may be either self-evident (e.g., mathematical axioms such as a + b=b + a and a + 0 = a[9], evident to the senses (e.g., I am listening to Rachmaninoff as I write this), or evident to memory (e.g., I played tennis earlier this morning). These propositions might be called "basic beliefs" – non-reflective common sense beliefs which are universally held, unavoidable, necessary conditions for action and such that the likelihood of their truth cannot and need not be increased by additional evidence. Hence, if it must be rational to believe what is self-evident or evident to the senses, then it is rational, on occasion, to accept a proposition without evidence. Hence, it would seem that a belief is rational if it is belief in a proposition which is self-evident, evident to the senses (and could be corroborated by others), or sufficiently supported by the evidence provided for it by the latter "basic" propositions. I will return to this question of the rationality of basic beliefs and beliefs in general at a later point in this thesis. Let it be said at this point that no scientific or metaphysical theory that presupposes them can contradict those same common sense basic beliefs and remain consistent.

Descartes saw belief in God as basic and foundational, the end result of a process of vigilant meditation free from the "fluctuating testimony of the senses." By turning inward and judiciously appealing to the criterion of "clear and distinct ideas", he arrived at what he perceived to be indubitable, personalized and self-evident intuitive knowledge. Descartes' individualistic criterion of "whatever I am clearly convinced of is true" renders new inquiry superfluous and ignores the social dimension of the inquiries of others. But "truth", as Peirce and Dewey would put it, is a fixed limit toward which inquiry tends, that opinion which investigators are bound to come by "in the long run," and the object of their convergent opinion constitutes the meaning of "reality." But Descartes' methodological criteria are laid out in advance; the antecedents for the data selected for investigation and hence, preclude empirical evidence or inter-subjective appraisal. Forms and rules of thought precede and are detached from the particulars of content and experience. Theory of judgement guides the questioning of issues and affairs under examination, whether they have to do with sub-atomic particles, human behaviour, social interaction, or God's existence. These theoretical criteria, embodied within the rules of method, at once provide the foundations for epistemic justification, leading to the general, the universal, the immutable and the unquestionable.

This Cartesian, context-independent, theory-centred approach to rationality and evidence which focuses on means which are independent of empirical procedures and human experience and action continues to be an accepted standard of rationality in the Twentieth Century. Stephen Toulmin (1990) states that Descartes' understanding of reasoning was that "...if everyone cleaned their slate, and started from the same sensory "impressions" or "clear and distinct ideas", there would be no need to ask what personal or cultural idiosyncrasies each of them brought to their common debate."[10]

This decontextualized ideal was a central demand of rational thought among "modern" thinkers until well into the 20th century. In due course, further variants joined it: the economist's equation of "rationality" with efficiency, for example, and Max Weber's view of the "rationalization" of social institutions. Rationally adequate thought or action cannot, in all cases equally, start by cleaning the slate, and building up a formal system: in practise, the rigor of theory is useful only up to a point, and in certain circumstances. Claims to certainty, for instance, are at home within abstract theories, and so open to consensus; but all abstraction involves omission, turning a blind eye to elements in experience that do not lie within the scope of the given theory, and so guaranteeing the rigor of its formal implications.[11]

 The bifurcation of epistemology since Plato, reaffirmed by Descartes into intellect, theory, rationalism and certainty on the one side and action, practice, empiricism and contingency, respect­ive­ly, on the other is attacked by John Dewey in The Quest for Certainty. Man's self-deceptive effort and desire to transcend his fallibility, finitude, and contingency and "escape from the vicissitudes of existence which do not demand an active coping with conditions"[12] have resulted in an appeal to the consolatory certainties of fixed, immutable metaphysical systems and universal, demonstrative pure reason. Dewey writes:

For in spite of the great, the enormous changes in the subject matter and method of the sciences and the tremendous expansion of practical activities by means of arts and technologies, the main tradition of western culture has retained intact this framework of ideas. Perfect certainty is what man wants.[13]

The upshot of this "quest for certainty" for philosophy is that epistemology has been dominated by Herculean efforts to devise rationalistic methods and theories of knowledge as some correspondence between our experiences and some antecedent, transcendent essence or Kantian "thing-in-itself." The effect of this dualism is that:

We are so accustomed to the separation of knowledge from doing and making that we fail to recognize how it controls our conceptions of mind, of consciousness and of reflective inquiry. The common essence of all these theories, in short, is that what is known is antecedent to the mental act of observation and inquiry, and is totally unaffected by these acts; otherwise it would not be fixed and unchangeable. A spectator theory of knowledge is the inevitable outcome. Such has been the characteristic course of modern spiritualistic philosophies since the time of Kant; indeed, since the time of Descartes, who first felt the poignancy of the problem involved in reconciling the conclusions of science with traditional religious and moral beliefs.[14]

Dewey attacks the Cartesian epistemological view that reverses what he sees as the proper logical order for acquiring meaningful knowledge. Modern philosophers err by constructing a priori theories about the nature of knowledge, which then ultimately determine and dictate cosmological theories - "a procedure which reverses the apparently more judicious method of the ancients in basing their conclusions about knowledge on the nature of the universe in which knowledge occurs."[15] This process, according to Dewey, by impairing the ability of an inquirer to pursue open-minded, open-ended investigations, results in a dogmatic approach.

Just as belief that a magical ceremony will regulate the growth of seeds to full harvest stifles the tendency to investigate natural causes and their workings, so accept­ance of dogmatic rules as bases of conduct in education, morals and social matters, lessens the impetus to find out about conditions which are involved in forming intelligent plans.[16]

Descartes appears to allow that, in principle, the whole of science could be established a priori in a system of derivations from axioms, the limits of which are intuitively evident. Descartes' theory of science was a mixture of ancient and modern principles. In its logical structure, its correspondence theory of truth, and its insistence on immaterial cognitive operations, it retained classical themes, though in a significantly altered form. But in its rejection of tradition, its reliance on hyperbolic doubt, its radical distrust of the senses, its concept of represen­tative awareness, its ontological and epistemological assumptions about mind, and in its espousal of theoretical pragmatism, it was distinctively modern. It combined the visual imagery of Plato with the receptive intellect of Aristotle, the classical language of substance and essence with the modern idiom of the veil of ideas and the external world. This eclectic amalgam of things old and new, jumbled together in the name of clarity and certainty, was a philosophical mixture far too powerful to ignore but so thoroughly confused that it took centuries to unravel. By making ideas rather than objects the starting point of philosophical reflection and scientific inquiry, he subordinated metaphysics to the theory of knowledge. Descartes had hoped that theoretical reason, liberated from the influences of tradition and authority, could achieve demonstrative knowledge of nature and man. But the unintended effect was to intensify the spirit of skepticism.

There appears to be an essential tension between Descartes' conception and understanding of the purpose of theoretical as opposed to practical reason. He clearly viewed the latter subordinate to the former but his ambivalence about the normative standards of reason deeply affected the modern interpretation of science. If the theoretical discoveries of science must be certain and indubitable, then the skeptic's reservations about their epistemic validity are warranted. If the canons of rationality require that all beliefs pass the test of radical skepticism and hyperbolic doubt, then few, if any, of the conclusions of modern science would survive. The skeptic who adopts the Cartesian ideal of certainty as the true standard of reason can put all of modern science and practice into question. But if the test of rationality is an increase in probability and practical security, rather than the attainment of certainty, then our appraisal of the scientific enterprise clearly will be more favourable. Skeptical doubts about the rationality of science are mitigated when a pragmatic fallibilistic conception of reason and inquiry becomes the norm. Rational judgements within a pragmatic context are comparative appraisals of conflicting beliefs; they no longer require the submission of truth claims to the severe uncompromising epistemic standards imposed by Descartes.

C. S. Peirce was one of the first philosophers to argue against the Cartesian foundationalist view of science. Scientific theories are fallible and theories currently found plausible may subsequently be modified or even refuted; but the practice of placing our confidence in theories currently "certain", and subjecting them to severe critical tests enables the scientific community to eliminate error and allow continual progress towards the truth.[17] In "The Fixation of Belief", Peirce criticizes what he calls "the a priori method", "the method of authority", and "the method of tenacity" for alleviating doubt and acquiring knowledge and he favours the "method of science" - " a method ... by which our beliefs may be caused by nothing human, but by some external permanency."

 Its fundamental hypothesis, restated in more familiar language, is this: There are real things, whose characters are entirely independent of our opinions of them; those realities affect our senses according to regular laws, and, though our sensations are different as our relations to the objects, yet, by taking advantage of the laws of percep­tion, we can ascertain by reasoning how things really are, and any man, if he has sufficient experience and reason enough about it, will be led to one true conclusion.[18]

Doubts concerning the method of inquiry may be general; on the ground that there is no infallible way of acquiring knowledge and that all methods have failed at one time or another. But more usually, skepticism of methods is partial and depreciates the reliability of one recognized source of knowledge in the interest of another. Rationalism and empiricism have been set against one another and again jointly defended against the pretensions of authority, revelation, and intuition. On the other hand, defenders of faith such as Pascal and Kierkegaard have voiced skepticism about the claim that religious belief can be based upon demonstrative knowledge, empirical investigation, or some other set of rational principles uncovered by philosophical reflection. And since there are neither metaphysical nor epistemological foundations for religious belief, one must take a "leap of faith" and accept what is now referred to as fideism.

          1.4 Varieties of Skepticism

What is it that some extreme skeptics are denying when they claim that knowledge does not exist? What is knowledge? Without probing into the depths of epistemology for the various definitions that have been offered, which would require investigations far beyond the scope of this thesis, I will accept the classical conception of knowledge as "justified true belief".[19] If one is to know what he believes - or asserts, claims, affirms, accepts - must be true; it must accurately describe or refer to the nature of the world or some part of it, for we cannot know what is false. But truth, though a necessary condition for knowledge is not a sufficient one, for we may believe or assert what is in fact true simply by accident; in such a case we cannot be said to know. Before we can claim knowledge, we must establish that what we believe or assert is true; that is, justify the truth of our belief or assertion. If we can do that, but only if we can do that, can we legitimately conclude that we know?

The premise of some philosophers that skepticism about knowledge claims cannot in general be answered is by no means obviously correct. There are, of course, many different varieties of skepticism, and it is true that not all of them can be answered with equal confidence. This is most clearly the case for the most thorough-going form, referred to as "terminal", "theoretical", "radical", "general", "wholesale" (Hamlyn)[20],  "excessive" (Hume) or "extreme" (Popkin, 1979) skepticism.[21]. This version simply rejects all empirical and a priori knowledge and all premises, assumptions, or modes of reasoning that might be used against it. Two examples of extreme skepticism would be Cratylus and Gorgias, two pre-Socratic philosophers who were known as characters in Plato's dialogues. Cratylus, possibly influenced by Heraclitus, held that everything is in a state of flux or perpetual change. He therefore became convinced that communication was impossible because, since the speaker, the auditor, and the words were changing, whatever meaning might have been intended by the words would be altered by the time they were heard! Cratylus concluded that one cannot say anything about anything and one should not try. Hence, he apparently refused to discuss anything and only wiggled his finger when someone said something, to indicate that he had heard the utterance but that it would be pointless to reply, since everything was changing.

The Sophist Gorgias, who held to a sort of epistemological nihilism, raised even more serious skeptical doubts. Gorgias is reported to have doubted whether anything exists at all, and to have offered the argument that if anything did happen to exist, we could not know it; and if we did know it, we could not communicate it.[22] Radical or total skepticism, for the very reason that it is an extreme position, presents a challenge that threatens the vital interests, not only of epistemologists, but of us all. If it is correct, the boundaries of our knowledge are not merely restricted, something most of us would be willing to acknowledge, but close in upon us and completely overwhelm us. If a philosopher asserts that we can know nothing, say, about the existence of God, his views may cause consternation in the ranks of some theologians but hardly raise an eyebrow among philosophers of science. Or if he asserts that reason alone can never produce knowledge, he cannot expect outraged reactions from empiricists. But if he asserts that none of us can know anything, he is raising an issue that no philosopher can ignore. Two more obvious objections to radical skepticism are that it is impossible to live it in practice (this was David Hume's point), since action requires belief; and that it is incoherent because its own principles require commitment, and not ambivalence, from those who follow them.

If radical skepticism is true then one particular belief would be as credible as any other. It would be impossible to criticize, improve upon, or to make more reliable what anybody believes. Science and pseudoscience, Darwin's evolutionary theory and creationism, history and myth, medicine and faith healing, reflective judgement and rabid prejudice, would be on no footing at all. Moreover, rational argument as a method of settling disputes would be replaced by force and propaganda. If extreme skepticism were accepted, life, as Thomas Hobbes described it, would be "solitary, nasty, brutish, and short." Bertrand Russell described extreme skepticism as "psychologically impossible" and said that "...there is an element of frivolous insincerity in any philosophy which pretends to accept it."[23] An extreme skeptic could also quite properly be defined as a dogmatist since (1) He makes a knowledge claim (i.e.: "There is no knowledge.")[24], and (2) this claim cannot be justified, that is, he can provide no reasons on its behalf. In general, it can be argued, if one is to have any reasonable ground for skepticism there must be something of which one is not skeptical. The usual reason for skeptical doubt is the experience or possibility of failure or error in claims to knowledge. As Nietzsche has stated, the ultimate truths of mankind are "his irrefutable errors."[25] Failure reveals itself through inconsistency and to recognize this we must be aware that contradictory statements have been made and that the law of non-contradiction is true. Furthermore, past experience of failure or error in a given type of thinking is only relevant to one's future confidence in it if the rationality of inductive argument is assumed. A person could conceivably exhibit complete skepticism by refusing to claim any knowledge at all. What he could not do is offer a rational defence of his procedure. Thomas Nagel has written that "skepticism is really a way of recognizing our situation, though it will not prevent us from continuing to pursue something like knowledge, for our natural realism makes it impossible for us to be content with a purely subjective view."[26]

          1.5 Mitigated Skepticism

Although skepticism is of ancient origin, relatively few philosophers in the Western tradition have followed the extreme path marked out by Gorgias and Cratylus. Rather most have rejected this form of skepticism in favour of an epistemology based on the conviction that however limited its scope, knowledge does exist.[27] To use the term that has become standard since Hume, these philosophers espouse mitigated skepticism.[28] Of course we are left with a problem because it seems safe to say that, to some degree all (or at least most) of us are mitigated skeptics. Most of us would impose some limitations to human knowledge, admitting that there are things that we do not know, or even that we cannot know. But if everyone is a mitigated skeptic, to speak of mitigated skepticism as a particular philosophical position is somewhat misleading: the term denotes no distinctive view at all. But this problem can, I think, be resolved by noting that the issue really turns on a question of degree. Within certain general boundaries, which are admittedly quite vague, one can acknowledge his ignorance of things without earning the title of skeptic. Just because all of us are (I hope) "skeptical" about some things, it does not mean we are skeptics, even in the mitigated sense. However, the denial of knowledge beyond a certain point surely must land one in the skeptical camp. Where this point lies, or where the dividing line should be drawn, is a matter about which philosophers do not all agree.

If we accept the term mitigated skepticism as a name for the general view that denies absolute knowledge without going to the extreme of denying its existence entirely, we can distinguish among three different forms that the denial might take: (1) Subject matter skepticism: In this category we have philosophers who deny that we are capable of real knowledge in certain subject areas such as religion, ethics, metaphysics, history, etc. (2) Substantive (or specific object) skepticism: This is the stance taken by those who deny knowledge of certain objects or phenomena such as other minds, God, supernaturalism, matter, causality, etc. (3) Functional (or faculty) skepticism: This form denies that we have any ability to gain knowledge by some process or function through employment of some faculty or capacity we are held to posses. Hence, we find skepticism concern­ing reason and skepticism about sensory perception. Rationalists, for example, tend to embrace skepticism about the senses, and empiricists are inclined toward skepticism about pure reason; both have a propensity to skepticism concerning intuition and revelation.

Carnaedes (c 213 - 128 BCE) presented a wealth of brilliant skeptical arguments against the reliability of perception, claiming that all we can ever have are images or copies of an external world[29], and he seems to have developed a verification theory and probabilistic view resembling those of many twentieth century pragmatists and logical positivists. Carnaedes claimed that absolute truth does not exist; only degrees of probability exist. Probability is the only guide to life and the individual does not need certainty or truth in order to function and understand. Some beliefs are more probable than others, and the degree of probability of a belief is related to its intensity and immediacy in our experience, and to its relationship to other intense and immediate experiences. The lowest degree of probability of a belief is related to its not having any ground in our experience. Bertrand Russell sounds much like Carnaedes when he says:

To my mind the essential thing is that one should base one's arguments upon the kind of grounds that are accepted in science, and that one should not regard anything one accepts as quite certain, but only probable in a greater or less degree. Not to be absolutely certain is, I think, one of the essential things in rationality. When one admits that nothing is certain one must, I think, also add that some things are much more nearly certain than others. It is much more nearly certain that we are assembled here tonight than it is that this or that political party are in the right.[30]

Sextus Empiricus restated Carnaedes’ infinite regress argument. The argument casts doubt on any claims by dogmatic philosophers to have gained knowledge of the naturally non-evident world. Asking if the criterion itself is evident can challenge any criterion, such as logical inference or presumed causal connection, used to judge what is naturally non-evident. The fact that there are disputes about everything that is not observable shows that it is not obvious what criterion should be adopted. The dogmatist is faced with begging the question by using a question­able criterion to establish the justification of whatever is true, or with an infinite regress involving finding a criterion for judging his criterion, and a criterion for this, and so on....  Anthony Quinton has stated the problem as follows:

If any beliefs are to be justified at all...there must be some terminal beliefs that do not owe their credibility to others. For a belief to be justified it is not enough for it to be accepted, let alone merely entertained: there must also be good reason for accepting it. Furthermore, for an inferential belief to be justified the beliefs that support it must be justified themselves. There must, therefore, be a kind of belief that does not owe its justification the support provided by others. Unless this were so no belief would be justified at all, for to justify any belief would require the antecedent justification of an infinite series of beliefs. The terminal beliefs that are needed to bring the regress of justification to a stop need not be strictly self-evident in the sense that they somehow justify themselves. All that is required is that they should not owe their justification to any other belief.[31]

Had it not been for the dissemination of absolute foundationalism and certitude in matters of knowledge and truth since Descartes (i.e., the move from cerno to certo), the modern day skeptics would have little to be skeptical about. It would seem that the demand for certainty is inevitably disappointed, leaving skepticism in control of all knowledge claims. Postmodernism, like modern skepticism, is to a considerable extent shaped by a reaction to the claim for certainty and absolute foundationalism in modern epistemology; in fact, one might well construe postmodernism as a culturally enhanced version of skepticism.

          1.6 Twentieth Century Epistemology: Answering the Skeptic

In the twentieth century epistemology has been defined, and otherwise explained, in terms of philosophical skepticism. This conception seems to have originated with Bertrand Russell early in the century and to have been reaffirmed by such notable philosophers as A. J. Ayer, W. V. Quine, D. W. Hamlyn, Ludwig Wittgenstein, and J. L. Pollock. Russell has consistently defined epistemology and philosophy in general in terms of answering the skeptic:

The essential characteristic of philosophy, which makes it a study distinct from science, is criticism... Descartes' "methodical doubt," with which modern philosophy began...is rather the kind of criticism which we are asserting to be the essence of philosophy...This is the kind of criticism which constitutes philosophy.[32]...These problems are all such as to raise doubts concerning what commonly passes for knowledge; and if the doubts are to be answered, it can only be by means of a special study, to which we give the name "philosophy."[33]

A. J. Ayer's conception is not unlike that of Russell:             

Having maintained that to say that one knows a fact is to claim the right to be sure of it, I show how such claims may be disputed on philosophical grounds. Though their targets vary, these sceptical challenges follow a consistent pattern: the same line of reasoning is used to impugn our knowledge of the external world, or of the past, or of the experience of others. The attempt to meet these objections supplies the main subject matter for what is called the theory of knowledge.[34] 

The theory of knowledge is primarily an exercise in scepticism; the advancement and attempted rebuttal of arguments, which are intended to prove that we do not know what we think we know.[35]

John L. Pollock also sees problems of knowledge arising from skeptical arguments:

Skeptical arguments generate epistemological problems. Apparently reasonable assumptions lead to the conclusion that knowledge of a certain sort (e.g., knowledge of the physical world, or knowledge of other minds) is impossible. Faced with such an argument, our task is to explain how knowledge is possible. The problem is not to show that knowledge is possible; that much we can take for granted. What we must do is find the hole in the skeptical argument that makes it possible to have the knowledge we do..."How do you know that P?" This is the general form of an epistemological problem. The question "How do you know that P?" is a challenge - a demand for justification. The task of the epistemologist is to explain how it is possible for us to know that P, i.e., to explain what justifies us in believing the things we do.[36]

          1.7 Wittgenstein

To the followers of Wittgenstein, philosophical skepticism is a symptom of conceptual confusion and disorder, an indication that language is being misunderstood and put to an improper use. It is argued that we can only learn what "knowledge" and "certainty" mean by hearing them used in connection with material objects, past events, and people's feelings, etc., and that it is useless to inquire whether these paradigm cases are genuine instances of knowledge and certainty. Philosophers have had a proclivity for looking for uniformity and simplicity where none exist, and hence have ignored the important differences in function between such superficially similar statements such as "God created the universe" and "Beethoven created the Ninth Symphony". The attempt to assimilate one function of language to another, or to treat one as a paradigm to which others must conform, is, for Wittgenstein, the source of many of our proverbial philosophical problems. Bad philosophy results when language is detached from its everyday functions and the aim of good philosophy is to bring out the misunderstandings that give rise to the problems in the first place.          

For Wittgenstein, language is essentially social. In his famous attack on the idea of a "private language", he tried to show that it would be impossible for anyone to develop such a language, one which it is, in principle, impossible to teach to anyone else. Wittgenstein argued that, if there were private events, we would be unable to categorize or talk about them. For the possibility to exist in order to name or categorize something, there must be rules of correct naming and categorization. Without the possibility of a public check, there would be no distinction between our feeling that we reported them accurately and our really doing so, so nothing could count as our doing so correctly or incorrectly. And where such a criterion is impossible, then, according to Wittgenstein, there are no genuine rules at all and hence any genuine language. If he is right about this much-debated issue then the more extreme forms of skepticism, which call into question the existence of anything or anyone independent of one's own mind, are ruled out by the mere fact of a language in which to formulate them; and if we think out the implications of there being a whole society of language users, we are taken a long way back toward a common sense view of the world.[37]

Wittgenstein, in On Certainty, clearly distinguished between what might be termed grounded and ungrounded beliefs. Grounded beliefs are logically parasitic on the ungrounded sort, although not in the sense that they are demonstratively inferred from them. Ungrounded beliefs (or what I referred to earlier as "basic beliefs”) are those common sense beliefs that "stand fast", providing a necessary structural basis for the grounded variety. Included among these would be our belief in an external world, in other minds, causation, and in the general reliability of inductive reasoning. The very possibility of anything that may be called the "scientific method", or of any sort of communication, investigation or inquiry, is dependent upon them. In order to avoid an infinite regress of justifications, there can be no justified judgements unless some beliefs are ungrounded. When we push back reasons, evidence and grounds for a belief we come to beliefs that we will not give up even though we do not use reasoning, evidence and grounds to justify their "certainty."

This class of groundless beliefs is included, for the most part, within the domain of empirical propositions, beliefs that are unassailable and exempt from doubt. Wittgenstein thinks that these are irrefutable propositions and that we should say "Rubbish!" to anyone who denies them; although he also thinks that they are not proper objects of knowledge claims because they preclude inquiry or justification. If he is right, many propositions including G. E. Moore's common sense propositions like "The earth has existed for over one hundred years", "Cats do not grow on trees", and "Automobiles do not grow out of the earth", never come into question, and hence, never become objects of discussion or inquiry.[38] Ungrounded beliefs such as these, says Wittgenstein, "lie apart from the route travelled by inquiry."[39] This is a class of non-inferential propositions or beliefs that are immune to evidence, if only because no other propositions or beliefs are more certain than the beliefs themselves. In other words, skeptical criticism of these basic beliefs establishes no others in their place. The holding of these basic groundless beliefs do not necessarily entail acceptance of a foundationalist epistemology, but they are nonetheless an unavoidable part of the noetic structure of every human being and could not be abandoned without causing havoc to that structure. "Things do not vanish without cause", "Human beings need food and water to survive" and "Objects thrown ten metres into the air will come down to earth" are beliefs that must be accepted as properly basic by any rational human being. Belief in God, for example, which Descartes held to be axiomatic, surely does not enjoy epistemic parity with the aforementioned commonsense beliefs and our noetic structure would surely not come crashing down upon us if we ceased believing in Him.

Wittgenstein also claims that one cannot speak of knowing when doubt is out of the question - "knowledge" makes sense only when doubt makes sense. The absolute nature of the term "certainty" precludes us from saying of anyone that he is ever certain of anything. Doubt invites the challenge to explain how we know, a challenge which is most appropriate when the claim concerns something describable as an hypothesis; and we can intelligibly claim to know something only where the possibility of being mistaken makes sense. When I identify an object in my hand as a pen, for example, none of these conditions is satisfied: I do not seriously admit the possibility that I have made a mistake; that it is a pen is not a hypothesis that I proceed to test; and there is no intelligible response to a request that I explain how I know this. We might suppose that we can give grounds for identifying this object as a pen: we can see that it is! But it is questionable whether this offers any support for our knowledge claim. Using a different example, Wittgenstein writes:

My having two hands is, in normal circumstances, as certain as anything I could produce as evidence for it. So I am not in a position to take the sight of my two hands as evidence for it.[40]

If a person questioned my claim to have two hands I would treat that as evidence that his eyesight is defective before I entertained doubts about my hands. Hence, when we find ourselves unable to offer justifications, only explanations or demonstrations, we know we have reached the level of groundless beliefs and values, a "pre-rational" level of beliefs and values, which themselves cannot be rationally justified.

          1.8 Constructive Skepticism

As we have seen, when philosophers talk about skepticism, they are usually referring to the kind of radical or methodological skepticism directed against epistemological claims such as the existence of the external world, other minds and standards of rationality. However, the speculations about the evil demon (in Descartes Meditations) and the brain in a vat (Putnam, 1981, Chap. 1) are merely dramatic devices to express our skeptical thoughts, our private doubts, in their most radical form. Since there is no neutral position from which we can defend our most basic intellectual foundations, the study of epistemology quite inevitably leads to skeptical anxiety, including concerns about rational procedures and the scientific method. No matter what procedures or standards we employ, how much reflection and inquiry we engage in or how much evidence we gather, there do not exist intellectual guarantees to prevent the possibility of error. Although we find ourselves defending rationality and the methods of induction on their own terms (i.e., we employ our rational pro­cedures in their own defence), some questions such as the reliability of our fundamental methods of inquiry can, it seems, be rightfully begged. It seems odd to demand a justification for rationality because the notion of justification itself is clearly a concept within rationality. Any attempt to stand outside the framework of rationality in order to pass judgement on it requires that one remain inside, and this is clearly incoherent.[41] Hence, as I will argue throughout this thesis, there are principles of rational belief and rational action that are universal, and there are general ways in which we can appraise institutions, practices, and worldviews with respect to their rationality without falling into ethnocentrism, relativism or some tendentious ideological stance.                       

The skepticism that I will endorse is not so much a philosophical position as it is an attitude, frame of mind or disposition - a psychological mechanism to combat the intellectual vices of credulity and pretentious dogmatism. I will presently use the term constructive skepticism to describe the skeptical temperament that entails a perspicuous conscientiousness and judiciousness regarding demands for evidence, a propensity to suspicion about extraordinary claims, and a desire for further argument and persuasion than would satisfy the majority of people. It is a form of skepticism that is not abstract, but rather selective and contextual. It rejects the nihilism and pessimism of radical skepticism and accepts that there is reliable knowledge about the world that can be known by rational, epistemically responsible agents employing reliable rational methods, not only in the sciences, but in the normative realm as well.

Constructive skepticism is a protection against pretension, false hopes, illusion, self-deception and "un-worldliness," an acceptance of a world in which nothing is permanent except change - and change at an ever-increasingly rapid rate. Hence, one attempts to see the world and reality as it is now, recognizing that the future is tenuous and unpredictable, save what science may be able to tell us. This does not mean that we must accept the arbitrary and the contingent, although much of whom we are, what we have become, is a function of accident. As Thomas Nagel has so aptly pointed out:

Two things, neither of them easy to assimilate, strike me about my birth: its extreme contingency and its unimportance... we are here by luck, not by right or by necessity...My own existence or that of any other particular person is extremely gratuitous...  Just as we can't evade skepticism by denying the pretensions of our beliefs about the world and interpreting them as entirely relative to a subjective or personal point of view, so we can't evade the impact of objective detachment by denying the objective pretensions of our dominant aims in life. This would simply falsify the situation. The problem of the meaning of life is in fact a form of skepticism at the level of motivation. We can no more abandon our unqualified commitments at will than we can abandon our beliefs about the world in response to skeptical arguments, however persuasive we may find them, as Hume famously observed. Nor can we avoid either problem by refusing to take that step outside ourselves which calls that ordinary view into question.[42]

All the choices we have made in life and must continue to make imply that we could have chosen and can choose otherwise - we are "condemned to be free" as Sartre has proclaimed, and we are responsible for who we are in spite of the fact that we are always more a product of our contingency than we are of our accomplishments. Real purpose and meaning in life can only come from within us and our immediate surroundings - our friends, our family, our work, our recreational activities, and so on.

The constructive skeptic is a stoical optimist who has the capacity to laugh at failures and disappointments. Although the skeptic avoids false hopes and illusions (that stock is going higher), he too has plans and projects that often do not come to fruition. Life is too short to be taken too seriously since one is at the mercy of many forces that are beyond one's control. The perfect antidote is humour - to laugh at the stock I bought yesterday at $1 that trades today at 50 cents. Hence, the constructive skeptic is a fallibilist who avoids the concept of "perfection" - all life's projects are fallible, imperfect, incomplete, and human. The moral life, for example, should be construed as an appeal to the best-reasoned ethical principles that our culture has to offer that promote well-being for all and to view all ethical discourse as rational and contextual. One must tolerate (and even laugh at) what is accidental and contingent, because living with what is accidental and contingent is not a failure to achieve absolute perfection or the transcendental, but is our normal situation. The purpose of life is life itself and to ask for some transcendent purpose is to not know what “purpose” is; to ask for the ultimate goal of the tennis player is to be interested in something other than tennis.

The constructive skeptic rarely contemplates extremes such as suicide because he accepts his fate (perhaps after some crying) with stoical resignation and humour - life must go on. But the life of the constructive skeptic is the life of reason; a rational person can examine means and ends and, hence, exercise some per­sonal control regarding his future and the future of those to whom he is respon­sible. The life of reason is the rejection of the attempt to remain ignorant. The constructive skeptic is suspicious of a priori theories, grand metaphysical or utopian schemes, and "salvation plans." Oversimplified views of the world and ideological abstractions, including oversimplified solutions to real human problems are to be distrusted, as are appeals to absolute, transcendent or immutable closed systems of thought (absolute truth, absolute choice, absolute authority, absolute knowledge, absolute ethical principles, absolute reality, absolute life, etc.). Man's search for absolutes is an attempt to escape intellectual and moral responsibility - to escape life, the negation of reality.

Many postmodernist philosophers have rejected the rational tradition of the Enlightenment simply because its project has not resulted in perfection, utopia or the absolute. We cannot entirely reject our traditions and social practices merely because we doubt their efficacy. Although all beliefs are subject to critical scrutiny, choice requires an appeal to an existing structure of mores, accumulated reliable knowledge, social practices, etc.; a revolution rarely works because we end up sinking "Neurath's boat."[43] Criticism is, above all, conflict between social practices. To be capable of constructive criticism of social practices and institutions, one must know social practices and institutions, as well as rules of rational discourse. Because they are asking people to change, the burden of proof for reformulation or rejection of existing social mores and practices must always be on the advocates of change. The constructive skeptic, however, rejects the unexamined belief in the status quo and further rejects the acceptance of the premise that we are prisoners within our own "language games" or "conceptual scheme" (I will deal with this issue later in the thesis). All beliefs, values and social institutions are subject to criticism. The constructive skeptic is also critically aware of authority, in a world in which one is increasingly dependent on experts and specialists rather than direct personal experience (perhaps real personal experience is becoming obsolete).

Constructive skepticism is not equivalent to cynicism, pessimism and lack of hope, relativism, or nihilism. But neither is it a resigned acceptance of the Leibnitzian "best of all possible worlds." Evil and suffering do not exist, as apologists like Leibnitz have claimed, so that one can come to understand the "Good"! The constructive skeptic avoids the intemperance of seeing the world through "rose-tinted spectacles." In any event, the universe is morally neutral and indifferent to our plans, projects and values. Value and meaning in life are not externally imposed a priori, and even if they were, our existence can, in the relevant sense of "end", "purpose" or "meaning" have no other end, purpose of meaning than what we as responsible human beings give it by our own deliberate rational choices, decisions and value judgements.[44] And as Thomas Nagel has written, "What makes doubt inescapable with regard to the limited aims of life also make it inescapable with regard to any larger purpose that encourages the sense that life is meaningful."[45] We do not eo ipso establish that something is good or ought to be done by discovering that I or others approve of it, like it, desire it, ought to do it, strive for it, seek it, and the like. Moreover, "X is good" and "Y ought to be done" cannot be inferred from "That transcendent entity whom we call Z says X is good" or "Z wills Y" unless we independently judge that whatever Z says is good is good or whatever Z says ought to be done.[46]      

Constructive skepticism does not dismiss metaphysics. Rationality does not and likely will not give us all the answers to our questions and problems. Metaphysics is the cognitive department of "realms of the unknown" - the "awe factor." Metaphysics is the realization of infinite speculative possibility that our knowledge is never complete and our problems never completely "solved." A major part of being human is to be faced with problems, enduring problems that are never really "solved," but only dealt with or perhaps mitigated. As Dewey has pointed out, we are always looking at ends that ultimately become means to further ends, and we are continuously dealing with unfinished business. Problems never have only one solution, and one who gives only one solution to a problem and who thinks he has solved the problem easily or absolutely falls prey to self-satisfied dogmatism.                                        

To the constructive skeptic, metaphysics is not the unreflective acceptance of a priori or authoritatively given world views; nor is it the creation of immutable and absolutist systems of thought so that we can resign ourselves to the comforts, consolations and "certainty" of a system. Metaphysics is the attempt to understand, to make sense of the world, to see, as best we can, how things really are, to get things right, and not to escape reality by engaging in quixotic quests for certainty. Metaphysical theories are, as John Kekes has stated, not "gratuitous speculations of idle minds, but passionate attempts to make sense out of reality."[47] An adequate metaphysical theory must provide a rational, conceptually coherent, and comprehensive interpretation and explanation of how things really are and a reasonable view of reality and man's place in it. The possession of such a world-view is an essential component of the humanistic outlook and makes rational action possible. Plausible, yet facile, superficial and highly speculative transcendental worldviews are often a function of dogmatism, self-deception or unreflective wishful thinking. Outlandish forms of transcendent, supernatural metaphysical theorizing should heed Wittgenstein's closing words of the Tractatus, namely, "Whereof one cannot speak, thereof one must remain silent." Speculative theories should be falsifiable and subject to criticism by experience and experimentation. The fabrication of vague, ambiguous ontological entities and propositions that have no way of being tested are vacuous. It is difficult to conceive of a bona fide fact claim with no empirical consequences and if the more conceptually oriented approaches to metaphysics do not work either, there is an excellent case for incoherence.

I have come to believe that people can be placed in three main "metaphysical" camps: First, there are people who take the world for granted, or: A what you see is what you get and it is obvious that is how it is and talking about it is not going to change anything, and so why bother? Metaphysics is a waste of time so let's get on with our lives. This appears to be the outlook of most people.                                                                       

Second, there are the religious (in the "theological" or "ecclesiastical" sense). To Christians and Moslems, for example, this life is a preparation for better things to come which will be satisfied by the God who has made them and this world, has given them immortal souls, and their purpose and meaning in life can be discovered only through God. All the questions, contradictions, paradoxes, ironies and "accidents" of this world can be explained by the Almighty - He has all the answers. Thus, we should quit pestering ourselves with questions and put our faith and trust in Him. Our questions will only be answered when we die. The attitude of this group I find as complacent and incurious as the first; they simply offer different reasons for evading the real problems and questions of life, and equally do not really seem to feel the problems. They have subdued themselves into a smug sense of security with a story which may or may not be true but which they have no substantive evidence or grounds for believing.

Third, there is the group who condemn the previous two groups for their credulity and intellectual slothfulness. This third group is in awe of the very fact of our existence and possess a natural curiosity about the mysteries of the universe, refusing to accept simplistic answers and explanations. This group questions both the way things are and our traditional social and religious beliefs. They challenge the adherents of the other two groups for proof or at least good evidence, justification or argument. Within this group are two subsets or subgroups. Subgroup #1 is the group who believe that everything is explicable and solvable at the bar of reason, that rational inquiry will eventually answer all our questions. However, they forget that the perplexity and insolvency of most pressing human problems are brought into existence by the application of rational thought and seemingly cannot be removed by it. This unreflective "faith" in the power of reason tends to elevate rationality to the status of a religion or ideology. If there is a God, however, His gift of reason is surely His greatest gift; but it is not an infallible one. Subgroup #2 agree with the criticisms which subgroup #1 directs at the previous two groups and accepts rationality as our most valuable and useful human attribute, but they maintain a stance of skepticism, fallibilism and humility concerning both what we claim to know and the sovereignty of rationality. It is my contention that we should, as educators, encourage and foster in our students the attributes and dispositions characteristic of subgroup #2, the group within which a constructive skeptic would be found.


One who has not been scrupulous in knowing cannot be scrupu­lous in doing - Lorraine Code                                              

 The foundation of morality is to give up pretending to believe that for which there is no evidence, and repeating unintelligible propositions about things beyond the possibilities of knowledge - T.H. Huxley


                    (2) THE ETHICS OF BELIEF

          2.1 Ancient and Medieval Sources

The central problem of epistemology is the individual's concern with what to believe and how to justify those beliefs. Hence, many philosophers have held that there is an important connection between epistemic concepts such as belief and ethical concepts such as justification. Plato regarded the form of the Good; i.e., Goodness itself, as the ground not only for all goodness, but also all being and all that is knowable. Many medieval philosophers, such as Thomas Acquinas, regarded goodness and truth as two of the transcendentals, coextensive and ranging across all categories.

Partially in reaction to the presumptuous rationalism and dogmatism of the Church, sixteenth century humanists such as Montaigne, argued that it was best to suspend judgement about matters of general theory, and to concentrate on accumulating a rich perspective, both in the natural world and human affairs, as we encounter them in our actual experience. This respect for the possibilities of human experience was one of the chief merits of the Renaissance humanists, but they also were also sensitive to the limits of human experience and knowledge. Human modesty and humility alone, they argued, should teach reflective Christians how limited are their ability to reach unquestioned Truth or unqualified Certainty over all matters of doctrine.[48]

          2.2 Enlightenment Skepticism

Francis Bacon argued at length that material, human and social progress had been retarded for centuries by false philosophies that pandered to human credulity, superstition and what he called the "Idols of the Mind".[49] The pervasiveness of the "Idols" and the all-too-human tendencies toward intellectual laziness, credulity and fanaticism give plausibility to the skeptic's arguments for the fallibilism of knowledge. If Bacon had to choose between the skeptic and the dogmatist, he would choose the former since the dogmatist curbs or conceals the doubts that are essential to genuine inquiry.[50] Bacon's solution to our predicament was the cultivation and eventual resol­ution of doubt through diligent application of inductive procedures of inquiry and verification. The cognitive-ethical thrust of Bacon's position is that genuine knowledge and the prospects it holds are possible only if his standards and procedures for open-minded disciplined inquiry are adopted. To fail to do so, for people to indulge their credulities and cognitive inadequacies, would rob them of their promise for a satisfying intellectual life. For Bacon, the proper choice between dogmatic certainty and radical skepticism seemed clear.

          2.3 Locke's Ethics of Belief

John Passmore describes the tight connection between the normative and the epistemological in modern western thought as follows:

Modern philosophy was founded on the doctrine, uncompromisingly formulated by Descartes, that to think philosophically is to accept as true only that which recommends itself to Reason. To be un-philosophical, in contrast, is to be seduced by the enticements of Will, which beckons men beyond the boundaries laid down by Reason into the wilderness of error. In England, Locke had acclimatized this Cartesian ideal. There is "one unerring mark," he wrote, "by which man may know whether he is a lover of truth for truth's sake:" namely "the not entertaining any proposition with greater assurance than the proofs it is built upon will warrant." Nineteenth-century agnosticism reaffirmed this Lockean dictum, with a striking degree of moral fervour.[51]

 H. H. Price, who has put it forward as a definition of rationality, has supported Locke’s normative principle of belief. Price states "the degree of our assent to a proposition ought to be proportioned to the strength of the evidence for that proposition."[52]

Locke's two doctrines, then - that assent has degrees, and that the degree of assent ought to be proportional to the strength of the evidence - may easily seem platitudinous. One would be happy to think that they are. For if they are false, our human condition must be both more miserable and more intellectually disreputable than we commonly suppose... It would be more miserable, because we so often need to be able to assent to propositions on evidence that is far less than conclusive; and therefore, we need to be able to assent to them with something far less than total or unreserved self-commitment, if we are to have any guidance for our subsequent thoughts and actions... But... we do not always have to choose between an inert agnosticism - a helpless "wait and see" attitude - and a total and unreserved self-commitment. When our evidence for a proposition, although not conclusive, is favourable, or favourable on balance when any unfavourable evidence there may be is taken into account, we can assent to that proposition with a limited degree of confidence; and we can then conduct our intellectual and practical activities "in light of" the proposition, though not without some doubt or mental reservation.[53]

 Any degree of belief involves the elements of commitment, responsibility, and intellectual integrity. The degree of commitment may vary: a belief held too strongly may bring with it an attitude of disregard, though not complete disregard, of alleged evidence which conflicts with the belief. One must remain open-minded and open to counter-evidence. All beliefs ought to be based on evidence and the degree of commitment should be, as Hume and Price have claimed, in proportion to that evidence. According to W. K. Clifford, who I will discuss in greater detail later, what counts as evidence for one person must be confirmable by any other person under similar circumstances. Beliefs based upon personalized, internal Cartesian "intuitions" or "revelations" which cannot be publicly corroborated are highly suspect. Moreover, a priori postulations of metaphysical entities that cannot be empirically verified or hypotheses that cannot be falsified are deemed intellectually irresponsible. Karl Popper's "falsifiability principle" is relevant here. A proposition is considered falsifiable if we can know what would have to happen, be happening or going to happen, in order to prove that the proposition or doctrine is not, after all, true.[54] Also, any kind of indeterminateness or vagueness in meaning regarding metaphysical constructs were often, and is often intended to, disarm potential criticism and serious skeptical inquiry. But for those who, like Bertrand Russell's "pedant", having an aversion to self-contradiction, prefer their statements to be true, will not tolerate unfalsifiability or any other kind of obscurity and indifference to truth. Russell, in making a point in one of his skeptical arguments, claimed that one couldn’t prove that the universe was not created five minutes ago - but neither is there any evidence for this claim. Similar points can be made concerning the hypothesis of the existence of God. According to Clifford, anyone who believes in a supreme deity believes on insufficient evidence and hence violates the "ethics of belief" by displaying the vice of credulity. If there is evidence for God's existence, it is evidence that should be available to everyone and open to refutation.

The degree of supporting evidence needed for a belief is clearly contingent upon the urgency and importance of the proposition under examination. As the stakes are pushed higher, one might want to examine more thoroughly the grounds for a certain belief. For example, hearsay evidence or the testimony of my neighbour may be sufficient for me to believe that the Canucks won the hockey game last night, but may be clearly deficient in determining my assent to the propositions "Moose Pasture Resources is going higher" and "This mushroom is edible." If the subject matter or conjecture is important, it is clearly irresponsible to hold a belief on slight evidence or in the face of unexpected contrary evidence. A person may be simply naive and gullible (or stupid) but it is the power of the skeptic's intellectual scrutiny that can embarrass and induce the awareness of his bewilderment, carelessness and dishonesty. The power of critical and constructive skepticism, as I will argue later, is its ability to suppress the propensity to credulity and in the force of its arguments against the claims of dogmatism.

          2.4 Epistemic and Ethical Concepts

According to Aristotle, both intellectual and moral virtues are character traits but intelligence and wisdom are the supreme virtues, with ethics tied to knowledge and intelligence inseparably tied to ethics. Intellectual virtue helps one to select the correct goal, and intelligence to choose the correct means (Ethics, Book 6, Sec. 13). A reasonable virtue must be very closely connected with an ethical one, and must interact with it. C. S. Peirce, in referring to our reasoning, our believing and concluding, stated "we have here all the main elements of moral conduct; the general standard mentally conceived beforehand, the efficient agency in the inward nature, the act, the subsequent comparison of the act with the standard."[55] Peirce further asserts that "what is more wholesome than any particular belief is integrity of belief, to avoid looking into the support of any belief from fear that it may turn out rotten is quite as immoral as it is disadvantageous."[56]                                       

In contemporary twentieth century philosophy the connection between ethical and epistemic terms has been discussed at some length. A. J. Ayer defines knowledge as including "the right to be sure."[57] He writes of "being entitled" to talk about some thing being true [58] and of someone's "right to reproach me" if my epistemic credentials do not meet certain standards.[59]  C. I. Lewis refers to the "sense in which cognitive rightness is itself a moral concern"[60] and any belief that explicitly or implicitly has the character of inferred conclusion - any belief that is such that the test of its correctness will “involve test of some inference implicit in it” - is either "justified, warranted and right" or "unjustified and wrong."[61] One is naturally assuming, of course, that the belief is within the subject's control. As Lewis states later, "the sub­ject must have no reason to be unju­stified and non-rational or irrational."[62] Roderick Chisholm has noted that epistemic reasoning and discourse are very much like ethical reasoning and discourse and makes the strong claim that "when a man fails to conform to the ethics of belief he is ipso facto, behaving irrationally."[63] He believes that many characteristics that philosophers "have thought peculiar to ethical statements also hold of epistemic statements"[64] and that "presuppositions of the theory of evidence are analogous, in fundamental respects, to the presuppositions of ethics."[65] Chisholm does not, in other words, attempt to conflate epistemic principles with ethical principles or to reduce the former to the latter, but points out that there are similarities in the process of justification.[66] One might ask, for example, whether it is ethical to believe anything, provided that it does not harm anyone. John McDowell (1979), Ernest Sosa (1980­, 1985), Alvin Goldman (1986), and Lorraine Code (1987) have held that, at least in part, epistemology should be thought of as an account of the intellectual virtues and, not unlike Chisholm, that there are important similarities between epistemic and moral evaluation. Code establishes connections between intellectual virtue, epistemic responsibility and wisdom where wisdom "has to do with knowing how best to go about substantiating beliefs and knowledge claims, where best means with intellectual honesty and due care."[67] This process of justification involves "a willingness to let things speak for themselves, a kind of humility toward the experienced world."[68]

          2.5 W.K. Clifford's Ethics of Belief

The motivation and inspiration for this thesis has come from many sources which include the influence of Bertrand Russell and twenty-five years experience teaching senior high-school Mathematics, but W. K. Clifford's perspicuous and incisive essay "The Ethics of Belief" (1877) provided the major impetus for me. W.K. Clifford (1845-1879), English mathematician and philosopher, was a brilliant student at Cambridge who, in his very short life[69] became a distinguished professor (he was elected fellow of Trinity College at the age of 23), public lecturer, member the London Mathematical Society as well as the most prestigious intellectual society of the day, the Metaphysical Society. Clifford's examination of the basis of belief in the natural sciences led him to a more general analysis of belief. In fact, it was this general analysis of belief and the agnostic and humanistic conclusion to which it led that induced vehement opposition on the part of William James in his essay, "The Will to Believe." Clifford argues that the survival of civilization itself depends upon the habit of forming only justified beliefs. Credulity, the propensity to hold unjustified beliefs, he asserts, threatens the very foundations of society - "the credulous man is the father to the liar and the cheat."[70] He ultimately concluded, "It is wrong always, everywhere, and for anyone, to believe anything on insufficient evidence."[71]

In order to make his point and secure this unconditional imperative within the context of real-life events, Clifford tells several stories. One of these concerns the owner of an un-seaworthy emigrant ship. He laid all anxieties aside by ignoring contrary evidence in order to believe in the ship's seaworthiness. He then authorized the voyage and collected the insurance when the vessel went down. Clifford declares that the ship owner "had no right to believe on such evidence as was before him"[72] and that, even if the ship had returned safely, his guilt would be diminished "not one jot."[73]

When an action is once done, it is right or wrong forever; no accidental failure of its good or evil fruits can possibly alter that. The question of right or wrong has to do with the origin of his belief, not the matter of it; not what it was, but how he got it; not whether it turned out to be true or false, but whether he had a right to believe on such evidence as was before him.[74]

 Hence, the ship owner is open to censure not merely because of what he did and its consequences, but also because of the manner in which he arrived at the belief upon which his action was based. The action itself is immoral, but the belief is epistemically irresponsible and morally reprehensible because "it is not possible to sever the belief from the action."[75]                                              

According to Clifford there are normative imperatives for the proper formation of our beliefs just as there can be normative requirements for control of our actions. These imperatives might be deemed duties to oneself - moral action done in accordance with beliefs formed for good reasons and not merely prudential or self-serving reasons. An action is not considered moral if done unintentionally or for self-interested reasons. Beliefs affect actions in so far as they embody expectations about what the results of these actions will or would be. Although our democratic freedoms of thought and expression prohibit it, Clifford might be thinking of why we punish a person for his irrational behaviour, yet do not punish him for his irrational beliefs. If certain things are believed to be true, it is considered rational to act in a certain way, but thinking and action are considered mutually exclusive entities by our system of justice. One can believe what one wants provided it does not spill over into action, which would be extremely rare.  Nevertheless, the ship owner is unethical because his unwarranted belief in the ship's seaworthiness was motivated by his selfishness and self-deception. It was Adlai Stevenson, I recall, who once said "we judge others by their acts, but we judge ourselves by our motives." For example, a person would be judged a saint for saving a drowning child even if his private motive for doing so was to engage in some aerobic exercise. If the ship had not gone down, the ship owner, as Clifford has argued, would still be guilty because of his unworthy motives. The ship owner distorted evidence and engaged in rationalization and self-deception in order to arrive at conclusions that were more self-serving and congenial to him.

Clifford's story of the ship owner reminds me of the 1948 movie All my Sons, in which a young man, played by Burt Lancaster, discovers that his father (played by Edward G. Robinson) was responsible for his company's shipment of defective aircraft parts to the U.S. Air Force, resulting in 21 deaths. The son initially succumbs to rationalization and refuses to accept evidence of his father's guilt because he is blinded by his affections, sense of commitment and familial attachment. This trust and commitment of intimate relationships often rules out strict conformity to the rules of evidence, but surely this does not mean that love and friendship should permit us to slide into a reckless and arbitrary flight from rationality and truth. Was it Tolstoy who once said that it is better to be deceived than to be skeptical? Surely this is bad advice on a used car lot or when listening to the appeals of politicians during an election campaign, yet it does seem to represent a sound prima facie maxim for love and friendship. However, in spite of the e­motional bonds and the "will to believe" in his father's innocence, the son conducted his own investigation into the matter and came to the inescapable conclusion that his father was guilty, that "...he lied to himself and he doesn't know - he's got to see it and be his own judge. There are some things that are bigger than yourself and your family." It is clear that this "bigger thing" is the truth, and the truth, at least in this story, does eventually prevail. A more recent true story is the case of Christine Lamont and David Spencer who were charged with kidnapping a San Paulo millionaire and subsequently sentenced to 28 years in a Brazilian jail. In spite of their conviction and more recent compelling evidence of their guilt, the parents of both of these young people remain convinced of their innocence. We may admire the loyalty of these parents who go on believing that their children are innocent of a crime in the face evidence of their guilt; we may even admire the patriot whose slogan is "My country, right or wrong". Certainly there is merit in loyalty that does not give way too easily; but there is also ample room for the concept of misguided loyalty and misguided devotion.                    

Lorraine Code (1982, 1987) provides an example similar to the above, telling the true story of Philip Gosse, who falsified his scientific discoveries because they conflicted with his deeply held religious beliefs. Gosse was a nineteenth century biologist who "chose to discount the findings of the new biology because of their incompatibility with his belief in the literal truth of the creation story as set forth in the Book of Genesis."[76]  Gosse accepted the conclusion that Archbishop Usher had arrived at from his study of the chronology of the Old Testament, that the world was created in 4004 B.C. In order to get around the difficulty of the conflict between Usher's analysis of the Genesis account and his scientific findings, he maintained that God had indeed created the world in 4004 B.C., but filled it with delusive signs of a much older world in order to test people’s faith. The strength of the geological evidence, he argued, was proportional to the extent that the Deity was prepared to go in carrying out this test. Hence, according to this view, fossils are real, but their appearance of old age is illusory. What good, however, can this logical possibility do to the creationist's story? If God is a deceiver, as the possibility under consideration implies, then his words in the Bible are quite possibly deceptions as well. Even if we allow Gosse's postulation of a Creator and a beginning in time, there is no way of refuting his position. Any evidence which geologists uncover can be assimilated and explained away by his "theory." But if the findings of science are to be accepted in other fields, it seems hardly plausible to assume that its laws break down just at that point. Gosse's failing resulted from his disinclination or inability to modify his beliefs when the evidence available to him indicated that he ought to do so.                                                             

Code describes Gosse as a man who is "quite unaware of his own dogmatism" which led to a "failure of integrity, wisdom, and epistemic responsibility."[77] In another sense, Gosse's failure was a failure of judgement. Inquirers of good epistemic character attempt to arrive at sound judgements by exercising their discretion appropriately in seeking out and assessing the worth of both evidence and counter-evidence. Some contemporary physicists with a "spiritual view" of the universe have distorted Quantum Physics and the Big Bang theory to argue for the existence of God.[78] What would be impressive however, as Robert Nozick has pointed out, is some physicist being forced to conclude by the weight of the evidence, but contrary to his own personal preconceptions, biases and desires that the universe is strictly materialist, that "the universe is at base spiritual."[79]

Phillip Gosse's failing is explained in the psychological literature as cognitive dissonance - a propensity to actively defend oneself by means of a distortion and denial of disconfirmatory evidence against deeply held beliefs. An important study conducted by C. D. Batson showed that "a person confronted with irrefutable disconfirmation of a personally significant belief will frequently emerge, not only unshaken, but even more convinced of the truths of his belief than before."[80] Subjects in the study who both expressed belief and accepted the veracity of the disconfirming information subsequently expressed a significant increase in the intensity of the belief. Cognitive dissonance and its partner confirmation bias, it would seem, are possibly functions of tenacious beliefs that have been acquired by some process of indoctrination, a matter that I will deal with later.

          2.6 Clifford's Normative Epistemology: Evidence

By holding that a certain way of proceeding with respect to one's beliefs is "wrong" (but not in the sense that a certain mathematical procedure is wrong), it is clear that Clifford is proposing a moral­ thesis or principle. He maintains that believing on insufficient evidence leads to a variety of harmful consequences that include depravation of character, undermining of public confidence, irresponsibility and self-deception. Clifford puts sufficiency of evidence forward as a necessary condition of the legitimacy of belief or, at least, an insufficiency of evidence is a sufficient condition for the immorality of belief. But what is the sufficient condition for a morally sound belief? For an ethics of belief to be tenable, it must be determined whether it is a function only of evidence, and if so, under what conditions are evidence sufficient and the grounds adequate to affirm a belief? There are no clear-cut rules that we can assume a priori - it depends partly on the context of inquiry, the unique facts of the case and whether the relevant reasons adduced to support a conjecture or theory are considered to be sufficient.           

Clifford's moral argument for basing belief only on epistemic reasons is, in a sense, ironic. His reason for not accepting beneficial reasons in justifying belief are ostensibly based on one type of beneficial reason - the undesirable moral consequences of doing so. Perhaps Clifford should have argued that there is a freestanding epistemological obligation to base one's beliefs on purely epistemic reasons. He could have argued that there is a prima facie epistemic duty to believe what appears to be true in the light of the evidence and, moreover, there are general beneficial reasons for believing only what is implied by the evidence. This creates a strong presumption that one should only believe something is true on the basis of epistemic reasons. In addition, there is a presumption that beneficial reasons will only be used when there are no epistemic reasons for disbelief. In other words, beneficial reasons may be invoked to decide whether to believe some proposition p or to believe ~p when there are equally strong epistemic reasons for p and ~p. Although there are circumstances (which I will discuss shortly) in which this presumption can be overridden, they are uncommon. Hence, there is both a moral duty and an epistemic duty not to believe something on insufficient evidence unless there are good epistemic reasons to believe them.

Bertrand Russell has a belief prescription that might be thought of as a mitigated version of Clifford’s, namely, "that it is undesirable to believe a proposition when there is no ground whatever for supposing it true."[81] Russell's precept is less restrictive, for it says when there is "no ground whatever," whereas Clifford extends his prescription more aggressively to "insufficient evidence," suggesting disbelief when the grounds are inadequate. Russell's imperative, however, seems so persuasive that no rational person would deny it. It is difficult to imagine a belief for which one has no clues or indications whatsoever, not even a trace of evidence. But there do seem to be some things that we are justified in believing, such as free will and causation that are not a direct function of "evidence". We are quite justified on some occasions in taking action based on beliefs for which we can­not have sufficient evidence. William James is right when he claims "our personal nature not only lawfully may, but must decide an option between propositions, whenever it is a genuine option that cannot be decided on intellectual grounds."[82] In other words, few important decisions can be made on clearly formulated rules or maxims applied to objective, fully verified empirical evidence. Moreover, it does seem to me that there are some beliefs we are morally obliged to hold even though we do not have sufficient evidence on which to base them. In other words, there appear to be weak epistemic or non-epistemic occasions for believing - a manner of faith that is morally acceptable in the sense that we extend someone "the benefit of the doubt" in the absence of conclusive evidence. It is reasonable in science, for example, to give tentative assent to a hypothesis that is only plausible, yet lacking in evidential support and in social contexts to trust others before there is sufficient evidence that trust is justified since trust is necessary for cooperation.                                                              

Giving the other person the benefit of the doubt can help one avoid bias in making moral judgements and is consistent with open-mindedness. For example, a wife might feel that she ought to believe that her husband did not cheat on her, or a person might feel that she ought to continue to trust a life-long friend who she suspects may have betrayed her - even in the face of what may be considered, in the absence of the special relationship, adequate evidence to the contrary. Also, one could argue that the content of a belief might morally justify believing against the evidence. For example, although one must question the value of such research, suppose that there is found compelling evidence to support a claim for racial superiority? I think there are certainly cases, like the ones just mentioned, in which value considerations should impact upon beliefs that are justified by purely factual evidence. Hence, rather than our usual philosophical concern over whether factual statements can justify value statements (the proverbial Humean "no ought can be derived from an is"), we might legitimately ask whether or not value considerations can help justify beliefs about purely factual matters (can an "is" be derived from an "ought"?).[83] If we accept that "ought" implies "can", then perhaps Clifford has no business telling us that we ought to believe only on the basis of sufficient evidence unless he takes belief to be, in some sense, a voluntary matter. I think it was Kant who stated somewhere in The Metaphysics of Morals that we have no obligation to believe anything. If I understand him correctly, Kant claimed that one has the right to believe (but not to "know") that God, soul, immortality, justice and freedom exist, not as metaphysical necessities, but as pragmatic moral necessities. We have the right to consider these notions as synthetic a priori truths if doing so will make us better, more successful people. Kant's position on this issue is not unlike that of William James, who I will deal with in the next chapter.

There are instances for which it could be argued, on practical or psychological grounds, that one should act as if one believed a proposition, even without sufficient evidence. For example, suppose that I have won only 20% of my tennis matches against Ralph. I am playing him on Saturday morning and based on the evidence before me, I ought to believe that I will lose. However, believing that I will win (i.e., believing against the evidence) increases my confidence and hence creates evidence in favour of my winning the match. These psychological devices, "the power of positive thinking" and "mind over matter" can be useful, as many self-help and religious gurus have demonstrated. As William James has stated, "faith in a fact can help create the fact."[84] Someone seriously ill, for example, could justify his belief in recovery on the basis of (1) the salutary psychological or pragmatic causal effects of his belief that he will recover and, (2) his physician's diagnosis/prognosis and laboratory tests. But surely only (2), and not (1), is epistemically relevant to the question of whether or not he will likely recover from the illness. The question that must be asked, however, is: when do these mechanisms spill over into exercises in self-deception and neurotic vanity? If a person wants to be reasonable and realistic about his projects, relationships and beliefs, then fantasy, illusion and self-deception must be suppressed as much as is humanly possible. Akrasia is often a function of one's failure to confront the claims of reason and one's fear of facing the truth. Should I believe that I can defeat Pete Sampras or Stefan Edberg at tennis? Hardly! Eamon Callan views incontinent behaviour - such as akrasia, self-deception and wishful thinking - as dispositional, a function of the failure of intellectual autonomy and an unwillingness to face the "real world as it is."

What often causes confusion in this area is a tendency to drive a sharp wedge between the things we choose and the things that "simply" happen to us. It does not make much sense to say that one chose to believe such and such, but neither does it make much sense to say that a person's beliefs or feelings are given facts about himself in the sense that his mortality is. For what one comes to believe and feel is immensely influenced by how one chooses to direct one's mental energies. One can decide to face disagreeable facts or engage in wish-fulfilling fantasies instead; one can passively indulge in longings for what one has come to see (or what one should see) as futile, imprudent or evil, or else one can focus one's attention on how to live in the real world in which one finds oneself [85]

          2.7 Clifford's Normative Epistemology: Responsibility

How and what one comes to believe, Clifford argues, involves not only an important obligation to oneself, but entails a responsibility to the rest of society. This manifests itself in two ways. First, although some beliefs are not actively demonstrated, perhaps because of their insignificance, others because of their expression would be in some way inimical, threaten­ing or embarrassing to both the believer and society. However, Clifford does not accept the view that some beliefs, such as religious beliefs, are thus epistemically entitled. The reason he gives is that to hold such beliefs, no matter how trivial, disposes the mind to accept others like it and leads to a reckless self-perpetuating attitude toward truth itself.[86] Vulnerability to bad arguments in one domain, for example, may open the door to being manipulated in another domain.

No real belief, however trifling and fragmentary it may seem, is never truly insignificant; it prepares us to receive more of its like, confirms those which resembled it before, and weakens others; and so gradually it lays a stealthy train in our inmost thoughts, which may someday explode into overt action, and leave its stamp on our character for ever. Every time we allow ourselves to believe for unworthy reasons, we weaken our powers of self-control, of doubting, of judicially and fairly weighing evidence. But a greater and wider evil arises when the credulous character is maintained and supported, when a habit of believing for unworthy reasons is fostered and made permanent.[87]

The claim made here is that to deplore the communication of false beliefs is to assume that it is imprudent and immoral to believe what is not true. Generally speaking, this is a reasonable prima facie assumption, but, it could be argued, is not universally correct. There are some things, such as the precise date of our death, of which we would rather not know the truth. The general principle that true belief is beneficial rests to a large extent on the dual notions that truth has intrinsic value and the fact that most of what we do is not done for its own sake but as a means to some further desired or chosen end. Second, Clifford argues that not only are the actions I am led to by false or unreasonable beliefs harmful to others, so is the possession and dissemination of false or unreasonable beliefs by them. For teachers, this dictum has important implications.

If I let myself believe anything on insufficient evidence, there may be no great harm done by the mere belief; it may be true after all, or I may never have occasion to exhibit it in outward acts. But I cannot help doing great wrong towards humankind, that I make myself credulous and lose the habit of testing things and inquiring into them. The harm which is done by credulity in a person is not confined to the fostering of a credulous character in others, and consequent support of false beliefs. Habitual want of care about what I believe leads to habitual want of care in others about the truth of what is told to me. People speak the truth to one another when each reveres the truth in his own mind and in the other's mind, but how shall my friend revere the truth in my mind when I myself am careless about it, when I believe things because I want to believe them, and because they are comforting and pleasant? [88]

Hence, one of the prices one pays for credulity, faulty reasoning and unwarranted beliefs is the familiar problem of the slippery slope. How do we prevent the occasional acceptance of a belief on unsubstantial or flimsy evidence from influencing our habits of thought more generally? Thinking straight about the world is a valuable, yet difficult, process that must be rigorously cultivated and fostered. By attempting to turn our rational faculties and critical intelligence on and off at will, we risk losing it altogether, and this impairs or endangers our ability to see the world clearly and truthfully. Moreover, by failing to fully develop our critical faculties, particularly the will to doubt, we become vulnerable to the arguments and exhortations of those with other than honourable intentions. And as Stephen Jay Gould has noted, "When people learn no tools of judgement and merely follow their hopes, the seeds of political manipulation are sown."[89] 

          2.8 Clifford's Normative Epistemology: Authority

Clifford's concern for the ethics of belief contains two moral elements: the first is expressed in the language of ethical obligation (one ought not to believe on insufficient evidence); the second is expressed in the language of intellectual or epistemic virtues and vices (reverence for the truth, persistent care regarding one's believing and avoiding credulity). I will discuss these two elements later, but it should be stated that the proper examination of these two normative components of belief would involve a thorough philosophical analysis of the relevant epistemic and moral concepts which far exceeds the scope of this thesis (I will, however, later discuss at some length the notions of truth and objectivity). Clifford ventures some distance into these areas in the latter half of his essay, pointing out the fallibility and limits of inductive inference, the necessity for "the assumption of a uniformity in nature", and the fact that, in dealing with our everyday needs and circumstances, we find that we must act on inconclusive evidence and probabilities. In order to avoid slipping into an undesirable state of universal skepticism, he states there are many cases in which it is our duty to act upon probabilities, although the evidence is not such as to justify present belief; because it is precisely by such action and by observation of its fruits, that evidence is got which may justify future belief. We therefore have no reason to fear lest a habit of conscientious inquiry should paralyse the actions of our daily life.[90]

Clifford admits that the majority of what we believe and claim to know results from the acceptance of testimony and appeal to authorities. But why is it rational ever to accept anything another person tells you? He states that our acceptance of authority is contingent upon three factors, his veracity, his knowledge and his judgement.

In what cases, then let us ask in the first place, is the testimony of a man unworthy of belief? He may say that which is untrue either knowingly or unknowingly. In the first case he is lying, and his moral character is to blame; in the second case he is ignorant or mistaken, and it is only his knowledge or his judgement that is in fault. In order that we may have the right to accept his testimony as grounds for believing what he says, we must have reasonable grounds for trusting his veracity, that he is really trying to speak the truth so far as he knows it; his knowledge, that he has had opportunities of knowing the truth about this matter; and his judgement, that he has made proper use of the opportunities in coming to the conclusion which he affirms.[91]

 In other words, most of the time we cannot directly examine the evidence for a belief and we must accede to the testimony of an authority or so-called "expert" (e.g., my belief that there is a country called Japan where I have never been). This second order inquiry, which is parasitic upon other people's inquiries, is how most of our knowledge is acquired. But this does not mean that we abdicate our epistemic autonomy and intellectual responsibility. The deference to expert opinion requires rational judgement and critical thought. Hence, in evaluating the testimony of another, Clifford insists that we ask three questions: (1) Is he moral? (i.e., does he usually tell the truth?), (2) Is he a reliable source? (3) Did he arrive at the conclusions using the acceptable methods of inquiry? One might be reluctant to judge as intellectually virtuous a teacher, for example, who is epistemically responsible in professional matters but is dogmatic or credulous in private life.

Moreover, a person's testimony gains in credibility insofar as that person has nothing substantial to gain by being believed, and perhaps even something to lose. It should be pointed out, however, that it is fallacious to assume that an assertion should be dismissed as false, or an argument discredited as unsound, simply because it is uttered or presented by, an interested party. But that a statement is uttered or an argument presented "by someone who is in a position to know, and has no motive for trying to deceive us, is for us, who are not in a position to know, better evidence for believing that it is true than the same assertion" [or argument] "made by someone in an equally good position to know, but with opposite interests."[92] The arguments of many postmodernist philosophers, neo-pragmatists and proponents of the sociology of knowledge are fallacious (and self-refuting) in this way. They argue that there cannot be any objective standards of rationality because all truth is distorted and rendered relative by vested interests, gender, ideological or cultural frameworks, and the desire for power and domination, etc. (I will be dealing with these arguments later in this thesis.) However, they do make one good point: one who is "in authority" is not necessarily "an authority". "One of the most important and difficult steps in learning who can be trusted is realizing that authority cannot create truth."[93] Students must learn to judge what and who are worthy and reliable sources of knowledge. We should value testimony only from those persons who have demonstrated reliability, intellectual honesty, critical inquiry, a cool and judicious skepticism, and who encourage others to question their claims.

          2.9 Primitive Credulity and Suspension of Judgement                  

To learn to distinguish the plausible from the implausible is to develop one part of wisdom; it leads as well as anything can toward true belief. David Hume developed principles of rational criticism and laid down important standards for separating wisdom from credulity, which he referred to as the "love of wonder and surprise" and "...the strong propensity of mankind to the extraordinary and marvellous."[94] H. H. Price, echoing C. S. Peirce, states "the natural tendency of the human mind is to believe any idea which comes before it, unless and until that idea is contradicted directly and obviously by sense experience."[95] This tendency he calls "primitive credulity." In order to counter this "natural tendency" and hold it in check, Price states that: 

The power of suspending judgement, of asking questions and weighing evidence, the power on which reasonable assent depends, is not something we possess from the beginning. It is an achievement, which has to be learned, often painfully. To put it in another way: the attitude of "being objective" about a proposition which comes before one's mind and assenting to it only with the degree of confidence which is warranted by the evidence, and of suspending judgement unless and until these conditions are fulfilled - this attitude is something which "goes against the grain" of our natural tendencies. We have to acquire this attitude of being "objective" and impartial, much as we have to acquire the power of controlling our instinctive desires.[96]  

The "suspension of judgement" is an option that Clifford does not seem to consider. He feels that we must find the threshold between belief and disbelief and decide one way or the other. Insofar as a person's ends are epistemic, one really has only three options - believing, disbelieving or suspension of belief. Disbelief is believing a sentence false, i.e., it is a case of belief. To believe a sentence false is to believe the negation of the sentence. For example, to disbelieve in clairvoyance is to believe that clairvoyance is false. Suspension of judgement could be construed as unbelief or non-belief, i.e., neither believing a sentence nor believing it false.[97] Although one could perhaps argue for degrees of truth, it sounds odd to make a statement such as "I believe that P is fairly true", but reasonable to say "I believe that Q is probably true." Consequently, when there exists some doubt or if the supporting evidence for what we are asserting is inconclusive, it surely makes sense to say "I am inclined to believe Q." It should be noted that a person must be as responsible in her disbelief as in her claim to believe or to know and often suspension of judgement, pending further evidence, is the most responsible alternative.


 Beliefs are about the world and their truth deter­mined by it, not by us: that the fit is of belief to the world, not the world to belief  - Michael Stocker


                 (3) BELIEF, PRAGMATISM AND TRUTH                

          3.1 The Concept of Belief  

Belief is not an activity, but a cognitive state of mind. Some beliefs are mutable, while others endure. It is not an affective state like an emotion or desire, but a disposition to respond or act in certain ways when the appropriate situation arises.[98] Many people are disposed to credulity, a tendency to be easily deceived and to accept propositions too readily or on weak or insufficient evidence. Credulity is a second-order disposition about how we arrive at beliefs and is a disposition that can lead to subjectivity.  Credulity is displayed in unqualified assent to propositions or belief in propositions that are not sufficiently grounded to justify belief in them. Credulity may be a function of a natural ignorance or an uncritical insensitivity in assessing evidence. It may result from a readiness to accept the prescriptions of authority or pre-eminence, submissive or acquiescent personality or mental self-manipulation, self-indulgence or intellectual sentimentality.                                       

We often try to convince people to believe things ("Believe in God and you will have everlasting life.") or implore them not to ("How can you believe in the nonsense of astrology?"), and although believing would hardly make sense if it were not a matter of decision, it cannot be construed as an action. It is characteristic of both believing and actions that they can be easy or hard and that we provide reasons for them. But if believing was an activity or exercise, when it ceased to occur we would stop believing. For example it makes sense to say that "I will play tennis, but not right now" but it does not make sense to say "I will believe that Clyde won the match, but not right now".     

It is generally accepted that believing is easy, and knowing is hard. It takes something more to know because knowledge requires, besides mere belief, some reliable coordination of internal belief with external reality. If one takes no thought for whether a belief is true or false, reliable or unreliable, then believing itself is simply an arbitrary game with no rules - a sort of mental helter skelter. Knowing is hard, but the cognitive demands of believing should not be child's play either. To speak of simply deciding to believe something, independently of any reasons real or imagined, is to stretch the notion of "belief" beyond belief. After all, is not care in managing our beliefs exactly what the study of reasoning and philosophy is supposed to teach us? Suppose, for example, the richest man in the world will grant you a billion dollars if you believe in the Tooth Fairy and disbelieve in gravitation. Also, suppose that this rich man has special telepathic powers in which he can decipher the contents of your mind and ascertain what you "really" believe. Is it possible for belief to be an act of the will in this way? I do not think so. To believe in the Tooth Fairy and reject gravitational theory, a person is going to have to make serious modifications to the remainder of his belief system and come to believe a whole range of other propositions that will become epistemically irrational for him. The nature of belief prevents this since belief surely cannot be a simple non-epistemic act of the will, although those who speak of a “leap of faith” reject this hypothesis                                 

Is it then possible for us to believe something while holding there is no more reason to believe it than its contrary? Is the notion of a leap of faith psychologically intelligible at all? These are difficult questions. Moreover, since a leap of faith can be made to any one of an infinite number of metaphysical positions and worldviews, what criteria are to be used in the process of selection? The notion, recurrent in Kierkegaard’s writings on Christianity, for example, that belief is subject to the will is a highly problematic one in philosophy, and is plagued by ambiguities. We can surely decide to act as if we believed a particular proposition to be true, leaving the matter of its actual truth-value, provisionally at least, undetermined. This is clear enough and can often be allowed to occur. Kierkegaard seemed to suggest, as did Pascal, that personal involvement, commitment and perseverance in action would have the consequence that a person will eventually come to believe the proposition in fact and not merely hypothetically. What is less likely is that we can, consciously and directly, set ourselves to believe something tout court, irrespective of any grounds we might have for supposing it to be true and even perhaps in the face of what we see to be overwhelming evidence to the contrary. These difficulties are, moreover, exacerbated if what we are asked to believe is stated to be inherently paradoxical - not only lacking in objective foundation, but intrinsically offensive from a rational standpoint. In what sense can I undertake to believe something that I recognize to be inconsistent, self-contradictory, opaque, contrary to experience or literally unthinkable? But this, it would seem, is the essence of Kierkegaard’s thesis - the very intelligibility of the claim that one can believe what one at the same time recognizes to be necessarily or demonstrably false.[99] Many beliefs and belief systems cannot in fact be epistemically justified since they are culturally inculcated conditioned responses and a function of habit, or per­haps the product of a doctrinaire upbringing. It is no coincidence that out of all the belief systems and worldviews, the overwhelming majority just happen to choose the one that their parents or society adhere to rather than the belief system that is the most coherent and plausible, has the most evidential support, the most equitable moral code, the best miracles, and so on.        

It could be argued, however, that all beliefs grounded in authority are to some extent a function of "faith," particularly if there is some element of risk in acting on those beliefs or putting them into practice. For example, it would seem odd to say that one has faith that "2+3=3+2" or that "the sun will rise tomorrow" but quite sensible to say "I have faith in Allah and the tenets of the Koran." Faith is it would appear; a species of belief and it makes sense to speak of a credulous or "blind" faith (e.g., "The Lord will provide" or "My country, right or wrong") and a rational faith (e.g., "I have faith in my physician's diagnosis of a peptic ulcer"). "Blind faith" can justify anything but "rational faith", or perhaps "trust" would be a better word, must have some significant degree of evidential support. To use faith as if it were an alternative way to the truth cannot bypass the crucial question of whether such results really have any likelihood of being true.[100]

          3.2 Belief and Truth                                                                     

Belief is not independent of truth since: (1) what is believed must either be true or false (since the formal object of belief is always a proposition), and (2) what is believed, even if it happens to be false, is believed to be true. If we value the truth, then "psychological" assent to a proposition, without a commitment to determining whether or not it is true, comes at a heavy cost to intellectual integrity. Intellectual integrity requires that one be on guard against what he wishes to be true and to pursue an argument even if it leads to conclusions that are judged to be regrettable. For example, some, despite the strong arguments in its favour, that materialism is false, may desire it. I have also suggested in the previous section that the concept of belief as construed as an act of the will is problematic. If the object of belief is truth, then

If I could acquire a belief at will, I could not acquire it not knowing whether it was true or not; moreover I would know that I could acquire it whether it was true or not. If in full consciousness I could will to acquire a "belief" irrespective of its truth, it is unclear that before the event I could seriously think of it as a belief, i.e., something purporting to represent reality.[101]

It could be said that an increase or decrease in knowledge is an increase or decrease in its extent, whereas sometimes an increase or decrease in a belief is an increase or decrease in its intensity. But the intensity of a belief cannot be counted on to reflect its supporting evidence any more than its causes can.[102] One obvious test of evidence is this: "would it still be taken to support the belief if we stripped away all motives for wanting the belief to be true?"[103] W. K. Clifford has stated:

The fact that believers have found joy and peace in believing gives us the right to say that the doctrine is a comfortable doctrine, and pleasant to the soul; but it does not give us the right to say that it is true. And the question that our conscience is always asking about that which we are tempted to believe is not, "Is it comfortable and pleasant?" but, "Is it true?"[104]

 Many beliefs are, of course, a function of a process of rationalization, wishful thinking or self-deception and it remains an open question as to whether all those processes are morally wrong. Nevertheless, "to maintain any belief while dismissing, or refusing to give due weight to, reasonable and relevant objections, is to show that you are more concerned to maintain that belief than really to know whether it or some other is, after all, true."[105] As I have pointed out earlier, there is a strong inclination, particularly among adults, to believe what they want to believe, to see what they want to see, conclude what they expect to conclude and to ignore or discount disconfirming evidence. A person's motivations influence his beliefs via the subtle ways he chooses a comforting pattern from the fabric of evidence. A person's preferences influence not only the kind of evidence he considers, but also the amount he examines. When the initial evidence supports an individual's preferences, he becomes self-satisfied and terminates the inquiry. Conversely, when the initial evidence is unfavourable, he resumes his search for confirmatory evidence to reveal reasons to believe that the original evidence was faulty. For example, when Jane loses 6-1, 6-0 to Joan in tennis, rather than accepting the rather clear evidence of her deficiencies as a tennis player, she will attempt to account for the loss by searching for further "evidence" such as problems with the tension of the string, external distractions and her biorhythm and so on.

          3.3 Belief, Faith and Pascal's Wager

Religious faith, which H. L. Mencken quite rightly defined as "the illogical belief in the highly improbable" and which Nietzsche defined as "not wanting to know what is true,"[106] is relevant to the present discussion. A colleague recently tried to convince me of the belief in the existence of the Christian God by arguing that "you can't prove that God does not exist"[107] and explained why this particular belief is a "no lose" situation. I countered by stating that the burden of proof for exceptional claims such as the existence of God rests with the believer and quoted T. H. Huxley's well known axiom: "extraordinary claims require extraordinary evidence" and "the more a fact conflicts with previous experience, the more complete must be the evidence which is to justify one's belief in it." Moreover, when there is no good reason for thinking a claim to be true, that in itself is good reason for thinking the claim to be false and, accordingly, a proof of X's non-existence usually derives from the fact that there is no good reason for supposing that x does exist. Michael Scriven subscribes to this view when he states that "the proper alternative, when there is no evidence, is not mere suspension of belief: it is disbelief."[108] Bringing this maxim to bear on theism, Scriven goes on to say that "atheism is an obligatory default position in the absence of evidence of God's existence."[109] If epistemically null conditions could obtain for any proposition p and its denial ~p, then it seems we would be forced to disbelieve p, thereby believing ~p, and to disbelieve ~p, thereby believing p. But this is absurd, so epistemically null conditions cannot obtain both for a proposition and its denial. The absence of a positive epistemic consideration in favour of p will just be a positive epistemic consideration in favour of ~p, and vice versa.     

 I stated further that, in my opinion, Pascal's notorious wager[110] and William James "Will to Believe" are strange distortions of the notion of belief, my colleague responded by saying that he was familiar with neither Pascal nor James. I explained Pascal's wager and argued that, since there is little or no evidence for the existence of God and rational people harbour reasonable doubts about it, surely a just God who values rationality would not punish people for being reasonable. In fact, he/she/it might even reward the skeptics for their independent habits of thought and punish believers for their credulity. In other words, there might be a god who looked with more favour on honest doubters and atheists who, in Hume's words, proportioned their belief to the evidence, than on mercenary manipulators of their own understandings. Indeed, this would follow from the ascription to God of moral goodness in any sense that we can understand. The sort of god required for Pascal is modelled upon a tyrannical narcissistic monarch stupid and vain enough to be pleased with self-interested flattery. We are, in effect, back with the god of the Book of Job, and, whatever we may think of Job himself, there can be no doubt that Jehovah comes out of that story very badly. Moreover, it never seems to be thought that since God has made us flawed in so many ways, He might also have seriously limited our capacity to find out precisely what he wants us to do.[111]                             

A further quite obvious objection to Pascal's wager is the problem of numerous different versions of theism, many of which promise eternal bliss, all vying for credence. It is even logically possible that there exists a being who promises infinite eternal reward to all and only those who deny the existence of other claimants to worship, including the Christian God, yielding a dilemma equivalent to a practical contradiction: a rational person both ought and ought not to bet on, say, the Christian God. Perhaps the biggest reason why Pascal's wager is a failure is that if God is omniscient he will certainly know who really believes and who believes on the basis of cost-benefit analysis. He will spurn the latter - assuming he actually cares at all whether people believe in him. And finally, it would seem that a fair and just God would judge people on their actions in life and not whether they happen to believe in him.

Pascal's wager could at most give us only psychological reasons (motives) for wishing that we believed or could believe. What we lack are good reasons (grounds) for having this kind of faith. The postulation of a supreme being has, ever since I was a youngster, seemed to me "throwing in the towel", a refusal to take complex problems seriously - a facile, groundless and evasive response to deeply disturbing difficulties. It welcomes the self-comforting delusion that we know what we do not know, and have answers that we do not have, thereby denying the true humility of awe, wonder, mysteriousness, and perhaps, inexplicability of what is. By sheer chance I had what I perceive to be the good fortune to arrive at these conclusions early in life when I became aware of the contradictions between my personal experiences and what I was told at Sunday School. One example of this was the result of my mother's efforts to comfort me following the death of my dog Rusty who was killed after being struck by an automobile. My mother assured me that I would eventually meet Rusty again in a place called "heaven" but later at Sunday school I was informed that dogs do not have "souls." Only humans have souls and can get a ticket to the opaque destination of "heaven". The explanations provided to me of both "heaven" and the "soul" by both the Sunday School teacher and my dear mom  made little sense to me. Not finding these religious concepts and explanations very comforting or credible, I have since been highly suspicious of facile explanations or solutions to difficult questions. The efforts of my mother and my teachers to answer my pressing metaphysical queries were not very successful, and I could never understand why so many other students were not interested in better answers of questions that seemed so important to me and appeared to have no easy answers. Why not be honest and simply say "I don't know". Surely if beliefs persist when there are no reasons for holding them, we should look for causes: look what makes people believe as they do. Belief in God is irrational - perhaps absurd - but, as Feuerbach, Nietzsche, Santayana and Freud have shown, the psychological need for this construct of the human psyche is so emotionally compelling, that in cultures like ours many people must believe in spite of the manifest absurdity of their beliefs. They can accept and see the absurdity in the religious beliefs of primitive tribes and ancient cultures and sometimes, as with Kierkegaard and Pascal, they can partially see it and accept it in their own culture, but the acceptance is not unequivocal and the full absurdity of their own belief remains hidden from them.

Bertrand Russell was highly critical of pragmatic devices like Pascal's wager when he wrote:

The true precept of veracity, which includes both the pursuit of truth and the avoidance of error, is this: "We ought to give every proposition which we consider as nearly possible that degree of credence which is warranted by the probability it acquires from the evidence known to us. The further questions, what propositions to consider, and how much trouble to take to acquire knowledge of the evidence, depend of course upon our circumstances and the importance of the issue. But to go about the world believing everything in the hope that thereby we shall believe as much truth as possible is like practising polygamy in the hope that among so many we shall find someone who will make us happy.[112]

I am inclined to think that Pascal's advice carries with it a large dose of intellectual dishonesty and self-deception. William James, as we shall soon see, appears to advocate the same thing. Brand Blanshard attacks this Pascalian/Jamesian pragmatic defense of religious claims:

...the only evidence that is relevant to the truth of a belief is evidence that is logically relevant... James was thus left in the uneasy position of saying that we were justified morally in accepting what we were clearly not justified in accepting logically; I say uneasy because if we know that we are not logically justified, to say that we are morally justified is to warrant an attempt at self-deception.[113]

Hence, what ought not to be allowed, if rational belief is to be objective, is that it be dependent on the particular value judgements or motives of the believer or on the kind of application or consequences the belief is going to have.                                                                                       

Beliefs are, in many respects, like possessions. An individual acquires material goods because of the satisfaction they provide and one often feels inclined to purchase and retain beliefs in a like manner.[114] This similarity is captured in our language by people referring to beliefs as being "adopted", "inherited", "acquired", "held", "maintained", "lost", and "abandoned." However, in the marketplace of beliefs one must be an astute and discriminating shopper. There are many things one is enticed to believe, and to do so would often be comforting and agreeable. The will to believe in ESP, for example,[115] is likely motivated by the fact that it entails several other comforting corollaries and opens up many inviting prospects such as the prospect of an "afterlife." There are many beliefs to be purchased at bargain-basement prices; but in acquiring some of these consolatory beliefs, one pays a high price in rationality and intellectual sincerity. There are many things we would like to believe but reality gets in the way. Furthermore, many people tend to be quite protective and tenacious about their beliefs and become overly sensitive and defensive when their beliefs are challenged and exposed to intellectual scrutiny and criticism. Others, aware of this neuroticism, are reluctant to openly question the beliefs of another, particularly those beliefs that lack substantial evidential support such as political and religious creeds. Many people of course try to avoid potential conflict with others and often feign agreement with the claims of others in order to "play ball" or to avoid being branded by the "group" as offensive, negative, unfriendly, or hostile. The hidden assumption in statements such as "I trust you won't mind if I'm perfectly frank" is usually false when it comes to criticism of another's cherished beliefs. Moreover, with the notion of "political correctness" seemingly dominating and restricting present day rational discourse, many topics such as religious and ethnic beliefs have become societal sacred cows. Many skeptics consider it dangerous politically or socially to apply their critical thinking to scriptural claims, for example. The plight of Salman Rushdie is ample evidence of the paranoia that presently exists, and probably always has existed, in religious communities.

Returning to Pascal, it would seem that with his infamous wager he was employing the “Principle of Insufficient Reason” which John Maynard Keynes, in his Treatise on Probability, renamed the “Principle of Indifference”. The principle can be stated as follows: If a person has no good reason for supposing a proposition to be true or false, then he assigns even odds to the probability of both truth and falsity. The principle has had a long notorious history, having been applied is such disparate fields as science,[116] statistics, economics, philosophy, ethics and psychic research. Unfortunately, its application often leads to absurdities and paradoxes, if not wholesale logical contradictions. If one assumes that Pascal believed the odds of both the Christian God existing and the Christian God not existing to be even[117], one can easily see how it can lead to inconsistencies.

To illustrate the problematic nature of the principle, consider the following. All theistic religions make claims about the existence of their God or Gods. Now, there have been a multitude of theistic claims throughout history and it is estimated that there are in the order of 200 theistic religions in the world today. In one of his well-known satirical essays, Memorial Service (1922), H. L. Mencken lists approximately 100 Gods that are no longer with us because of the dissolution of the cultures that believed in them. As Mencken proclaims in his closing statement of the essay, “They were all gods of the highest dignity - gods of civilized people - worshipped and believed in by millions. All were omnipotent, omniscient and immortal. And all are dead.”[118] All these theistic religions claimed exclusivity; that is, their God is the one and only God. Now, if the probability of the Christian God’s existence is .5, the probability of the Muslim God’s existence is .5, the probability of the Jewish God’s existence is .5, and so on… , one can calculate the following: The probability of the Christian God’s existence is (.5)n, where n represents the number of religions making exclusive claims to a deity, the probability of the Christian God existing is .5, and the probability of all the other God’s not existing are .5 respectively. But the probability of the Christian God existing or the probability of the Christian God not existing must equal 1, an obvious contradiction since (.5)n + .5 = 1 if and only if n = 1.

It might be argued, in defence of Pascal, that he was actually following a course recognized as valid in the theory of utility in Decision Theory (i.­e., Expected utility = p (outcome) x value( of outcome). The only unresolved issue is the fact that here Pascal uses infinity as a multiplier; i­.e., "degree of happiness" = ∞ (infinite happiness in heaven), but p (of this bliss) = 1/n (n → ∞), since, based upon subjective empirical probabilities concerning God's existence, it would be reasonable to assign a probability near zero to this event. Such relationships are considered acceptable in Decision theory. There is, however, some serious question about the use of probability assignments at all. It seems clear to me that the only interpretation of probability relevant and useable here is the subjective one: yet how are even subjective probability assignments supposed to arise out of the mere insistence that theism is not demonstrably impossible? Its mere possibility need not be taken to endow it with any positive probability at all. Even if it could be argued that theism and atheism are in approximate epistemic parity, no decision between them can be made on purely epistemic grounds and some form of agnosticism would seem to be the appropriate doxastic stance if no considerations other than purely epistemic ones could or should enter into such decisions.[119] It should also be pointed out that Decision Theory by itself is an instrumental theory of best action, not of rational action. For those who are intellectually honest, belief is based upon evidence and plausibility, not upon the power of the will or cost-benefit analysis. Moreover, is it not irrational to gamble on an infinitesimal probability, even though the stakes are high? Purchasing a "Lotto" ticket when the odds of winning anything of significance are in the order of fourteen million to one is a case in point. There's a reason the names of people winning nothing in Lotto 6-49 not being printed in the newspapers; the list would require several thousands of pages of very fine newsprint.

          3.4 William James' "Will to Believe"

Central to William James' variety of Pragmatism is the view that there are no epistemic virtues.[120] James asserts that "the true...is only the expedient in the way of thinking"[121]. Truth is "any idea upon which we can ride..., any idea that will carry us prosperously from any one part of our experience to any other part, linking things satisfactorily, working securely, simplifying, saving labour."[122] He explains the connection between belief and truth as follows:

Let me say only this: truth is one species of good, and not, as is usually supposed, a category distinct from good, and coordinate with it. The true is the name of whatever proves itself to be good in the way of belief, and good too for definite and assignable reasons. Surely you must admit this, that if there were no good in life for true ideas, or if the knowledge of them were positively disadvantageous and false ideas the only useful ones, then the current notion that truth is divine and precious, and its pursuit a duty, could never have grown up or become a dogma...If there be any life that is really better we should lead, and if there be any idea which, if believed in, would help us lead that life, then it would be really better for us to believe that idea, unless, indeed, belief in it incidentally clashed with other greater vital benefits. "What would be better for us to believe”? This sounds very like a definition of truth. It comes very near to saying "what we ought to believe" and in that definition none of you would find that any oddity. Ought we ever not to believe what it is better for us to believe? And can we then keep the notion of what is better for us, and what is true for us, permanently apart?[123]               

James’ argument appeals to moral advantages but is also a prudential argument in that theoretical justification of one’s religious belief might eventually turn up, given enough time. But to adopt a belief simply because it might do you more good than harm is really to abandon any concern with the truth. “A belief can be a condition of life,” wrote Nietzsche, “and nonetheless be false.” “Among the conditions of life might be error.”[124] “The psychology of error,” writes Nietzsche, results when “cause is mistaken for effect; or the effect of what is believed true is mistaken for the truth; or a state of consciousness is mistaken for the causation of this state.”[125] Nietzsche’s harsh words need to be taken seriously if anyone is tempted vaguely to imagine that a belief passed the test of truth precisely in being shown to have consolatory, “life-enhancing,” or “self-actualizing” power.                   

John Dewey, although influenced by James, held to a coherence theory of truth which, although somewhat muddled, would in no way allow acceptance and assent to a proposition simply because one is comforted or pleased by believing it to be true. In fact, Dewey spoke with a degree of contempt for those who tried to evade reality and cling to compensatory and consolatory values[126] (I will return to Dewey’s notion of truth later in this thesis). Regarding religious beliefs and “religious experience,” Dewey asserted that “...genuinely sound religious experience could and should adapt itself to whatever beliefs one found oneself intellectually entitled to hold”[127] and that “it makes all the difference in the world in the value of a belief how its object is formed and arrived at.”[128] For Dewey questions of method were paramount and he insisted that all beliefs be subject to critical inquiry, employing “the best available methods...and testing as to matters of fact; methods, which are, when collected under a single name, science.”[129] Israel Scheffler, in a critique of James’ pragmatism stated:

The effects of a belief on the believer are altogether irrelevant to the question whether or not the belief is true. That a belief comforts the believer is no count at all in favour of its truth, and that a belief is unpleasant to contemplate is no count against its truth. What counts at all is whether things appear likely, on the evidence, to be as the belief asserts them to be, and clearly this condition is logically independent of the psychological effects of accepting the belief.[130]

The essential and crucial distinction between, on the one hand, truth, and, on the other hand power to fulfil, surely needs honest discussion in courses of religious studies, for example. The teacher of religion should certainly make her students aware of the self-protective devices a religion may sometimes employ in order to discourage its believers from questioning its basic claims. If one accepts James' argument, one would need to be extremely careful about what particular religious doctrine one claimed to have a right to believe on ethical grounds. Some doctrines seem more likely than others to have morally bad consequences and the violent history of religion attests to this fact. Certainly it would seem that belief in the doctrine of heaven (reward) and hell (punishment) is a seriously confused, if not depraved, notion of the moral point of view and it seems more likely to make one intolerant in a morally undesirable way. And if one were, for example, attempting a pragmatic justification of belief in God's existence, one would need to show that the "argument from evil"[131] does not constitute sufficient grounds for rejecting the belief. One objection is that if we allow supposed morally advantageous consequences of a belief to count as reasons for accepting it, then it is a small move to allowing alleged morally disadvantageous consequences of accepting belief as reasons justifying its rejection, even when the belief has strong theoretical justification. Some Creationists, for example, have argued on such grounds that the theory of evolution should be rejected. It should be pointed out that some of the strongest challenges to religion are moral objections: war, persecution, racism, slavery, cruelty, torture, genocide, prohibitions on the free exercise of reason and inquiry, convoluted arguments about the nature and existence of gods, subordination of women, and (in Christianity) the debasement and enervation of life with a doctrine of original sin and the exploitation of nonhuman life and natural resources.

One way of testing belief, powerful where applicable, is to ask the professed believer to put his money where his mouth is. Acceptance of a wager reveals sincerity, and the odds accepted conveniently measure the strength of the conviction for the belief. This method is applicable only in cases where the believed proposition is one that can eventually be decided to the satisfaction of both parties so that the bet can be settled. Certainty, it could be said, "involves the willingness to risk everything if you are wrong over against no gain if you are right."[132] Moreover, the demand for certainty is often associated with "causes", dogmatism, commitment to an ideal and the need or love of power. The quest for power is often inversely proportional to the quest for critical inquiry. Reason and the critical faculties are often forced into a hasty retreat in the face of an all-consuming ambition to achieve or establish control. Power corrupts and one must be wary of explanations that appeal to ulterior motives, hidden agendas or are distorted by charismatic (and often psychopathic) character traits. We should also be mindful of explanations that seem to work too well, explanations that are un-testable and always seem to be available, such as, "whatever God wills happens."[133] An irreversible commitment to a religion with theistic and other beliefs on the ground of the supposedly beneficial effects of these beliefs is a form of self-betrayal that has potential moral and practical disadvantages of an overwhelming kind. "Convictions are [often] more dangerous enemies of truth than lies," as Nietzsche has claimed[134] and ready-made answers often let one down when subjected to pressure.

Charles .S. Peirce intimated that to hold true beliefs are intrinsically valuable and further stated that a belief is something upon which we are prepared to act. In "The Fixation of Belief", Peirce writes:

The person who confesses that there is such a thing as truth, which is distinguished from falsehood simply by this, that if acted on it should, on full consideration carry us to the point we aim at and not astray, and then, though convinced of this, dares not know the truth and seeks to avoid it, is in a sorry state of mind indeed.[135]


          3.5 Pragmatism and Science

Richard Rorty, who espouses a pragmatic theory of truth not unlike that of James[136], also denies epistemic realism conflating it with its naive and representational forms. He also sees his denial of epistemic realism as a natural upshot of his rejection of Cartesian dualisms.[137] Following Dewey's lead, but taking a much more radical position, Rorty makes the sole purpose of scientific inquiry a thoroughgoing pragmatic affair. Science simply becomes the most effective means for predicting, controlling and coping with the world, rather than the collective human effort to understand it. The human capacity and desire for disinterested knowledge, which receives vivid expression in the history of science, simply disappears from the scene. But pragmatic theories of science are extremely implausible to anyone who takes scientific practice seriously. No one would reasonably deny the efficacy of science or its utilitarian value, but this narrow view would be to confuse science with technology and pure disinterested inquiry with instrumental reason. 

Rorty's radical pragmatic approach to science pales in comparison to the epistemological anarchism of Paul Feyerabend, a malcontent philosopher of science, who writes:

But rationalism has no identifiable content and reason, no recognizable agenda over and above the principles of the party that happens to have appropriated its name. All it does now is lend class to the general drive towards monotony. It is time to disengage Reason from this drive and, as it has been thoroughly comprised by the association, to bid it farewell.[138]

Feyerabend rejects rationality as a "tribal creed" and argues that, even within a cultural context, "objective reasons simply do not exist."[139] All reasons are "subjective" inclinations and as good as their contraries; no one can be judged to have chosen or acted unreasonably. In discussing his "refusal to condemn even extreme fascism," he acknowledges his negative attitude toward fascism, but insists that "the problem is the relevance of my attitude" and "an inclination I follow and welcome in others."[140] There are no objective arguments to combat fascism, but neither are there objective arguments to criticize anything else, not even the burning of the children of "witches" in order to save their "souls."[141]  Feyerabend's rejection of the conceivability of objective standards of rationality precludes any possibility of epistemic responsibility or an ethics of belief. If all moral criteria, for example, are solely relative or internal we must cast out all such terms or expressions as true, false, objective, bias, impartial, progress, advance, improvement, decline, regression or deterioration in our descriptions or explanations of human thought or action, except in areas of technical accomplishment. It would even be difficult to talk of deterioration or greater wisdom in individuals (e.g., Ghandi v Hitler) unless we had non-arbitrary grounds for evaluating them. Once the principle is accepted of regarding any thought or action as rational or as ethical only in accord with the norms of a particular group we slide inexorably into an acceptance that the perception of each individual must count as being as rational or moral as any other individual's since no external criteria are available. Both logic and the nature of our cognitive equipment impose restraints on what counts as rational or reasonable and even though there is no one position, no Archimedean point, from which value-free knowledge can be developed, some positions are clearly better than others.

Feyerabend is, of course, correct when he accuses Reason (in the context of Western Culture/Science) as having been destructive. The ideals of rationality and objectivity have been used as ideological devices and instruments in the pursuit of highly questionable and narrow cultural interests. But to reject rationality (and its attendant intellectual virtues) for this reason is like banning baseball bats because they have been used for purposes other than hitting a baseball. The other curiosity about Feyerabend's "arguments" (albeit not very compelling ones) is that they are just that. If the arguments are correct then there is no reason for us to accept his epistemological anything goes free for all; they are rendered impotent by their self-refutation.

Feyerabend's "anything goes" approach to scientific methodology, involving "the rejection of all universal standards and of all rigid traditions"[142], threatens the important connections between cognitive inquiry, objectivity and truth. Surely science can maintain an open-minded self-critical attitude toward its method of inquiry in order to avoid ideology and dogmatism. But Feyerabend pushes his tolerance for pseudoscience, and for eccentric ways of acquiring useful information about nature; into such an extreme relativism that science ceases to be a rational enterprise any better than that of religion, psychics or mystics. It is simply silly to give a serious hearing to every proposal or hypothesis that comes along. Should astronomers pay attention to the flat earth proponents or Astrology? Ideological differences notwithstanding, should scientists extend a hearing to Creationists, Biblical prophecies, Christian Scientists or to a "back to Ptolemy movement?" It seems to me a frivolous relativism and an open-mindedness that has lapsed into credulity that tells us they must not. There is as much difference between an "open-mind" and a "hole in the head" as there is between "tolerance" and "anything goes." One surely need not feel concerned about being labelled closed-minded simply because she quickly dismisses a claim to precognition, communication with discarnate spirits, an Elvis sighting, or a purported levitation. Many ideas and propositions lie beyond the range of coherence and do not even provide testable hypotheses. A "hole in the head" implies that we abandon all standards of critical inquiry and be willing to assimilate uncritically anything thrown into it. H. L. Mencken's response to outrageous claims such as these was that they did not even deserve to be debunked, declaring "one horse-laugh is worth a thousand syllogisms." Although we ought always to be open and amenable to new ideas and creative hypotheses, no matter what their source, not every claimant is a potential Copernicus or Einstein. We may find it difficult to define criteria for scientific progress or advancement but, at least in the hard sciences, is there clearly not such progress? Surely these questions concerning open-mindedness are merely rhetorical and it is plainly not the case that "anything goes." Open mindedness can be an intellectual virtue but not to the point that your brains fall out.                                                         

Getting back to "putting your money where your mouth is," Feyerabend was once asked why he takes an airplane instead of a broom. Feyerabend's reply was "I don't know how to use brooms, and can't be bothered to learn."[143] Imre Lakatos once asked Feyerabend why, if he does not believe in objective standards of truth, he never jumps out of a fifty-story building. "Because", he answered, "I have an innate fear of death, not because I can give rational reasons for such a fear."[144]

          3.6 Pragmatism and Relativism

Returning to James' pragmatism, two objections can be levelled against it: (1) it leads to relativism, possibly nihilism and, (2) it is self-refuting. It is relativistic, since the pragmatic assessment of a cognitive context (i.e., the ways of arriving at and holding beliefs) will be sensitive to both the values and the circumstances of the people using it. Therefore, it may turn out that one belief system is, on pragmatic grounds, better than another for me, while that other system is pragmatically better for someone else. One may argue that epistemic relativism is not a bad thing, or even a good thing, but I do not know what those arguments could be. The natural upshot of epistemic relativism is that it creates a gap between good reasoning, on the one hand, and truth on the other. Moreover, truth and justification tend to be conflated - an important issue I will deal with later. Groups of individuals presented with the same evidence will likely end up with conflicting beliefs and hence, never lead them to "the truth." Of course it seems clear that if we do not see any intrinsic value in having true beliefs, then the Jamesian pragmatist wins out. Epistemic pragmatism endorses a purely consequentialist or instrumental account of inferential virtue - the value of a system of cognitive processes depends upon the distinct possibility of the system leading to certain consequences. Hence, epistemic pragmatism of the Jamesian variety is typically relativistic since it is contingent upon the values, environment and aims of the people using it. There is, however, no denying the fact that the output of any particular inferential system will be affected by the social or cultural environment in which it is functioning.

          3.7 Pragmatism and Self-Refutation

Socrates, in the Theaetetus, raises objections similar to those sketched above when he attacks Protagoras' claim that "Man is the measure of all things." Socrates argues that if Protagoras is right in claiming that what anyone takes to be true is true, it follows that his opponents are correct in denying that which anyone takes to be true is true, since that is what works for him. Hence, we end up with the self-contradictory dilemma in which "P is true for A" and "P is not true for B" are both true. James' claim that truth is "whatever proves itself to be good in the way of belief" results in precisely the same self-contradiction. For example, A argues: "You claim it is to be true for you that P, but then you are asserting that it is absolutely true that P is true for you." B argues: "No, I am saying it is true for me that P is true for me, etc, and so on ad infinitum. To assert a proposition is to assert it to be true, therefore, "the true is what is good in the way of belief" has to be asserted to be true, but to James it need only be taken to be "good in the way of belief." It is good to believe that the true is the good and good to believe that this belief is good, and so on.[145]                                                     

  In order to reveal the self-refutation one could also argue as follows: "Pragmatism seems false to me, and has never worked for me, hence on pragmatic grounds I am quite justified in calling it false!" Suppose, for example, I am evaluating whether my own cognitive system is better than some alternative system. In order to do that I must study both systems and attempt to determine the likelihood of various consequences that might follow if I adopted one system or the other. Of course, in order to do this I must engage in reasoning processes; but I must use my cognitive context. If I conclude that my own system is better (or worse) than the proposed alternative, then I have used the very system whose superiority (or inferiority) I claim to have established - hence the resulting self-refutation and possible circularity. Bertrand Russell, a vehement critic of pragmatic theories of truth, calls James' claim that "on pragmatic principles, if the hypothesis of God works in the widest sense of the word, it is true", a mere tautology.

For we have laid down the definition: The word true means "working satisfactorily in the widest sense of the word". Hence the proposition stated by James is merely a verbal variant on the following: On pragmatic principles, if the hypothesis of God works satisfactorily in the widest sense of the word, then it works satisfactorily in the widest sense of the word. This would hold even on other than pragmatic principles; presumably what is peculiar to the belief is that this is an important contribution to the philosophy of religion. The advantage of the pragmatic method is that it decides the question of the truth of the existence of God by purely mundane arguments, namely, by the effects of belief on His existence upon our life in this world. But, unfortunately this gives a merely mundane conclusion, namely, that belief in God is true, i.e. useful, whereas what religion desires is the conclusion that God exists, which pragmatism never even approaches.[146]

Russell's point is that the true believer does not accept religious faith because it is useful; he accepts it because of his insistence that it is true. Surely, if Christianity, for example, is saying anything at all, it must be making claims about what is objectively the case. As Roger Trigg has stated, "If religious claims are true, they are true whether people believe them or not, and ought to be accepted by everyone" and "if they are false, they are false for everyone, including Christians."[147] Many theologians such as Paul Tillich and D. Z. Phillips have tried to insulate religious beliefs from the scope of reason, scientific evidence and critical inquiry by claiming that religious discourse has its own idiosyncratic meaning and logic and is intelligible only if one is "committed" to its "form of life" or "conceptual scheme." However, these Wittgensteinian attempts to protect religious propositions from criticism are purchased at the price of "a death by a thousand qualifications" as Antony Flew once stated.[148]

          3.8 Wittgenstein: Coherence of Beliefs

According to Wittgenstein in On Certainty, discussions about the entrenchment of a belief are regulated by considerations of consistency and coherence. A person's conceptual framework or "worldview" must hang together. A person is convinced to the point of certainty only because of the ways in which a given belief stands in relation to a system of other beliefs.

When we first begin to believe anything, what we believe is not a single proposition; it is a whole system of propositions. (Light dawns gradually over the whole.) It is not single axioms that strike me as obvious, it is a system in which consequences and premises give one another mutual support.... What stands fast does so, not because it is intrinsically obvious or convincing; it is rather held fast by what lies around it.[149]

The doubting of a bedrock belief has consequences which reverberate throughout one's worldview. For example, if I doubted the existence of the earth before my birth, I would have to doubt all sorts of things that "stand fast" for me since "this would not fit into the rest of my conviction at all."[150]

If consistency and coherence within a system of beliefs is to be maintained then there is a price to pay for the beliefs one has. To accept a belief is to accept its consequences however they may fall, and it may come crashing down on our mass of opinions. Hence, our choice of beliefs is not an arbitrary matter. Avoiding the charge of arbitrariness has been an awkward matter for Pragmatism, and since consistency alone is neutral about choices for beliefs, direction must come from the end of achieving a coherent picture of experience. But how does one achieve control? Once we recognize a conflict or contradiction among our beliefs, then we ought to gather and assess our evidence with a view to screening out one or another of the recalcitrant beliefs. As Quine's metaphor so aptly points out, "a healthy garden of beliefs requires well-nourished roots and tireless pruning."[151] However, Wittgenstein maintains that we cannot "stand outside" any system of beliefs and assess it as a unit, since we have no idea of what it would be like to step outside the framework without concepts and criteria that are borrowed from our own system to make such appraisals. Wittgenstein's claim suggests that objectivity is not possible, that we cannot escape from the biases of our own conceptual or cultural framework. Clearly, objectivity, as I will argue later, is a normative term and a matter of degree - that there are personal subjective elements and cultural particularities that affect our ability to be objective in an absolute sense. Wittgenstein's contention does not preclude self-criticism, for surely we can look at any component of a belief system and the tradition in which it has evolved. Christopher Coope (1974) stated that Hobbes was aware of certain obvious characteristics of a good tradition:

[First} a certain peace and stability in the community, and the economic conditions which allow certain people to devote them to learning. There should be written records of their deliberations, so that ground gained in one generation should not be lost in the next. Above all, there should be a lively awareness of the way human beings, so often fond of their opinions, will readily deceive themselves, reluctant as they are to believe things that are new, frightening, or injurious to their pride. And in consequence of this they will need to have a certain respect for the clarity of expression which makes their errors open to view; and there needs to be a measure of mutual criticism.[152]

That not all communities of inquiry are equally rational was a fact that Dewey never tired of pointing out. The acceptance of the premise that all communities or cultures are inherently equal is thought to be essential to the agenda of cultural pluralism (or multiculturalism) and to a rejection of racism, and is presently deemed "politically correct." Racism, for example, is the advantaging or disadvantaging of certain social groups based on irrational prejudices and biases,[153] but to claim that there is nothing to choose between the beliefs and values of various cultural groups in order to promote anti-racism and multiculturalism is clearly relativistic and nihilistic. Open-mindedness, an awareness that not all cultures are equally rational, just or valuable, recognition of the need for continual self-examination and self-correction, and the fallible nature of the democratic process and its institutions is what protects democratic societies from lapsing into dogmatism and stagnation.

Relativism of course has been a matter of perennial concern for philosophers, and regarding which they have felt impelled in two directions. On the one hand, awareness of the cultural diversity of beliefs and values was an early stimulus to reflection on whether it is possible to transcend human differences in order to achieve de-contextualized knowledge and values. Accordingly, like Socrates in The Apology, philosophers are generally reproachful of those who audaciously assume the unquestionable veracity of their inherited world view and are unable or unwilling to take other possibilities seriously. On the other hand, like the Socrates in The Republic, they take alarm when awareness of cultural differences and the problems of judging between them encourage the view that beliefs and values can be no more than relative to the individual or his group. Pursued consistently, such views must preclude the exercise of reason in the serious pursuit of truth and virtue. For if all beliefs and values are merely relative, each ultimately no better or worse than the rest, then it becomes absurd to even try to achieve and live by a correct understanding of things. Those who embrace relativism cannot seriously attempt to improve or correct their epistemic or moral judgements and rational reflection and discourse become not only pointless, but also impossible. In any area the possibility of rational thinking and progress towards an improved understanding necessarily presupposes that there are grounds for distinguishing good and bad reasons, relevant and irrelevant considerations, truth and falsity. In denying these possibilities relativism also precludes reasoning. It is not even true, as is often suggested, that relativism implies that one ought to be tolerant of other views. A relativist may choose to be tolerant but there is no reason within relativism why he ought to. If all viewpoints and values are relative with no possibility of rationally judging between them, then the value of tolerance is no exception.

The Canadian government's putative endorsement of cultural pluralism (multiculturalism) and the attendant crises with our aboriginal peoples arises in part from an adoption of unconditional relativism as the only perceived alternative to an imposition or assimilation of white European ethnocentric values. If one set of beliefs were as rational, moral, just, explanatory or generally as adequate as any other there could be no reason why any group should change its beliefs. Although all opinions deserve a voice, not all opinions or beliefs deserve respect and the fact that one is oppressed does not make one right. It could possibly be argued that the belief systems of many of Canada's aboriginal peoples[154] (and perhaps some other ethnic groups) actually interfere with their acquisition of the level of critical rationality that is a prerequisite for personal autonomy or genuine democratic self-government, let alone for its transformation into anything that might be better. Unfortunately, despite the fact I personally qualify for legal status as a native Cree, my paternal origins from the Alberta Lesser Slave Lake reserve, this gloomy but plausible conclusion is likely to lead to my being branded as a racist. This, despite the enlightening visit with my paternal grandmother to these wonderful relatives when I was about ten years old  and the deep pride I have had since then my aboriginal roots and supporting aboriginal Canadians rights who have been treated - and continue to be treated - horrifically by the racist bigoted Christian white man for the past several hundred years.

All justification, according to Wittgenstein, results in an infinite regress of reasons, and at its terminus stands "persuasion."[155] Wittgenstein asks us to think of what happens when missionaries convert natives. People are brought up to believe that there is, or is not, a God and are taught or acquire a way of defending their views; of presenting "apparently telling grounds."[156] But after the arguing is over, we are left with, ostensibly, an unbridgeable conceptual gap and persuasion is our only recourse. Does this mean there is nothing to choose between incongruous belief systems? If the practice of relying on reasoning, evidence and proof cannot occur independently of the "environment within which arguments have their life," then giving our reasons to people who lack our environment will be of no value. Must we conclude then that any beliefs, values or theories are as good as any other beliefs, values or theories? I do not think so. The same argument might be cogent to one person and not cogent to another. All cogent arguments are "persuasive" to the audience that recognizes them. Yet not all "persuasive" arguments are cogent or even sound. Bad arguments and fallacious reasoning often persuade people. Wittgenstein points to an answer when he suggests that access to a system of thought radically different from one's own view requires more than the knowledge that it fails to respect the truth-values assigned by one's own culture. A person must also understand how an alien culture's ways of thinking bear on the truth-values they assign and this involves a role reversal or an empathetic shift in one's own view of the world in order to find some common ground. If Donald Davidson is right in his arguments against the incommensurability thesis of "conceptual schemes", then that common ground can be found.[157] Gilbert Harman has stated:

...in order to see whether certain reasons or rea­son­ing might explain a particular person's beliefs or actions we must try to imagine ourselves in his position, with his antecedent beliefs, desires, moral principles, and so forth, to see whether we can imagine what sorts of conclusions we might draw by reasoning from that position. This appeal to the sympathetic imagination is necessary, because we cannot appeal to explicit principles of the theory of reasoning to tell us what is possible and what is not.[158]

Certainly the notion of communicating with other people presupposes certain basic suppositions: the law of identity, the law of non-contradiction, a norm of truth telling, and so on - and if these basic tenets are denied, communication, even within one's own framework is inconceivable. Is it not contradictory, for example, to say that I want to know how things "really" are, that "I want to understand and know what is true of the physical world, but I do not want to use my senses or use inductive reasoning?"[159] As John Wilson has asserted:

The reason why science and other clearer fields are respectable is not that we can feel incorrigibly certain of particular answers - but that we can feel reasonably certain of the procedures. It is extremely hard to maintain nothing can be said about the application of reason or about sensible procedures in more controversial areas. Getting to know the facts, becoming aware of one' own prejudices, immersing oneself in what is (on any account) relevant to making up one's mind about moral or political or aesthetic questions, talking things over with people of a different persuasion - all these (and many more) are ways in which reason gets brought to bear. And the more closely we look at, and agree upon, what is to count as a question of a particular type - moral, aesthetic, or whatever - the more we come to see what criteria, what "rules of the game", are actually applicable.[160]

Wittgenstein states that reasons are compelling only within a certain "language game", and only when there exists a congruous world-view. He claims, quite rightly, "At certain periods men find reasonable what at other periods they found unreasonable - and vice versa."[161] He further points out that "Very intelligent and well-educated people believe in the story of creation in the Bible, while others hold it as proven false, and the grounds of the latter are well known to the former."[162] The qualification that must be met in order for one to be called a "Christian", for example, are unacceptable to many "very intelligent and well educated people". Many of these people, I believe, would endorse Christianity if they could reject the supernatural and transcendental elements but feel that it is intellectually debilitating and dishonest to endorse speculations such as the Genesis story, transubstantiation, Immaculate Conception and Resurrection of Christ. Many educated and intelligent people with a critical eye feel that some claims in the Bible are blatant absurdities. But many Christians would argue that if we exclude the supernatural from religion, we leave no difference between Christianity and Humanism. It seems to me that the important question for any serious thinker is not whether one calls his set of beliefs Christian, Humanist or any other label - the crucial question should be what beliefs one ought to hold and how they are held. In fact, many people, including myself, consider themselves "spiritual" without believing in God because, as R. M. Hare has stated:

God is bound always to be an idle element in our religious life. His existence or non-existence cannot possibly make any difference, either to what we ought to do, or what is going to be the case. His transcendence logically rules this out.[163]

John Dewey was highly critical of the religions of his day, claiming that religion has "lost itself in cults, dogmas and myths," has become "perverted into something uniform and immutable" and has been "formulated into fixed and defined beliefs expressed in required acts and ceremonies."[164]

 Instead of marking the freedom and peace of the individual as a member of an infinite whole, it [religion] has been petrified into a slavery of thought and sentiment, an intolerant superiority on the part of the few and an intolerable burden on the part of the many.[165]

Dewey argues that supernaturalism and transcendentalism have historically identified themselves with belief in ultimate and immutable truths which are viewed as the only sure foundation for the moral life and social order, and insofar as they have been characterized by such moral absolutism they have involved an undemocratic tendency toward dogmatism, authoritarianism and fanaticism. Dewey finds this tendency especially manifest in the case of religious supernaturalism. He notes that "conflict between truths claiming ultimate and complete authority is the most fundamental kind of discord that can exist" and that "religions in the degree in which they have depended upon the supernatural have been, as history demonstrates the source of violent conflict and destructive of human values."[166]

In The Quest for Certainty, Dewey argues that "the religious attitude" which compliments and harmonizes with naturalism, experimentalism and democracy is "a sense of the possibilities of existence and devotion to the cause of those possibilities." Religious faith in this view is "devotion to the ideal" understood as a possibility of nature rather than an actuality such as the God of theism. Dewey, as a naturalist, completely separates religious faith from belief in the supernatural or transcendental and from belief in some eternal unity of the ideal and the real as is involved in classical theism or absolute idealism. But if religious faith is basically moral faith, an ethical ideal, why use the word religious at all? In A Common Faith, Dewey makes it clear that "personally I think it fitting to use the word God to denote that uniting of the ideal and the actual..."[167] Why Dewey insisted on using the word "God" in a naturalistic theory of religious experience remains a mystery and was often questioned by his former student and devotee, Sidney Hook. To employ metaphysical vocabulary without a metaphysical foundation seems unnecessary - theistic naturalism is a contradiction of terms and oxymoronic. Perhaps he did not want to offend readers or reject a word that has become part of our cultural and conceptual framework. Dewey distinguished between "religion" and "religious", reserving the word "religion" to denote institutionalized religion. He rejected the notion of religious experience as something sui generis, separate from all other kinds of experience and involving a distinct religious reality or object - the sacred - that is completely separate from all natural realities. Dewey had in mind a quality of experience that may be realized within the midst of secular activities and may come to belong to a person's aesthetic, scientific, political, social and moral experience.[168]

Returning to Wittgenstein, it is difficult to understand what he really means when he talks about "language games" and "forms of life" since he often does not make himself very clear. Perhaps he is claiming that theism is justifiable within a certain conceptual framework? In sharp contrast to other analytic philosophers such as Moore, Russell, Ayer, Schlick, Carnap, and Quine, etc., the later Wittgenstein rejected Enlightenment secularism and humanism and took exception with those who claimed that the concept of God is meaningless or incoherent, holding that they failed to understand the "language game" of religion. But if reasons are relative to conceptual schemes or world views, then one's choice of a scheme must be a matter of faith or commitment and not reasons. Like Kierkegaard, Wittgenstein suggests that the religious believer does not even want favourable evidence: "if there were evidence, this would in fact destroy the whole enterprise". In the Judeao-Christian tradition, individuals are (or, at least were) exhorted, required, commanded to have faith in God, to believe in Him; i.e., to believe without proof or evidence that He exists. Those who were not compelled to do so were pitied or even reviled, threatened, punished, or killed. They were deemed morally depraved (and this belief still persists today)[169] and told they would be punished in the after­life by the God in whom they lacked faith. But how can anyone be required to have faith? Faith, if construed as a form of belief (i.e., belief lacking sufficient evidential support), does not seem to be subject to the will: it is not something we do or, therefore, fail to do. It is something we have or lack. If faith, like belief, can be justified, i.e., shown to be warranted, reasonable, etc., it seems to become superfluous, redundant; in fact, impossible. For example, I cannot continue to have faith that you will show up for our tennis match when I see you arriving in the same sense that I can continue to believe the same. In other words, in the face of compelling evidence affirming one's faith in something, faith becomes unnecessary and irrelevant.                       

Although we may admire the dedication, strength, commitment calmness of conviction and other qualities associated with pious faith - such things give us reasons for wanting to believe - they do not give us reasons for believing they are true. I submit that faith is, at best, epistemologically vague and ambiguous and hence, in most cases at least, offensive to reason and critical thought. Once faith is uncontrolled by evidence - anything goes. The idea that a person must­ "crucify his reason" and replace it with faith is an abdication of epistemic responsibility, but there is a longstanding Christian tradition behind this idea.[170] Perhaps Wittgenstein was endorsing fideism? Or was he simply claiming that intelligent and well-educated people are quite capable of being non-rational? It seems to me that a large measure of agreement in belief must exist if we are claiming to understand that other people do not share our conceptual scheme or background beliefs.                      

I also find it curious that Wittgenstein rejects the notion of epistemic responsibility and intellectual virtue. As Richard Rorty has rightly pointed out, "giving up what Nietzsche calls our metaphysical comforts results in greater, not less, ethical obligation, responsibility and sense of community."[171] Wittgenstein, however, regarded it as something strange and unusual that someone could be blamed for the manner in which he acquires his beliefs. From this it appears to follow that he did not think that belief acquisition was subject to the will. But if belief is an attitude or disposition which cannot be given or withheld at will, then there is no room left for the possibilities either of criticizing people for holding irrational beliefs, or of altering our own beliefs or those of others with the help of evidence and argument. It seems to me that we can be justly blamed not only for failing to believe what we ought to believe, given our other justified beliefs, but for failing to believe things because we lack other supporting beliefs within our belief structure that we should posses, but lack, because of some intellectual vice. Similarly, we should be held accountable for beliefs we hold that we ought not to hold. Roderick Chisholm stated that "when a man deliberates and comes finally to a conclusion, his decision is as much within his control as is any other deed we attribute to him" and "if there is any reason to suppose that we ever act at all, then there is reason to suppose that what C. I. Lewis calls our "believing and concluding" are to be counted among our acts."[172] This attitude fosters and encourages what Lorraine Code calls "a kind of intellectual akrasia, an entrenched reluctance to enquire further lest one face the necessity of having to reconsider a range of treasured beliefs."[173] The idea of reproachable ignorance can be found at least as far back as Aristotle in his discussion of akrasia, which followed directly upon his analysis of the varieties of intellectual virtues and their respective contributions to moral virtue. It is immortalized in such lamentations as "You ought to have known better!"


What in fact has been missing from so much recent controversy in religion, science and other fields, is the notion of objectivity - of things being the case whether people recognize them or not - Roger Trigg

Rationality is a fundamental cognitive and moral virtue and as such should, I believe, form a basic objective of teaching - Isreal Scheffler.                       


          4.1 Conceptions of Rationality

The Fontana Dictionary of Modern Thought (p. 721) defines rationality as behaviour that satisfies two conditions; "consistency and fulfilment of certain aims."[174] It regards the mode of rationality as the acceptability of beliefs founded only on "experience and reasoning, deductive or inductive." This definition of rationality, ostensibly the most generally held, is that of the rational person as a "means-end reasoner, a maximizer of expected utility or of subjective preference."[175]

The idea here is simply that one determines what one wants, what one's ends or objectives are, based on one's subjective preference for the various possible outcomes; then one chooses so as to achieve most efficiently those ends, or to maximize utility or the satisfaction of preference. If one's ends involve the achieving of a certain level of professional status, which in turn requires the unfair treatment of one's competitors, then, given those ends, the rational thing to do is to engage in the unfair treatment of one's competitors, for to do so is most efficiently to achieve one's ends.[176]

 Of course, Harvey Siegel must be implying that this selfish behaviour would be carried out in an inconspicuous manner or in such a way that the victim of such behaviour actually believes she is being treated fairly by her competitor. This notion of reason has its origins at least as far back as Hume and is recounted by D. Pole (1972) as follows:

Reason, for Hume, let us remember, is something wholly passive and inert; it stands back, theoretical or contemplative, or at best serves to point means to ends. For it tells us of causal connections (or does so, at least, in Hume's less sceptical moments). Ends themselves are neither rational nor irrational; I, made as I am and not otherwise, de facto desire or dislike things, hence following the well known Humean paradox, which strictly follows: namely, that "reason is only, and ought to be, the slave of the passions". Now the first part of the picture may seem plausible. But that conclusion at least is certainly false.[177]

This Humean focus on prudential and instrumental reason as the whole of rationality is deficient because it overlooks moral constraints on rational choice, because it ignores the distinction between morality and prudence and the conceivability of purely moral reasons. Rationality is multifaceted and can be employed in the interests of either good or evil, as extremely vicious people can be rational in some ways. Rationality in mathematics and physics, for example, is not the whole of rationality, although rationality does involve the acquisition of certain logical skills. More importantly, rationality involves the ability to reason in a variety of ways, such as in the moral realm, and to acquire the appropriate intellectual dispositions and virtues. For example, a virtuous person does the right thing because it is just, not because of some external or ulterior prudential reason or future reward/punishment.                      

We must also preserve the prospect of judging the rationality of our ends themselves, not only the means to those ends. Robert Nozick (19­93) writes that "the notion of instrumental rationality gives us no way to evaluate the rationality of those goals, ends, and desires themselves, except as instrumentally effective in achieving further goals taken as given."[178] The apotheosis of instrumental reason has hence become a form of reason that has lost its concern with goals, and focuses on the "instrument" itself. Charles Taylor (1991) defines instrumental reason as "the kind of rationality we draw on when we calculate the most economical application of means to a given end" and in which "maximum efficiency, the best cost-output ratio, is its means of success."[179] Since the overriding concern of instrumental reason is efficiency, its only measure is quantitative. Hence, it is most efficient when its materials - ends, potential means - are conceived in quantitative terms. Instrumental reason's preferred form of existence is as a calculation of quantitative input and quantitative output.

Instrumental reason is clearly exemplified within the bureaucratic structures of modern capitalist nation states. Bertrand Russell, always the social critic, has argued that we often forget that "politics, economics, and social organizations generally, belong to the realm of means, not ends."[180] What we see in modern bureaucracies, however, is an inclination to what Russell called the "administrator's fallacy" - a reification or hypostatization of bureaucratic systems as "planned organisms" or "systematic wholes" which have become ends in themselves. Russell stated that societies do not exist to "satisfy an external survey, but to bring a good life to the individuals who compose it" and must not be construed as "something having a separate kind of excellence on its own account."[181] This fallacy of composition is an evil of modern societies and institutions that confuses means with ends. Russell writes:

To believe that there can be good or evil in a collection of human beings, over and above the good or evil in the various individuals, is an error; moreover, it is an error which leads straight to totalitarianism, and is therefore dangerous..."The state" is an abstraction; it does not feel pleasure or pain, it has no hopes or fears, and what we think of as its purposes are really the purposes of individuals who direct it. When we think concretely, not abstractly, we find, in place of the "the state", certain people who have more power than falls to the share of most men.[182]

In other words, government and corporate institutions become self-perpetuating organisms that are construed as ends in themselves, rather than as instruments toward further desirable ends to perpetuate the common good. within our capitalist socio-economic systems, these bureaucratic behemoths consistently violate the Kantian ethical principle of never treating people as means to some end. But corporations, which are by their very nature undemocratic, in fact hierarchical and tyrannical, violate this moral imperative with vicious regularity since profit is their solitary objective.

This means-end conception of rationality also poses a problem for properly understanding science, often cited as the ideal paradigm of rationality. If the goals of natural scientific inquiry were taken to be, for example, the optimization of explanatory force, objectivity, problem solving, predictability and control of nature, the depiction of truth and reality and so on; then theory choice in science would be a matter of choosing the theory that satisfies those ends.

But if these goals conflict, as they sometimes do, then the means-ends account will not help to determine the rationality of theory choice. Nor will it help settle disputes about the legitimacy of these several putative goals of scientific inquiry. In short, the means-ends account of rationality, because of its inability to assess the rationality of ends, is inadequate for the resolution of outstanding questions regarding the rationality of science.[183]

 Charles Taylor who laments “the eclipse of ends, in the face of instrumental reason”, echoes Siegel’s views[184] and those of E. F. Schumacher (1977) who notes that "science for understanding" has been displaced by "science for manipulation".[185] This approach to reason via control and technique within confined disciplines and over-specialized bureaucracies and technologies has failed to adequately deal with the complicated human problems we face in today's increasingly corporatist world. This Cartesian "disengaged reason" which separates our thinking from the complexity of real human problems explains why so many people find it quite unproblematic that we should conceive human thinking on the model of the digital computer. This self-image is enhanced by the sense of power that goes along with a disengaged instrumental grasp of things.[186]

Schumacher charges that "Western man has become rich in means and poor in ends"[187] and Morris Berman (1989) writes:

The most important change [in the Scientific Revolution] was the shift from quality to quantity, from "why" to "how". The universe, once seen as alive, possessing its own goals and purposes is now a collection of inert matter, hurrying around endlessly and meaninglessly, as Alfred North Whitehead put it. What constitutes an acceptable explanation has thus been radically altered. The acid test for existence is quantifiability, and there are no more basic realities in any object than the parts into which it can be broken down. Finally, atomism, quantifiability, and the deliberate act of viewing nature as an abstraction from which one can distance oneself - all open the possibility that Bacon proclaimed as the true goal of science: control. The Cartesian or technological paradigm is, as stated above, the equation of truth with utility, with the purposive manipulation of the environment.[188]                                                               

Society, in this sense, becomes fragmented, narcissistic and atomized. We have become victims of "technical truths" which are frequently rote, cookbook, and unimaginative formulae with limited application outside highly specialized domains.

The primacy of instrumental reason is evident in the prestige and aura that surround technology, and makes us believe that we should seek technological solutions even when something very different is called for. It is that the institutions and structures of industrial-technological society severely restrict our choices, that they force societies as well as individuals to give a weight to instrumental reason that in serious moral deliberation we would never do, and which may even be highly destructive.[189]

The means-ends account of rationality also renders it deficient as an account for the development of an adequate theory of critical thinking. First, according to Siegel, "it is clear that all "bonafide" reasons are relevant to the critical appraisal of belief and action, yet the means-ends conception threatens to rule out some sorts of reasons in favour of prudential or efficiency considerations."[190] For example, I may decide to believe in God because it provides "meaning and purpose to life", gives me a "warm, secure feeling", compels me to be good, or perhaps simply because I have engaged in some sort of Blaise Pascal wagering or William James pragmatism.[191] Second, the means-ends conception does not account for values, dispositions and character traits central to a full account of critical thinking. Finally, as many feminist philosophers have rightly pointed out, rationality conceived only as instrumental reason has no place for the expressive and relational activity essential for caring; the putative impartiality of instrumental reason does not allow for acts of love and compassion which are of their nature particular.

Since the Enlightenment, rationality has been understood by means of a series of relations between notions such as "truth", "objectivity", "justification", "certainty", and "reality" as well as a series of practices related to these notions. Commensurate with this conception, philosophy has been the pursuit of a permanent, foundational, a-historical structure from which to secure truth and objective knowledge of the real, a pursuit thought to be assured by the nature and universality of reason itself. Since Kuhn's The Structure of Scientific Revolutions (1962), the rationality of science - which has, since the Enlightenment, served as the prototype and standard of reason generally - has been brought into question by an interpretation of science according to which science proceeds by way of paradigm shifts and revolution rather than by linear progression.

The reason that Western science is still seen as the model and standard for a theory of rationality is because of its self-regulating, self-correcting nature and overriding aim at objectivity, truth and reality.[192] John Dewey held the view that science is the embodiment of the highest standards of thought, declaring "Without initiation into the scientific spirit one is not in possession of the best tools which humanity has so far devised for effectively directed reflection."[193]

According to the most commonly accepted view, a person is said to be rational in her beliefs if she can provide good reasons for holding them. But rationality involves not only taking into account reasons for beliefs - also reasons against. As Popper has pointed out, confirmations are often easy to find but the rational person must also take efforts to find refutations for her beliefs. If this definition is accepted, several questions naturally arise: What constitutes "good" reasons for a belief? How are beliefs justified and what role does evidence play in the process of arriving at a belief? What constitutes sufficient evidence? Do similar methods of justification produce congruent beliefs, and is this of any great import? If an individual is seriously and honestly concerned with understanding the world and securing knowledge that is not only useful, but accurate, then she must be concerned with the truth of her beliefs. Above all, she will want to know, now and always, what truly is the case. I will analyze the notions of justification, evidence, objectivity and truth and argue that they are not, as many recent philosophers have claimed, relativistic concepts. Many postmodernist philosophers and proponents of the sociology of knowledge, for example, argue that there cannot be any objective standards of rationality because all arguments and truth are distorted or rendered relative by vested interests, ideological or cultural frameworks, gender bias, or a desire for power, etc. I will try to show that rationality and logically related concepts are inherently normative, insofar as we consider a person rational if she is able to provide good reasons for a belief. Hence, the vital element for any serious, responsible and honest thinker is unequivocally: "What ought I to believe?"

Clearly the person who believes too easily submits to the vice of credulity or gullibility; the person who believes too little is guilty of an excessive skepticism. It is not an easy task to specify the criterion of rationality of belief. Robert Nozick in a recent work states that the rationality of belief involves two aspects: (1) support by reasons that make belief credible and (2) generation by a process that reliably produces true beliefs.[194] The criterion for the credulous person could be "believe everything you are told" and, for the radical skeptic, "believe only what you see with your own eyes." But anyone who followed either of these precepts would end up believing too much or too little. It would seem that rationality is, as Aristotle may have conceived it, a mean between skepticism and credulity.[195] My conception of rationality will lean toward the side of skepticism. Wittgenstein, as I have mentioned earlier, maintains that one cannot speak of knowing when doubt has been ruled out; knowledge makes sense only when doubt makes sense. He felt that there are good reasons to believe that it is never rational to be certain. Moreover, the need for certitude is often associated with dogmatic stances, maintaining the status quo and the demand for power. If "certainty" is construed in an absolute sense, in which doubt or the possibility of error are completely ruled out, then it seems to preclude us from saying of anyone that she is ever justified in making a knowledge claim. Doubt invites the challenge to explain how and why we know, a challenge that is most appropriate when the claim concerns something describable as an hypothesis; I can only intelligibly claim to know something only where the possibility of being mistaken makes sense.[196]

The classical image of philosophy, the image from Plato through Descartes and Kant that has taken knowledge to be accurate representation of reality, philosophy to be the task of grounding knowledge on a-historical foundations, and reason as the tribunal before which all practises, beliefs, and values are to be judged has been subjected to attacks by "postmodernist" philosophers such as Derrida, Lyotard, Feyerabend, and Rorty. These recent attacks on classical foundationalism, however, appear to be efforts to avoid dogmatism and have opened the door to the possibility of a pernicious relativism - a relativism of uncritical, unreflective acquiescence to the traditional and customary. If the fact is that our beliefs and their justifications are simply cultural and conceptual contingencies, then it would appear that they are optional and arbitrary. If postmodern conceptions of rationality insist that we abandon notions of objectivity, truth and universality of rational discourse, then it must do so without adopting the view that any beliefs and practises are as good as any other.

Certainly dogmatism is to be avoided - this was the motivating force behind philosophical skeptics from Pyrrho to Montaigne to Hume to Russell to Rorty - but without lapsing into a destructive relativism or nihilism. Humanists increasingly have been aware of the limitations of reason since the writings of Berkeley and Hume. But apologists for postmodernism's rejection of universal rationality and foundationalism have elevated these limitations to a dogmatism of their own making by denying the possibility of rationality to establish bases for our beliefs. The question that recent debate over the rationality of science, the rationality of unnatural or "alien" systems of belief, and the plausibility of conceptual relativism poses is whether a pluralistic notion of rationality is consistent with the idea that incompatible systems of beliefs, conceptual schemes and values could, nevertheless, be subject to meaningful criticism and comparative evaluation.

For John Kekes, skepticism about rationality is an issue that has serious ethical considerations. Kekes asserts that skepticism about rationality results in the "impossibility of settling conflicts in a civilized manner. It encourages an appeal to prejudice and the use of force, propaganda and dogmatism. It is an attack on what is the finest in the Western tradition."[197] The postmodernist challenges the possibility of a neutral framework for rational criticism and results in the submission to the authority of our own beliefs simply because they are traditional and customary and reflect current social practices. In other words, arguments can only come to life from within a commitment to a social practice or "language game."

 The dogmatist or fundamentalist reinforces our confidence by maintaining that we have travelled on a progressive, evolutionary path which has led necessarily to the truth of our own system of beliefs, with the conclusion that the veracity of our own views detaches them from the contingency of time and social practice. What is wrong with epistemological theories involving absolute certainty, besides their implausibility, is that they take our present beliefs as immutable truths, and that is dogmatic.[198] So paradoxically, the postmodernist's challenge is also a challenge that implies we are cut off from tradition as a source of beliefs and values since they are ungrounded, and in that respect, optional and arbitrary. But one must be cognizant of the fact that any account of rationality that is tolerant enough to allow beliefs in deities, magic, psychics, astrology, ghosts, precognition, psychokinesis, and other highly speculative metaphysical entities and paranormal phenomena stands little chance of defeating even the moderate skeptic. The problem of rationality, it would appear, is located somewhere between the extremes of dogmatism and radical skepticism.

Rationality can be conceived as a process in which beliefs are accepted, modified or rejected. It is a process that consists in believing things because one has "good reasons" for doing so. If a person is going to determine whether a given action or belief is rational, he must ask whether there are sound reasons or justification for it. Larry Laudan writes: "At its core, rationality - whether we are speaking about rational action or rational belief - consists in doing (or believing) things because we have good reasons for doing so."[199] But what constitutes good reasons? How do we justify our beliefs and actions? What are the necessary and sufficient conditions for rationality? To determine, for example, whether people in the Middle Ages had "good reasons" for believing that the earth was flat, we shall need to know what reasons were available to them. What counts as the most rational belief obviously varies from time to time and from person to person, depending upon what information is available - but the truth does not vary in this way. What we properly call the truth will vary, but what is actually true does not. It is not truth that is relative, but only our best estimates of the truth that are relative. Rationality of a given person's beliefs or actions is relative to the evidence, reasons and patterns of reasoning available to that person and this vary with historical and social context. What we ought to believe is not guaranteed to be true and the moment its error appears, our obligation to believe disappears.                           

Another important feature of the activity of giving reasons is that if I am to give someone a reason for believing Q, it is not enough that I should point to a proposition P which she accepts and which entails Q. Something further is required and here we can appeal to Aristotle's axiom which maintains that the premise of an informative piece of reasoning must be "better known" than the conclusion. Wittgenstein, in On Certainty, holds to the same principle when he writes:

One says "I know" when one is ready to give compelling grounds. "I know" relates to a possibility of demonstrating the truth. Whether someone knows something can come to light, assuming that he is convinced of it.

But if what he believes is of such a sort that the grounds that he can give are no surer than his assertion then he cannot say that he knows what he believes.[200]

          4.2 Rational Principles

In addition to seeking reasons, one must, according to Israel Scheffler, recognize and commit oneself to principles that serve to lend relevance and strength to reasons. Scheffler writes:

Reason is always a matter of abiding by general rules or principles - a matter of treating equal reasons equally, and of judging the issues in light of general principles to which one has bound oneself. If I could judge reasons differently when they bear on my interests, or disregard my principles when they conflict with my own advantage, I should have no principles at all. The concepts of principles, reasons and consistency thus go together and they apply both in the cognitive judgement of beliefs and the moral assessment of conduct. In fact they define a general concept of rationality. [201]

These principles or "standards of rationality are a means whereby we rise above, or check, our own particular hopes, wishes, and biases,"[202] but the principles themselves are not absolutes and must admit of rational justification. Scheffler goes on to say that we, as teachers, are obligated to pass on these rational principles and the traditions in which they are embodied, and in which "a sense of their history, spirit, and direction may be discerned", but:

We need not pretend that these principles of ours are immutable or innate. It is enough that they are what we ourselves acknowledge, that they are the best we know, and that we are prepared to improve them should the need and occasion arise.[203]

This evolutionary approach to the rational principles is consistent with Dewey's notion of rationality as located within a context and playing a role as one element along with others, rather than as an external self-supporting Kantian point that settles everything. Principles for Immanuel Kant, whether rational or moral, are like mathematical axioms and theorems, independent of context. For Kant, as for many other enlightenment thinkers, to engage in rational enlightened thought was to think in accordance with universal principles of rational justification that are independent of particular historical or cultural circumstances and that exhibit the capacity of all human beings for rational objectivity and truth. But since Kant we have discovered that even entire mathematical systems such as Euclidean Geometry cannot be appealed to if we are to understand the workings of the universe beyond our own planet.

In his prophetic work The Revolt of the Masses, Ortega Y Gasset wrote:

Whoever wishes to have ideas must first prepare himself to desire truth and to accept the rules of the game imposed by it. It is no use speaking of ideas when there is no acceptance of a higher authority to regulate them, a series of standards to which it is possible to appeal in a discussion. These standards are the principles on which culture rests. There is no culture where there are no principles of legality on which to appeal. There is no culture where there is no acceptance of certain final intellectual positions to which a dispute may be referred. There is no culture where economic relations are not subject to a regulating principle to protect interests involved. There is no culture where aesthetic controversy does not recognize the necessity of justifying a work of art. Barbarism is the absence of standards to which appeal can be made.[204]

Ortega Y Gasset was attacking the "average man's" propensity and desire for expressing his ideas and opinions but also his attendant unwillingness "to accept the conditions and presuppositions that underlie all opinion." "To have an idea", he states, "means believing one is in possession of reasons for having it, and consequently means believing there is such a thing as reason, a world of intelligible truths; the highest form of intercommunion is the dialogue in which reasons for our ideas are discussed."[205] The "rules" referred to by Ortega Y Gasset are rules of rationality - rules of inference. But how is the validity of the rules determined? The point is that rules and particular inferences are justified and brought into agreement with each other. A rule (or principle) is amended if it yields an inference we are unwilling to accept; an inference is rejected if it violates a rule we are unwilling to amend. The process of justification is a delicate one of making mutual adjustments between rules and accepted inferences; and in the agreement achieved lays the only justification needed for either.

          4.3 Objectivity and Rationality

I would now like to turn to the notion of objectivity and its relation to truth and rationality. It is a truism that the rational person cares about truth, but the relativist holds to the thesis that objectivity about what counts as truth and knowledge is only possible within some kind of framework - be it conceptual, epistemological, linguistic or cultural. Harvey Siegel sees "framework relativism" as the notion that epistemic judgements are in some sense bound by schemes or frameworks, so that thinkers are limited or trapped by, and cannot transcend or escape from, some sort of fundamental restraints which sharply delimit the possible range of claims that they are able to regard as true or justified. Framework relativism is thus dependent on the notion of a limit or boundary beyond which rationally defensible judgements concerning truth or epistemic worthiness cannot be made. One can perfectly well judge from within one's scheme, utilizing criteria internal to the scheme, but one cannot meaningfully question the scheme or its criteria themselves, for they are necessary for judgements to be made at all. There simply is not, according to the relativist, any framework or scheme-independent vantage point from which to criticize or judge alternative frameworks or schemes.[206]

In other words, propositional belief can be criticized only from within the framework and truth and rationality become compartmentalized. What is true for one group may not be true for another or even intelligible to them and what counts as a reason varies from system to system ("It may be rational to believe in God if one is a Christian and irrational if one is a Marxist"[207]). The notion of an all-encompassing rationality must be relinquished and the whole question of what is true (i.e., the content of a proposition) is reduced to the question of justification for belief. But one can justifiably believe what is false and unjustifiably believe what is true; that is, truth is independent of one's ability to provide rational justification that is a fallible indicator of truth. "What people disagree about is what is true, and not what is true for them."[208] It should be noted that rationality is not really a "framework", but a method for solving problems within any "framework" and its justification is the justification of the employment of a method.[209] The most common, and perhaps the most revealing, objection to relativism of this kind is that no such relativism can account for itself. Relativism poses as a truth for all schemes, but in reference to what scheme is relativism to be judged? W. V. Quine makes this point when he writes:

Truth, says the relativist, is culture bound. But if it were, then he, within his own culture, ought to see his own culture-bound truth as absolute. He cannot proclaim cultural relativism without rising above it, and he cannot rise above it without giving it up.[210]

Siegel argues that the "classical" connection between truth and rationality is a "philosophical confusion of cart and horse."[211] The classical connection, he claims, is "an evidential one: p is true, and the fact that p is true counts as grounds for taking our belief in p to be rational." Siegel writes:

This, I believe, is a mistaken explication of the classical connection between rationality and truth, according to which rationality amounts to believing what we have good reason to believe. Such good reason for believing p provides grounds for believing that p is (at least somewhat likely to be) true. Rational belief in p thus is not grounded or evidentially based on p being true; rather, the evidential relation is reversed: judgements concerning the truth of p are grounded on p being rationally believed. To say, in short, that we have good reason to believe p is to say, in effect, that we have good reason to believe that p is true. We could put this point another way, namely, that the import or upshot of rational belief is truth. That is, the import of the fact that we have good reason for believing p is that p is (at least somewhat likely to be) true. We cannot appeal to the truth of p in order to establish the rationality of believing p.[212]

Despite the self-referential problems of relativism[213] we apparently cannot, as Dewey, Quine, Kuhn, Putnam, Rorty and others have claimed, any longer hold to the correspondence theory of truth. There are not, on the one hand, our theories about the world, and, on the other, the world itself; we do not evaluate our theories by seeing how well they correspond or "mirror" the world. This is because we have no access to a theory-independent world - that is, a world unconditioned by our point of view, our needs, goals, and values. The world we see is theory-laden: it already bears the stamp of our involvement in it. Richard Rorty, however, ventures far beyond this claim in holding to a William James conception of truth which amounts to nothing more than what Dewey called "warranted assertability" and justification amounting to no more than "what one's peers will let one get away with saying."[214] Epistemic warrant is, therefore, reduced to non-epistemic sociological fact.[215] There is no Archimedian point or "Gods eye view" outside our own worldview from which to evaluate that view's truth. Although we exist and participate in an ultimate reality, we cannot know this reality objectively in the sense that the Logical Positivists[216] had hoped for. 

We cannot, it would seem, assume the detached vantage point of what Thomas Nagel calls "the view from nowhere,"[217] and consequently we must attempt to diminish the generally accepted sharp bifurcation between the objective and the subjective. Nagel argues that objectivity is both "overrated" and "underrated". With respect to the former, it cannot provide a "complete view of the world on its own" and with respect to the latter, it must be regarded as a "method of understanding the world as it is" and not some romanticized notion in which certain subjective values are indispensable.[218] Nagel states that objectivity is the effort to "transcend our particular point of view" by attempting to "get outside of ourselves." He writes:

A view or form of thought is more objective than another if it relies less on the specifics of the individual's makeup and position in the world, or on the character of the particular type of creature he is.[219]

But, it is "impossible to leave one's own point of view behind entirely without ceasing to exist."[220] Hence, the commonly accepted dichotomy between subjectivity and objectivity is not as clearly defined as we would like to think. Self-deceptive efforts to escape oneself and achieve objectivity from some external, transcendent source, some ultimate reality or "view from nowhere" is an abdication of one's freedom and responsibility. It is a rejection of "authenticity" - an escape into what Sartre called the "etre-en-soi." Many of the Existentialist writers, such as Sartre, understand objectivity as having this normative dimension, identifying it with action and responsibility.[221]

This traditional understanding of objectivity, which has come under attack from Kuhn, Rorty and other postmodernists, ties it to notions of precision, veracity, disinterestedness, detachment, impartiality and impersonality. Piaget, for example, stated:

Objectivity consists in so fully realizing the countless intrusions of the self in everyday thought and the countless illusions that result - illusions of sense, language, point of view, value, etc. that the preliminary step to every judgement is the effort to exclude the intrusive self.[222]

In this sense, objectivity depends upon the existence of impersonal, autonomous entities, complete self-detachment and is independent of subject-related influences. According to this view, any rational agent X should be replaceable by any other rational agent Y in which they both have the same cognitive access to some proposition P.

          4.4 Objectivity as a Normative Notion

A second notion of objectivity is often mistakenly conflated with the one just described. In this second sense, objectivity takes on a normative flavour in the sense of being a matter of judgement - of avoiding prejudiced, biased or dogmatic decisions in favour of a willingness to submit to standards of relevance, evidence and argument regulating ways of resolving disputes and deciding beliefs. As Wittgenstein has pointed out, in measuring a belief we measure more than the belief itself; we measure it together with its supporting cast of other beliefs. Beliefs cannot be justified or grounded in isolation from a matrix of other beliefs constituting our "picture of the world." It should be noted, however, that whether our criteria for truth necessarily demand coherence or any other requirement, they are not directly connected with the nature of truth. The logical shift from what is held true to "truth" is a very common one in postmodernist camps. "What is real [or what is true] exists whether any rational being can in fact think of it or not...that there is a real world, independent of how we understand it, is both a simple and profound statement of metaphysics."[223] The common denominator of objectivity is a disposition and willingness to be open-minded with respect to conflicting interests and views and to keep discourse open while valuing an impartial regard for reasoned argument and evidence, leading where it will. To be objective and epistemically responsible is to be constrained by "the nature of human cognitive equipment"[224], the fact that there exists an external reality and by the intellectual virtues and values of critical self-reflection. Hence, the avoidance of bias, prejudice and irrelevant factors in judgement (e.g., John will get a poor grade on this assignment because he misbehaves in class) and a willingness to be critical, not only of the weaknesses, obscurities and paradoxes of the views of others, but also of the views that we ourselves favour, are virtues associated with objectivity.

What is distinctive about this view is that objectivity connects itself to persons through their beliefs and their actions. What makes a judgement objective is not merely the acceptance of the fact that there is a real world independent of how we understand it, but also there is something special about people's practices. Hence, a normative one subordinates the ontological Kantian role in the first sense of objectivity. Objectivity becomes an intellectual virtue - a quality of character rather than referring to properties of that that is known or features of the relation between the knower and the known or between theories and the world. The idea that objectivity is a quality of character has the corollary that people are themselves responsible for the extent to which their actions accord with objective practice, just as they are responsible for the extent to which they are constrained by reason. In a certain sense, the objective person is the rational person. Objectivity is a disposition well within the grasp of rational persons but cannot be seen as a virtuous or admirable characteristic of their cognitive proclivities if it has nothing at all to do with their actions. A person can hardly be responsible for being or failing to be objective, or be praised or blamed for it, if the notion of objectivity is restricted to an external property ascribed independently of the desires, motivations and beliefs of any individual person.

But surely the notion of objectivity presupposes the existence of a self-subsistent reality, independent of human thought and language, about which one can be objective. If we are to make any kind of sense at all of the distinction between knowledge and mere conjecture it is absolutely crucial to maintain a secure hold on the difference between subjective human experience as expressed between this and that epistemic state and a world of independent objective reality by reference to which it is possible to judge such states to be true or false, correct or incorrect. Failing any such difference or distinction, the epistemological subject can only remain trapped within the circle of his own ideas - either via solipsism or on individual accounts of knowledge acquisition or, in the case of social or inter-subjective conceptions, within the confines of a given epistemological community. On any respectable account of knowledge, the notion of objective truth as a significant goal of human inquiry is simply indispensable if we are to have any confidence that our inquiries may actually get us somewhere by way of an understanding of that which exists beyond the otherwise uncertain contents of our own minds.       

Many proponents of the sociology of knowledge and philosophers following Wittgenstein and Rorty are content to conflate objectivity with intersubjectivity or claim that objectivity is only meaningful within conceptual frameworks and hence, reduce objectivity to causal or deterministic explanations of belief which inevitably involve a tendency to treat all beliefs on an equal footing. The net product of this postmodernist or New Age misalliance of diverse ill-sorted philosophical and psychological views is an essentially constructivist concept of knowledge acquisition as largely a matter of fabrication of personal (and, presumably, well nigh incorrigible) models of reality which have little claim to any sort of objectivity beyond the internal consistency of the experience of whosoever constructs them. Moreover, while such constructivist views have exercised widespread influence throughout the school curriculum - even in relation to thinking about the nature of human inquiry in mathematics and the hard sciences - it is easy to see why they should have been especially welcomed in those curricular areas, such as religious education, for which a traditional realist or correspondence account of truth might be considered highly problematic. What entirely disappears from this picture of knowledge acquisition is any substantial criterion of objective inquiry and truth of the sort entertained by epistemological realists. Knowledge is no longer apt for construal as the grasp of an independent objective order by an epistemic agent who, in his attempts to apprehend it, observes certain rational canons and procedures of disinterested and impartial inquiry. In general, then, the direct offspring of the unholy union of pragmatic philosophy and cognitive psychology is a dubious conception of knowledge according to which truth reduces, at best, to some uncertain amalgam of coherence and utility. It is interesting to point out the dangers and pitfalls which this sort of constructivist thinking about knowledge acquisition, pedagogy and curriculum present for those who wish to mount a defence of the educational value of initiation into, for example, religious inquiry - especially as it is nowadays common to encounter the defence of something like such a conception of religious education under the general rubric of "personal search". On the constructivist view, of course, no special problems arise about the objective truth of religious knowledge in more or less realist terms because no such problems can arise about any form of knowledge. Even those so-called "precise" forms of scientific knowledge such as physics, biology, chemistry and even mathematics are to be regarded as merely constructs which are no more or less answerable to some independent objective reality than any of the alleged less exact moral and social sciences. This constructivist approach to objectivity inevitably blurs the distinction between what a person believes and why he believes it and seemingly removes the possibility of error from our judgements. If "truth" is merely "what we judge to be true" then objectivity loses its point and thereby undermines all intellectual activity and the impetus for inquiry. Roger Trigg writes:

No intellectual activity worth the name can avoid the distinction between our understanding and what we are trying to understand... Without such a conception of objectivity, philosophy can no longer be distinguished from prejudice. All reasoning requires the idea that there is something beyond itself that can provide a standard of correctness.[225]

Clearly any enterprise with a legitimate claim to be a serious form of human inquiry must operate with some formal analogue of truth as a regulative norm or principle of epistemological procedure. Furthermore, if we are to have any real confidence that the exercise of reason is actually getting us somewhere in terms of an understanding of how things are as distinct from how we are inclined to take them to be, truth must be regarded as more than a matter of mere coherence between personal cognitive constructs and must involve definite reference to a mind-independent order of objective reality against which such mental constructs may be measured. The holy grail of educational epistemology clearly lies in the development of a realist conception of truth that is also capable of steering a steady course between several unacceptable and undesirable philosophical extremes. However, as educators we must heed the postmodernist’s lesson that it is not sensible to look for the sort of absolute or bedrock metaphysical and other truths upon which traditional theories of knowledge sought to found the whole edifice of human inquiry - and therefore to recognize the inescapably provisional nature of much, if not all, human knowledge. However, it should also be recognized that to reject the idea of objective truth along with absolutism is to throw out the epistemological baby with the foundationalist bath water - for traditional realism is surely correct in supposing that there can be no knowledge, of even a provisional sort, without the possibility of mind independent reality, truth and objectivity. 

          4.5 Harvey Siegel's Epistemology

Harvey Siegel argues for what could be described as a mitigated "absolutist" epistemology[226] based on the premise that a denial of relativism does not necessarily entail either foundationalism,[227] incorrigibility, infallibility, certainty or some necessary privileged framework, or the un-revisability of some class of statements, or dogmatism."[228]

Knowledge claims can be objectively assessed in accordance with presently accepted criteria (e.g. of evidential warrant, explanatory power, perceptual reliability, etc.), which can in turn be critically assessed. Thus an absolutist belief system can be both self-correcting and corrigible. Furthermore, judgement of knowledge claims require at least implicit commitment to "absolutist" presuppositions - not regarding certainty or privilege, but rather the possibility of objective, non-question-begging judgements.[229]

Siegel rejects what he calls "vulgar absolutism" - the desire for certainty, incorrigibility or the "product of a privileged framework."[230] Siegel's "absolutist" thesis does not require that one embrace any particular theory of truth, analysis of knowledge, theory of justification or theory of evidence, but demands a fair, open-minded, objective, inter-subjective, critical approach to the evaluation of conflicting knowledge claims in a "non-question begging way"[231] and "in accordance with criteria which themselves admit of critical assessment and improvement."[232] Siegel's efforts are a step in the direction toward a denial of epistemological relativism but, as he readily admits, much more work needs to be done in this area, as well as in ethics, which has been in a perpetual state of chaos since Hume. Siegel's efforts are, in some respects, similar to John Rawls' and other liberal theorists who argue that one's reasoning appeals can be made to any rational person - to individuals conceived of as detached from their idiosyncrasies of character, culture, history and circumstance. Moreover, Siegel's approach is consistent with Karl Popper's fallibilist, Peircean view that no statement of fact is ever final in the sense that it is beyond refutability or modification. It maintains that the structure of our knowledge has foundations, but it does not hold that these are immutable and incorrigible. Moreover, the very notion of fallibility, whether empirical or not, presupposes a realist epistemology/ontology in which a distinction is drawn between the object of one's belief and what one believes. "The history of the world is littered with people who were certain and wrong"[233] and the possibility of error, perhaps even massive error, must be taken seriously. Our beliefs are a reflection of reality or they are not, but uncertainty about reality does not mean that we do not or will not have access to reality. It simply rejects the view that knowledge is "constructed" out of an amorphous, unstructured chaotic universe and then conditioned and determined by our conceptualizations of these constructs. It rejects the Wittgenstein view that reason, truth and the self are merely linguistic creations. Language is the tool of our thinking; not its prison.

          4.6 Objectivity and Truth: Dewey

John Dewey, one of Rorty's heroes, would in no way subscribe to the relativistic position of Rorty's form of neo-pragmatism that insists that truth and objectivity entail "conformity to the norms of justification (for assertions and actions) that we find about us"[234]. Rorty thus depicts pragmatists like himself as "those who wish to reduce objectivity to solidarity." The pragmatism to which Rorty subscribes is not the Peircean pragmatism which looks at truth as an intrinsic epistemological ideal to which we aspire, but is a Jamesian pragmatism which defines truth as "the name for whatever proves itself to be good for us to believe."[235] For Dewey, the fact that all justification involves appeal to existing social practices is where the real problems for democracy begin. To find oneself in a cultural tradition is the beginning, not the end, of critical thought. What are the social practices to which we should appeal in any particular context? How do we discriminate the better from the worse? Which ones need to be criticized, reconstructed, reformulated or discarded?[236] Alternative possibilities must be imagined. Richard Bernstein writes:

 Whatever our final judgement of Dewey's success or failure in dealing with what he called the "problems of men", Dewey constantly struggled with questions which Rorty never quite faces - although his whole reading of modern philosophy is one that points to the need for reflective intellectuals to examine them. Sometimes Rorty writes as if any philosophic attempt to sort out the better from the worse, the rational from the irrational (even assuming that this is historically relative) must lead us back to foundationalism and the search for an a-historical perspective. But Rorty has also shown us that there is nothing inevitable about such a move. A primary task is one of trying to deal with present conflicts and confusions, of trying to sort out the better from the worse, of focusing on what social practises ought to endure and which demand reconstruction, of what types of justification are acceptable and which are not.[237]

Although Dewey understood the reality and importance of habit and custom[238], he also recognized the need to recreate and for continuous re-evaluation and reconstruction - a second order habit of intelligent assessment and adjustment of custom to meet arising circumstances before they evolve into either social stagnation or social upheavals. This possibility for continuous reconstruction is generally neglected because we become captivated by the manifest stability and security of our inherited customs.[239] Dewey, like Russell, eschewed the sort of education which suppressed critical modes of thought, kept within bounds by habits of mind formed early in childhood. The aim of education is first and foremost to develop critical methods of thought, directed toward internalizing in the student the discipline of critical, responsible thinking.

...for the most part, adults have been given training rather than education. An impatient, premature mechanization of impulse activity after the fixed pattern of adult habits of thought and affection has been desired. The combined effect of love of power, timidity in the face of the novel and a self-admiring complacency has been too strong to permit innovative impulse to exercise its reorganizing potentialities.[240]

Instead of fostering in our students the creative and critical spirit "just where critical thought is most needed - in morals, religion and politics," we have focused on "retaining and strengthening tendencies toward conformity," avoiding "the shock of unpleasant disagreement" and finding "the easy way out..."[241]

For Dewey, considerations of method were critical, and he viewed science in a very broad sense to be commensurate with common sense reasoning that cut across all cultural peculiarities and spoke of the "fundamental unity of the structure of common sense and science."[242] Dewey, I am sure, would have rejected the Thomas Kuhn incommensurability thesis - something Rorty accepts carte blanche. Rorty, like Wittgenstein, reacts to scientism by mistakenly taking science to be inextricably enveloped within a cultural context - just another "language game", "form of life" or "worldview." Dewey, on the other hand, following Peirce, construed science as a useful mode or paradigm of deliberative inquiry and as a reliable process via which to acquire knowledge and fix belief. Pre-Rorty pragmatists (William James notwithstanding), like the logical positivists and other supporters of the Enlightenment ideal, were aware that there is something special about science, both in our attempts to understand the world and in our attempts to achieve reliable and true beliefs. Or are Feyerabend and Rorty right in claiming that there is no such thing as scientific progress and that our scientists know no more about the world than did Aristotle? Rorty, and in particular Feyerabend with his "anything goes" approach to science, must now surely endorse religious fundamentalism, Numerology, Creationism, Scientology, Astrology and other crank theories as credible hypotheses for scientific inquiry and as possible candidates for implementation into the science curriculum of our secondary schools.

          4.7 Objectivity and Truth: Rorty

I would now like to discuss the work of Richard Rorty who, in his influential work Philosophy and the Mirror of Nature (1979) and subsequent collections of papers, has attempted the deconstruction of the epistemological tradition of analytic philosophy and has declared the end of philosophy as such. His attack, if I read him correctly, is directed at (1) the Platonic tradition concerning truth and knowledge according to which truth is correspondence with nature and knowledge is a matter of accurately "mirroring" reality and essence, (2) the dualism of Descartes' philosophy of mind as separate and "immaterial", functioning as privileged access, the "inner mirror" in which cognition and access to certain knowledge takes place according to some special intuitive process of "clear and distinct ideas", and (3) the Kantian doctrine which views philosophy as being invested with some special status as cultural overseer in setting universal standards of rationality and objectivity and self-adjudicator for all possible claims to knowledge.                       

I will presently not take issue with either (1) or (2) since I have already dealt with (2) earlier in this thesis; and with respect to (1): why should we necessarily assume or take it self-evident that reality is not hidden from us, that the significance of our experience is all there is, or that ultimate reality lies outside the range of our understanding? I will not elaborate further on this objection to (1) but I do have serious reservations about the severe relativism that results from Rorty's total annihilation of epistemology in (3).[243] I see no problem with Rorty's claim that we should give up the Cartesian quest for certainty and put aside any nos­talgia for transcendental Absolutes, Archimedean points and strong found­ati­onalism. But to say that there can be no standards of rationality[244] and objectivity outside of incommensurable conceptual or cultural frameworks[245] and that there is nothing to choose between them is a step into the quagmire of radical skepticism, self-contradiction and self-refutation. It is incoherent both to hold a point of view and at the same time argue that no point of view is more justified or right than any other. The commonly held view that philosophy is the pursuit of truth by way of universal principles of rationality, with its attendant normative qualities of intellectual integrity and responsibility, open-mindedness, impartiality, courage to doubt, and respect for persons. The postmodernist's lingering questions of "Whose truth" and "Whose rationality" are empirical questions that may reveal empirical answers by sociologists, psychologists and others. Philosophy is defined, as is science, by its strategy and methodology and if we reject the notion of a common rationality, philosophy (and perhaps science as well) will become a superfluous pleonasm.

That we cannot escape from our ethnocentrism and that we should hence, acquiesce to this alleged predicament and achieve communal "solidarity" by remaining loyal and faithful to the values and accepted standards of rationality of our community seems contrary to the democratic ideal. Social criticism is essential to the health of a liberal democracy. Bertrand Russell comes to mind here, and more recently, Noam Chomsky. Both Russell and Chomsky are good examples of freethinking social critics unencumbered by the weight of excessive theoretical or ideological baggage. Does Rorty mean, for example, that the solidarity of South African Apartheid cannot be criticized from within or without and that contemporary Sweden or Norway are no more a just societies than present day North Korea, Saudi Arabia or even the United States, provided that there is solidarity and shared understanding within their respective societies? Surely it must be admitted that Rorty's own endorsement of liberal democracy is a reductio ad absurdum, since the attraction of a liberal democracy is its support of hum­anistic values, its capacity for self-criticism and its efforts to transcend ethnocentrism and provincialism.

Rorty's particular version of pragmatism is graphically illustrated when he asserts, in Jamesian fashion, that what made Newton a better scientist than Aristotle was "not because his words better correspond to reality but simply because Newton made us better able to cope" (my italics).[246] Hence, science is viewed as ways of "coping with the world" rather than as a search for "truth" about the world. Rorty regards truth and objectivity as ethnocentric and culture-relative and "truth", says Rorty, "is not the sort of thing we should expect to have a philosophically interesting theory about"[247]. What is true is what could be established to the satisfaction of one's "cultural peers" and "what you can defend against all comers."[248] But truth is not determined by reflections on social convenience or by the majority in a culture. Counting heads is clearly not a good guide to the truth, as the early Christians were proud to acknowledge. On the contrary, social expediency depends upon whether a belief is true.[249] When we ask a question about some proposition or some aspect of how things really are, "we are not asking for a report on the state of public opinion with regard to that question, we are asking to be told the truth about it."[250] If Rorty's notions of truth and rationality are correct, one might legitimately ask, what reason do we have to accept his views as true and how do we know we are coping? Furthermore, if Rorty purports to be offering anything resembling a philosophic argument against rationality and truth, he must himself be appealing to the very rationality he is calling into question. It seems to me that Rorty must make some drastic concessions if he expects to be taken seriously - namely, that there are, at least, some objective truths that we can come to know.

Why we don't need theories of truth - and Rorty would agree here - is that we should accept that things are as they are and that we often know and say how they are, without looking for a theoretical explanation of how we are able to do so.[251] In other words, it is the world that determines truth; our beliefs are true because they "fit" reality by approximating reality in its un-interpreted state.[252] The real issue of realism is not whether we know what we know - we must accept the empirical sufficiency of having certain beliefs based upon the limitations of our cognitive apparatus - but we can offer good reasons for thinking that we do know most of what we claim to know. Rorty seems to think that accepting our epistemic limitations necessitates assuming something that transcends those limitations. Should we accept this assumption? Propositions are "true" when there is a convergence of description and belief, when we say how things are, the sorts of things we say in language about sentences and beliefs relate to the world beyond language. In short, we are unwilling to equate the dismantling of the Platonic correspondence theory of truth with the destruction of the very idea of truth as determined by the world.[253]

Empirical reality is always reality as experienced by humans and the problem with a strict empiricism is that it leaves no room for any possible reality beyond that experience. Although it must be tempered with skepticism and plausibility, acceptance of the possibility of the existence of entities beyond our experience may make us less dogmatic in saying what can be experienced. Reality is essentially a metaphysical question rather than an empirical one. Therefore, when the prospect of reality is removed and factors other than reality introduced as the origins of our beliefs, "we become the playthings of our individual or collective history. The way our beliefs are produced becomes more important than what they are about."[254]

When Rorty asserts that truth is "what you can defend against all comers," I am reminded of the 1992 movie A Few Good Men. The central story revolves around two United States marines who are charged with second-degree murder, but who are subsequently offered a "plea bargain" which would ostensibly result in dishonourable discharge and a six-month prison sentence. The "murder" was in fact an accidental killing of another marine (who apparently had a heart ailment) - the result of invoking "Code Red", an unwritten code of punishment to be inflicted on fellow marine trainees who are not "carrying their weight." The Code Red injunction was secretly ordered, but denied, by the camp commander, Colonel Jessup (played by Jack Nicholson) - an unscrupulous "man with a cause" who holds to the dictum: "Core - Unit - God - Country." The two career marines who were simply following orders and camp protocol, wanted to see the real truth divulged and hence, decided to take their chances in court in spite of a lack of evidence in support of their case. Here we have two men who are innocent of a crime but cannot prove it, cannot defend it successfully "against all comers," but who refuse to compromise the truth and their honour to serve pragmatic ends. The truth is: They are innocent! The only correct sense of "true" makes truth independent of how well it can be defended. Its defensibility is a separate matter, which may depend upon a variety of extraneous circumstances. Any person innocent of a crime surely wants the real truth to emerge; and the real truth is all that is normally meant by "true." On a similar thread, I can unfortunately think of too many examples from my years of teaching in which I have contacted a parent concerning his/her son or daughter's absence and had the truth compromised by expediency. The parent would initially be aghast at the apparent unexcused absence until informed that her/his student had missed a test and would consequently receive an automatic zero. Then, once the stakes are raised, the parent all of a sudden remembers why her/his son/daughter was away.

Rorty's attempts to reduce normative expressions like "is true" or "is justified" to non-normative expressions of the form "is what your peers let you get away with" (e.g., an argument is valid or a proposition is true if it adheres to the standards of one's social group) flies in the face of what we already understand about how the terms "true" and "justified" function in our everyday language and discourse. Hence, we should not need any grandiose theory or definition of these terms, particularly those of the pragmatic or historicist variety, to understand their meaning and everyday usage. Justification can, in a certain sense, be relative to the reasons, evidence, concepts and arguments, etc. that are available (e.g., in the Twelfth Century it may have been rational to assert that the earth is the centre of the cosmos because...) so that one may be free from epistemic censure. We now believe that "slavery is unjust" is true because the arguments from previous authority such as "the belief that God had designated certain classes of people as slaves after the flood" have not survived critical scrutiny.[255]

Being justified in believing something is a normative relation that exists among a given proposition, the person who accepts it, and a cognitive context. If I am justified in accepting a proposition, my context and I are related in the required way. The relation is as objective as can be, not subject to worrisomely arbitrary subjective manipulation.[256]

This includes "arbitrary subjective manipulation" by the social group to which one belongs. Antony Flew describes this phenomenon as "metaphysical collectivism" which excludes as inconceivable that a dissenting individual or minority could apprehend the truth. He writes:

If the only possibility of objective knowledge does indeed lie "in its being the set of beliefs of a social group;" and if propositions and arguments are not true or false, valid or invalid, altogether independently of whether anyone actually recognizes them to be so; then, certainly, there is no standing ground for any dissident individual.[257]

Hence, "slavery is unjust" is true[258] and "the earth is the centre of the cosmos" is false independent of time and context - accepting their contraries may have been, at one time, justified, but false. However, it must be granted that "just as people can hold false beliefs for good reasons, they can hold true beliefs for bad reasons."[259] True beliefs, for example, can be held for reasons of self-interest, faulty logic, bias, ideological affiliation, religious dogma, rationalization or wishful thinking. These would be unjustified true beliefs.

Rorty must concede the fact that no matter how convinced or justified we are in believing P, it always seems legitimate to ask "But is P really true? Does P describe how things really are?" Our attempts to answer these questions do not, as Rorty suggests, generate useless metaphysical questions; nor do they necessarily mean that we are searching for foundations or certainty. Nor does our assent to P shut off further inquiry regarding the truth or falsity of P. Intellectual humility requires recognition of the fact that some of our present beliefs are false; i.e., some propositions that we are now warranted and justified in believing are not true. Hence, our assent to a belief should be preceded by skepticism and careful, responsible, and open-minded intellectual scrutiny. But the fact that some of our present beliefs might be false does nothing to change the fact that to hold a belief is precisely to hold it to be true, even though we have no privileged access to any distinction between those of our beliefs which are actually true and those which are merely considered to be true by us. Truth may be defined in Alfred Tarski fashion as a property of sentences but truth is not a function of either language or our ability to perceive it. Roger Trigg writes:

When error is impossible, any belief or any theory is as good as any other and it does not matter which one holds. In the world of ideas, at least, the permissive society soon becomes a nihilistic one. If it matters what we believe, we have to face the fact that the price of possibly being right is that we could be wrong.[260]

Whatever Rorty and William James mean by “good” or “better” beliefs, the pious Muslim or Christian must surely count as having some of the very best: beliefs that bring certainty, security, stability, blissful happiness in a comforting world view and a clear conscience as one persecutes or eliminates one’s enemies. Yet still, is there not a nagging feeling somewhere, that those beliefs might not be true, and that the impoverished opinions of a postmodernist atheist might have an epistemic advantage over them? Rorty, a self-proclaimed atheist, must believe that this worldview must in some way provide a better foundation for the consensual community that he envisages. Yet how does he know that? It is quite clear that, when it comes to the point of believing something, Rorty is prepared as the rest of us to look beyond the consensus, and to evaluate beliefs on some grounds other than their “goodness”, utility or “coping” power. Indeed, if he rejects the belief that God exists, it is because, like any other atheist, he is convinced that there is, in reality, nothing to which that belief corresponds. Moreover, when C. S. Peirce, the founding father of the pragmatist school, espoused fallibilism he did not mean to deprive us of the tests whereby we decide that the “bad” beliefs are false. Even if we are never entitled to declare ourselves certain, we still reject our old beliefs by describing them as untrue, and usually by accepting new beliefs that contradict them. Indeed, the method of refutation is so fundamental to science, that it is hard to imagine any assertions that do not presuppose it. And in availing ourselves of concepts like falsehood and contradiction, we are covertly reaffirming our commitment to truth. Also, there seems to be no clear account of the convergence at which science aims, and which it also seems (unlike religion) to achieve, that the supposition that it is convergence on the truth. A Muslim, Hindu, Christian and atheist may disagree on many things, but if they have thought about the matter at all, they agree on the fundamental laws of mathematics and physics.

The stress on human fallibility leads readily to a critique of certainty but not necessarily to an espousal of relativism or nihilism. Relativism denies the universal validity of the claims of reason; fallibilism cautions that such claims are always susceptible to error. When the concepts of truth, knowledge and objectivity lose their intentional import we steer dangerously close to an endorsement of epistemological relativism. Rorty gives us truth without extra-linguistic sources of truth-value and knowledge without intentional relations between knower and known. Objectivity becomes a matter of compliance with the conventions of the operative language game and science is reduced to a strategy for coping with the surrounding environment. Rorty's skeptical analysis and critical insights are very impressive and thought provoking but I find his pragmatic solutions to the problems of modern epistemology not only exceptionally bleak, but unacceptable.

          4.8 Dewey's Notion of Truth

Although there are some similarities, Dewey's notion of truth should not be confused with the relativistic approaches to truth espoused by William James and Richard Rorty.[261] Dewey's concept of truth is more closely in line with the fallibilistic transitional epistemology of C. S. Peirce. Dewey states:  

...all knowledge, or warranted assertion, depends upon inquiry and that inquiry is truthfully connected with what is questionable (and questioned) and involves a sceptical element, or what Peirce called "fallibilism." But it also provides for probability, and for determination of degrees of probability in rejecting all intrinsically dogmatic statements, where "dogmatic" applies to any statement asserted to possess inherent self-evident truth.[262]

Dewey adds that the only proper criterion for truth is a "theory that finds the test and mark of truth in consequences of some sort." By "consequences", he means only the consequences of the use of inquiry and not consequences unrelated to the content of the instrument such as psychological factors, for example. He describes his theory of truth as "the ideal limit of indefinitely continued inquiry"[263] and goes on to state that the "truth" of any "present proposition" is:

...subject to the outcome of continued inquiries; its "truth", if the word must be used, is provisional, as near to the truth as inquiry has as yet come, a matter determined not by a guess at some future belief but the care and pains with which inquiry has been conducted up to the present time.[264]

Truth and falsity are properties only of that subject matter which is the end; the close of the inquiry...the distinction between true and false conclusions is determined by the operational procedures through which propositions about inferential elements (meanings, ideas and hypotheses) are instituted.[265]

In my view, however, Dewey invites relativism when he conflates truth and justification by his preference for the term "warranted assertability" rather than "truth" and his theory is imbued with a form of crass utilitarianism, thus undermining the widely held project of pure inquiry and the pursuit of truth for its own sake. In using this term he is suggesting that what we come to believe (as true) is contextually justified (or warranted) by the particular conceptual scheme or process of inquiry itself.[266] A statement may be regarded as verified or become qualified as "warranted assertability" but the question still remains: "Is it true?" Dewey would probably suggest, as would Rorty, that this question is an idle one - perhaps it is. But if Dewey means "true" then why use the term "warranted assertability"? It seems clear that "warranted assertability" is about justification and justification is what is warranted, not about what is true.   

As I have mentioned earlier, grandiose theories of truth are not very helpful and do not seem necessary. The attempt to get at an ultimate definition or theory of truth is to enter a seemingly bottomless pit, permeated by venomous circularity and self-reference. But truth requires coherence and internal consistency and is a function of our collective experience with the natural environment. Truth is what truly is the case, what really is, and this, along with the limitations of our cognitive apparatus, constrains what it is reasonable to believe. Coherence, explanatory power and consistency with past experiences are measures of the reasonableness of our beliefs. The key question concerning truth and knowledge is whether or not having justified true beliefs is intrinsically valuable. If they are, then all propositions are true or false whether or not they meet our ends or satisfy our psychological needs. William James, and neo-pragmatists like Rorty, rejects truth and justification as having any intrinsic value. But if neither truth nor justification is intrinsically valuable then, based upon the accepted view of knowledge as justified true belief, the value of knowledge itself is brought into question. Certainly this would be one way of answering the radical skeptic - if we cannot know anything or know if our beliefs are true -So what? Who cares?

It must also be stated that simply because we take something to be true (to actually be the case) does not imply that it is true in an absolute or dogmatic sense. Dewey was, I think, trying to avoid the absolutist conception of truth and truth as correspondence "to that which is not known save through itself."[267] With respect to the correspondence theory of truth, Dewey wondered "how something in experience could be asserted to correspond to something by definition outside experience, which it is, upon the basis of epistemological doctrine, the sole means of knowing."[268] "How", says Dewey, "can anybody look at both an object (event) and a proposition about it so as to determine whether the two correspond?" This approach to the idea of truth assumes what Thomas Nagel called "a view from nowhere." It would, however, appear that the case against a general conception of truth as involving correspondence has been either dramatically overstated by pragmatists and coherence theorists or, just as likely, grossly misunderstood by those who would claim to have been influenced by them - precisely, to be sure, many recent educators. For whether or not, of course, it makes any sense to suppose that one might ever be in some epistemic privileged position to test or measure the validity or accuracy of one's entire conception of the world against some independent order of reality (when it is expressly through the window of that conception that the reality in question requires to be apprehended) is hardly an issue which should reasonably lead us to deny that correspondence plays quite an indispensable role in distinguishing via direct observation, what is from what is not actually the case in our experience.   

As I have already pointed out, philosophical definitions of truth are not particularly helpful to anyone trying to find out what is true. What one needs are reliable criteria, but the criteria should not be conflated with the nature of truth. Whether our tests for truth involve requirements for coherence, consistency or correspondence with reality, once we accept that a gap exists between our judgement and what is the case, there is room for truth as correspondence with reality. But that room must be occupied by a self-subsistent reality - a reality that is not contingent upon human thought, language or conceptual frame­work. When we make the move from "reality" to "reality for us" we turn from realism to anti-realism.[269] Such a move is only one step removed from idealism or solipsism. According to Roger Trigg, such a move takes us to a position in which "the notion of evidence collapses into `what people judge to be evidence,’ just as truth becomes "what is judged to be true" and knowledge becomes `what is thought to be knowledge'."[270]

In countering Bertrand Russell's objections to his conception of truth, Dewey charged Russell with holding to an epistemology which assumes that "anything that is not certain to the point of infallibility, or which does not ultimately rest upon some absolute certainty" does not constitute knowledge. This charge is not "assertability warranted", to use Dewey's own term - Russell made it very clear in his many papers and essays that he eschewed any form of absolutism, dogmatism or infallibilism. Being certain that P, (e.g., that the sun will rise tomorrow), is not claimed as a necessary condition for knowledge; only that the agent be in possession of evidence which makes it reasonable for her to be "certain", to hold a rational firm belief. Moreover, if evidence was logically conclusive (i.e., entailing indubitable certainty) then it does not seem that it would be conceptually correct to call it "evidence."[271]

    4.9 Habermas                                                

 If truth, rationality and argument are only a relativistic context-dependent driven ideology then how, as Jurgen Habermas has argued[272], can we be self-critical or attempt to solve the John Dewey "problems of men?" Habermas, echoing Dewey, does not believe we must be cornered into either-or dilemmas. It does not mean committing oneself to any form of absolutism or "pure transcendentalism", but neither does it mean embracing pure historicism or relativism. The latter two, he argues, carry "the burden of self-referential, pragmatic contradictions and paradoxes that violate our need for consistency," and the former is "burdened with foundationalism that conflicts with our consciousness of the fallibility of human knowledge."[273]                       

 Habermas maintains, contrary to Rorty and Wittgenstein, that there is a distinction "between valid and socially accepted views, between good arguments and those which are successful for a certain audience at a certain time"[274] and members of any society have an interest in self-criticism and viewing "social practices of justification as more than just such practices."[275] In the fallibilistic, anti-foundational worldview truth may be eternal in some inconsequential way a la Tarski, but our judgements as to what is or what is not true are not. Rorty's contention is that truth is compartmentalized within "an infinite plurality of standpoints", i.e., within historical, cultural, religious, scientific, moral, and aesthetic contexts in which "truth is made rather than found."[276] This position, as I have argued earlier, is unacceptable to those of us with liberal humanistic and realist leanings. It leaves us with a fragmented parochialism in which we have been psychologically set adrift from the enlightenment humanistic ideal of a common intellectual tradition based on a universal reason. Rather than the evaluation of arguments from the point of view of their cogency, independent of their source, Postmodernists have zeroed in on the relevance of "whose view?", "whose argument?", promoting the view (like Thrasymaschus' in Plato's The Republic) that all truth, all knowledge is a function of what the most powerful factions in society choose to invent and enforce for what they perceive to be their own interest. One might argue that little has changed in this power elite tradition throughout history and proclaim, along with Robert Michel's "iron law of oligarchy", plus ça change, plus c'est la même chose.                 

 Some feminist philosophers, particularly those influenced by the deconstructionism of Derrida and Foucault, have argued that science and the notions of universal reason, logic and morality are gender-based, a function of patriarchal or male-dominated oppressive power structures. Worst of all, science is accused of setting up an intellectual hegemony - of arrogating for itself a title of epistemological superiority above other "modes of discourse" or "ways of knowing." They seek to reconstitute science from the ground up, replacing it with "feminist science." But this analysis of epistemology and science as power based is entirely dependent upon and presupposes a universalized point of reason and an ability to locate a source of judgement and knowledge. It would seem that such efforts, when not entirely vacuous, amount to no more than an old-fashioned attempt to subordinate the methods of science to the demands of ideology. Feminism has a glorious past and there remains work to be done to ensure that women are granted equal opportunity in all areas including, and especially, science and mathematics. But "deconstructing" mainstream science will do little or nothing to further that goal - in fact it will only discourage some women from entering the field. Moreover, if they attempt to hold fast to the most emphatic tenets of these radical views - for example, the stylish assertion that women can't be scientists or mathematicians under the present order, because society constructs these as mutually exclusive categories, and therefore that scientific practice must be reconstituted along radical feminist lines before women can participate - they will quickly find themselves excluded from serious scientific work. The ambitions of these feminists, as well as the ambitions of Afrocentrists and other ethnic groups, are backed by a multiculturalism insisting that science must be faulty because it has been dominated, for the most part, by white European males; and by a kind of populism committed to setting New Age mysticism on a par with science because the former is believed by so many well-meaning people.

 Even if Rorty is right, surely this does not mean that we are incapable of adequately answering such urgent questions we have about abortion, bioethics, gender equality, anti-gay bigotry, racism, religious fundamentalism, indigenous people's rights, euthanasia, capitalism and the desecration of the natural environment. Rorty's skeptical challenge, however, reminds us of the fact that we cannot seek security from the contingencies and problems of our everyday life in a quest for certainty or by attempting to find answers and comfort in dogma and rhetoric.


Men become civilized not in proportion to their willingness to believe, but in proportion to their readiness to doubt  - H. L. Mencken

I respect faith but doubt is what gets you an education - Wilson Mizner



          5.1 Educational Aims

Education is a process that should not begin or end with institutionalized schooling. Education should be viewed as lifetime endeavour and this idea should be fostered in our young people. Children are naturally curious creatures and want to understand both themselves and their environment; especially they want to know why. They are natural philosophers and this Socratic disposition of curiosity and desire for examination and self-examination should be invited and encouraged. For some reason, however, many of our children lose this natural inquisitive spirit and eagerness for learning quite soon after they have entered the school system. This state of affairs is extremely unfortunate and poses a difficult problem to resolve. But attempts to incorporate critical thinking programs into the public school curriculum have been invariably blocked by powerful conservative forces in our communities, especially the Christian churches.           

This problem could perhaps be met by the promotion of sensitive, intelligent discussion of existential and religious questions in the classroom. For example, a mathematics teacher, in presenting a unit on Cartesian coordinate systems, could discuss the life of Rene Descartes, the innovator of coordinate geometry. Here would be an occasion not only to discuss Descartes' life, his philosophical orientation, and what life was like in his time, but an opportunity to consider the soundness of his reformulation of St. Anselm's "ontological proof" for the existence of God. There should be nothing in the least offensive to either believers or unbelievers in such a discussion. The eminent 18th century German philosopher Immanuel Kant showed convincingly that the proof demonstrates only the existence of a concept of perfection - not a perfect entity. But the failure of a rational proof does not destroy the logical possibility of God's existence any more than it does for the invisible flying pink unicorn. Further, there is a possibility that if God does exist, he is not necessarily omnipotent and omnipotent and the ontological proof is thereby rendered irrelevant. Similar discussions on this topic could be initiated concerning Pascal's response to Descartes with his infamous wager.  Pascal's interest in games of chance usually fascinates students, but they rarely hear how he rejected all attempts to prove God's existence and instead focused on a prudential argument involving probability and mathematical expectation in order to induce commitment. It would not be difficult to think of other classroom situations in which the cosmological, teleological and other attempts to argue for God's existence could be discussed. But once again, the politically active Christian churches would be soon staging a revolt. 

Quite clearly, our vulgar popular cultural influences notwithstanding, the education system drives many students to boredom and despair. The over-emphasis on the content of thought (rather than its procedures and standards) and its concomitant didactic methodology, the repression of creative and critical inquiry, over-specialization, the predominance of instrumental reason, business models, bureaucratization, and an over-emphasis of one-dimensional technical solutions to complex social, political and environmental problems are just a few of the many problems that I feel educators and others in positions of authority and responsibility must address. Largely due to over-specialization and our dependence on "experts", many people have either abdicated or simply lost their ability to think clearly. We have, to a large extent, given up on the enlightenment ideal of rationality, not only by endorsing cultural and epistemological relativism, but by turning over important decisions, including the moral order, to specialists and technocrats who never seem to ask if a course of action or end in view is right or wrong but only if it will work for the dominant hegemonic ideology of capitalism. The concept of an educated public has been replaced by an atomized group of narrow specialists and experts who often know little outside their chosen speciality. Thinking has become the occupational responsibility of specialists such as scientists, lawyers, economists, marketers and accountants. This phenomenon has resulted in the demise of the traditional broad liberal education. We must continually rethink and possibly redefine our present conception of education in a rapidly changing increasingly authoritarian, corporatist and economically unequal world. Many of the problems we face such as overpopulation, over-consumption, ecological collapse and the ongoing militarism and imperialist wars are of major import. Our very survival depends upon the realization of the gravity of these problems and that their resolution will demand a critical and creative intelligence. Serving up facile, atomized technical solutions cannot solve difficult and complex human problems.

 It is clear that there cannot be any incorrigible, eternal or ultimate definition of education as there cannot be any ultimate aims of education.[277] One could fill several volumes attempting to articulate and address some of the problems mentioned above. It is my belief that educators should attempt to develop a humanistic, fallibilistic conceptualization of education, a "programmatic definition"[278] which will stress the need for fostering autonomous critical thinkers, persons who are skeptical of appeals to superficial, immutable, transcendent or absolutist approaches to the solution of real human problems.[279] Although it may serve as an ideal to which one might aspire, we desperately need people who can live without the myths of the "quest for certainty" (as Dewey would have it), people who are prepared to take responsibility for their beliefs and actions and accept the realities of our contingency and finitude. Our problems cannot be resolved by resorting to duplicitous businessmen and politicians, psychics, faith-healing evangelists, New Age mysticism, astrologers, or quick fix "self-help gurus." The proliferation of these and other appalling exercises in self-deception is a sad comment on the present human condition. The status quo is not an option.

The importance of the role played by critical thinking in education depends upon one's philosophic stance on the aims of education. R. S. Peters has asked the question: "Must Educators have an Aim?"[280] Peters maintains that arguments over ultimate aims are more often disputes over the way in which or process by which things are done rather than what the outcomes should be. Surely no one would argue against acquainting our young people with the vast cultural and historical traditions and encouraging rational, responsible, autonomous, and principled intellectual, as well as ethical, behaviour. Education is, in many respects, an abstraction not unlike "happiness", having no ultimate essence or intrinsic, innate quality. As the Zen and Taoist sages have rightly pointed out, the more we desire and pursue such abstract intangible goals, the more remote and transient they become. Therefore, if Peters is correct, as I believe he is, then it is at least as important to consider the way the game is played, as it is to consider its purpose.

          5.2 The Problem of Indoctrination

If education is concerned only with the transmission of basic skills, "factual" information and the accepted cultural dogmas of our age, then perhaps we should indoctrinate, not "educate" our children. The Webster's New Universal Unabridged Dictionary (2nd ed.) defines "indoctrinate" thus: (1)"to instruct; to teach," and (2)"to instruct in doc­trines, theories, beliefs, or principles." These definitions are not very helpful. Definition (1) is, if not clearly false, at the very least an anachronism and (2) does not reveal the pejorative connotation associated with indoctrination. Presently, we think of indoctrination as a particular instructional technique involving the severance of rational, reflective assessment and the logical and moral criteria for teaching. More precisely, indoctrination entails the acceptance of unverifiable and/or contentious premises, the acquiescence to authority and suspension of doubt, and an acceptance of the absolute certainty of the beliefs or doctrines with the objective of giving the "true believer" an unshakeable faith in total solutions and ultimate objective real­ty. Eric Hoffer asserts that the true believer claims:

the ultimate and absolute truth is already embodied in their doctrine and that there is no truth or certitude outside it. The facts on which the true believer bases his conclusions must not be derived from his experience or observation but from holy writ... To rely on the evidence of the senses and of reason is heresy and treason... it is the certitude of his infallible doctrine that renders the true believer impervious to the uncertainties, surprises and the unpleasant realities of the world around him. Thus the effectiveness of a doctrine should not be judged by its profundity, sublime nature or the validity of the truth that it embodies, but how thoroughly it insulates the individual from his self and the world as it is.[281]

Although Hoffer is referring to the doctrinaire nature of mass movements and their ideologies, these authoritative, deceptive, non-evidential, uncritical means by which the true believer adopts and maintains his beliefs are salient features of indoctrination.

When one is presented with and adopts a set of beliefs which can explain away obvious inconsistencies (e.g., "It is God's will", "It's an act of God", or "My astrology charts predicted the outcome.") and maintains that experience and evidence is irrelevant, then our beliefs become fixed and permanent. This is the essence of dogmatism. A person who is dogmatic is one who is disposed to indoctrinate - the indoctrinator and the dogmatist are cut from the same cloth. The dogmatic temperament "tends to search for certainty" and dogmatism is "the inability to seriously entertain the possibility that one might be wrong"[282] - it is the attitude that no information, evidence, argument, or experience will ever be seriously entertained and that further inquiry has come to an end. It is a view that sees human existence at the end of some sort of teleology, a path that has led to the necessary truth of our own system of beliefs thus disengaging the truth of our own views and beliefs from the interplay of time and social practice. But if history has taught us anything, it is that the world is strewn with people who were certain and wrong.

The real problem we face is not the rationality of most of our beliefs, but the possibility of criticizing particular beliefs, values and institutions, particularly if we accept the postmodern assertion that there are no foundations, system-independent criteria, or external frameworks on which to rest rational critique. I have already argued against this position at some length and I do not mean to suggest that, since there is no Archimedean point of ultimate appeal, all forms of knowledge are on an equal footing. Important factors such as plausibility, conceptual clarity, explanatory value, evidence, verifiability, falsifiability and coherence are accepted means of adjudicating knowledge claims which are generally ignored by those who maintain that Creationism is as much a science as Evolutionary Theory and further contend that Evolution and Secular Humanism are themselves religious doctrines. There are also those who, in spite of the paradoxes of self-reference, argue that rationality and critical thinking are indoctrinated dogmas.

Bertrand Russell has many times pointed out the fact that the beliefs people hold most intensely are those that lack the most evidential support. Unfortunately, the truth of a belief is not commensurate with the degree of passion or zeal with which it is held. Christians and other religious persons, for example, are often highly sensitive and defensive when their beliefs are questioned. They hold their beliefs as though they are congruent with their very being or personhood and any query regarding these beliefs is taken as a threat to this personhood. The defence of these beliefs often amounts to an appeal to irrelevant external factors, dubious premises, circuitous argument, and when they are unable to rationally justify their position, resort to ad hominem attacks or even violence. One of the principal reasons that religious conflicts have been so ferocious and brutal over the last several centuries is because the adherents, generally having very little factual evidence to use against each other, must ultimately resort to persuasion by intimidation and violence. The assassinations of physicians at abortion clinics by the "Right to Life" movement are a more recent case in point. As Bertrand Russell never tired of pointing out, "the most savage controversies are those for which there is very little evidence either way."  

The school, ideally, is an environment in which values, beliefs, and opinions can be exposed to critical reflective scrutiny. It is a vital task of education to help students gather evidence, assess arguments, discriminate among authorities, construct counter-arguments, and challenge claims. But why do so many feel that religious beliefs are sacrosanct and immune from classroom discussion? It is an oddity and a paradox that we live in a society that shrugs off the influence of violence, gratuitous sex, crass materialism and greed displayed daily on television, and worries, instead, that its children will be corrupted by the free discussion of controversial issues in the classroom. One of the deficiencies of our educational system is that it produces graduates who are rarely, if ever, exposed to any serious criticism of the immorality of capitalism, the cosmological or teleological arguments for the existence of God and are unable to conceive of the possibility of a secular morality. It is hard to think of any topic on which there has been so little change in the level of its treatment in educational institutions in the last century. With "political correctness" the order of the day, there is clearly a taboo on open-minded inquiry at least as strong as the resistance in Darwin's day to questioning the authority of the Bible or the rationality of particular religious beliefs. The fear arises, I suppose, from the fact that children will be induced to question and possibly reject the beliefs of their parents or church. Religious fundamentalism persists not because of inadequacies in our arguments using reason and science; it persists because it is taboo in society - indeed, in most places in the world - to promulgate the arguments of reason and science in refutation of most religious beliefs (The Satanic Verses and Salman Rushdie's plight is a case in point). I cannot remember when I last encountered a rousing refutation of any of the thousands of preposterous religious dogmas on prime-time television nor have I seen a disclaimer by a major newspaper regarding the astrology column. We need to learn how to discuss sensitive issues without taking up cudgels. These issues can be sensitively handled in the classroom by avoiding the ad hominem vilification and character assassination that are so common to religious and political argumentation. It seems clear to me that if a particular set of beliefs is so fragile that they cannot withstand intellectual examination and critical scrutiny, they should, indeed, be rejected.

Eamon Callan[283] asks whether parents are entitled to view their children as chattels by indoctrinating them. This includes "the right to send one's children to denominational schools which instil one's own faith."[284] Callan's answer is "no", maintaining that "indoctrination is at least prima facie the same evil whether it is perpetuated by Big Brother or one's dear parents."[285] The inculcation of religious doctrine is a paradigm case of indoctrination in that the majority of the beliefs are accepted certainties and held on the basis of faith, i.e., held non-evidentially and "immune to criticism and rational evaluation."[286] Harvey Siegel has argued that children should be protected from indoctrination, regardless of its source, maintaining that it is "undemocratic and immoral."[287]

Fundamentalist education, in fact, offers us a classic example of indoctrination. For the aim of such education is to inculcate a set of beliefs in such a way that students never question or inquire into the legitimacy of those beliefs. Indeed, the mark of success of a fundamentalist education is the student's unswerving commitment to the set of basic beliefs inculcated, and a teacher or schoolmaster whose students did not exhibit such a commitment could not be judged successful. It is disconcerting to hear leaders of the Moral Majority and other Christian proselytizers or allied proponents of creationism and fundamentalism claim that parents own their children and should be free to determine their children's education. Such a view denies that children are morally entitled to grow into autonomous thinkers, capable of making independent judgements as to the worth of particular beliefs. This view is both morally repugnant in its flagrant disregard for the rights of children as persons, and anti-American in virtue of its antidemocratic thrust.[288]

For very young children, indoctrination of some sort is probably unavoidable for both moral and prudential reasons. However, the authority of the parent or teacher is probably invoked more often to bring about acceptable behaviour in a child than it is to inculcate beliefs. Is to tell a child that Mount Everest is the highest mountain in the world or that Evolutionary Theory, the cornerstone of modern biology, is true, indoctrination? Not if the child is encouraged to ask how or why the teacher "knows" this. It might be argued from this example that indoctrination is not logically bound to any particular content in the sense that a teacher could quite conceivably convince her students that the story of Noah's Ark, Virgin Birth or the Resurrection stories are true by suppressing all counter-evidence and inquiry concerning her claim.

 It seems clear that if one is to teach, and not indoctrinate, then as soon as a child reaches an appropriate level of intellectual sophistication (perhaps at the Junior High School level), opposing sides of controversial issues must be entertained and reasons provided based on the weight of the evidence and argument for or against either side. For example, skeptics never seem to appear on shameful television programs such as Oprah Winfrey, Phil Donahue and Geraldo Rivera in which the proliferation of absurdities and credulities appear endless. We would likely have no reason to fear indoctrination or television programs such as these if we fostered in our children the appropriate cognitive styles and intellectual dispositions such as the propensity to question, to doubt and to ask "Why?" We cannot, as Calan argues, undermine our children's "capacity for self determination" since their "rights as adults may be violated by what happens to them as children."[289] Appealing to the Kantian notion of respect for persons and Joel Feinberg's notion of "anticipatory autonomy rights," he states that we, as parents, do not have the right to obstruct our child's future capacity for open-minded inquiry and their ability to evaluate evidence and argument.

Richard Dawkins, the eminent Oxford evolutionary biologist argues that young minds are "pre-programmed to absorb useful information at a high rate" but at the same time find it difficult to "shut out pernicious or damaging information." Young minds, Dawkins asserts, are "open to almost any suggestion, vulnerable to subversion" and "friendly environments to parasitic, self-replicating ideas or information." He likens a child's mind to an "immune-deficient patient" which is "wide open to mental infection" and the incoming deleterious, malignant information as a computer virus.[290] Dawkins refers to these infectious ideas as memes,[291] ideational organisms generally having great psychological appeal, spreading from one receptive mind to the next. The survival value of a meme depends upon its ability to provide us with emotionally satisfying answers to our deepest disturbing existential concerns and dissolve our anxiety about the injustices of an indifferent universe. Dawkins cites "belief in the afterlife" and "belief in a supreme being" as having high survival value, capable of being passed on from one culture and generation to the next.       

It can be argued, for example, that telling children Santa Claus is a real person; we are not really engaging their active imagination. We are propagating a deception, an illusion – in short, a lie. Belief in Santa Claus is convenient and perhaps also charming and enchanting up to a point; but is such charm and convenience worth the price of lying to one's children and discouraging their intellectual curiosity and their respect for truth and honesty? It is no different with critiques of organized religion. Truth is not determined by reflections on social convenience. On the contrary, social expediency depends upon whether a belief is true! To encourage false beliefs and to protect them by discouraging, if not prohibiting, honest discussion and free inquiry may well be expedient in the extreme. Those who assume some beliefs, even if false, are necessary to preserve morality have a peculiar notion of morality and imply that dishonesty and rigorous discrimination against honesty are moral. However, parents who teach their children about God, the Devil, Heaven and Hell, Angels and other metaphysical fictions are not knowingly deceiving their children since, in most cases, they are inclined to believe these things themselves. The fact that children in their "preoperational stage" of development, to use Piaget's phrase,[292] have difficulty in differentiating between fact and fiction, we, as parents and teachers, have a responsibility not to take advantage of their cognitive immaturity, vulnerability, credulity, and reliance on us for accurate information and correct undistorted descriptions of the world. Do we need these myths and deceptions to teach children about love, good will, well-being and the spirit of goodness and generosity? I think not.

One of the dilemmas that humanist liberal educators face is the conflict between their desire for a school environment embracing a purely secular open-minded, autonomous, critical and rational pursuit of the examined life and the freedom of the individual to, on the other hand, choose and commit himself to what ultimately may be an unreflective life of religious faith and unreason. As Eamon Callan has stated, “the moral problems of religious upbringing may grow out of a radical conflict between the twin ideals of educational liberalism. For if the examined life requires something approaching strict fidelity to the rational-critical principle, coming to live that life would make the option of religious practice virtually ineligible; and where that option does more or less disappear, it is not clear that one enjoys an ampler range of choice than the indoctrinated zealot who cannot seriously consider alternatives to his faith.”[293]

Moreover, in a liberal democracy there are serious practical and moral difficulties in any government taking a strong paternalistic stand on the problem of indoctrination by closely scrutinizing whether or not parents are causing irreversible harm to their children's future ability to make autonomous rational choices? The essential tension between religious faith and the Socratic ideal of the examined life must, however, be "made vividly apparent to children and adolescents as they grow in understanding, even if this obstructs parental efforts to elicit faith in many instances."[294] As Callan has clearly pointed out:

The experience of examining religious propositions in the often harsh light of reason will sometimes, perhaps commonly, lead to their rejection, but without that experience our children remain ignorant of the reality that confronts them in accepting or rejecting lives grounded on such propositions. Those whose faith can survive the experience will not be entirely at home in either Athens or Jerusalem, but if there is faith worth having, they are the ones who have it.[295]


          5.3 Educators are Concerned With Belief

The public education system and its teachers are not merely neutral distributors of information. Real teaching does not simply involve the dispensation of raw data, information and prescriptions without providing justification, supporting evidence and reasoned argument. Non-judgemental neutrality in education is not only indefensible, but impossible, unless one is prepared to accept a position of epistemological relativism. The state, for example, in favouring evolutionary theory over creationism in the school curriculum is not taking a neutral position. The content of public schooling cannot be neutral among conflicting knowledge claims; nor can it be neutral among competing conceptions of the good life, and if it could, surely we would not and should not care to support it. The system chooses evolutionary theory over creationism, for example, because it meets the disciplinary criteria and standards of science. But teaching evolution, or anything else for that matter, must be conducted in an atmosphere of open-minded inquiry and critical skepticism with students encouraged to freely examine, question and challenge the evidence and arguments offered in their support. Indoctrination, it would seem, is clearly more a function of how something is taught rather than what is taught.

Forcing teachers to teach creationism, as many fundamentalists would have us do, restrict rational inquiry for the sake of furthering sectarian religion and therefore is repressive. Other judgements require more extended arguments: Is it repressive to teach evolution but to require equal time for creationism? If equal time for creationism entails teaching that it is reasonable to believe that the world with all its creatures was specially created by a deity some several thousand years ago within a time frame of seven days as it is to believe that it occurred naturally and took much longer, then the demand for equal time is indirectly repressive: it undermines the secular standards of reasoning that make democratic education possible. If public schools are permitted to teach the reasonableness of creationism, then the same principle will allow them to teach the reasonableness of divine punishment for the sins of non-Christians or any other minority that happens not to control the school curriculum. On the other hand, if teachers may subject creationist ideas to the same standards of reasoning to which other views presented in the classroom are subjected, then the demand for equal time may be benign - or even conducive to democratic education. Of course, this is not the interpretation of equal time that proponents of creationism have in mind. A disturbing footnote to this discussion is a study conducted by R. A. Eve and D. Dunn (1990) concerning the pseudoscientific beliefs of American biology and life science teachers. In their sample, forty-five percent agreed that "Adam and Eve were the first human beings and were created by God", twenty-five percent agreed that "God created humanity pretty much in its present form within the last 10,000 years" and thirty percent would teach only creationism in their science classes if forced to choose between creationism and evolution.[296] Eve and Dunn concluded that "there is reason for serious concern regarding the quality of science (especially biology) education in the U.S."[297]

As educators, if we are concerned with cultivating autonomous critical thinkers we must foster in our students the notion that truth is often tentative and transitory. Genuine knowledge is a difficult commodity to secure, but belief is not easily purchased either. If we, as educators, are not concerned with what our children or we believe - and this became a societal norm - it seems there would be little need for our services. As John Wilson has recently stated, "what would be the point in spending time, money and effort in changing our pupil's beliefs and attitudes, if we have no reason to believe that we are changing them in the direction of reason and truth."[298] One of the basic premises of public education is that we care about, not only the beliefs that our children come to hold, but more importantly, the procedures and standards employed in arriving at those beliefs and how those beliefs are held. We do not choose our beliefs as we would our clothing or furniture. In this sense, beliefs appear to be, as  W. K. Clifford would argue, involuntary - forced upon us by evidence, rational argument, and reliable, impartial authority or, unfortunately, by indoctrination. For surely education must at least be concerned with these procedures and with correcting one's own beliefs about various matters, bringing them into line with those beliefs accepted by the acknowledged experts in the field under discussion.


          5.4 Credulity, Truth, Constructive Skepticism and Education         

The high level of credulity of the general populace is an unsettling reality which should be of major concern to educators.­[299] Our young people are the constant targets and victims of proselytising apologists for a very large range of conflicting views, some of which a vastly stronger case can be made than for others. If one is aiming to educate, not indoctrinate, one tries to show not only how far particular proposed conclusions are grounded in evidence and well formulated argument, but also how, generally, to go about testing, confirming or falsifying any proposed conclusion in those fields. As W. K. Clifford has intimated, there are certain intellectual traits or habits of mind that must be cultivated in order to avoid the vices of credulity, what I have referred to as constructive skepticism. As I have already claimed, we cannot choose our beliefs in the same cursory man­ner as we choose our furniture or clothing. Lorraine Code has written:

Cherished beliefs pose formidable bastions of opposition to epistemic change. In fact, there is undeniable tension here, for a responsible attitude to knowledge and belief in general is manifested, in part, in caring about what one claims to believe or know. People for whom believing or not believing, knowing or not knowing, are matters of indifference are unlikely to meet even the least stringent requirements of epistemic responsibility. But caring too much, holding on too tenaciously in the face of contradictory evidence is as bad as caring too little. We are led, in the end, to see just how apposite is Aristotle's doctrine of the mean.[300]

In moral education we must be concerned with, not only the question "What if everyone did X?", but also with the question, "What if everyone believed X?" Underlying this conception of education is the ideal of a person who has acquired those intellectual virtues[301] that enable him to believe responsibly, to have reverence for truth and respect for sound judgement, and to have a propensity for self-criticism[302] and open-mindedness.[303] Bertrand Russell maintained that a critical skepticism is commensurate with the concept of liberty, the principle of free expression, open-mindedness and what he called, "truthfulness."

The fundamental argument for freedom of opinion is the doubting of all our beliefs. If we certainly knew the truth, there would be something to be said for teaching it. When the State intervenes to censure the teaching of some doctrine, it does so because there is no conclusive evidence in favour of that doctrine. The result is that the teaching is not truthful, even if it should happen to be true. The difference between truth and truthfulness is important in this connection. Truthfulness, as I mean it, is the habit of forming our opinions on the evidence, and holding them with that degree of conviction that the evidence warrants. This degree will always fall short of complete certainty, and therefore we must be always ready to admit new evidence against previous beliefs.[304]

Russell is not arguing against the concept of absolute truth - there is a difference between absolute truth and the conviction of certainty in one's claims to the truth. To hold that truth is an absolute; i.e., a time-independent and person independent property of ideas or beliefs is not the same as to suppose that one can never be certain that we have the truth. Hence, "it is logically possible to deny certainty (and, therefore, dogmatism) and yet to uphold an absolute theory of truth."[305] Dogmatism rests on a conclusion of certainty (or indubitable intuition), not absolute truth. The denial of certainty is not relativistic truth, but fallibilism. As C. S. Peirce has stated, "Estimation of truth thus alters in the course of our experience, but it does not follow that truth itself is altered or alterable."[306] It seems necessary, therefore, to retain the notion, if not of absolute truth, at least of moving closer to the truth. Karl Popper, not unlike Russell, wrote:

Although I hold that more often than not we fail to find the truth, and do not even know when we have found it, I retain the classical idea of absolute or objective truth as a regulative idea; that is to say, as a standard of which we may fall short.[307]

This view of truth is commensurate with the Platonic ideal that it is only by conceiving of an absolute truth that we can make sense of approximating or approaching the truth or the possibility that we might be in error.


          5.5 Belief and Critical Thinking

 Surely one of the key objectives in advancing critical thinking is to make our students aware of the importance of belief management. This involves: (1) knowing how to critically evaluate the reliability of authoritative knowledge, (2) believing what we have good reasons to believe and not believing what we have good reasons not to believe, (3) realizing that the degree of assent to a belief should be proportionate to the strength of the evidence in support of that belief and not based on the intensity of the belief or our self-deceptive desires for wanting it to be true, and (4) being prepared to modify or reject beliefs if counter-evidence or counter-argument is disclosed.

John McPeck defines critical thinking as the process of "justifying one's beliefs." This process involves assessing "the veracity and internal validity of the evidence and determining whether or not it is coherent and consistent with one's existing belief system."[308] McPeck would agree with Russell when he argues that "reflective skepticism" is a necessary factor in the appraisal of any new proposition or idea and that we should never assent to a belief, but hold it provisionally or tentatively, until it is exposed to judicious critical scrutiny. Furthermore, beliefs should never be held in such a way that they may never be revised, or even rejected, in the light of new evidence and argument concerning their "truth or validity."[309] McPeck's expression "reflective skepticism" is construed as "the kind (of skepticism) we engage in when we have reason to suspect that the normal procedures, or beliefs, leave something to be desired."[310]

Harvey Siegel (1980) and John Passmore (1967) have proposed that we, as educators, must cultivate in our students the "critical spirit" or "critical attitude" - dispositions that are part of the moral outlook, an ideal of character. Harvey Siegel conceives of critical thinking as "an embodiment of the ideal of rationality" which, in turn, is "coextensive with the relevance of reasons."[311] Siegel argues that "to seek reasons is to commit oneself to principles governing such activity" which entail "judging non-arbitrarily", impartiality, and objectivity. Expanding upon Ennis' (1962) conception[312], Siegel concludes:

Critical judgement must, therefore, be objective, impartial, non-arbitrary, and based on evidence of an appropriate kind and properly assessed.[313]

John Dewey's conception of rationality as dispositional is echoed by Israel Scheffler who describes "rational character" as constituting an "intellectual conscience" which "monitors and curbs evasions and distortions...combats inconsistency, unfairness to the facts, and wishful thinking." By exercising control over undesirable impulses, it "works for a balance in thought" - an "epistemic justice."[314]

On the other hand, Barry Beyer views credulity, as it was for W.K. Clifford and Bertrand Russell, as an intellectual vice:

The single most important criterion for acceptance as a critical thinking skill must remain that the skill seeks primarily to differentiate truth from falsehood, fact from fiction. A critical thinker approaches information with a healthy skepticism about what is really true or accurate or real as well as with a desire to search through all kinds of evidence to find that truth.[315]

Critical thinking is discriminating, disciplined, and questioning. We often naively assume that the opposite of critical thinking is creative thinking, but its actual opposite is undiscriminating, undisciplined, and unquestioning thought - in short, the gullible acceptance of claims without careful analysis of their bases of evidence, reasons, and assumptions.[316]

Matthew Lipman provides arguments not unlike the arguments supplied by Beyer. He agrees with Beyer's assertion that definitions and predicted outcomes of critical thinking have been, respectively, too vague and too narrow. Lipman states that the outcomes of critical thinking are judgements. If we conceive of education as inquiry - transmission of knowledge and cultivation of wisdom - then what is wisdom? Lipman maintains that wisdom is "the characteristic outcome of good judgement and good judgement [is] the characteristic of critical thinking."[317] Critical thinking, he asserts, is skilful, responsible thinking that facilitates good judgement because it (a) relies on criteria, (b) is self-correcting, and (c) is sensitive to context.[318]

The requirement that critical thinking be skillful thinking connects it with epistemological and other normatively relevant considerations of practice. Skillfulness points to the fact that critical thinking is embedded in contexts that furnish reliable information and warranted methodology. Critical thinking is not indifferent to the norms of the various fields of inquiry, rather it looks to appropriate practice for the standards that have proved useful so far in supporting warranted inquiry of all sorts, and for the most reliable information from which inquiry draws its relevance and strength.

Responsible thinking points to the relationship between the critical thinker and the community that he addresses. The critical thinker sees an obligation to present reasons in light of acceptable standards, or to challenge such standards by relevant and persuasive argument. Such reasons are subject to the judgement of competent members of fields relevant to the issues involved, and the critical thinker is obliged to address such members and reflect upon their judgements when making claims and presenting arguments and analyses.

Through the focus on judgement, critical thinking is seen as directed towards non-routine thinking, thinking that cannot be adequately based on recipes, algorithms or mechanical procedures. It is called for in those situations in which considerations must be weighed and alternatives assessed in situations that call for the assessment of priorities and determinations of truth and relevance.

A criterion, Lipman states, is "a rule of principle utilized in the making of judgements" and he outlines the logical connection between critical thinking and criteria and judgement. Criteria are reasons, but reasons which are reliable - "good reasons." Critical thinkers rely upon criteria that have stood the test of time such as validity, evidential warrant, consistency, and relevance. Criteria, however, may not have a high level of public assent (many people are not interested in objectivity and a search for truth), but have a high level of acceptance and respect within a community of inquirers. Lipman distinguishes between criteria and standards, pointing out that standards represent a vast subset of criteria, as criteria can be thought of as a subset of reasons. Criteria specify general requirements, while standards represent the degree to which these requirements need to be satisfied in particular instances. 

Lipman argues for a sort of epistemic responsibility - what he refers to as cognitive accountability and intellectual responsibility. Ultimately, we want students to think for themselves; to enable them to develop intellectual autonomy and intellectual empowerment, and this requires accepting responsibility for good thinking and decision making. This aim will require a component of critical thinking which he calls "self-correcting inquiry"[319], which aims to discover weaknesses in one's own thinking and rectify what is at fault with the methodology and finally, Lipman stresses that critical thinking must be sensitive to context. This takes into account: (i) exceptional or irregular circumstances and conditions, (ii) special limitations contingencies, or constraints, (iii) overall configurations, (iv) the possibility that evidence is atypical, and (v) the possibility that some meanings do not translate from one context to another.

In addition, critical thinking discussions, according to Lipman, are best carried out within a John Dewey style community of inquiry, that is, within a group of individuals for whom the pursuit of inquiry and the norms that it entails are the governing considerations. A critical thinker, through dialogue, strives after truth and other normatively appropriate goals. The outcome of inquiry is to be judged, for example, in terms of epistemological rather than rhetorical norms; in light of moral ends rather than mere expediency or efficacy.

In sum, Lipman insists that good judgement cannot be operative unless it relies upon proficient reasoning skills that can assure competency in inference, as well as upon proficient inquiry, concept formation, and translation skills. Critical thinking conceived as "skilful thinking" - thinking that satisfies relevant criteria - dictates that one orchestrate a vast variety of cognitive skills, grouped in categories such as reasoning skills, concept formation skills, inquiry skills, and translation skills. The philosophical disciplines alone, Lipman claims, provide both the skills and the criteria that are presently deficient in the curriculum.

Because of his emphasis on criteria, Lipman's account perhaps comes closest to bringing out the sense in which critical thinking is "critical". However, even if we grant that Lipman has specified three properties of critical thinking, it is not clear that they define it. A thinker might be engaged in self-corrective thinking, be sensitive to context, and be guided by criteria, and still fail to be critical. For example, suppose self-criticism refers to that process whereby one looks critically at one's beliefs, theories, and so forth. This is not, however, enough, since it might be that an individual is quite good at this and yet be highly resistant to criticism from others. If the capacity to take criticism from others is an essential feature of the critical thinker, then being self-critical is not enough. It is equally important to take criticism and learn from it, but unfortunately many people find it extremely difficult to accept criticism of their thinking or their deeply held beliefs, particularly when their beliefs lack evidential support or plausibility.

Richard Paul argues that we have a natural tendency toward "egocentricity" and "ethnocentricity" - a tendency to assume our perspectives and our culture's perspectives to be the only plausible ones, to resist issues from the perspectives of other persons or cultures. Our "primary nature" is spontaneous, egocentric, credulous, and strongly prone to irrational belief formation.

People need no training to believe what they want to believe, what serves their immediate interests, what preserves their sense of personal comfort and righteousness, what minimizes their sense of inconsistency and what presupposes their own correctness.[320]

The many miracle stories in the Bible are a case in point. These are typical miracle tales of the sort no intelligent Christian would believe for a moment if he came upon them in the Koran.

R. S. Peters states "the irreconcilability of the use of reason with egocentricity and arbitrariness is a reflection of its essentially public character."[321] Impartiality and inter-subjectivity - the appeal to public, impersonal tests - deny any appeals to revelation, intuitive insight, or any other privileged access as criteria for rationality. Reason is, in this sense, impersonal, for by "partiality" we unequivocally mean "the intrusion of irrelevant factors, say private, idiosyncratic associations or more often, private hopes or private fears." Moreover, "rational people have no fondness for miracles which here, in the popular sense, mean arbitrary discrepancies"[322] or what Hume called "a violation of the laws of nature."[323] Premises arising from the aforementioned sources must be rejected as inadequate on the grounds that they involve privileged access, faith or dubious testimony. Hume asserted:

No testimony is sufficient to establish a miracle, unless the testimony be of such a kind, that its falsehood would be more miraculous, than the fact, which it endeavours to establish...whoever is moved by Faith to assent to it is conscious of a continued miracle in his own person, which subverts all the principles of his understanding and gives him a determination to believe what is contrary to custom and experience.[324]

Hume's general principle for the evaluation of testimony, which we have to weigh the unlikelihood of the event, reported against the unlikelihood that the witness is mistaken, deceived or dishonest, is substantially correct. It is a corollary of the still more general principle of accepting whatever hypothesis gives the best overall explanation of all the available and relevant evidence. For many people, at least from my own experience, their beliefs seem to be a function of their upbringing and a propensity to believe what they are told. Of course it must be admitted that most of what we know depends upon testimony and authority but in most cases the chain of testimony must come to an end. We can only learn from testimony what, at some point, was learned by means other than testimony.

What is needed, it seems, is a sort of intellectual modesty - a move in the direction of skepticism. As C. S. Peirce states in the "Fixation of Belief", "doubt is an uneasy and dissatisfied state from which we struggle to free ourselves and pass into the state of belief; while the latter is a calm and satisfactory state which we do not wish to avoid, or to change to a belief in anything else. On the contrary, we cling tenaciously, not merely to believing, but to believing just what we do believe,"[325] partly because "the instinctive dislike of an undecided state of mind, exaggerated into a vague dread of doubt, makes men cling spasmodically to the views they already take."[326]

Bertrand Russell declared that credulity is "one of the chief obstacles to intelligence" and, echoing W. K. Clifford, stated "the aim of education should be to cure people of the habit of believing in propositions for which there is no evidence."[327] Continuing his relentless attack on William James' pragmatism, he asserted:

William James used to preach the "Will to Believe." For my part, I should wish to preach the "Will to Doubt". What is wanted is not the will to believe, but the wish to find out, which is the exact opposite.[328]

The "wish to find out" is a disposition of sustained intellectual curiosity and sense of wonder - the "awe factor." It involves sensitivity to unanswered questions and explanatory gaps with the attendant ability to challenge assumptions and detect hidden premises.

          5.6 Fallibilism and Constructive Skepticism

The "egocentric mind", says Richard Paul, requires a moderated skepticism and a willingness to suspend judgement pending evidence - a capacity that he calls a higher order "secondary nature" skill. As Wittgenstein rightly pointed out, without doubt there would be no need for inquiry. The assumption of certainty and infallibility for a belief implies that there in no need for either reflection or inquiry. The strength of constructive skepticism lies not in whether it is tenable as a philosophical position, but in the force of its arguments against the claims of dogmatism. Paul describes what he calls "dialogical thinking" - an ability to look at problems from multiple points of view and different frames of reference. He stresses developing critical thinking in the "strong sense" - teaching it so that students "explicate, understand, and criticize their own deepest prejudices, biases and misconceptions, thereby allowing them to discover and contest their own egocentric and sociocentric tendencies."[329] Not unlike Siegel and Passmore, the normative component of Paul's argument is made clear in his plea for developing in our students intellectual virtues which he calls the "rational passions."

A passionate drive for clarity, accuracy, and fair-mindedness, a fervour for getting at the bottom of things, to the deepest root issues, for listening sympathetically to opposition points of view, a compelling drive to seek out evidence, an intense aversion to contradiction, sloppy thinking, inconsistent application of standards, a devotion to truth as against self-interest. [330]

Here Paul stresses the dispositional requirements of perspicacity, the need for conceptual clarity and understanding of essences, sensitivity to vagueness, ambiguity and superficiality, attention to precision and detail, alertness to error and fallacious argumentation and the importance of meta-cognitive skills.

In my view, the great strength of Paul's account is that it forces us to think about the extent to which critical thinking depends upon the capacity of the individual toward reflective self-criticism -the ability to distance ourselves from our beliefs, to be cognitively self-aware. We must avoid self-deception and achieve a sense of humility[331] in the face of the fallibility of most of what we construe as knowledge. Charles Darwin, for example, apparently engaged in the practice of making a note of all possible objections to his theories the moment he encountered them. Darwin claimed that he "followed the golden rule, namely that whatever new observation or thought came across me, which was opposed to my general results, to make a memorandum of it without fail and at once; for I had found by experience that such facts and thoughts were more apt to escape from the memory than favourable ones."[332] In other words, evidence favourable to our beliefs and hypotheses is readily assimilated and recalled, whereas unfavourable or dis-confirmatory evidence is often ignored and forgotten. Darwin was fully aware of the irrational fallacy of conformation bias. Lorraine Code, drawing heavily upon C. I. Lewis (1956), provides a compelling argument for what she calls "normative realism" - the "aim to understand how things really are."[333]

A kind of normative realism constitutes the implicit ideal of good knowing at the core of correspondence and coherence theories of truth and knowledge. Although actual correspondence relations are difficult, if not impossible to establish, sustaining the effort to do as well as possible is a mark of a virtuous intellectual conduct.[334]

Implicit in normative realism is the view that to be a good knower is to have a fundamental respect for truth. A good knower seeks to achieve knowledge that fits the world of experience, is coherent with rationally established truths, and enables one to live well, both epistemically and morally.[335]

Achieving these goals requires both honesty and humility: honesty not to pretend to know what one does not know (and knows one does not) or to ignore its relevance; humility not to yield to temptations to suppress facts damning to one's theory.[336]

Loraine Code claims that justification should not be impersonal or objective in the sense of psychological detachment in which the agent is concerned only with propositions and their logical relations, but should focus on the knowing subject. However, this does not preclude the notion of an objective reality, of how things "really are," the possibility of a mind-independent reality which is external and indifferent to our absolute convictions and which determines what is true.

Probability has to be the guide to life in light of the fact that we almost always have to act under uncertainty. Without uncertainty there would be no hope, no free will and no ethics. An essential part of wisdom is the ability to determine what is uncertain or implausible; that is, to appreciate the limits of knowledge and to understand its probabilistic nature in many of life's situations. Paradoxically, however, while we strive to reduce the uncertainties and contingencies in our lives and of the social, political, economic and natural environment, ultimate success in this endeavour; that is, total success, would be horrific. If there were no uncertainties about the consequences of behaviour, for example, ethics and morality would become redundant because uncertainty is a necessary precondition for the very existence of ethical choice. This is because the ethics of a decision is not judged post hoc on the basis of the consequences that happen to follow.     

 We must face the fact that all but the most basic of our actions are connected to the purposes for which we choose them by beliefs that are only more or less probable. We generally dread uncertainty, however, and a common way of dealing with the uncertainty in life is to ignore it completely, or to invent some "higher rationale" to explain it, often a rationale that makes it more apparent than real. This "higher rationale" is often found in religion, but many who have abandoned traditional religion replace the same dread of uncertainty with a belief in some other "higher order" - quasi-religious facile self-help books, astrology, tarot cards, scientology, and innumerable other pseudo-scientific and paranormal systems of belief. These replacements for mainstream religion help many people "make sense" of life's uncertainties and vicissitudes, which are believed to be part of some deep underlying structure that they strive to understand. There is structure in the universe, but is it related to the course of an individual's life? Shaking off the anguish of uncertainty in our lives and the need for denying its existence is extraordinarily difficult; even for those who have a profound and compelling intellectual belief that the universe is not constructed to serve human needs. These same people typically wonder what they "did wrong" when some tragedy enters their lives such as developing a terminal illness. Furthermore, people even tend to deny the random components in trivial events that we know to be the result of chance and often treat chance events as if they involved skill and are, therefore, controllable. For example, observe the mannerisms of gamblers as they attempt to influence the outcome of a roll of the dice or the belief that one's own choice of numbers on a lottery will have any effect on the outcome. One serious problem that needs to be addressed by educators is the lack of understanding by the general populace, not only of science, but of the most basic principles of probability theory. The acceptance of the "gambler's fallacy" by the majority of the general public is a blatant example of this appalling misunderstanding of the most basic elements of mathematical probability.                                 

William Hare has argued that students and teachers "need to recognize the vulnerability of their beliefs to counter-evidence and counter-argument" and that the "ability to doubt" is crucial if we are to "entertain criticisms of our beliefs."[337] An education, it might be said, should leave students with the ability to doubt rather than the inclination to believe. Hare reminds us of Dewey's observation of the "over-simplified" human tendency to think in terms of either-or, "hard-and-fast alternatives" - a dualistic form of reasoning in which "they assume that an answer must be right or wrong."[338] "This particular dichotomy of ability to doubt versus inclination to believe", Hare claims, "tends to blind us to Hume's insight that belief can be proportioned to the evidence."[339] This probabilistic, dynamic and reversionary approach to knowledge is supported by the assertion that:

Fallibilism recognizes that our claims to knowledge rest on reasons and evidence, and our awareness and understanding of the latter can change. This view, then, is incompatible with the kind of skepticism that regards all claims and interpretations as equally dubious. If, however, those who advocate skepticism as an aim of education really mean to emphasize the point that the last word has never been said, that our assessment of reasons and evidence may in time lead us to a new view, then this is in fact a way of making the point that knowledge is tentative. Clearly this view of skepticism leaves intact the legitimacy of appeals to reason and evidence, since it is in terms of the latter that a new view will be framed. Teachers who embrace fallibilism recognize the possibility of improving their present knowledge and understanding.[340]

Fallibility is one of the universal and inevitable conditions of our humanity. There is no possibility of a choice between fallibility and infallibility. It is plainly our fallibility that is the primary reason why we must be continually open to skeptical scrutiny and rational criticism. It also leaves us vulnerable to error and failure.

A study of fallibility, error, failure and feelings of failure is perhaps healthier that the total denial of failure that seems to pervade the education system today. Of course we want students to succeed and have self-esteem - but not at everything and, certainly not at everything equally. And there is surely no need to mark them as failures - to expose their shortcomings to the world as a "permanent record." A life-oriented education would assist students in realistic self-evaluation, in seeing both the humour and sadness in failure, and in recognizing that some mistakes and failures contribute to our real talents by learning from them.

But along with humility and temperance as key intellectual virtues, there is required a sort of intellectual panache - the courage to engage in intellectual risk - to take the time to examine grounds for new and adventurous ideas to challenge the credentials of things it is customary to believe. I am not here referring to "the power of positive thinking", "wishful thinking", or wasting one's time in pursuing propositions which are glaring absurdities. Intellectual courage is the "willingness to conceive and examine alternatives to popularly held beliefs, perseverance in the face of opposition from others (until one is convinced one is mistaken), and the Karl Popper willingness to examine, and even actively seek out, evidence that would refute one's own hypothesis."[341] Karl Popper's account of science is one of walking the tightrope of sharp criticism and testing of bold new hypotheses and theories with the attendant risk of falling into "the chasm of refutation."[342] However, this is not something we are naturally inclined to do. Recent psychological research on reasoning suggests that the existence of possible counter-examples is not a major consideration when people decide whether or not to accept an inference. People seem more inclined to search out confirmations of new or pre-existing beliefs (confirmation bias) and acceptance of any new candidate for belief is dictated by whether or not it "fits in" with their experiential model and other beliefs. Hence, students should be encouraged to take risks with new ideas and original solutions to problems and to present their arguments and views in class without fear of error or criticism.

Although one must be open-minded in the manner described by William Hare (1979, 1985), the strength of a mitigated and tempered skepticism lies in the force of its arguments against metaphysical gibberish and the claims of dogmatism. As Hume has admitted, one of the characteristics of skeptical argument is that "it admits of no answer, and produces no answer", but the strength of skepticism lies not in whether it is tenable as a position but in the force of its arguments against the claims of dogmatism. Simply to live a superficial and unreflective life according to the dictates of custom, habit or external authority renders "the greater part of mankind...naturally apt to be affirmative and dogmatic in their opinion," but "a small tincture of Pyrrhonism" might imbue "a degree of habit, and caution, and modesty which, in all kinds of scrutiny and decision, ought for ever to accompany a just reasoner."[343] The warrant for saying that good philosophical reason for doubt is easily purchased and when knowledge is conceived as a prized possession, one should not be surprised that it is hard to come by. The role of skepticism is in the questioning of whether a person has adequate grounds for his assertions and assumptions and whether his belief system is free from wishful thinking, fallacious reasoning, contradiction or absurdities. Skeptical arguments tend to be parasitic, in that they assume the premises of the dogmatist and point out the logical inconsistencies and other faulty standards of reasoning of the dogmatist. This is the essence of the Socratic approach. The purpose of skeptical scrutiny is to inquire into the evidence for one's beliefs and the adequacy of that evidence.

 Skepticism has been a major dynamic force in intellectual history and without it we could never distinguish superstition, speculation, or emotional responses from meaningful coherent beliefs. Without skepticism the enlightenment and scientific revolution could not have occurred and we would have continued to suffer intellectual inertia under the domination of religious dogma. As C. S. Peirce has written, "All the progress we have made in philosophy, that is, all that has made sense since the Greeks, is the result of that methodological skepticism which is the first element of human freedom; doubt is not a habit, but the privation of habit."[344]  As Peirce has pointed out, once beliefs are in place, they can prove tenacious, even when the original arguments that convinced us of them are discredited. One must not, as Sartre has pointed out, act in "bad faith" by abdicating our sense of rational responsibility to self-criticism and self-assessment by deceiving ourselves into accepting beliefs for which there is no basis in fact. When people defend creationist "science" by insisting that "evolution is only an hypothesis" one suspects that their view is only sustained by a wilful failure to explore the structure of hypotheses and the difference between unverifiable speculation and scientific theory. Such an attitude suggests that we are incapable or unwilling to take responsibility for our beliefs by subjecting them to rational criticism. Christians, for example, believe in free will and responsibility, but they often abdicate or exempt themselves from autonomous epistemic responsibility by dogmatically accepting the authority of biblical claims. Moreover, we need to be reminded that a "theory" is an examination and explanation of "facts" - as many conflate theory with conjecture or hypothesis.

John Stuart Mill once stated "The fatal tendency of mankind to leave off thinking about a thing when it is no longer doubtful is the cause of half their errors." Bertrand Russell, always the skeptic, but who has declared "I wish to guard myself against being thought to take an extreme position"[345] has severely denounced all forms of dogmatism, particularly the political and religious varieties.

William James used to preach the "will to believe." For my part, I should wish to preach the "will to doubt." Every man of science whose outlook is truly scientific is ready to admit that what passes for scientific knowledge is sure to require correction with the progress of discovery; nevertheless, it is near enough to the truth to serve for most practical purposes, though not for all. In science, where alone something approximating to genuine knowledge is to be found, men's attitude is tentative and full of doubt.

 In religion and politics, on the contrary, though there is as yet nothing approaching scientific knowledge, every­body considers it de rigueur to have a dogmatic opinion, to be backed up by inflicting starvation, prison, and war, and to be carefully guarded from argumentative competition with any different opinion. If only men could be brought into a tentative agnostic frame of mind about these matters, nine-tenths of the modern world's evils would be cured. War would become impossible, because each side would realize that both sides must be in the wrong. Persecution would cease. Education would aim at expanding the mind, not at narrowing it. Men would be chosen for jobs on account of merit, aptitude and fitness to do the work, not because they followed the irrational dogmas of those in power. Thus, rational doubt alone, if it could be generated, would suffice to introduce the approaching new millennium. [346]

Russell felt that if society could rid itself of superstition and authoritative belief systems, and teach men and women rationality and critical thought, "it would completely transform our social life and our political system" and "would tend to diminish the incomes of clairvoyants, book-makers, bishops, and others who live on the irrational hopes of those who have done nothing to deserve good fortune here or hereafter."[347]

Russell was a severe critic of the education system of his own day and was appalled by the level of credulity of the general public. If education is simply conveying factual information and teaching basic skills, implicitly denying the fallibility of much of what we claim to know, then there is little need for conscientious inquiry, open-mindedness, or healthy constructive skepticism. Russell argued that "Education should have two aims: first, to impart the basic skills and knowledge such as reading, writing, language, mathematics, and so on, and second, ...to create those habits which will enable people to acquire knowledge and form sound judgements for themselves."[348] Russell felt that "one of the chief obstacles to intelligence is credulity," and that "the aim of education should be to cure people of the habit of believing in propositions for which there is no evidence."[349]  Russell would endorse critical think­ing if he were alive today. He bemoaned the fact in his own day, that "...it is not desired that ordinary people should think for themselves, because it is felt that people who think for themselves are awkward to manage and cause administrative difficulties."[350] Russell would agree with Nietzsche's claim that the schools "have no more important task than to teach rigorous thinking, cautious judgement and consistent reasoning."[351]

People tend to be "true believers", effortlessly, automatically, and uncritically taking in new ideas without reflection or rational analysis. With the rise of religious fundamentalism and religious cults in recent years we are witnessing the development of a mass state of mind that depends upon suspension of critical faculties - on developing the Alice-in-Wonderland capacity to believe is six impossible things before breakfast every day. The French mathematician Henri Poincare once remarked that rampant credulity is caused by the fact that the truth can often be stark and cruel, and hence we would rather console ourselves by a process of self-deception and delusion. Skepticism also challenges established institutions and perhaps we fear, as Russell suggested, if we teach students to be critical thinkers that they will not restrict their skepticism to outrageous television programs and commercials, horoscopes, and crystal ball gazers. Maybe they will start asking awkward questions about economic, social, political, and religious institutions.                       

Yes, skepticism is a risky business - but one must examine the alternatives. How are we to negotiate a very tenuous future if we don't instil in our children the intellectual dispositions and tools to ask the crucial and urgent questions of those in positions of authority in our so-called "democratic" societies? Particularly during difficult times, the tendency of a society leans toward conformity: the herd huddles together and becomes more intolerant than ever, often searching for a scapegoat and for a simple solution to complex problems when what is needed is constructive criticism and creativity. The men who conducted the Greek heresy trials, the Inquisition, the Nazi death camps and the communist witch hunts and "Red Scares", those zealots who went on Crusades and to holy wars were not critical thinkers or constructive skeptics - they were conformists, men of the crowd, true believers. As Eric Hoffer has stated, "the less justified a man is in claiming excellence for his own self, the more ready he is to claim all excellence for his nation, his religion, his race or his holy cause."[352]  There is presently enough nonsense and rubbish disseminated by the political parties, commercial advertisers and Sunday morning evangelists that the propensity and habit of impartial skepticism should be encouraged as a national past time, like weight loss programs, quack remedies and physical fitness. Education is hardly worth its name if it allows unfounded claims and nonsense to go unchallenged, but is unworthy if it promotes insensitive and spiritually illiterate skeptics. Hence, by this I do not mean that debunking should become a national sport. Although skeptics have performed a useful social function by exposing the proliferation of charlatanism and fraud associated with televangelists, political demagogues, advertisers and the proliferation of paranormal beliefs, skepticism imposes an important responsibility. It is clear that when one simply and totally closes the door to further inquiry into a phenomenon, there is no room left to study it. Unless established by the proper intellectual dispositions and virtues - and scientific inquiry provide a paradigm of those virtues and dispositions - then such a course is not healthy skepticism. Skeptics should critically examine alternate beliefs and belief systems by first attempting to understand the historical, social and cultural perspective of the people who hold them. If they expect to be taken seriously, particularly by their targets, as tempting as it is, they should also try to avoid emotive epithets like "cranks", "crackpots", "claptrap" "mumbo jumbo" and "drivel" which sometimes accompany their debunking exposures. Otherwise, the materialistic, humanistic assumptions of some skeptics may paradoxically play themselves out in a kind of religious crusade endorsing scientific fundamentalism or a sort of deified rationality.

But even though charges against skeptics of closed-mindedness are often brought in as a shoddy rhetorical device to prop up an otherwise hopeless argument, there may be a serious, though muddled, point lurking somewhere behind such charges. Believers in God and creationism often point to the logical truism that it is impossible to prove the non-existence of some­thing. One must be cautious about using the claim "there is no evidence to the contrary" to establish the truth or existence of something. Maintaining a state of open-mindedness is certainly a good thing but in cases where there is no physical evidence either way (for example, Russell's "Five-Minute Hypothesis"[353]); a contention about something existing or being true often deflates into a vacuous metaphysical statement or mere speculation, precluding rational, scientific logical inquiry. But as Thomas Huxley once stated (later reaffirmed by Karl Popper and Carl Sagan), "extraordinary claims demand extraordinary evidence" and "the more a fact conflicts with previous experience, the more complete must be the evidence which is to justify our belief in it." There's also an adage that "anything that can be asserted without evidence can be dismissed without evidence." This is hopefully a rule followed in any court of law; to me it is a truism for living a reasonable life. But if critical thinking is to be effective, there is a necessity for a delicate balance between the two conflicting imperatives of skeptical scrutiny and open-mindedness. If one is able to operate in only one of these modes, then critical thought is impossible. All good scientists function in both modes but many politicians, theologians, advertisers, many economists, stock market analysts and the general populace rarely seem to do so.                   

 I have attempted to defend the enlightenment project of rationality and have presented arguments for a normative theory of critical thinking by specifying what I believe are the characteristic dispositions that will cultivate such thinking. Questions remain unanswered, many of which can only be answered by much needed empirical research. How can these dispositions be fostered? Are they innate? Are they a function of intellect? Can they be instilled by example? Dispositions are grounded in belief systems and we need a convincing culturally based account of their development that is not contingent upon the dubious premises of conceptual or cultural relativism. It is my view that, to a great extent, dispositions are acquired by and are a function of organizational and interpersonal social interaction. For example, if a child grows up in a family which is open-minded, anti-dogmatic, models acceptance of multiple points of view and which encourages their children to doubt and question may encourage those attendant dispositions. I also hope to have been convincing in my contention that belief acquisition involves an ethical component. Surely a strong desire for the truth is an intellectual virtue.

W. E. H. Lecky (1866) noted that every major shift in belief or worldview has been preceded by a change in the intellectual climate, and that the success of any particular view depends "less upon the forces of its arguments, or upon the ability of its advocates, than upon the predisposition of society to receive it."[354] Such a predisposition results from the intellectual type which characterizes the age. "A change of speculative opinions," he wrote, "does not imply an increase of the data upon which these opinions rest, but a change of the habits of thought and mind which they reflect. Definite arguments are the symptoms and pretext, but seldom the causes, of the change they derive their force and efficacy from their conformity with the mental habits of those to whom they are addressed."[355]  Lecky goes on to add that the number of persons "who have a rational basis for their beliefs is probably infinitesimal; for illegitimate influences not only determine the convictions of those who do not examine, but usually give a dominating bias to the reasoning of those who do. All that we can rightly infer is that the process of reasoning is much more difficult than is commonly supposed; and that to those who would investigate the causes of existing opinions, the study of predispositions is much more important than the study of arguments."[356] Hence, if Lecky is right, whether arguments command assent or not appears to depend less upon the plausibility, soundness and logic that conveys them than upon the climate of opinion in which they are sustained and the dispositional factors of the persons to whom they were directed. This does not mean that we should forget about teaching the sanctity of truth, reasoning skills and how to identify fallacious arguments - these are especially important[357] but dispositional considerations are perhaps more important than we once thought.

 In the classroom, teachers should stress the tentative nature of much of what we claim to know and appealing to rigorous intellectual standards and practices can only purchase that reliable knowledge. One of the central aims of education should be to transform the individual being taught. Like Socrates, the teacher need not know the answers, but is sure her questions will transform the way students think about a topic as well as transform their dispositions, habits, attitudes and beliefs as educated persons. No educational theory that lacks such a Socratic counterpoint can hope to free our young people to think new thoughts, to become new people, and thereby to revitalise the culture. One's education then is developmental, renewal or growth, to view one's life as a creative endeavour, and not just an accumulation of factual knowledge, a filling up of the mind. Developing the mind and one's powers of critical inquiry aims at a responsible kind of autonomy for the individual whereby the student is equipped to be able to make his own judgements using the best rational standards we have evolved as human communities of inquirers in the various forms of knowledge including the ethical realm. The "content" students learn about is secondary to the goal of developing individuals who use their intellectual autonomy responsibly and confidently and who are committed to tolerance, rationality, open-mindedness and constructive skepticism. The modern traditions of humanistic psychology, progressive educational theory, Kantian respect for persons and existential philosophy can serve contemporary education as intellectual and moral grounding for this transformative view.                                 

 Too much of what goes on in the classroom today is the product of a didactic pedagogy directed at ramming information into students minds to facilitate the reproduction of what they learn on an examination so that it can be easily quantified. This is not to say that we should not test the extent of a student's "factual" knowledge and understanding, but unless we focus our attention more on how and why we know what we claim to know, we can test little else. In twenty-five years of teaching senior high school mathematics I have found that most of the students I encounter in Grade 11, 12 and my Advanced Placement Calculus classes have limited conceptual understanding of mathematics or its philosophical and historical foundations. Moreover, due primarily to curriculum inflexibility, they are seriously lacking in their understanding of logic (deductive and inductive), reasoning skills and the mathematical foundations and principles that justify any mathematical knowledge that they claim to "know". Students may "know" that X2 + X2 = 2X2 and that X2 x X2 = X4 but they often do not understand or know the fundamental principles and logical processes that underpin and justify them; the "why" question is always crucial and of utmost importance in any inquiry. Mathematics is often taught as a book of recipes and formulae without any justification. Moreover, students rarely think of attempting to falsify a general principle that they intuitively think to be correct. For example, in elementary algebra students often wrongly assume that (A + B)n = An + Bn  a misapplication of an exponent law.[358] Students need to understand that polynomial multiplication is justified by a real number axiom called the distributive law, an extremely important algebraic axiom that should be clearly understood before a student exits grade nine. Instead, however, as I have already pointed out, students are often taught mathematics by a "recipe" approach - the memorization of procedures, processes, algorithms, formulas and mnemonic devices for carrying out algebraic operations. In science, for example, more effort should be directed at understanding its philosophical underpinnings and how and why scientific theories are derived, rather than simply knowing what those theories are. The scientific method (and there is much more to science, including creativity and conjecture) can be employed as a paradigm of intellectual integrity by explaining the standards, principles and practices that have been employed by great scientists such as Charles Darwin, Albert Einstein and Richard Feynman.[359] Students should be encouraged to think and act as scientists, artists, revolutionaries, mathematicians, historians, creative writers and others as they are being initiated into a way of knowing.                        

   A dispositional theory, however, is clearly problematic in the sense that is it vulnerable to the charge of circularity and appeals to a vague collection of ill-defined or immeasurable behavioural traits. If my arguments for an ethics of belief, intellectual virtue, constructive skepticism and dispositional theory of critical thinking obtain, then our present approach to education will need to be re-examined, revamped and revolutionized. A classroom atmosphere of free inquiry, open-mindedness and critical skepticism, following the lead of an education system which promotes independence of thought, will not only require an overhaul of the curriculum, but will demand the employment of teachers who posses the necessary intellectual and dispositional equipment. It will also demand allaying the fears of those students and parents who do not want their precious and treasured (primarily religious) beliefs exposed to intellectual scrutiny - a most difficult task indeed. Surely an important part of an education is challenging long-standing verities, practises, ideologies and worldviews, especially religious superstition. If we are to survive as a species, the status quo is not an option. Unless our schools focus on teaching students how to think rather than what to think, our citizens will become increasingly credulous and the increasing authoritarianism, injustice and economic inequalities (that has been the norm throughout history) will prevail and likely perpetuate further as what threadbare democracy we do enjoy is under attack from reactionary and oligarchic threats from various factions in society. Teachers need to realize that the educated person in a democratic society is not the person who can necessarily answer all the questions, but the person who can question the often calcified answers.


  [1] Bertrand Russell (1950), p. 26.

    [2] Bertrand Russell (1926), p. 52.

    [3] Bertrand Russell (1950), p. 123.

    [4] Jean Paul Sartre (1956), Intro., Chap. 6, pp. 24-32.

    [5] Vancouver Province, Sept. 28, 1992.

    [6] The Skeptical Inquirer (Winter1991), vol. 15, no. 2, pp. 137-46. It is curious we spend an incredible amount of time and money assessing exactly what people think but devote little or no effort in trying to understand why they think that way. There seems to be an underlying assumption that the views people hold in these polls are reasoned views.

    [7] From the Oxford Dictionary. Another uncommon definition of skeptic cited is "one who doubts, without absolutely denying, the truth of the Christian religion or important parts of it; often loosely, an unbeliever in Christianity."

    [8] This conception of philosophical skepticism is given by Webster's New Universal Unabridged Dictionary. This definition is consistent with the Twentieth Century analytical philosophy definition of skepticism (e.g., Russell, Ayer, etc.) as a precursor to all philosophical inquiry.

    [9] Perhaps it could be argued that intuitively obvious mathematical principles such as these and some logical principles such as the law of non-contradiction are innate platonic concepts or are a function of training, tradition or authority. In any event, we take them to be self-evident.

    [10] Stephen Toulmin (1990), p. 199.

    [11] Stephen Toulmin (1990), p. 200.

    [12] John Dewey (1929), p. 17.

    [13] Dewey (1929), p. 21.

    [14] Dewey (1929), pp. 23-23, 42.

    [15] Dewey (1929), p. 41.

    [16] Dewey (1929), p. 40.

    [17] Peirce's approach to Science is not unlike that of Karl Popper's deductivism and his falsifiability principle.

    [18] C. S. Peirce, in P. P. Weiner, ed., pp. 107-108.

    [19] see "Is Justified True Belief Knowledge?" by E. L. Gettier in A. P. Griffiths, ed. (1967) for challenges to this notion.

    [20] According to Hamlyn (1970), there are five different versions of "wholesale" skepticism: (1) that knowledge is impossible; (2) that we can never be sure that we have attained it; (3) that it makes no sense to speak of knowledge; (4) that we never know anything; and (5) the possibility of knowledge is questionable (p. 7, 22).

    [21] Some recent philosophers, e.g.: Stroud (1984), Lehrer (19­90), have referred to it as "total" or "epistemological" skepticism.

    [22] Peter Unger's thesis (1975) comes dangerously close to that of Cratylus and Gorgias when he concludes that our ignorance summons us to be silent, but having said that, proceeds to write for another seventy pages.

    [23] Bertrand Russell, Human Knowledge, 1948, p. xi.

    [24] To claim "There is no knowledge" is paradoxical since the skeptic puts forth a thesis or hypothesis; namely, that nothing can be known. But in advancing this theory he is himself making a knowledge claim, the claim that the thesis states. But in order to advance the hypothesis that nothing can be known, the skeptic must make a knowledge claim. He thereby contradicts himself. One cannot consistently know that nothing can be known without falling into the trap of self-referential refutation. Moreover, according to Russell's Theory of Types, general or total skepticism, since it refers to itself, is not capable of meaningful formulation. I wonder? If the total skeptic doubts everything, does he doubt his own skepticism?

    [25] F. Nietzsche (1887), The Gay Science, Sec. 265, p. 219.

    [26] Thomas Nagel (1986), p. 74.

    [27] Whether there is a privileged set of empirical beliefs or a priori principles which constitute a foundation for knowledge is still an open question. Foundationalists such as Roderick Chisholm (1977) have argued that there are "unmoved movers" within the epistemic realm.

    [28] see Poplin (1979), Chap. 7.

    [29] Similar to the Representative Theory of Perception or "sense datum" theory of Russell, Ayer and others.

    [30] Russell (1962a), p. 83, 85.


    [31] Anthony Quinton, quoted in L. Bonjour (1985), p. 18.

    [32] Bertrand Russell (1912), p. 149-51.

    [33] Bertrand Russell (1927), p. 1.

    [34] A. J. Ayer (1956), p. viii.

    [35] A. J. Ayer (1975), p. 1.

    [36] John L. Pollock (1974), p. 5.

    [37] see J. Schulte (1992), pp. 142-146.

    [38] Ludwig Wittgenstein, On Certainty, sections 234, 498.

    [39] On Certainty, sec. 88

    [40] Ludwig Wittgenstein, On Certainty, Sec. 250.

    [41] Roger Trigg (1973), p. 149.

    [42] Thomas Nagel (1986), pp. 211, 213, 218.

    [43] Neurath’s boat is an anti-foundationalist, coherentist epistemic stance originated by Otto Neurath and later adopted by Quine. In this powerful metaphor a system of beliefs is compared to a boat that must be repaired at sea so that we are never able to start at the bottom. Any part can be replaced, provided there is enough of the rest on which to remain afloat.


    [44] To say that our lives can be viewed sub specie aeternitatis and given some external meaning or purpose can only provide an explanation of the facts of existence - it cannot offer a justification of these facts. There is no essential difference, for example, between a teleological explanation of events and a mechanical explanation. Teleological explanations of human behaviour are irreducible because in explaining, teleologically or otherwise, we are still showing how things are; we are not providing any justification. In any event, is it really compatible with human dignity to be made for something? What you are for is an insult! Finally, is it not infantile to go on looking for some authority, some order, that will lift all the burden of creativity and decision from you?

    [45] Thomas Nagel (1979), p. 17.

    [46] Many theists assert that ethics cannot do without religion because the very meaning of "good" is nothing other than "what God approves", thus reducing ethics to a mere tautology. Plato, in the Euthyphro, refuted similar claims more than 2000 years ago by arguing that if the gods approve of some actions it must be because those actions are good, in which case it cannot be the god's approval that makes them good. The moral irrelevance of the appeal to authority can be brought out best, perhaps, in this way: If there is a God who is the source of moral commands then either God had reasons for commanding what he did or he did not. If God has no reasons for giving the commands he does, there does not seem to be any good reason why we should obey them other than our fear of His power. This latter view reduces ethics to prudence and makes divine approval entirely arbitrary: if God approves torture of innocent children and disapproves of honesty then torture of innocent children would have been good and honesty bad. But if God commands something because it is good, this implies that something can be good independently of God. Why? Because "God commands it because it is good" implies that God apprehends it to be good and then tells us to do it. But if God does this then it is at least logically possible for us to see or in some way know or come to appreciate that it is good without God's telling us to do it.

    [47] John Kekes (1976), p. 241.

    [48] The writings of the Renaissance humanists displayed an easy going open-mindedness and skeptical tolerance that were distinctive features of this new lay culture. Their ways of thinking were not subject to the demands of ecclesiastical duty and they regarded human affairs in a clear-headed, non-judgemental light. This led to honest practical doubt about the value of "theory" for human experience - whether in theology, natural philosophy, metaphysics, or ethics. In spirit, their critique was not hostile to the practice of religion, just so long as this was informed by a proper feeling for the limits of the practical and intellectual powers of human beings. These sixteenth century followers of classical skepticism never claimed to refute rival philosophical positions. Such views do not lend themselves either to proof or to refutation. Rather, what they had to offer was a new way of understanding human life and motives. Like Socrates, and Wittgenstein in our own time, they tried to show people how to avoid dogmatism and recognize how philosophical theories often overreach the limits of human rationality.

    [49] Francis Bacon, Novum Organum, pp. 5-10, 19-36.

    [50] Bacon, p. 41.

    [51] John Passmore (1968), p. 95.

    [52] H. H. Price (1969), p. 131. This normative rule was also endorsed by Hume (1748), for example, who states that "a wise man...proportions his beliefs to the evidence." (Enquiries Concerning Human Understanding, Sec X, Part 1, p. 110.)

    [53] H. H. Price (1969), pp. 133, 155-56. Similar views were held by Bertrand Russell (1966) who states: "The true precept of veracity which includes both the pursuit of truth and the avoidance of error is this: We ought to give every proposition which we consider as nearly as possible that degree of credence which is warranted by the probability it acquires from the evidence known to us" (p. 86). And W. V. Quine (1978) writes: "Insofar as we are rational in our beliefs, the intensity of belief will tend to correspond to the firmness of the available evidence. Insofar as we are rational, we will drop a belief when we have tried in vain to find evidence for it" (p. 16).

    [54] Karl Popper (1963), Chapter I.

    [55] quoted in R. Chisholm (1966),  p. 225.

    [56] C. S. Peirce (1877), in P.P. Wie­ner, ed., p. 111.

    [57] A. J. Ayer (1956), p. 31.

    [58] Ayer, p. 22.

    [59] Ayer, p. 17.

    [60] C.I. Lewis (1969), p. 163.

    [61] C.I. Lewis (1955), p. 27.

    [62] Lewis (1955), pp. 88-89.

    [63] Roderick Chisholm (1966), p. 227.

    [64] R. Chisholm (1969), p. 4.

    [65] R. Chisholm (1980) "A Version of Foundationalism" in Midwest Studies in Philosophy V

    [66] Hilary Kornblith (1983) has stated that "questions of justification are...questions about the ethics of belief." (p. 34)

    [67] Lorraine Code (1987), p. 53.

    [68] Code (1987), p. 20.

    [69] Clifford contracted tuberculosis but eventually lost the battle with the disease and died on March 8, 1879.

    [70] W. K. Clifford (1877). Lectures and Essays, Vol. 2, 3rd Edi­tion (1901), p. 174.

    [71] Clifford, p. 175.

    [72] Clifford, p. 164.

    [73] Clifford, p. 164.

    [74] Clifford, pp. 164-65.

    [75] Clifford, p. 168.

    [76] Lorraine Code (1987), p. 17. Code's account of Phillip Gosse relies heavily on the book authored by his son, Edmund Gosse, Father and Son (Harmondsworth, Middlessex: Penguin, 1970).

    [77] Code (1987), p. 23.

    [78] Many of these attempts, such as the Anthropic Principle, are reformulations and revisions of the old Cosmological and Design Arguments for the existence of God, arguments dispensed with by Hume over 200 years ago in his Dialogues Concerning Natural Religion. The Anthropic Principle states that the Universe is the way it is because we exist; that is, an attempt to invoke the presence of humans as an explanation of the way the universe is. It has obviously not occurred to the proponents of this theory that we exist because the Universe is the way it is. The Anthropic Principle really explains nothing, and in the end it reduces to some of our species’ oldest intellectual pathologies: egocentrism and the quest for certainty. Scientists such as Paul Davies (1992) who write books about the Universe and God (The Mind of God) would be better off writing more scientifically and less metaphorically and poetically. Davies is a scientist with strong spiritualist and metaphysical leanings who insists that the Universe must have some intrinsic meaning or sense; that is, sense and meaning that is congruent with his notion about sense and meaning. Now, I do not know what the Universe has in mind, but it is highly unlikely that, whatever it does have in its mind, it is not at all like what Davies or anyone else on our insignificant planet has in his mind. And, considering what most people have in their minds, which generally lacks any depth of understanding or profundity, it is just as well. For the theistically inclined, the journey into the world revealed by physics ends with God; for a materialist, the symmetry and order revealed at the heart of existence would only be marred by the addition of an extraneous deity. Both are mistaken. Physics is not a device for discovering a primal simplicity, and the complexity of life is not a veil to brush aside in gaining a vision of True Reality.             

    [79] Robert Nozick (1993), pp. 101-102.

    [80] C.D. Batson (1975), p. 179.

    [81] Bertrand Russell (1962b), p. 11.

    [82] William James (1897) "The Will to Believe." in Pragmatism and Other Essays, 1963, p. 200.

    [83]  But one must be careful about inferences such as: "I ought to believe in X, therefore X is the case."

    [84] William James (1897), p. 209.

    [85] Eamon Callan (1984), p. 71.

    [86] This point will be reinforced later when I discuss the dispositional requirements for skepticism, critical thinking and belief.

    [87] Clifford, pp. 169, 173.

    [88] Clifford, pp. 173-74.

    [89] Stephen J. Gould (1987), p. 245.

    [90] Clifford, pp. 177-78.

    [91] Clifford, p. 178.

    [92] Antony Flew (1975), Thinking About Thinking, p. 62.

    [93] Lorraine Code (1987), p. 248.

    [94] David Hume (1748), Enquiries, sec. X, part 1, pp. 117-118.

    [95] Price, p. 213.

    [96] Price, p. 214.

    [97] Quine & Ulliam (1978), p. 12-13.

    [98] For Gilbert Ryle, it is a mistake to think of a belief as any kind of private mental state, activity or occurrence. Beliefs are dispositions, whereas knowledge is more akin to ability. According to Ryle's account in The Concept of Mind (1949), a person has a disposition if he is inclined to speak and behave in a particular way. In light of the limitations of this thesis, I shall sidestep the difficult analysis of the concept of belief and follow the lead of Ryle and  H. H. Price. Price (1969) states that "A believes that P" is to attribute a multiform disposition to A which is manifested or actualized in many different ways: not only in actions but in emotional states, feelings of doubt, surprise, confidence, and inferences. Wittgenstein argues that we do not acquire our beliefs by being dragged and screaming, as it were, out of  skepticism (skeptics are made, not born). Nor do we carefully weigh the evidence of every proposition recommended to us. Rather, our culture teaches us to organize our experience in certain ways by giving us conceptions, rules of use, names, and so on. We acquire a picture of the world; that is, a loosely connected network of propositions in which the consequences and premises are mutually supporting. (Wittgenstein, On Certainty, p. 21.) It is against this background that doubt arises, either because what we expect is contradicted by our experiences in the world, or because we find ourselves entertaining propositions that are, or whose consequences are, contradictory. In other words, we begin by believing and we must have grounds for skepticism. Clifford would argue that it should be the opposite.

[99] Kierkegaard, in attempting to create a barrier for Christianity against the presumptions and incursions of rationalism, has done so at the cost of giving no grounds for preferring Christianity to any other religion or system of belief and even robbing it of all serious pretensions to credibility. Obviously a good deal depends here on how the ideas in question are taken. It is one thing to regard acceptance of the Christian faith as commitment to a self-contained sphere or Wittgenstein's “form of life”, not itself finally justifiable by external criteria or modes of assessment; it is another to treat its content as being in some sense essentially paradoxical, avowedly “absurd” or contradictory. In so far as Kierkegaard subscribed to the second, and not merely the first, of these positions, his standpoint has been felt - not unnaturally - to present special problems.


    [100] It is interesting to note that for classical Greek philosophy, as for Plato, faith (pistis) is the lowest form of belief, characteristic only of the wholly uneducated, who fail to reflect critically on what they experience or are told. The Jewish-inspired Christian emphasis on faith struck educated pagan observers with astonishment; it represented, in their eyes, the extreme of anti-intellectualism and foolishness.

    [101] Bernard Williams (1973), p. 148.

    [102] Evidence for a belief must be distinguished from the motives and causes of belief; for some causes of belief can be counted as evidence and some cannot. When someone is said to have some reason for believing a certain proposition, we may need to ask whether this reason is a ground for holding that the proposition is actu­ally true or whether it is a motive for persuading himself of it, irrespective of whether it is true or not. In the former case we can speak of a reason (ground), in the latter of a reason (motive). Many beliefs are caused by social factors such as what we have been taught by our elders, or "picked up from our peers by social osmosis." (Antony Flew (1982), p. 367-69; also Flew (1975), p. 58.)

    [103] W. V. Quine & J. S. Ulliam (1978), p. 15.

    [104] Clifford, pp. 183-84.

    [105] Antony Flew (1975), p. 115.

    [106] F. Nietzsche (1895), The Antichrist, Sec. 52, p. 169. Sart­re, in Being and Nothingness (Pt 1, chapter 2, pp. 86-118) equated faith with "bad faith."

 [107] Of course it is a fundamental point of logic that one cannot disprove a universal negative. I cannot, for example, prove the non-existence of the Tooth Fairy or super intelligent invisible green goblins residing on the planet Neptune.


    [108] Michael Scriven (1966), p. 103.

    [109] Scriven (1966), p. 103-104. Scriven would argue that agnosticism is really a confused position. The self-styled agnostic who suspends judgement about the existence of God while asserting without hesitation that, of course, Aphrodite, Zeus and Satan do not exist, and there are no angels or mermaids, is confused: he takes it for granted that Satan and angels have to be conceived anthropomorphically, while God must not be considered that way. However, once we leave the absurdly false but intelligible claims of a very anthropomorphic and religiously and rationally unaccept­able theism, we get versions of Christianity and Judaism which make central claims. But for them it cannot be ascertained under what conditions they would be false or probably false, and their logical status and indeed their very intelligibility and coherence is problematical or anomalous. It therefore seems that the agnostic, unless he is willing to suspend judgement rather than deny extraordinary claims such as the existence of Fairies, Bigfoot, the Easter Bunny, Santa Claus, the Loch Ness Monster, disembodied spirits and a host of other alleged paranormal entities, he should also be willing to deny the existence of God. Scriven points out that at least Bigfoot and the Loch Ness monster are not alleged to possess any powers or attributes of an utterly unprecedented sort. But when a claim asserts the existence of something that is greatly at odds with our previous experience and our best scientific knowledge, we rightly regard the claim as very probably false until we are provided with truly strong evidence in its favour. Thus we are not too skeptical when we read in the newspapers that the world record for the 100 metre dash has been exceeded by one-tenth of a second. However, one would be considered credulous to uncritically accept a newspaper report of someone sprinting across Canada in one day. The same would hold for claims that human beings can be cured of a terminal illness by Gregorian chants or by the words of an unhinged evangelist faith healer. The relevance of these considerations to theistic claims of an omnipotent, omniscient deity that does not exist in space or time (but can act in space and time) is obvious.

    [110] Pascal's Wager: "If God does not exist, we can still believe in Him with impunity, but if he does exist, we doubt him at our peril; therefore it is the counsel of prudence to believe in God." (Quine & Ulliam (1978), p. 61) It would seem that the argument is aimed at convincing open-minded self-interested skeptics, whose coolness about their prospects for immortality horrifies Pascal, that they should become involved on the side of those Christians committed to immortality -  but it seems to me no such person would accept the premises. It seems more likely that the argument is not for the skeptic who is satisfied with this world, but is rather for the person who is conscious of the miserable human condition. It is certainly consistent for a self-interested rational skeptic to feel unhappy about man's lot. But even if he had the appropriate existential human longings, the rational skeptic must find Pascal's argument invalid. The primary source can be found in Pascal's Pensees and the Provincial Letters, trans. W. F. Trotter, New York, 1941, pp. 79 - 85 (Sections 233 - 241 of the Pensees).

    [111] See Walter Kaufman (1961), pp. 170-72, David Walker ((199­2), pp. 311-12, Thomas V. Morris (1986), pp. 437-454, and Robert M. Martin (1992), pp. 20-23 for excellent discussions of Pascal's wager.

    [112] Bertrand Russell, Philosophical Essays, p. 86.

    [113] Brand Blanshard (1974). Reason and Belief, p. 424.

    [114] R. P. Abelson (1986), pp. 222-50.

    [115] More people in the United States believe in ESP than in Evolution. ("Gallup Poll of Beliefs" (1989) Skeptical Inquirer, vol. 13 (3), pp. 244-45 and "Scientific Literacy" (1989) Skeptical Inquirer, vol. 13 (4), pp. 343-45.)

    [116] The French mathematician and astronomer Pierre Laplace once used the principle to calculate the probability of the sun rising at 1,826,214 to 1.


    [117} It would seem that Pascal must have considered the probability of the Christian God’s existence to be sufficiently high, otherwise his wager argument would carry little weight. In any event, any such probability would be a priori and highly speculative and if the odds of the Christian God’s existence were extremely low, say 1 in 1010, then the wager argument could not take flight.


   [118] H. L. Mencken (1982), p. 98.


    [119] Many would argue that basic religious concepts and propositions are unintelligible or, at best, incoherent and therefore incapable of being rational objects of belief. Hence, for a reflective and concerned human being to possess a reasonable scientific and philosophical understanding of the world, some form of agnosticism or atheism is the most non-evasive option for such a person.

    [120] More recently, Richard Rorty (1979), in his deconstruction of the Kantian tradition of foundationalism, has argued for a more "holistic" approach to epistemology. Rorty, arguably one of the most influential contemporary philosophers, would essentially be in agreement with James' theory of truth and claim that nothing of major importance turns on the distinction between epistemic and non-epistemic criteria for beliefs.

    [121] William James (1896), "Pragmatism's Conception of Truth" in Pragmatism and Other Essays, (1963), p. 98.

    [122] William James (1896), quoted in Scheffler (1974), p. 104.

    [123]  William James (1896) ""What Pragmatism Means" in Pragmatism and Other Essays (1963), pp. 36-37. Bertrand Russell summed up James' philosophy as "A truth is anything which it pays to believe." (Philosophical Essays, p. 118) He concluded that the only reason a Jamesian pragmatist would believe in the proposition "People exist" would be in order to avoid solipsism. (Ibid, p. 122)

    [124] F. Nietzsche (1888), The Will to Power, Sec. 483.

    [125] F. Nietzsche (1885), Twilight of the Idols, p. 53.

    [126] J. Guinlock, ed. (1976), p. xxxv.

    [127] John Dewey, "Absolutism to Experimentalism" in John J. McD­ermott, ed., Vol. I, 1973, p. 7.

    [128] John Dewey (1925), p. 427.

    [129] John Dewey (1925), p. 410.

    [130] Israel Scheffler (1974), p. 108.

    [131] The problem of evil poses a very awkward question for anyone who wants to assert, literally, the full traditional set of theistic doctrines. According to traditional theism there is a god who is omnipotent, omniscient and wholly good, and yet there is evil in the world. How can this be? The problem of evil is essentially a logical problem: it sets the theist the task of clarifying and reconciling the inconsistent beliefs which he holds. If God has the power to eliminate evil, but the evil still exists, it must follow that he does not want to eliminate evil. But in that case he cannot be perfectly good. On the other hand, if God wants to eliminate evil, but evil still exists, it must be the case that God cannot eliminate evil. But then he cannot be omnipotent. Finally, if God wants to eliminate evil and has the power to do so, evil will not exist. But evil does exist, and so we have to reject either the claim that God is omnipotent or the claim that God is perfectly good. However, since omnipotence and perfect goodness are part of the definition of the term "God", to surrender either of these claims is to surrender belief in God. Hence, the existence of evil seems to constitute a disproof of the existence of God. The attempts of Christian apologists to answer the argument from evil, beginning with the theodicy of Leibnitz, have been subjected to a barrage of satirical critiques from the likes of Voltaire to compelling counter-arguments from contemporary analytic philosophers such as J. L. Mackie, Antony Flew and Kai Nielsen. Furthermore, if a successful rebuttal to the argument from evil cannot be made, a posteriori arguments like the cosmological and design arguments become superfluous. What good would it do to prove the existence of an Uncaused Cause or Great Designer so long as evil remains apparently unjustified? In such a case, these arguments, instead of being arguments for the existence of a God, would seem to support the existence of something like Descartes' evil demon.

    [132] Skyrms (1986), p. 195.

    [133] Quine (1978), p. 122. Also see William Hare's (1990) excellent article on the Keegstra case. (p. 386)

    [134] F. Nietzsche (1886), Human, All too Human, sec. 483, p. 179.

    [135] C. S. Peirce (1877) in P.P. Wiener, ed., p. 111.

    [136] I will deal specifically with Rorty's notion of truth later in this thesis.

    [137] It should be pointed out that if one rejects Cartesian dualism in any or all of its various forms it does not necessarily entail that one must then reject epistemic realism.

    [138] Paul Feyerabend (1987), p. 13.

    [139] P. Feyerabend (1987), p. 297, 301.

    [140] Feyerabend (1987), p. 309.

    [141] Feyerabend (1987), p. 309-311.

    [142] Feyerabend (1987), p. 20.

    [143] Martin Gardner (1983), p. 272.

    [144] Feyerabend (1978), pp. 221-22. See Ernest Nagel (1979) in Teleology Revisited for a restrained attack on Feyerabend's views. Also, see Harvey Siegel (1987) in Relativism Refuted.

    [145]  I have presented arguments similar to those offered by John Passmore (1969), Philosophical Reasoning, pp. 63-69.

    [146] Russell (1966), p. 125.

    [147] Roger Trigg (1973), p. 91, 166.

    [148] The "death by a thousand qualifications" is the attempt to protect an assertion or hypothesis from falsification or refutation by various reinterpretations and redefinitions. In other words, it amounts to the explaining away any and all objections to a claim or hypothesis by ad hoc addendums. Hence, an assertion is virtually killed and its content reduced to vacuity, once all the relevant or necessary qualifications are added to it.

    [149] On Certainty, sections. 141, 142, 144. Coherentism, however, encounters considerable difficulties if taken at face value. For there are any number of coherent bodies of beliefs that are mutually inconsistent with one another. For example, suppose that both Christianity and Hinduism are internally coherent. Yet they cannot both be true, since there cannot be both one God and many. But in that case it is very difficult indeed to see how the mere fact of the coherence of Christian belief could render it justified, given that there are other incompatible belief systems that are equally coherent. For there would be no justification for being a Christian rather than a Hindu.

    [150] On Certainty, sec. 102.

    [151] W. V. Quine (1978), p. 126.

    [152] Christopher Coope (1974), in G. Vesey, ed., p. 264.

    [153] A bias is a "disposition to underestimate or overestimate in a particular direction" and "as such, can be recognized and systematically compensated for; just as prejudices can be identified and open-mindedly examined by all those who prefer their beliefs to be, even if uncomfortable, well-evidenced and, hopefully, true." (Antony Flew (1992), p. 208.)

    [154] An emergence from a tribal closed society in which social arrangements are simply deemed part of a given natural or divine order imposes strains and severe difficulties for aboriginal peoples just as it has for every culture which has made or has failed to make the transition. Is it reasonable, for example, with the present state of our natural environment, to grant unrestricted hunting and fishing privileges to any group or individual? The argument by our aboriginal peoples that the land and its bounty is divinely bequeathed to them is beginning to wear a little thin.

    [155] Wittgenstein, On Certainty, sec. 612.

    [156] Wittgenstein, On Certainty, sec. 107.

    [157] Donald Davidson, (1973-74), pp. 5-20 (Reprinted in Meiland & Kraus, pp. 66-80). Davidson states that there is no good reason for accepting Thomas Kuhn's claims to incommensurability. We come to understand languages of others, including people from very different cultures with very different languages, in basically the same way we come to understand our own language, namely by systematically coming to understand the truth conditions of the sentences in the language in question. To understand the language of another is to follow a systematic method for generating the truth conditions of her declarative sentences.

    [158] Gilbert Harman (1977), The Nature of Morality, p. 131.

    [159] John Wilson (1986), "Relativism and Teaching", p. 91.

    [160] John Wilson (1986), p. 92.

    [161] On Certainty, sec. 336.

    [162] On Certainty, sec. 366.

    [163] R. M. Hare (1992), p. 33.

    [164] John Dewey (1922), p. 330-331.

    [165] Dewey (1922), p. 331.

    [166] John Dewey, Later Works, vol. 15, p. 58.

    [167] John Dewey (1934), p. 35-36.

    [168] Dewey (1934), p. 8-11.

    [169] A recent news item on the sports channel TSN accounted for why the CFL decided to have a football team in Las Vegas, whereas the NFL had always declined to do so. Their explanation was: "there is no reason to be concerned about moral depravity in Las Vegas (despite its reputation for gambling and other moral vices) because there are more churches in Las Vegas per capita than any other major city in the United States."

    [170] Martin Luther, for example, called reason "the Devil's bride" and "God's worst enemy." Luther’s anti-rationalism is a recurring theme in Protestantism, and there are fundamentalists and extremists who take him at his word. Similarly, Kierkegaard regarded philosophy a great threat, critical thinking as insubordination and reason the enemy. Objections to Christianity do not issue from doubt, as many people think - "Objections against Christianity come from insubordination, unwillingness to obey, rebellion against all authority." (Journals, Sec. 630) Kierkegaard rejected reason and philosophy because they were unable to tell him what ideas he should live and die by. What he, like millions of others overlook is a very simple but important point: reason and philosophy may well safeguard people against ideas for which they might better not live or die. Simply because reason cannot always tell us what to do, should we conclude that deliberation, reflection, evidence and argument are a waste of time and abdicate our intellectual responsibility in favour of faith in absolutes? One of the chief aims of education is to make people more responsible. The person who does reflect on the probable effects of his decisions on the people who are likely to be affected, who relies on reason, evidence and argument, if only to eliminate some choices, acts respon­sibly even if he later finds that he has done the wrong thing. Moreover, the refusal ever to let argument and evidence to bear on faith can be deeply disturbing. Faith such as that of Luther, Kierkegaard or Pascal is blind, in the sense of not being guided, or revisable, by reasoning either from conditions independent of the believer’s state of faith or from the counsel of fellow human beings who are not so sure. Religious faith, it would seem, is intellectual closure - a conversation stopper - and in this sense is indistinguishable from fanaticism. The fanatic is one who is committed to a train of thought or course of action regardless of any contrary counsel or caution, thus breaking off a crucial kind of communication or discourse with others, one that accords them equal respect in the weight of opinion and action. Fanaticism tears at the fabric of community, solidarity and the democratic spirit. So too does fideism, if and insofar as it vilifies the counsel of others deemed beyond the pale.

    [171] Rorty (1982) Consequences of Pragmatism, p. 166.

    [172] Roderick Chisholm (1966), p. 224- 225. Chisholm is not suggesting that beliefs themselves are "acts" but is alluding to the rational practices and standards which ought to be invoked during our deliberations regarding belief acquisition.

    [173] Lorraine Code (1988), p. 161.

    [174] One must not confuse the explication of rationality revealed so far with the philosophical position of Rationalism; i.e., the thesis that the general nature of the world can be esta­blished by wholly non-empirical demonstrative reasoning - e.g., Descartes, Leibniz and Spinoza.

    [175] Harvey Siegel (1988), p. 129.

    [176] Siegel (1988), p. 129.

    [177] D. Pole (1972), "The Concept of Reason" in Dearden, Hirst & Peters, eds. Education and the Development of Reason, p. 168.

    [178] Robert Nozick (1993), p. 139. In this recent work Nozick does take steps toward the development of a theory of substantive rationality of goals and desires. (see pp. 140-151)

    [179] Charles Taylor (1991), p. 5.

    [180] Bertrand Russell (1947), p. 73.

    [181] Bertrand Russell (1947), p. 73.

    [182] Bertrand Russell (1947), p. 74.

    [183] Siegel (1988), p. 130.

    [184] Charles Taylor (1991), p. 10.

    [185] E. F. Schumacher (1977), p. 53.

    [186] Taylor (1991), p. 103.

    [187]  E. F. Schumacher (1977), p. 58.

    [188] Morris Berman (1989), p. 34.

    [189] Taylor (1991), p. 6, 8.

    [190] Siegel (1988), p. 131.

    [191] William James' pragmatism is the philosophy which holds that the only valid test of truth is that "it works", a philosophy which is, for the most part, "preoccupied with means." (Bertrand Russell, Authority and the Individual, p. 72) In other words, truth is synonymous with utility, or at least, utility is sufficient for truth. Russell succinctly summed up James' pragmatic theory of truth as "A truth is anything which it pays to believe." (Philosophical Essays, p. 118), adding that the only reason a William James pragmatist would believe in the proposition "people exist", would be in order to avoid solipsism. (Ibid, p. 122)

    [192] Karl Popper views the idea of truth-seeking and objective criticism as key ingredients in rationality in that they make rational discourse possible. But he sees rational discourse as possible only if one assumes the existence of an objective reality, "a challenge to our intellectual ingenuity, courage and integrity" (1983, p. 81). Rationality consists of accepting fallibility and learning from our experiences, especially our mistakes. Popper (1989) states that "... each of us makes mis­takes, serious mistakes, all the time. We should remember what Voltaire said: "What is toleration?" asks Voltaire. And he answers: "It is a necessary consequence of our being human [and therefore fallible]. We are the products of frailty: fallible and prone to error. So let us mutually pardon each other's stupidities. This is the first principle of the law of nature [the first principle of human rights]."


   Voltaire's principle of tolerance is, indeed, the basis of all rational discussion. Without it, rational discussion is impossible. And it is the basis of all self-education. Without consciously admitting our fallibility to ourselves, we cannot learn from our mistakes; we become infallible dogmatists." (p. 281)


     Popper endorsed what one would call weak fallibilism, the view that "it is logically impossible to exclude the possibility of error in any conclusion arrived at by reasoning" (Kekes(1976), p. 77). Hence, this view does not exclude rational belief or the possibility of knowledge. Weak fallibilism simply instructs us to be cognizant of error and advises that "all beliefs be tentatively held" whereas strong fallibilism "denies that reasoning can provide grounds for the acceptance of any belief" (Kekes, p. 77).

    [193] John Dewey (1916), Democracy and Education, 1966, p. 189.

    [194] Robert Nozick (1993), pp. 175-76. Nozick proposes two central rules governing rational belief: (1) "not believing any statement less credible than some incompatible alternative - the intellectual component", but (2) "then believing a statement only if the expected utility (or decision-value) of doing so is greater than that of not believing it - the practical component." (pp. xiv, 85-93)

    [195] In the Nichomachean Ethics, Aristotle says that intellectual virtues are not concerned with a mean as the moral virtues are. In the Eudemean Ethics, however, he says that the intellectual virtue of phronesis (or wisdom) is a mean between cunning and folly.

    [196] Of course, in our everyday conversations we often use the word "certain" in the sense in which we are making a claim to knowledge. When we say that "I know the sun will rise tomorrow" we mean "I am certain that the sun will rise tomorrow", without digressing into the philosophical conundrums concerning Induction.

    [197] John Kekes (1976), p. 256.

  [198] It is worth noting that we never in fact insist upon absolute certainty in our practical lives, no matter how much may be at stake. For example, imagine yourself as a juror on a murder case in a country that retains the death penalty and imagine that the evidence against the defendant is, as we often say, overwhelming. Nevertheless, as a scrupulous person, you ask yourself whether you really know that the defendant is guilty. Moreover, suppose one of the jurors has proposed an alternative hypothesis - namely, that a group of super-intelligent aliens may have landed on earth undetected and planted all the evidence against the defendant for purposes known only to themselves. This hypothesis is certainly a possible one, in the same way as is Hillary Putnam’s hypothesis of the brain in a vat. It therefore demonstrates that your belief that the defendant is guilty is not absolutely certain. But plainly the alternative proposal would not even cause you to hesitate in handing down a guilty verdict, despite the fact someone’s life hangs in the balance.


    [199] Larry Laudan (1977), p. 123.

    [200] Ludwig Wittgenstein, On Certainty, Sec. 243.

    [201] Israel Scheffler (1973), p. 76.

    [202] Robert Nozick (1993), p. xiii.

    [203] Scheffler (1973), p. 80.

    [204] Jose Ortega Y Gasset (1929), pp. 71-72.

    [205] Ortega Y Gasset (1929), p. 73-74.

    [206] Harvey Siegel (1987), Relativism Refuted, p. 33-34.

    [207] Roger Trigg (1973), p. 151.

    [208] Roger Trigg (1973), pp. 152-53.

    [209] John Kekes (1976), p. 168, 190.

    [210] Quine, quoted in Siegel (1987), p. 43.

    [211] Siegel (1987), p. 138.

    [212] Siegel (1987), p. 138.

    [213] Similar self-referential paradoxes arise with the question "Why be rational?" It can be argued that the very ability to pose the question displays evidence of rationality, at least to some minimal degree, for to ask the question seriously is to seek, and commit oneself to, reason which might answer the question. This stresses the fact that rationality is, in an important sense, self-justifying; to inquire about its rational status is eo ipso to commit oneself to it. (Siegel (1988), p. 88, 132, 167.) Nozick (1993) writes that "one answer would be that we are rational, we have the capacity to act rationally, and we value what we are." (p. 40) Karl Popper's argument is that the case for universal rationality cannot be supported by either logical argument or empirical evidence because only those who already have some commitment to reason will be influenced by them. Despite the force of Popper's argument, this should not be used as an excuse for a pernicious relativism concerning rationality since frequent exposure to argumentation (both inductive and deductive) and the use of evidence is for most people a precursor to their acceptance of their validity. (Karl Popper (1945), Vol. II, pp. 230 ff.)

    [214] Rorty (1979), pp. 175-76.

    [215] Jeffrey Stout (1988), p. 247.

    [216] Positivism, despite its rejection of Metaphysics, is, in a certain sense, a metaphysical position: it claims for Science the same God's-eye view of reality which had formerly been claimed by metaphysical systems and religion. It does not recognize that Science has not only destroyed the claims of Metaphysics and Religion to this status, it has destroyed the status itself. (see Pole in Dearden et al., p. 124)

    [217] Thomas Nagel (1986),The View From Nowhere.

    [218] Nagel (1986), p. 5.

    [219] Nagel (1986), p. 5.

    [220] Nagel (1986), p. 67.

    [221] See Rorty's discussion of this sense of objectivity in Chapter 8 of Philosophy and the Mirror of Nature.

    [222] Jean Piaget (1972), p. 34.

    [223] Roger Trigg (1989), pp. xxv, xxix-xxx.

    [224] Lorraine Code (1988), p. 160.

    [225] Roger Trigg (1989), pp. 209, 219.

    [226] The denial of absolutes, it should be remembered, entail paradoxes of self-reference and self-refutation. The paradoxes inherent in statements such as "There are no absolutes" and "Everything is relative" were dealt with by Socrates in the Platonic dialogues. See Siegel (1987), pp. 8-9, 18-19.

    [227] The appeal of foundationalist epistemologies appears to be rooted in the elegant deductive structures of mathematical systems. Euclidean geometry, to state a familiar example, has had a dramatic effect on the imagination of many philosophers in the way in which so much can be deduced from so little. The structure of Euclidean geometry suggests that there are some basic truths (axioms, postulates) which serve as the foundations upon which all else rests. But it is a mistake to think that axioms have any special epistemological status - the whole theory stands together and its credibility must be exposed to the bar of informal reason, plausibility and experience.

    [228] Siegel (1987), p. 160.

    [229] Siegel (1987), p. 161-162.

    [230] Siegel (1987), p. 164.

    [231] Siegel (1987), p. 165.

    [232] Siegel (1987), p. 167.

    [233] Roger Trigg (1989), p. 64.

    [234] Rorty (1979), p. 361.

    [235] Rorty (1991), p. 22.

    [236] Richard Bernstein (1980), p. 771.

    [237] Bernstein (1980), p. 771.

    [238] John Dewey (1922) Human Nature and Conduct, Part I

    [239] Dewey (1922), pp 102-104.

    [240] Dewey (1922), p,. 96.

    [241] Dewey (1922), p. 97-98.

    [242] John Dewey (1938), p. 79.

    [243] If there is no epistemology, then certainly this precludes any notion of moral epistemology. If there are no foundations of knowledge, then talk of foundations in morals or politics would be an absurdity.

    [244] According to Rorty (1979), what philosophers call rationality is simply "the philosophical dogmas of the day." (p. 269.)

    [245] Kuhn's incommensurability thesis states that we simply cannot understand the statements uttered by those whose terms are not commensurable with ours and whose rules for purchasing rational agreement differ from ours. The upshot of this is that rationality becomes relative to whatever is accepted as reasonable by a given group or culture at a certain time. Rorty's Wittgenstein view that rational agreement is mere "persuasion" ignores the fact that any argument rests on the assumption that there is some common ground between the disputants, including rules of rational discourse. Rorty argues that Galileo's eventual victory over the Catholic Church concerning the debate over "the way the heavens are set up" (Philosophy and the Mirror of Nature, p. 329) was the result of his superior "rhetoric." (pp. 330-31) It seems to me that Rorty confuses producing a "proof" with "persuading" a person. A person may be persuaded by an" abominable argument;" just as she may remain unconvinced by considerations which she surely would accept if only she were more rational or intellectually honest. (see Flew (1975), p. 57)

    [246] Rorty (1979), p. 269.

    [247] Rorty (1982), p. xiii.

    [248] Rorty (1979), p. 308.

    [249] W. Kaufmann (1961), p. 279.

    [250] Sabina Lovibond (1983), p. 148.

    [251] C. G. Prado (1987), p. 78.

    [252] Prado (1987), p. 79.

    [253] Prado (1987), pp. 81-82.

    [254] Roger Trigg (1989), p. 206.

    [255] J. Stout (1988), Ethics After Babel, p. 29.

    [256] J. Stout (1988), p. 30.

    [257] Antony Flew (1982), p. 371.

 [258] It is my view that any sane, rational person should not be teased into the untenable position that at least some questions as to what is good or bad for people, what is harmful or beneficial, are simply matters of opinion or taste. That it is a bad thing to be tortured or starved, humiliated or subjected to slavery, is not opinion; it is a fact. That it is better to be loved than despised, free rather than enslaved, is again a fact, a truism - not a matter of opinion. In this sense Hume’s claim that “no ought can be derived from an is” becomes unconvincing.


    [259] J. Stout (1988), p. 31.

    [260] Roger Trigg (1989), p. 187.

    [261] Dewey (1938), in Logic  wrote that "situations that are disturbed and troubled, confused and obscure, cannot be straightened out, cleared up and put in order, by manipulation of our personal states of mind." (p. 106) In other words, truth and evidence are not to be found in individualistic Cartesian musings or determined by the satisfactory, consolatory or practical outcomes of a proposition.

    [262] Dewey (1941), p. 172.

    [263] Dewey (1939), p. 572.

    [264] Dewey (1939), p. 572-73.

    [265] Dewey (1941), p. 176.

    [266] see Dewey (1938), pp. 8-9.

    [267] Dewey (1941), p. 178.

    [268] Dewey (1941), pp. 178-79.

    [269] Roger Trigg (1989), p. 53.

    [270] Roger Trigg (1989), p. 150.

    [271] see Brian Carr (1981-82)

    [272]  J. Habermas in R. Bernstein, ed.(1985), pp. 192-98.

    [273]  R. Bernstein, ed. (1985),  p. 193.

    [274]  R. Bernstein, ed., p. 194.

    [275]  R. Bernstein, ed., p. 195.

    [276] Richard Rorty (1989), p. 51.

    [277] see R. S. Peters (1964),"Education as Initiation" in Analysis and Education, R. Archambault, ed., New York: Humanities Press.

    [278] This is a term used by Israel Scheffler as defining the way things should be. Jonas Soltis (1977) has asserted that "...a search for the definition of education is most probably for a statement of the right or best program for education, and, as such, is a prescription for certain valued means or ends to be sought in education." (p. 9)

    [279] Maxine Greene (1976) has stated that "...fixed principles, like fixed ends, tend to close off inquiry" and "people who function habitually, according to rules that are seldom reflected upon, cannot think what they are doing." (p. 19)

    [280] in Philosophy of Education, W. K. Frankena, ed., New York: MacMillan, 1965, pp. 44-51. Originally printed in R. S. Peters, Authority, Responsibility, and Education, London: Allen & Unwin, 1963, pp. 83-95.

    [281] Eric Hoffer (1951), pp. 82-83.

    [282] Jonathan Rauch (1993), p. 94, 28.

    [283] Eamon Callan (1988b), pp. 133-142.

    [284] Callan (1988b), p. 136.

    [285] Callan (1988b), p. 136.

    [286] Harvey Siegel (1984), p. 361.

    [287] Siegel (1984), p. 361.

    [288] Siegel (1984), p. 360-61.

    [289] Callan (1988b), p. 141.

    [290] Richard Dawkins (1993), pp. 34, 37.

    [291] Richard Dawkins (1976), Chapter 11.

    [292] Jean Piaget (1965), pp. 141-142, 164-166.

    [293] Eamon Callan (1988a), p. 192.

    [294] Callan (1988a), p. 193-194.

    [295] Callan (1988a), p. 193.

    [296] R. A. Eve & Dana Dunn (1990), p. 14.

    [297] R. A. Eve & Dana Dunn (1990), p. 20.

    [298] John Wilson (1986), "Relativism and Teaching", p. 95.

    [299] In a Gallup poll conducted in 1991 more than half of those surveyed believe in the Devil, three in four occasionally read their horoscopes in a newspaper, and one in four said they believe in the tenets of astrology. More than 70% believe in life after death and only 57% do not believe in reincarnation. (Skeptical Inquirer, Winter 1991, pp. 137-146.)

    [300] Lorraine Code (1987) Epistemic Responsibility, p. 251.

    [301] Ernest Sosa (1985) defines an intellectual virtue as "a quality bound to help maximize one's surplus of truth over error" and " a subject-grounded ability to tell truth from error infallibly or at least reliably...(p. 243) A faculty is "intellectually virtuous" if it does not "lead us astray in our quest for truth: that it outperforms feasible competitors in its truth/error delivery potential." (p. 229) (Sosa's condition of infallibility is, I would argue, too strong). Sosa refers to his epistemology as "reliabilism", "the view that a belief is epistemologically justified if and only if it is produced or sustained by a cognitive process that reliably yields truth and avoids error." (p. 239) "What interests us in justification is essentially the trustworthiness and reliability of the subject with regard to the field of his judgement, in situations normal for judgements in that field." (p. 241). The problem with reliabilism, as I see it, is a problem of justification. A person may decide that Astrology is reliable because it has made 4 of 5 predictions correctly or that prayer is reliable because 4 out of 5 prayers were "answered". Do these results justify belief in prayer or Astrology? Justification is a normative notion and hence precludes the utilitarian nature of reliabilism as a sufficient condition for justification. (Justification is a matter of having good reasons for beliefs.) As Nozick (1993) has stated, "reasons [for beliefs] without reliability seem empty, reliability without reasons seems blind" (p. 64). Nozick points out that a rational principle less reliable than another might be preferred because the latter might "prove disastrous when wrong" (pp. 135-36).

    [302] Robert Nozick makes this point in the preface to Anarchy, State and Utopia (New York: Basic Books, 1974), when he asserts that "intellectual honesty demands, occasionally at least, we go out of our way to confront strong arguments to our views." (p. x).

    [303] William Hare (1979) defines "open-mindedness" as a propensity and desire to formulate and revise one's beliefs in light of evidence and argument. (Chapter I)

    [304] Bertrand Russell (1962b), pp. 135-36.

    [305] Israel Scheffler (1974), p. 112.

    [306] C. S. Peirce quoted in Scheffler (1974), p. 113.

    [307] Karl Popper (1976), p. 21.

    [308] John McPeck (1981), p. 35.

    [309] McPeck (1981), p. 7,9,13,37, passim.

    [310] John McPeck (1990), p. 42.

    [311] H. Siegel (1980), p. 8.

    [312] Ennis (1962) defined critical thinking as "the correct assessing of statements." (p. 83), but more recently (1991) has defined it as "reasonable reflective thinking that is focused on deciding what to believe or do." (p. 6)

    [313] H. Siegel (1980), p. 8.

    [314] I. Scheffler (1982), p. 142.

    [315] B. Beyer (1985), p. 272, 275.

    [316] B. Beyer (1990), p. 56.

    [317] Matthew Lipman (1988), p. 38.

    [318] M. Lipman (1988), p. 34.

    [319] Here Lipman draws upon C. S. Peirce's essay "Ideals of Conduct". In this essay Peirce discusses the connection between self-correcting inquiry, self-criticism, and self-control.

    [320] R. Paul (1987), p. 130.

    [321] R. S. Peters (1972), in Dearden et al, eds., p. 211.

    [322] D. Pole (1972), in Dearden et al, eds., pp. 155, 159.

    [323] David Hume (1748), Enquiries, sec. X, part 1, p. 114.

    [324] Hume (1748), Enquiries, sec. X, Part 2, pp. 115-16, 131. The Christian miracles of the Virgin Birth and Resurrection, for example, persist in spite of the fact that even the most conservative accounts are so far outside the normal course of events as to shatter mundane scientific views of the world. But ancient miracles are a dime a dozen. All we can do, as Hume would suggest, is observe that human credulity, fraud, mythmaking, and simple mistakes are far too common to let us take such reports at face value. Arguing the literal virgin birth or resurrection is about as tedious as discussing biology with creationists. The fact of the matter is, the literal Virgin Birth, Resurrection and Ascension are preposterous, not merely unproven. It is scarcely better than claiming that Hitler was not killed, but was spirited away by his allies from the Hollow Earth, where he plots his revenge. As Hume has pointed out, if someone were to present evidence for their truth, should we not do our best to dismiss or explain that evidence away? For if it were true, it would apparently be wholly inexplicable.

    [325] C. S. Peirce, in P.P. Weiner, ed., p. 99.

    [326] Peirce in P.P. Wiener, ed., p. 102.

    [327] Bertrand Russell (1962b), p. 115.

    [328] Russell (1962b), pp. 104-106.

    [329] R. Paul (1987), p. 140.

    [330] R. Paul (1987), p. 142.

    [331] See William Hare's (1992) essay "Humility as a Virtue in Teaching." Lorraine Code (1987) states that "Humility stands as a safeguard against whimsicality in judgement. Imagination is accorded sufficient scope to see the world and one's own efforts at achieving explanation in a wider context, but humility checks its possible excesses in either direction: toward whimsicality, or toward closed-minded dogmatism, tantamount to a failure of imagination." (p. 234)

    [332] quoted in R. W. Clark (1984).

    [333] Code, p. 136.

    [334] Code, p. 131

    [335] Code, p. 161.

    [336] Code, p. 137.

    [337] William Hare (1992), p. 229.

 [338] Hare (1992), p. 228.

 [339] Hare (1992), p. 228.

 [340] Hare (1992), pp. 230-31. Mirroring Hare, Lorraine Code (1987) states that "epistemic integrity" is most strongly evident in the ability to be a "fallibilist" in the "Peircian sense", to be "cognizant of one's own potential fallibility even in the most painstakingly won conclusions, even in the nature of things that underlies them. The capacity to serve the intrinsic goods of the practice, to value a just perspective on how things are above one's own reputation and prestige is a significant mark of intellectual virtue." (p. 233)

 [341] James A. Montmarquet (1987), "Epistemic Virtue", p. 484.

 [342] Robert Nozick (1993), p. 174.

 [343] David Hume, Enquiries, Sec XII, Part 3, pp. 161-62.

 [344] C. S. Peirce in P. P. Weiner. ed., p. 189.

 [345] Bertrand Russell (1962b), p. 9.

 [346] Russell (1962b), pp. 104-106.

 [347] Russell (1962b), p. 9.

 [348] Betrand Russell (1962b), p. 109.

 [349] B. Russell (1962b), p. 115.

 [350] Russell (1962b), p. 109.

 [351] F. Nietzsche (1886), Human All too Human, sec. 265, p. 125.

 [352] Eric Hoffer (1951), p. 23.

 [353] Russell, in making one of his logical points, declared that you cannot disprove that the universe was created five minutes ago.

 [354] W. E. H. Lecky (1866), Vol. I, p. vi.

 [355] Lecky (1866), Vol. I, p. vi, vii.

 [356] Lecky (1866), Vol. I, p. xv.

 [357] Unfortunately, however, precious little room, if any, is made in the curriculum for teaching these important skills. Critical thinking is an idea tossed around as an important educational aim during accreditation sessions, mission statement formulations and subsequent growth plan meetings but nothing substantive is ever done. And when anything remotely seems successful for implementation of critical thinking programs into the curriculum, it is challenged by enraged conservatives of all stripes, especially the Christian variety. The separation of church and state is as much of a farce in Canada as it is in the United States where it is written into in their Constitution.

 [358] Rarely do student think of attempting to falsify a general principle such as this. For example, a simple substitution such as   (3 + 5)2 = 32 + 52 would suffice. (64 is not equal to 34!)

 [359] As the scientific method and the history of science make abundantly clear, science is misconstrued when it is viewed as a body of information or as a set of fixed conclusions. Yet, frequently that misconstrual is what students learn in their science classes.




          Abelson, R. P. (1986) "Beliefs are like Possessions." Journal for The Theory of Social Behaviour, vol. 16, pp. 222-250.


          Armstrong, D.M. (1973) Belief, Truth, and Knowledge. London: Cambridge University Press.


          Ayer, A. J. (1956) The Problem of Knowledge. Harmondsworth, Middlesex: Penguin.


          Ayer, A. J. (1975) The Central Questions of Philosophy. New York: Morrow.                                                                                                                                                      

          Bacon, Francis (1985) Novum Organon. trans. G. W. Kitchin, Oxford.


          Batson, C.D. (1975) "Rational Processing or Rationaliz­ation?: The Effects of Disconfirming Evidence on A Stated Religious Belief." Journal of Person­ality and Social Psychol­ogy, Vol. 32, pp. 176-84.


          Bernstein, Richard, ed. (1985) Habermas and Modernity. Cambridge, Mass.: MIT Press.


          Bernstein, Richard (1980) "Philosophy in the Conversation of Mankind.", Review of Metaphysics, XXXIII, no. 4, pp. 745-775.


          Berman, Morris (1989) The Reinchantment of the World. New York: Bantam Books.   


          Beyer, Barry (1985). Critical Thinking: What is it?", Social Education, vol. 29, no. 4, pp. 270-76.


          Beyer, Barry (1990) What Philosophy Offers to the Teaching of Critical Thinking.", Educational Leadership, pp. 55-60.


          Blanshard, Brand (1974) Reason and Belief. London: Allen & Unwin.


          Bonjour, Laurence (1985) The Structure of Empirical Knowledge. Cambridge, Mass.: Harvard University Press.


          Boykoff Baron, Joan & Sternberg, R.J., eds. (1987) Teaching Thinking Skills: Theory and Practice. New York: Freeman.


          Buchler, Justus, ed. Philosophical Writings of C.S. Peirce. New York: Dover, 1955.


          Callan, Eamon (1984) "Liberal Education and the Curriculum." Educational Studies, Vol. 10, no. 1, pp. 65-76.


          Callan, Eamon (1988a) "Faith, Worship and Reason in Religious Upbringing." Journal of Philosophy of Education, Vol.   2, no. 2.


          Callan, Eamon (1988b) "Indoctrination and Parental Rights." In W. Hare & J.P. Portelli, eds., Philosophy of Education, Calgary, Alta.: Detselig, 1988, pp. 133-142.


          Carr, Brian (1981-82) "Knowledge and its Risks", Proceedings of the Aristotelian Society, vol. 32.


          Chisholm, Roderick M. (1977) Theory of Knowledge. Englewood Cliffs: Prentice-Hall.


          Chisholm, Roderick M. (1982) Foundations of Knowing. Minneapolis: University of Minnesota Press.


          Chisholm. Roderick M. (1969) Perceiving: A Philosophic Study. Ithaca: Cornell University Press.


           Chisholm, Roderick (1961) "Evidence as Justification." Journal of  Philosophy, Vol. 58, pp. 739-48.


           Chisholm, Roderick (1966) "Lewis' Ethics of Belief" in The Philosophy of C. I. Lewis, Paul Schlipp, ed. La Salle, Ill.:  Library of Living Philosophers, Vol. 13.


          Chisholm, Roderick (1980) "A Version of Foundationalism", Midwest Studies in Philosophy V, 1980, pp. 543-564.   


          Chisholm, Roderick (1986) "The Place of Epistemic Justification."  Philosophical Topics 14, pp. 85-92. 


          Chomsky, Noam (1975) "Toward a Humanistic Conception of Education." in Work, Technology and Education. ed. Walter Fienberg &Harry Rosemount, Jr., Urbana, Ill.: University of Illinois Press, pp. 204-220.


          Clark, R. W. (1984) The Survival of Charles Darwin: A Biography of A Man and an Idea. New York: Random  House. 


          Clifford, W. K. (1877) Lectures and Essays, Vol. 2, eds. L. Stephen & F. Pollock, London: Macmillan, 1901.


          Code, Lorraine (1982) "Father and Son: A Case Study of Epistemic Responsibility". The Monist, 66, No. 2.


          Code, Lorraine (1987) Epistemic Responsibility. Hanover: University of New England.


          Code, Lorraine (1988) "Experience, Knowledge and Responsibility" in A. Garry & M. Pearsall, eds. Women, Knowledge and Reality


          Coope, Christopher (1974) "Wittgenstein's Theory of Knowledge" in Godfrey Vesey, ed., Understanding Wittgenstein,  Ithaca, N.Y.: Cornell University Press.


          Davidson, Donald. "On the Very Idea of a Conceptual Scheme", Proceedings and Addresses of the American Philosophical Association, 47,(1973-74), pp. 5-20.


          Davidson, Donald (1990),"The Structure and Content of Truth", Journal of Philosophy, LXXXCI, no. 6.


          Dawkins, Richard (1976) The Selfish Gene. (New ed.) New York: Oxford University Press, 1992.


          Dawkins, Richard (1993) "Viruses of the Mind", Free Inquiry, Vol. 13, no 3, pp. 34-41.


          Dearden, Hirst, & Peters, eds. (1972) Education and the Development of Reason. London: Routledge.


          Dewey, John (1916) Democracy and Education. New York: MacMillan, 1966.


          Dewey, John (1922) Human Nature and Conduct. New York: Modern Library, 1930.                                                                                                  

          Dewey, John (1925) Experience and Nature. New York: Dover, 1958.


          Dewey, John (1929) The Quest for Certainty. New York: G.P. Putnam's Sons, 1960.


          Dewey, John (1933) How We Think. Chicago: Henery Regnery, 1971.


          Dewey, John (1934) A Common Faith. New Haven: Yale University Press.


          Dewey, John (1938) Logic: The Theory of Inquiry. New York: Holt


          Dewey, John (1939) "Experience, Knowledge and Values." in Paul A. Schillp, ed., The Philosophy of John Dewey. New York: Tudor Pub. Co., 1951.


          Dewey, John (1941) "Propositions, Warranted Assertibility, and Truth. "Journal of Philosophy, Vol. XXXVIII, no.7, p.169-186.


          Dewey, John (1925-1953) The Later Works, ed., Jo Ann Boydston. Carbondale and Edwardsville, Ill.: Southern llinois Press, 1981


          Dewey, John, "Absolutism to Experimentalism" in John J. McDermott, ed., The Philosophy of John Dewey (Vol. I & II.), New York: G. P. Putnams' Sons, 1973.                                                                                                                            


          Dretske, Fred I. (1983) "The Epistemology of Belief." Synthese, vol. 5, pp. 3-19.  


          Elliot, Carl  (1991) "Beliefs and Responsibility." Journal of Value Inquiry, Vol. 25, pp. 339-347.                                                


          Ennis, Robert (1991) "Critical Thinking: A Streamlined Conception." Teaching Philosophy, Vol. 14, no. 1.                                


          Ennis, Robert H. (1962). "A Concept of Critical Thinking." Harvard Educational Review, Vol. 32(1), pp. 81-111.  


          Eve, R. A & Dunn, Dana (1990) "Psychic Powers, Astrology & Creationism in the Classroom." American Teacher, Vol. 52(1), Jan. 1990, pp. 10-21.


          Feyerabend, Paul (1978) Against Method. London: Verso Press.  


          Feyerabend, Paul (1987) Farewell to Reason. New York: Verso Press.      


          Flew, Antony (1975) Thinking About Thinking. London: Fontana/Collins.                                                                                                                                                                                                                                                                                                       

           Flew, Antony (1992) Thinking About Social Thinking. London: Fontana/Harper/Collins.   


           Flew, Antony (1982) "A Strong Program for the Sociology of Belief." Inquiry, vol. 25, pp. 365-385.    


          Flew, Antony (1966) "What is Indoctrination?", Studies in Philosophy and Education VI, Vol. 4, No. 3, pp. 281-306 


          Foley, Richard (1987) The Theory of Epistemic Rationality. Cambridge, Mass.: Cambridge University Press.   


          Foley, Richard (1991) "Evidence and Reasons for Belief", Analysis, vol. 51, no. 2, pp. 98-102       


          Gardner, Martin "Anti-Science: The Strange Case of Paul Feyerabend", Free Inquiry, Winter 1982/83.


          Gasset, Jose Ortega Y. (1929) The Revolt of the Masses. New York: W.W. Norton, 1955.


          Goggans, Phil (1991) "Epistemic Obligations and Doxastic Volantarism", Analysis, vol. 51, no. 2, pp. 102-105.   


          Goldman, Alvin I. (1986) Epistemology and Cognition. Cambridge, Mass,: Harvard University Press.


          Gould, Stephen J.(1987) An Urchin in the Storm. New York: W.W. Norton. 


          Greene, Maxine (1976) "John Dewey and Moral Education", Contemporary Education, Vol. 48 (1).


          Griffiths, A. P., ed. (1967) Knowledge and Belief. Oxford: Oxford University Press.


         Guinlock, J., ed. (1976) The Moral Writings of John Dewey. New York: Macmillan..


         Hamlyn, D. W. (1970) The Theory of Knowledge. Garden City, N.Y.: Doubleday, Anchor.


         Hankinson Nelson, Lynn (1993) "A Question of Evidence", Hypatia, vol.8, no. 2, pp. 172-89.


        Hare, R. M. (1981) Moral Thinking: Its Levels, Method, and Point. London: Clarendon Press.


        Hare, R. M. (1992) Essays on Religion and Education. London: Clarendon.


        Hare, William (1979) Open Mindedness and Education. Montreal: McGill-Queens University Press.   


        Hare, William (1985) In Defence of Open-Mindedness. Kingston, Ont.: McGill University Press.   


        Hare, William (1990) "Limiting the Freedom of Expression: The Keegstra Case", Canadian Journal of Education.   vol. 15   (4).


        Hare, William (1992) "Humility as a Virtue in Teaching", Journal  of Philosophy of Education. vol. 26, no. 2,  pp.   227-236.


        Harman, Gilbert (1977) The Nature of Morality. New York: Oxford University Press. 


       Heidelberger, Herbert (1963) "On Defining Epistemic Terms", The Journal of Philosophy, vol. 60, pp. 344-48.


       Heil, John (1983) "Believing What One Ought", Journal of Philosophy, vol. 80. 


       Hinton, J. M. (1989) "Skepticism: Philosophical and Everyday “, Philosophy, vol. 64, no. 248.


       Hoffer, Eric (1951), The True Believer. New York: Time Inc., 1963.


       Hook, Sidney (1940) Reason, Social Myths and Democracy. Buffalo, N.Y.: Prometheus Books, 1991.                                                                          

       Hume, David. Enquiries Concerning Human Understanding and Concerning Principles of Morals. L.A. Selby-Bigge, ed. Oxford: Clarendon Press, 1963.­        


       Hunter, J. F. M. (1980) "Believing", Midwest Studies in Philosophy V, pp. 239-260.                                                                                                                   

      James, William (1896) The Will to Believe and Other Essays in  Popular Philosophy. New York: Longmans, Green.


      James, William Pragmatism and Other Essays. New York: Washington Square Press, 1963.


       Kekes, John (1976) A Justification of Rationality. Albany: State Uni­versity of New York.


       Kaufman, Walter (1961) Critique of Religion and Philosophy. Garden City, N.Y.: Doubleday.


       Kornblith, Hilary (1983) "Justified Belief and Epistemically Responsible Action", The Philosophical Review,   vol. 92, no. 1.


       Kuhn, Thomas (1962) The Structure of Scientific Revolutions. 2nd  ed. (1970), Chicago: University of Chicago Press.


       Lakatos, Imre. ed. Criticism and the Growth of Knowledge. Cambridge: Cambridge University Press, 1970.


       Laudan, Larry (1977) Progress and its Problems. Berkeley: University of California Press.


       Lecky, W. E. H. (1866) History of the Rise and Influence of the Spirit of Rationalism in Europe. 2 vols. London: Longmans, Green and Co.


       Lehrer, Keith (1990) Theory of Knowledge. Boulder, Colorado: Westview Press.


       Lehrer, Keith and Clay, Marjorie, eds. Knowledge and Skepticism. Boulder, Colorado.: Westview Press, 1989.


       Lewis, C. I. (1955) The Ground and Nature of the Right. New York: Columbia University Press.


        Lewis, C. I. (1956) Mind and the World Order. New York: Dover.


       Lewis, C. I. (1969) Values and Imperatives. Stanford: University of California Press.


       Lipman, Matthew (1988) "Critical Thinking - What Can It Be?" Educational Leadership, Sept., 1988, pp. 38-43.


       Locke, John. An Essay Concerning Human Understanding. P. H. Nidditch, ed., Oxford: Oxford University Press,  1975.


       Lovibond, Sabina (1983) Realism and Imagination in Ethics. Minneapolis, Minn.: University of Minnesota Press.


       Luper-Foy, S. (1985) "The Reliabilist Theory of Rational Belief", The Monist, vol. 68.


       McDow­ell, John (1979) "Virtue and Reason", The Monist, vol. 62.                                                                          


       McPeck, John (1981) Critical Thinking and Education. Oxford: Martin Robertson.


       McPeck, John (1990) Teaching Critical Thinking. New York: Routledge.


       Martin, Robert M. (1992) There are Two Errors.... Peterborough, Ont.: Broadview Press.


       Meiland, J. and Krausz, M., eds., (1982) Relativism: Cognitive and Moral. Notre Dame: University of Notre Dame Press.


       Mencken, H. L. (1982)  A Mencken Chrestomathy. New York: Vintage Books


       Monk, Ray (1990) Wittgenstein: The Duty of Genius. New York: Penguin.


       Montmarquet, James A. (1987) "Epistemic Virtue", Mind, Vol. 96, no. 384, pp. 482-97.


       Morris, Thomas V. (1986) "Pascalian Wagering", Canadian Journal of Philosophy, vol. 16, no. 3, pp. 437-454.


       Nagel, Thomas (1979) Mortal Questions. New York: Cambridge University Press.


       Nagel, Thomas (1986) The View From Nowhere. New York: Oxford  University Press.


       Nielsen, Kai (1989) God, Skepticism, and Morality. Ottawa: University of Ottawa Press.


       Nietzsche, F. (1885) The Anti-Christ/Twilight of the Idols, trans. R.J. Hollingdale, New York: Penguin, 1985.


       Nietzsche, F. (1886) Human, All too Human, trans. R. J. Hollingdale, New York: Cambridge University Press, 1989.


       Nietzsche, F. (1887) The Gay Science, trans. W. Kaufmann, New York: Vintage Books, 1974.


       Nietzsche, F. (1888) The Will to Power, trans. W. Kaufmann, New York: Random House, 1968.


       Nozick, Robert (1993) The Nature of Rationality. Princeton, N.J.: Princeton University Press.


       Nozick, Robert (1981) Philosophical Explanations. Cambridge, Mass.: Harvard University Press.


       Nozick, Robert (1974) Anarchy, State and Utopia. New York: Basic Books.


       Passmore, John (1968) One Hundred Years of Philosophy. Harmondsworth: Penguin Books.  


       Passmore, John (1973) "On Teaching to be Critical", in The Concept of Education. R. S. Peters, ed. London: Routledge and Kegan Paul.


       Passmore, John (1969) Philosophical Reasoning. New York: Basic Books. 


       Pasquarello, Tony "Humanism's Thorn: The Case of Bright Believers", Free Inquiry, vol. 13, no. 1, Winter 1992/93, pp. 38-42.


       Paul, Richard (1990) Critical Thinking. Rohnert Park, CA.: S­onoma State University Centre for Critical Thinking  and Moral Critique.


       Paul, Richard (1987) "Dialogical Thinking: Critical Thought Essential to the Acquisition of Rational Knowledge and   Passions" in Teaching Thinking Skills: Theory and Practice. eds. J. B. Baron & R.J.   Sternberg. New York: W. H.  Freeman & Co., pp. 127-148.


       Peters, R. S. (1964) "Education as Initiation" in Analysis and Education, R. Archambault, ed. New York: Humanities  Press.


       Piaget, Jean (1965) The Moral Judgement of the Child. New York: The Free Press.


       Piaget, Jean (1972) The Child's Conception of the World. Totowa, N.J.: Littlefield, Adams & Co.


       Pollock, John L. (1984) "Reliability and Justified Belief". Canadian Journal of Philosophy, vol. 14.


       Pollock, John L. (1974) Knowledge and Justification. Princeton: Princeton University Press.


       Pollock, John L. (1986) Contemporary Theories of Knowledge. Totowa, N.J.: Rowman & Littlefield.


       Popkin, Richard A. (1979) The History of Skepticism from Erasmus to Spinoza , rev. ed., Berkeley: University of  California Press.


       Popper, Karl (1945) The Open Society and its Enemies, Vol. II, London: Routledge.


       Popper, Karl (1963) Conjectures and Refutations. London: Routledge & Kegan Paul.


       Popper, Karl (1989) "The Importance of Critical Discussion" in On the Barricades: Religion and Free Inquiry in Conflict, R. Basil, R. B. Gehrman & Tim Madigan, eds., New York: Prometheus Books.


       Popper, Karl (1976) Un-ended Quest. London: Fontana


       Popper, Karl (1983) Realism and the Aim of Science. London: Hutchinson.


       Prado, C. G. (1987) The Limits of Pragmatism. Atlantic Highlands, N.J.: Humanities International Press.    


       Price, H. H. (1969) Belief. (Gifford Lectures, 1960). London: Allen & Unwin.


       Putnam, Hilary (1981) Reason, Truth and History. Cambridge: Cambridge University Press.


       Quine, W. V. & Ulliam, J. S. (1978) The Web of Belief. New York: Random House.


       Rauch, Jonathan (1993) Kindly Inquisitors: The New Attacks on Free Thought. Chicago: University of Chicago Press.  


       Rorty, Richard (1979) Philosophy and the Mirror of Nature. Princeton: Princeton University Press.    


       Rorty, Richard (1982) The Consequences of Pragmatism. Minneapolis: University of Minnesota Press.


       Rorty, Richard (1991) Objectivity, Relativism and Truth. New York: Cambridge University Press.


       Rorty, Richard (1989) Contingency, Irony and Solidarity. New York: Cambridge University Press.


       Russell, Bertrand (1912) The Problems of Philosophy. Oxford: Oxford University Press, 1967.


       Russell, Bertrand (1926) Education and the Good Life. New York: Liveright Publishing Corp.


       Russell, Bertrand (1927) An Outline of Philosophy. London: Unwin Books, 1979.


       Russell, Bertrand (1947) Authority and the Individual. New York: Beacon Press, 1960.


       Russell, Bertrand (1948) Human Knowledge. New York: Simon & Schuster.


       Russell, Bertrand (1950) Unpopular Essays. London: Allen and Unwin.


       Russell, Bertrand (195­8) The Will to Doubt. New York: Philosophical Library.


       Russell, Bertrand (1962a) Essays in Skepticism. New York, Wisdom Library.


       Russell, Bertrand (1962b) Skeptical Essays. London: Unwin Books.


       Russell, Bertrand (1966) Philosophical Essays. New York: Simon & Schuster.


       Ryle, Gilbert (1949) The Concept of Mind. London: Penguin Books, 1968.


       Salmon, Wesley (1967)  The Foundations of Scientific Inference. Pittsburgh: Pittsburgh University Press.


       Sartre, Jean Paul (1956) Being and Nothingness. trans. Hazel  Barnes, New York: Washington Square Press, 1966. 


       Scheffler, Israel (1973) Reason and Teaching. Indianapolis, Ind.: Hackett Pub. 


       Scheffler, Israel (1982) Science and Subjectivity. Indianapolis, Ind.: Hackett Pub.


       Scheffler, Israel (1963) Conditions of Knowledge. Chicago: Scott, Foresman.


       Scheffler, Israel (1991) In Praise of the Cognitive Emotions and Other Essays in the Philosophy of Education. New York: Routledge.


       Schulte, Joachim (1992) Wittgenstein: An Introduction.  W. H. Brenner & J. F. Holley, trans., New York: State University of New York Press.


       Schumacher, E. F. (1977) A Guide for the Perplexed. New York: Harper & Row.


       Scriven, Michael (1966) Primary Philosophy. New York: McGraw-Hill


       Scriven, Michael (1976) Reasoning. New York,: McGraw-Hill.


       Siegel, Harvey (1980) "Critical Thinking as an Educational Ideal", Educational Forum, pp. 7-23.


       Siegel, Harvey (1981) "Creationism, Evolution and Education: The California Fiasco. Phi Delta Kappan, Vol. 63,   Oct. 1981,  pp.95-101.


       Siegel, Harvey (1984) "Response to Creationism", Educational Studies, vol. 15, no. 4, Winter, 1984.


       Siegel, Harvey (1987) Relativism Refuted. Dordecht: R. Reidel Press.


       Siegel, Harvey (1988) Educating Reason: Rationality, Critical Thinking, and Education. New York: Routledge.


       Siegel, Harvey (1988) "Rationality and Epistemic Dependence", Educational Philosophy and Theory, 20(1), pp. 1-6.


       Skyrms, Brian (1986) Choice and Chance. Belmont, Cal.: Wadswo­rth.


       Soltis, Jonas (1977) An Introduction to the Analysis of Educational Concepts, 2nd ed. Reading, Mass.: Addison- Wesley.


       Sosa, Ernest (1980) "The Raft and the Pyramid: Coherence v Foundations in the Theory of Knowledge", Midwest  Studies in Philosophy V, pp. 3-25.                                                                                                                                                                

       Sosa, Ernest (1985) "Knowledge and Intellectual Virtue", The Monist, vol. 68, no. 2.


       Stout, Jeffery (1988) Ethics After Babel. Boston: Beacon  Press.


       Stout, Jeffery (1981) The Flight From Authority: Religion, Morality and the Quest for Autonomy. Notre Dame, Ind:  U. of Notre Dame.


       Stroud, Barry (1984) The Significance of Philosophical Skepticism. Oxford: Clarendon Press.


       Taylor, Charles (1991) The Malaise of Modernity (CBC Massey Lectures, 1991). Concord, Ont.: House of Anansis Press.


       Toulmin, Stephen (1982) Return to Cosmology. Berkeley: University of California Press.


       Toulmin, Stephen (1990) Cosmopolis: The Hidden Agenda of Modernity. Chicago: University Of Chicago Press.


       Trigg, Roger (1973) Reason and Commitment. London: Cambridge University Press.


       Trigg, Roger (1989) Reality at Risk: A Defence of Realism in Philosophy and Science. London: Harvester Wheatsheaf.


       Unger, Peter (1975) Ignorance: A Case for Skepticism. Oxford: Clarendon.


       Walker, David (198­2) "A Lesson in Gambling with Pascal", Teaching Philosophy, Vol. 5, no. 4, pp. 311-312.


       White, Alan R. (1967) The Philosophy of Mind. New York: Random House.


       White, Alan R. (1982) The Nature of Knowledge. Totowa, N.J.: Rowman & Littlefield.