Scientism and its Discontents

Susan Haack

Rounded Globe

About the Author

Susan Haack is Distinguished Professor in the Humanities, Cooper Senior Scholar in Arts and Sciences, Professor of Philosophy, and Professor of Law at the University of Miami.

Her work ranges from philosophy of logic and language, epistemology, metaphysics, philosophy of science, pragmatism—both philosophical and legal—and the law of evidence, especially scientific evidence, to social philosophy, feminism, and philosophy of literature.

Her books include Philosophy of Logics; Deviant Logic, Fuzzy Logic: Beyond the Formalism; Evidence and Inquiry; Manifesto of a Passionate Moderate; Defending Science—Within Reason; Pragmatism, Old and New; Putting Philosophy to Work; Ciencia, Sociedad y Cultura; Evidence Matters: Science, Proof, and Truth in the Law; and (in 2015) Perspectivas Pragmatistas da Filosofia do Direito (São Leopoldo, Brazil: Editora UNISINOS) and Legalizarre l’epistemologia (Milan, Italy: Università Bocconi).

Haack’s work has been translated into French, German, Italian, Spanish, Portuguese, Polish, Russian, Croatian, Danish, Swedish, Romanian, Korean, and Chinese; and she is invited to lecture around the world.

Haack was included in Peter J. King’s One Hundred Philosophers: The Life and Work of the World’s Greatest Thinkers and in the Sunday Independent’s list, based on a BBC poll, of the ten most important women philosophers of all time. Haack’s work has been celebrated in two volumes of essays, Susan Haack: A Lady of Distinctions (2007) and Susan Haack: Reintegrating Philosophy (2016). In 2011 Haack was awarded the degree of Doctor Honoris Causa by Petri Andreis University, and in 2016 the Ulysses Medal, the highest honor given by University College, Dublin.


Introduction: Scientism and Its Discontents

Science is neither sacred nor a confidence trick. How, you might wonder, could anyone possibly think otherwise? Isn’t it just obvious that the work of the sciences has advanced our knowledge of the world immeasurably—and no less obvious that, nevertheless, science is imperfect and limited, as all human enterprises are? The fact is, however, that many people have thought, and do think, otherwise.

“Ours is an age in which partial truths are tirelessly transformed into total falsehoods, and then acclaimed as revolutionary revelations,” wrote Thomas Szasz in 1973;1 and this shrewd observation is certainly no less true now than it was then, perhaps even more so. Anti-scientific cynicism, focusing on the fallibility and limitations of science, transforms one part of the truth—that science is neither infallible nor omnicompetent—into something revolutionary-sounding but dangerously false: that what scientific theories are accepted really depends on nothing but power, politics, rhetoric, negotiation, so that the claim of the sciences to give us knowledge of the world is bogus. And scientism, focusing on the achievements of the sciences, transforms another part of the truth—that over the last several hundred years the sciences have made many remarkable discoveries—into something equally revolutionary-sounding but no less dangerously false: that only science can give us real knowledge, so that non-scientific fields are either ripe for colonization by the sciences or else are illegitimate, best abandoned altogether.

* * *

In Defending Science—Within Reason: Between Scientism and Cynicism (2003), I proposed an understanding of the scientific enterprise that did justice both to its strengths and to its limitations. At the time, anti-scientific cynicism—in the form of postmodernist, feminist, and post-colonialist “science criticism” and of radically dismissive styles of history, sociology, and rhetoric of science—was the height of intellectual fashion; so my critical efforts were largely focused on exposing anti-scientific misunderstandings of science. Before long, however, the tide had turned: encouraged by the boom in evolutionary psychology and neuroscience, energized by a newly-aggressive atheism, scientism was on the rise both in the academy and in our culture more generally; and so, in “Six Signs of Scientism” (2010) I proposed some ways to identify when the line between appropriate respect for the achievements of the sciences and the inappropriate deference to science characteristic of scientism has been crossed.

At the time, I thought that was that. But no: a 2014 conference at the Free University of Amsterdam brought home to me not only that scientism was already much more pervasive and deeply-entrenched in my own discipline, philosophy, than I had realized, but also that the disturbing idea was gaining currency that anyone reluctant to ride the new scientistic wave must have a religious agenda, overt or covert. What was needed, I concluded, was to expose the misunderstandings of science on which scientism rests as thoroughly and as carefully as I had previously exposed the misunderstandings on which anti-science rests.

Happily, an invitation to give the 2016 Agnes Cuming lectures at University College, Dublin provided the incentive to get on with the job of charting the myriad varieties of scientism, figuring out more exactly where they go wrong, and articulating the reasons for avoiding scientism—whatever your religious views, or lack of them. As so often, however, I soon discovered that these seemingly straightforward tasks were larger, more complex, and more challenging than I imagined at the outset; but for that very reason also more rewarding, leading me to unanticipated questions and unexpected insights about the place of science in society, the stresses and strains that threaten its integrity, the rich variety of human cultures, the complexities of the human mind, and even the ethos of the academy and the character, scope, and methods of philosophy itself. The result was the two lectures presented here.

The first lecture, “Science, Yes; Scientism, No,” begins by building on, amplifying, and deepening the account of scientific inquiry developed in Defending Science. After exploring shifts and changes in the scope of the word “science,” I suggest how science as we now know it emerged from everyday forms of inquiry; how we came to classify certain disciplines as sciences but to exclude others; how, over centuries of work, generations of scientists have gradually developed special methods, tools, and techniques for obtaining more evidence and better assessing its worth; why progress in the sciences is uneven and unpredictable; and why, today, the integrity of science is under serious threat from political, commercial, and institutional pressures. The next step is to get some grip on the manifold manifestations of scientism, from simple credulity about anything and everything bearing the label, “scientific,” through the honorific use of “science” and its cognates and preoccupation with the “problem of demarcation,” the “scientific method,” etc., to dismissing non-scientific work as inherently inferior or repudiating it altogether as simply spurious. And then I can show in some detail that every one of these manifestations of scientism betrays a significant misunderstanding of what the sciences do, and how they do it.

Well, almost every one of them. But in the first lecture all I can say in response to the radically scientistic idea that there simply is no legitimate inquiry outside the sciences is that, on the contrary, there are obviously large classes of perfectly legitimate questions—questions of history, law, literary scholarship, mathematics, ethics, epistemology, metaphysics, etc., questions of policy, and everyday questions about where I left the car keys, how to get to the post office, and so on—that not even the most sophisticated future science imaginable could answer. But this, though true enough, isn’t really response enough. Only in the second lecture, where I turn my attention to scientism in philosophy specifically, can I take the first steps towards a fuller and more satisfying answer.

This second lecture, “Scientific Philosophy, Yes; Scientistic Philosophy, No,” begins by looking briefly at some twentieth-century precursors of the scientistic philosophies in vogue today, and then turns to a closer examination of these recent manifestations—“experimental philosophy,” “naturalized metaphysics,” self-styled “scientism”, and the like. None, I argue, is capable of dealing with key questions about how the world must be, and how we must be, if the scientific enterprise is to be possible; in particular, all duck, fudge, or flub crucial questions about the human mind and our capacity to figure out something of how the world is. And this is no accident.

These crucial questions about the mind are, to be sure, empirical; but they can’t possibly be handed over to the sciences to answer, because all scientific work presupposes that they have been satisfactorily answered. They are, by my lights, distinctively philosophical questions. This doesn’t mean, however, that they are purely conceptual. No: however widely it’s taken for granted, it’s just not true that philosophy must either be entirely a priori, or else must look to the sciences to answer its questions; nor, therefore, that we are obliged to choose either the old analytic paradigm of philosophy as pure conceptual analysis or else scientism of one or another stripe. Philosophical questions are neither purely conceptual nor resoluble by the sciences. Philosophy is about the world, and so requires experience; not, however, the recherché experience needed by the sciences, but the familiar, everyday experience of our interactions with the world and with others.

Close attention to that everyday experience reveals a distinctive kind of human mindedness; and getting this in focus reveals that while, to be sure, we wouldn’t have this distinctive mindedness but for our big brains, the brain is by no means the whole story. There’s something deeply social about human mental powers—a kind of virtuous spiral in which culture enables mindedness even as mindedness enables culture. Much of the second lecture is devoted to articulating what this idea amounts to in the specific; and to showing that, if it’s even roughly on the right lines, it was only to be expected that the key questions about human mindedness would prove beyond the reach of scientistic philosophies. No wonder, then, that some scientistic philosophers—apparently not noticing that they’re sawing off the branch they’re sitting on—have gone so far as to claim that there are no beliefs, or even that “the brain does everything without thinking about anything at all.”2

All this, naturally, leaves me wondering why—when it is, in the end, nothing but a confession of philosophical failure—scientistic philosophy should have been so warmly embraced by so many in our profession today. Vaguely sensing that academic philosophy is in bad shape, I suggest, many are bored and restless, casting around for something new and different. And their sense that something’s rotten in the state of philosophy, that we can’t just go on with philosophical business-as-usual is, I’m afraid, well-founded; but the idea that the cause is simply that the analytic paradigm is nearing exhaustion, and the hope that scientism will cure our ills, are way off the mark.

I suggest a very different kind of diagnosis, beginning with the changes in the management of universities that have led to a near-universal reliance on badly flawed surrogate measures of the quality of intellectual work, creating incentives to bustle, boosterism, busywork, and boasting—in short, a focus on the appearance of progress, rather than the real thing. In the sciences themselves, these perverse incentives have contributed to the rise of salami publishing, misleading multiple attributions of authorship, corruption of the peer-review process, an obsession with grant-writing, and a burgeoning scientific bureaucracy, and have encouraged haste, carelessness, and even fraud. And in philosophy they have made the idea that important results can be readily achieved by means of simple surveys, by looking to recent papers in the psychology journals, or by getting your hands on an MRI machine almost irresistibly attractive—especially if you can land grant money for your project, which these days seems to get you more credit than results do. There’s a real irony here: scientistic philosophy is on the rise just as the integrity of science itself is under threat, and for some of the same reasons.

But, as the saying goes, “if it sounds too good to be true, it probably is”; there really is no such thing as a free philosophical lunch. Making real progress in philosophy will take the same kind of patient, painstaking hard work that enabled the sciences to extend their reach and deepen their theoretical understanding, and the same honesty and humility that all serious inquiry, whatever its subject-matter, requires. There are no shortcuts. The way forward is philosophical inquiry undertaken in the right spirit, from a genuine desire to figure things out, and using the right tools, experiential as well as conceptual.

* * *

I am grateful to the department of philosophy at University College, Dublin not only for the invitation that provided the spur to this work, but also for their comments and suggestions after my lectures; to Mark Migotti, my faithful and always-helpful first reader; to Pamela Lucken and Barbara Cuadras, in the law library at the University of Miami, and to Alina Hernandez, my Law School assistant, for their skilful and cheerful support; to Devon Coleman for intelligent and eagle-eyed volunteer assistance with proof-reading; and to Howard Burdick, my sterling But-For-Whom, who helps me keep my head even when, as now, I begin to fear that all about me really are losing theirs.3

Lecture I: Science, Yes; Scientism, No

… the progress of science has seemed to mean the enlargement of the material universe and the diminution of man’s importance. The result is what one may call the growth of naturalistic or positivistic feeling. Man is no lawgiver to nature, he is an absorber. She it is who stands firm; he it is who must accommodate himself.

The romantic spontaneity and courage are gone, the vision is materialistic and depressing. … [But] you want a system that will combine both things, the scientific loyalty to facts and willingness to take account of them, … but also the old confidence in human values …—William James1

In “The Present Dilemma in Philosophy,” the 1906 lecture from which this passage is taken, James suggests that, behind the recurrent disagreements between those who focus on facts, on the concrete, on the scientific, and those who prize poetry, art, religion, values, lies a clash of fundamental philosophical temperaments, the “tough-minded” and the “tender-minded.” Under each heading, he lists a cluster of the philosophical ideas—not always, as he realizes, mutually compatible ideas—to which thinkers of these contrasting temperaments tend to gravitate. What’s needed, James continues, is a philosophy that can accommodate both the focus on facts and admiration for the scientific that appeals to the tough-minded and the literary or religious idealism that appeals to the tender-minded. And while our cultural landscape is very different from James’s, even today his words resonate: we still need a philosophy that accommodates both that admirable “scientific loyalty to facts” and the hankering for those desirable human values—without sacrificing either to the other.

“A man must be downright crazy,” C. S. Peirce wrote in 1903, “to doubt that science has made many true discoveries.”2 Indeed. Thanks to the work of many generations of scientists, we now know far more about the world than we did just a few hundred years ago; and this expansion of our knowledge has enabled us to transform our world, and to lengthen and improve our lives. But science is a human enterprise; and so inevitably combines (as Denis Diderot put it) “insight and blindness, … pettiness and grandeur.”3 Impressive as their achievements have been, the sciences are imperfect and limited, as all human enterprises are.

There’s vastly more the sciences have yet to find out; much of what scientists once thought they knew has by now been found to be false, or only approximately or only partially true; and doubtless some of what scientists now think they know will, likewise, be revealed by future work to have been mistaken. Extraordinary as the edifice of well-established scientific knowledge now is, the trash-heap of discarded ideas and theories is far larger. Scientific discoveries don’t come easily—and, like the rest of us, scientists sometimes cut corners when they are hard-pressed or over-anxious to succeed; like the rest of us, they are sometimes lazy, careless, or biased; like the rest of us, they are susceptible to the pressure of commercial interests, political ideology, and simple careerism. Scientific progress is ragged and uneven, sometimes astonishingly fast, sometimes painfully slow and halting; and there’s absolutely no guarantee that there will always be progress, that the sciences will always or inevitably advance. And, of course, the results of scientific work can be put to bad uses as well as to good.

There are, besides, many questions beyond the scope of the sciences; many other valuable forms of inquiry, such as the historical, the legal, the literary, and the philosophical, besides the scientific; and many other valuable human activities, such as music, art, story-telling, joking, building, and cooking, besides inquiry. Science is a good thing; but it’s an imperfect, human enterprise, and it’s by no means the only worthwhile human enterprise.

Well, yes, you may be thinking, but isn’t all this just obvious—too obvious even to need saying? It is, indeed, obvious. All the same, however, it needs not only to be said, but also to be spelled out. For the fact is that attitudes to science range all the way from naively uncritical admiration through distrust, resentment, and envy to denigration and outright hostility. Many underestimate the sciences, denying, or professing to deny, their achievements; and many overestimate them, denying, or professing to deny, their fallibility or their limitations. In short, both tough-minded admiration for science and tender-minded reservations about it can and all too often do take on indefensible forms: admiration for science can only too easily turn into scientism, and reservations about science into antipathy, resentment, even outright hostility. And there’s more at stake here than a superficial clash of temperaments, for both scientistic and anti-scientific attitudes rest on serious misconceptions of the sciences and their place in culture. We need to understand why—while both admiration for the achievements of the sciences and reservations about their limitations and dangers are entirely appropriate—neither scientism nor anti-science is well-founded.

Sometimes one of these faulty extremes predominates, sometimes the other. For much of the twentieth century, scientistic currents of various kinds—behaviorist, reductionist, etc.—were in the ascendant; until the closing decades, when there was a strong surge of anti-scientific sentiment. That’s why, when I wrote Defending Science—Within Reason (2003)—though I also argued against scientism—I focused primarily on what seemed to be the greater danger, the cynically anti-scientific attitudes of postmodernist, radical-feminist, and post-colonialist “science critics” and newly-ambitious radical sociologists, historians, and rhetoricians of science suspicious of, or outright skeptical about, the claim of the sciences to give us knowledge of the world. Priding itself on having seen through philosophers’ (and scientists’) illusions about science, this style of science-denigration was given to inveighing against “rationalist,” i.e., formal-logical, models of scientific method and scientific reasoning. If the idea was only that these models missed important aspects of the scientific enterprise, it was true enough.4 But it obviously didn’t follow, as so many triumphantly concluded, that there is, after all, nothing more to the sciences than power, politics, and rhetoric; and neither, I argued, is it true.

Before very long, however, the fashion for anti-scientific exaggerations was waning somewhat, and a newly confident scientism was on the rise. To be clear: this new wave of scientism didn’t sweep over us overnight; it crept up on us—though it crept up with remarkable rapidity. It’s tempting to speak of the “new” scientism; but what we’re really seeing is probably better described as a revival, a recrudescence, of attitudes that are far from new—attitudes that were already familiar to James, and that were at the heart of logical positivism, of Karl Popper’s insistence on the crucial importance of the “problem of demarcation,” the need for a criterion to distinguish genuine science from pretenders, and of W. V. Quine’s ambitious but ambiguous program to “naturalize” epistemology. At the same time, as we’ll soon see, the current wave of scientism is neither simply a return to older forms, nor simply an overreaction to the anti-scientific extravagances of those postmodern cynics; other intellectual upheavals—notably, the boom in evolutionary biology and, especially, in neuroscience, along with the rise of a newly evangelical atheism—have shaped the style, and influenced the tone, of the forms of scientism in vogue today.

In “Six Signs of Scientism,” which appeared in 2010, I suggested some ways to identify when appropriate respect for the achievements of the sciences crosses the line into the kind of over-enthusiastic and naively deferential attitude characteristic of scientism. At the time, I thought that was one job I could cross off my list—done. But no: triumphalist scientism grows apace—in the academy, in the legal system, and in our culture more generally; to the point where, by now, as some proudly adopt the word to describe their own positions,5 “scientism” seems gradually to be losing its long-established negative tone,6 and even becoming an honorific term. And these developments oblige me to dig deeper.

The first step is to articulate an understanding of the scientific enterprise that enables us both to appreciate its extraordinary achievements and to acknowledge its inevitable imperfections and limitations (§1); the next is to get some grip on the manifold manifestations of scientism (§2); and the last is to show in some detail that all these manifestations betray significant misunderstandings of what science is, what it does, and how it does it (§3).

1. Setting the Stage: The Scientific Enterprise

The root of the word “science” is the Latin, scientia, “knowledge.” And for a long time the English word “science” had a similarly broad scope, referring to any kind of systematized knowledge or inquiry—as the German word “Wissenschaft” still does; one could, for example, speak without any incongruity of the science of jurisprudence. What we would now call “science” was known, rather, as “natural” or “experimental” philosophy, and what we would now call “philosophy” as the “moral sciences.” But by the latter part of the nineteenth century, usage had gradually changed:7 with the remarkable achievements in physics, chemistry, and biology, the word “science” began to refer to these fields exclusively.8 By now, though there is still some residual resistance, psychology, sociology, economics, anthropology, etc., are normally also classified as sciences—if, sometimes, as “soft” sciences, by contrast with the “hard” natural sciences; but, with this addendum, a relatively narrow usage continues to hold sway.

So the phrase, “the sciences,” as it is now used, and as I shall use it here, refers to a loose federation of interrelated kinds of empirical inquiry including both the natural and the social sciences, but excluding pure mathematics, history, philosophy, jurisprudence, literary scholarship, and such. To be sure, these days scientists engage in many other activities besides inquiry: applying for grants to support their research, reviewing others’ grant proposals, writing up results for publication, refereeing others’ papers, designing experimental apparatus or computer programs to crunch data or simulate the consequences of this or that hypothesis, offering expert testimony in court, and so forth. The point isn’t that inquiry is the only business of the sciences, but that it’s their core business.9 And what does this core business involve? Inquiry, investigation, is an effort to discover the answer—of course, the true answer, or true answers—to some question or questions; as, for example, James Watson and Francis Crick tried, and eventually managed, to solve the structure of DNA, i.e., to come up with a true account of what that structure is.

Advocacy, say, or dancing, writing a novel, promoting a political program, or drafting a business plan aren’t science for the simple reason that they aren’t forms of inquiry. But what, you may ask, distinguishes the sciences from other fields of inquiry not classified as sciences—over and above the sociological fact that deans and librarians group them together? Well, it’s not simply by convention or purely by historical accident that these disciplines and not others are classified as “sciences”; but neither is it in virtue of their sharing some essential characteristics or some unique, distinctive method or procedure. Rather, the sciences form a kind of cluster or family of disciplines. One can say, to be sure, that the fields now classified as sciences are all forms of empirical, descriptive inquiry—which is why such disciplines as pure mathematics, logic, ethics, or aesthetics aren’t included; and that their goal is to explain natural and social phenomena, so that they focus primarily on general laws rather than particular events—which is why history isn’t included. But the boundaries are fuzzy, shifting, and frequently contested.

The boundaries are fuzzy: there are similarities and continuities, as well as differences and discontinuities, between the sciences and other kinds of inquiry—and between scientific inquiry and other human endeavors. Scientific work, no less than writing a novel or designing a skyscraper, requires imagination: to come up with possible explanatory hypotheses, to devise ways to put them to the test, to think what factors might possibly interfere and how to rule them out, to come up with alternative explanations, etc., etc.10 Cosmology and evolutionary biology are historical sciences; astronomy concerns itself with particular heavenly bodies. There’s no sharp line where cosmology ends and metaphysics begins, or where empirical psychology becomes philosophy of mind. And so on.

There are also many interrelations both among the disciplines we call sciences, between the sciences and other kinds of inquiry, and even between the sciences and other human activities. In the early days of the mathematization of science, scientists borrowed techniques developed for purposes of double-entry book-keeping and perspective drawing.11 In the pioneering days of molecular biology, biologists borrowed tools from physics.12 In the late twentieth century, archeologists used neutron analysis showing that jasper found in a settlement in what is now Newfoundland contained trace elements present only in jasper from Greenland and Iceland to confirm that Vikings reached North America long before Columbus;13 and historians borrowed a cyclotron to determine whether the ink in an old bible was the same as that in the Gutenberg Bible of 1450-55.14 And so forth.

The boundaries of science are shifting: over time, new fields and sub-fields develop. Moreover, because of the remarkable successes of the sciences, practitioners of disciplines not at present routinely included in the federation sometimes describe their fields as, also, “sciences.” As a result, the boundaries are often not only fuzzy and shifting, but also contested: there is controversy, for example, over whether forensic sciences such as hair analysis15 or bite-mark identifications16 are really rigorous enough to be included; over whether psychiatric theorizing is genuine science or an unworthy pretender; and—when courts must determine whether creation science or Intelligent Design Theory may constitutionally be taught in public high school science classes17—over whether these are genuinely scientific theories, or thinly disguised religious dogma.

And, even within the large, unruly family of disciplines we now call “sciences,” there is enormous variety. We speak of more and less “mature” sciences, because in some long-standing fields there is by now a substantial body of well-established theory, while in newer fields there may as yet be little more than so-far untested and unsupported speculation. And of course each field has its own special tools, methods, and procedures—and its internal disagreements: think, for instance, of all the elaborate protocols for conducting epidemiological studies and calculating their results developed since the earliest days of the discipline, when John Snow figured out that cholera is waterborne,18 and of the ongoing controversies about methods of meta-analysis of multiple studies.19 And every scientific field evolves, sporting new sub-specialties, new approaches, and new tools.

* * *

So what, if anything, is distinctive in the way scientists go about inquiring? Is scientific inquiry something absolutely unprecedented in the history of the human race, or something familiar and routine? As I see it, it is neither. Inquiry in the sciences has its roots in, and is recognizably continuous with, everyday empirical inquiry; but it has gone far, far beyond it.

Perhaps this idea sounds radical; and indeed, from the perspective of all those twentieth-century debates among proponents of inductivist, deductivist, game-theoretical, Bayesian, etc., models of the Scientific Method, it is radical. But this Critical Common-sensism, as I call it, would have been entirely familiar to Thomas Huxley, according to whom the “man of science … simply uses with scrupulous exactness, the method which we all, habitually and at every minute, use carelessly”;20 to Albert Einstein, who once observed that “the whole of science is nothing more than a refinement of common sense”;21 to John Dewey, who stressed how “[s]cientific subject-matter and procedures grow out of the direct problems and methods of common sense”;22 to Percy Bridgman, who commented that the “scientific method, in so far as it is a method, is doing one’s damnedest with one’s mind, no holds barred”;23 to James B. Conant, who wrote that “what the scientist does is simply to carry over, into another frame of reference, habits that go back to the caveman”;24 and to Gustav Bergmann, who described science as the “long arm” of common sense.25 And, I can now add, it would be entirely familiar to Steven Weinberg, who wrote in 2015 that even “[b]efore history, there was science, of a sort. … Observation of the world led to useful generalizations. … [And] here and there, some people wanted …. to explain the world.” 26

Exactly. From the beginning, in matters of immediate practical concern in everyday life, humans had to figure things out—where the best hunting is, what plants have useful medicinal properties, which beetles make the most effective poison for arrow-heads, and so on. In these early efforts, as in all inquiry, people made the best guesses they could and, so far as they were able, checked them out. Beyond a certain point, though—when it wasn’t possible to check just by looking, listening, smelling, or tasting, or by asking others with sharper senses or better opportunities to observe—people fell back on folklore, myth, mystery and, as civilizations grew, on appeals to religious or secular authority. Surely there were always some who were more curious than others, more given to experiment, more persistent in trying to figure things out. And certainly there were many precursors of what we now call “modern science”—some truly remarkable, such as the work of the Chinese astronomers who, millennia ago, made observations so accurate that astronomers can still rely on them today.27 But, for a variety of reasons,28 these precursors never quite took hold as modern science has done over the last several centuries.

And with the rise of modern science, empirical inquiry took on whole new dimensions. Indeed, as David Wootton shows, the period that saw the emergence of modern science also saw the development of the now-familiar vocabulary for talking about empirical inquiry: “experience,” “experiment,” “discovery,” “fact,” “hypothesis,” “theory,” “observation,” “evidence.”29 The word “scientist” itself is a surprisingly recent coinage. According to William Whewell, it was first proposed by an “ingenious gentleman” at a meeting of the British Association for the Advancement of Science in the early 1830s, as a general term that, like the German “Natur-Forscher,” would refer to people in all fields of science rather than to people in just one.30 (Sydney Ross tells us that the “ingenious gentleman” in question was Whewell himself.)31

Scientists aren’t unique in making educated guesses and checking them out as best they can; but over the centuries they have come up with innumerable ways to extend, amplify, and refine the process, developing new tools, new procedures, new ways to figure things out better—more thoroughly, more persistently, more precisely, more broadly, more deeply, more imaginatively, …, etc. They have devised new instruments of observation and new methods of purification, analysis, excavation, etc., to seek out more, and new kinds, of evidence; new techniques of measurement and calculation to determine more exactly where evidence points, and even techniques of meta-measurement to measure the accuracy and reliability of first-order measurements. And they have built on the results of previous investigations by earlier generations of scientists to make better-informed guesses, better-designed and better-controlled experiments, and better tools and instruments; developed more informative, more exact, and more discriminating theoretical vocabularies; found ways to stretch their unaided imaginative powers; and so on. Over several centuries, generation upon generation of scientists has developed—here I borrow a word from Francis Bacon—a vast and various array of “helps” to inquiry.32

Bacon was a high-ranking lawyer and a remarkable philosophical thinker, but he was no scientist; indeed, as Sir William Harvey—who really was a scientist—complained, he wrote natural philosophy (i.e., science) “like a Lord Chancellor.”33 Nevertheless, for all its many flaws, Bacon’s New Organon is a visionary work. If we are to understand natural phenomena, Bacon saw, we must be willing to leave the study and the library and explore the world, to experiment, manipulate, get our hands dirty. And he also saw that, if we do this, we can anticipate both “light”—a greater understanding of the natural world; and “fruit”34—the power to predict natural phenomena and so to cope with some of the dangers, and take advantage of some of the opportunities, that they present. To us, now, this seems blindingly obvious; but it’s worth remembering that King James, to whom Bacon dedicated the book—and who was at the time one of the finest scholars in Europe—was completely baffled: Bacon’s vision seemed to him “like the peace of God, it passeth all understanding.”35 Only very gradually, as the sciences got on their feet, would the old habits of appeal to authority or resort to folklore and superstition be superseded; indeed, even today they have by no means been entirely banished.

So there is both continuity and discontinuity. Scientific inquiry uses the same underlying procedures and inferences as everyday inquiry; but by now scientists have enormously improved, refined, amplified, and augmented them. Scientific inquiry has been more persistent, more thorough, more searching, more sustained, more rigorous than everyday inquiry; it has been the work of generation after generation, each of which could build on the successes of the last; and it is by now the professional business of large numbers of people, sometimes cooperating, sometimes competing, allowing division of labor and pooling of evidence. And, while there are still “citizen scientists,”36 as in earlier generations there were “gentleman naturalists,” scientific work has become a recognized profession; and scientific communities have gradually developed a raft of internal social mechanisms that have served, up to a point—though only up to a point—to sustain intellectual honesty and encourage the sharing of evidence.

There is, in short, a constantly evolving array of scientific methods, tools, and techniques of inquiry—methods, tools, etc., often local to specific scientific fields, though sometimes proving useful elsewhere, too. Insofar as these methods, tools, and techniques stretch scientists’ imaginative powers, extend their unaided evidential reach, refine their appraisal of where evidence points, and help sustain honesty, provide incentives to the patience and persistence required by scientific work, and facilitate the communication of results, they enable progress: better measurements, better theories, more sensitive instruments, subtler techniques, finer-grained experimental design, more informative terminology, and so on. And nothing succeeds like success. Each step forward enables further steps, as well as allowing scientists to correct their own or others’ previous missteps—which is why it is sometimes said that science is “self-correcting.” But there’s no magic about it; only (only!)—on the part of many people, over many generations—curiosity, imagination, hard work, patience, persistence, attention to detail, and honest willingness to acknowledge failure, learn from it, and start over, perhaps again and again.

Scientific claims and theories are fallible, revisable: any claim or theory, no matter how well-established or how widely accepted, might be shown by new evidence to be false, or only approximately or only partially true. But there’s a kind of continuum, from scientific claims and theories well-rooted in a strong, dense, tightly-interlocking mesh of evidence, through others reasonably well-rooted and others again fairly well-rooted, to the as-yet wholly speculative and the outright wild and woolly. Most conjectures won’t survive as new evidence comes in; only a few will become part of the enduring edifice of scientific knowledge.

In any scientific community, probably, there will be some who are more radical, ready to try a new conjecture when the existing hypothesis most of their colleagues are content to work with encounters difficulties, and others who are more conservative, disposed to keep trying to modify and adapt the old idea; still, consensus will gradually form: this conjecture is probably correct, that idea is almost certainly mistaken. Ideally, we would find consensus among scientists in a field when, and only when, the evidence is sufficient to indicate that it’s probably safe to rely on this conjecture, but probably a waste of time pursuing work on that rival idea. There’s absolutely no guarantee, however, that scientific consensus will always faithfully track the state of the evidence.

The formation of scientific consensus is only too easily distorted by political pressures, by commercial interests, by the demands of litigation: think of Stalin’s sponsorship of Trofim Lysenko’s ideas about plant genetics37 or, at a more commonplace level, of pharmaceutical companies’ routine practice of withholding unfavorable results from publication.38 There may be strong resistance to acknowledging the evidence for a claim perceived as too new, too radical: think of Darwin’s wry comment that admitting he had come to believe that species aren’t fixed and immutable was “like confessing a murder,”39 or of the strongly skeptical reaction when it was first suggested that stomach ulcers might be caused, not by stress or a too-spicy diet, but by a bacterium.40 An individual, or a team, may have so much influence in a field, or so much control over funding or publication, that their approach continues to prevail even if the evidence is weak, as was perhaps the case with Cyril Burt’s work on the heritability of intelligence.41 And sometimes an unfounded idea somehow takes such firm root that everyone in a field just assumes it’s true: think of the “tetranucleotide hypothesis”—Phoebus Levine’s conjecture that DNA is a “stupid,” monotonous molecule in which the four base pairs occur in a regular sequence—which, though it was nothing more than a conjecture, was once widely taken for granted by molecular biologists.42 Indeed, its grip was so strong that, even after he’d completed his experiments subjecting the “hereditary principle” to every known test to discriminate the two substances, which clearly pointed to this conclusion, Oswald Avery dared not say in print that DNA, and not protein, is the genetic material.43

The sciences have achieved remarkable things; but there can be no guarantee that they will always advance, let alone that they will always advance at a brisk pace; and no guarantee that there won’t be setbacks, wrong turns, and false starts. As I said, progress has been ragged, uneven, and unpredictable. Sometimes scientific advance is cumulative; sometimes it involves big upheavals—I would say, “revolutions,” except that I intend no Kuhnian implications44—when a key new idea emerges, or a central old idea is discredited; and sometimes, if a really bad idea takes hold, a field may go backwards before the bad idea is dropped and progress can again be made.

And neither, of course, are the sciences complete. It’s not just that, in any scientific field, there are questions yet to be answered. It’s also that, as scientific work proceeds, new and unanticipated questions—sometimes, even, whole new fields or sub-fields of inquiry—will emerge; and, most importantly, that the competence of the sciences is limited. Not every kind of question is susceptible to scientific inquiry; even the most comprehensive future science imaginable wouldn’t explain everything.

Moreover, while those technical helps to scientific inquiry (the instruments, computer programs, statistical methods, and so on) usually get better and better over time, it’s very clear that by now the social mechanisms for sustaining honesty and encouraging evidence-sharing (peer-reviewed publication, assignment of research funding, professional certification and standards, and so forth) are under considerable strain. As science grows bigger, more expensive, politically more consequential, and potentially more profitable, things can go badly wrong. The peer-review system, for example, always flawed,45 is by now seriously dysfunctional46—to the point where William Wilson writes despairingly that “if [it] is good at anything, it appears to be preventing unpopular ideas from being published.”47 It is, as Wilson continues, truly ironic that a newly aggressive scientism—he goes so far as to say, a Cult of Science—should be on the rise precisely as careerism and a bloated bureaucracy threaten to undermine the ideals and the integrity of the sciences.48

2. Spotting the Signs: The Many Manifestations of Scientism

Against this background, the distinctive characters of scientism and of anti-science come into sharper focus. As we shall see, however, both are manifested in many and various ways.

Anti-science comes in a whole variety of shapes and guises, not always mutually compatible. So far, my focus has been on the cynical strain encouraged by rivalry between traditional philosophers of science and the ambitious radical sociologists and historians who, in the latter part of the twentieth century, began to turn their attention to the sciences—a great, noisy chorus of academics proclaiming that the sciences are shot through with sexism, racism, and colonialism, that science is driven by power, politics, rhetoric, negotiation, not evidence, that supposedly “objective” facts and supposedly “objective” reality are man-made, scientists’ own creation, even that the concepts of inquiry, evidence, truth are nothing but ideological humbug. (Of course, if there really were no objective truths, it couldn’t be objectively true that science is shot through with sexism, racism, etc.; and if there really were no objective standards of better and worse evidence, there couldn’t be objectively strong evidence that what scientific theories get accepted primarily depends, not on the evidence, but on scientists’ class interests.) But the anti-science camp also includes religious fundamentalists who reject modern cosmology and the theory of evolution, as well as ordinary, non-partisan people disillusioned after reading too often of scientific fraud, of science distorted by politics, of corruption in the scientific peer-review process, or of large grants for what seem to be banal or incomprehensible projects.

Not surprisingly, scientism is no less complex, no less various, and no less rife with internal tensions than anti-science is. Some in the scientistic camp are simply so impressed by the remarkable achievements of the sciences that they are ready to accept any and every scientific claim unquestioningly, the wildly speculative no less than the well-established; to believe the pronouncements of well-known scientists even on matters far outside their professional competence; and to resist all criticism of the sciences or of current scientific orthodoxies.49 But, like anti-scientific dismissiveness, scientism also has its academic wing.

This academic wing has long included those who insist on criteria by which to distinguish real science, the genuine article, from “pseudo-science”; propose formal-logical or probabilistic models of scientific reasoning; and—relegating potentially awkward sociological and psychological factors influencing scientific work to the “context of discovery”—offer bromides about the “rationality” and “objectivity” of science. But now there is, besides, a great noisy chorus of academics newly energized by the rise of evolutionary psychology and neuroscience, the boom in evangelical atheism, and the scientism at work in our culture at large, urging that such hitherto-uncivilized disciplines as jurisprudence, art criticism, philosophy, etc., look to the sciences for answers to their questions. Others, going even further, proclaim that it’s time to abandon these outdated, pre-scientific disciplines altogether, and to pursue genuinely scientific projects instead. “Neurolaw,” we’re told, promises to transform or even displace the outdated field of jurisprudence,50 “neuroart” the outmoded fields of aesthetics, literary criticism, and such,51 and “neurophilosophy” such primitive, pre-scientific disciplines as ethics, epistemology, metaphysics, aesthetics, and the like.52

And, just like anti-scientific cynicism, scientism can harbor contradictions. The earliest advocates of neurophilosophy, hoping to understand human cognitive processes on the model of the workings of the nervous system of the sea-slug, proclaimed—no, like those postmodernist science critics, they boasted—that their approach undermined the legitimacy of core logical and epistemological concepts.53 (Of course, like the all-too-similar claims of anti-scientific cynics, this is self-defeating: if it were true, the neuroscientific discoveries on which they based their extravagant claims couldn’t be well-warranted by strong evidence, and neither could they be true.)54 And, as we’ll see in the next lecture, by now some self-proclaimed supporters of scientism boast of an even more sweeping nihilism that repudiates every kind of value—the moral and the aesthetic, for example, as well as the epistemological.55

If it is evolutionary psychology and neuroscience that have played the largest role in shaping the character of scientism today—its manifestation in, as Raymond Tallis puts it, “Darwinitis” and “neuromania”56—it seems to be the new atheism that has most marked its tone. Of course, there have long been, and still are, religious people who reject cosmologists’ theories about the origin of the universe and evolutionary biologists’ theories about the origin of mankind because they are at odds with scriptural accounts; and for as long as there has been science, probably, there have been atheists who have welcomed scientific discoveries as confirming their position—though it’s salutary to remember that there were atheists long before modern science got on its feet, and that religious people sometimes welcome scientific theories as confirming their position—stressing, for example, the supposed “fine-tuning” of the earth to human life.57

So what’s new, you might ask, about the “new” atheism?—less its content, it seems, than its style, its swagger. Taking for granted, what is a long way from obvious, that by now science has shown that religious claims are groundless, the new atheism often calls on evolutionary or neurophysiological accounts of the religious impulse to explain it away; perhaps more importantly, assuming that religious people are either scientifically ignorant or willfully blind, it seems to pride itself on its intellectual superiority. We, the new atheists proclaim, are the “Brights”58—which, intentionally or not,59 inevitably suggests that religious people are, well, dim. This adds a new layer of confusion: with the new atheists acting as cheerleaders for the revival of scientism, it can come to seem that anyone who resists scientism must, overtly or covertly, have a religious agenda. But this is a serious misperception; as we will see, there are good and sufficient reasons for resisting scientism quite independent of any religious assumptions.

* * *

Of course, there’s no simple formula to determine when the line between appropriate respect for the achievements of the sciences and inappropriate deference to science has been crossed. There are, however, some characteristic indicators, among which I would include:

Excessive readiness to believe in the absence of good evidence is the epistemological vice of credulity.60 This vice comes in many forms: some people are too ready to believe bizarre Hollywood gossip, others the something-for-nothing promises made by aspiring politicians, others again the advertisements for miraculous dietary aids and other medical quackery, etc. And some—including not a few of those who, priding themselves on their “scientific” skepticism, scoff at claims about the Loch Ness Monster, haunted houses, fringe medical treatments, and the like—are too ready to believe any and every claim made by scientists, including the latest headline-catching study or speculation that will, more likely than not, turn out to be just plain wrong.61 They forget that science is an ongoing enterprise, and that much scientific speculation won’t survive the test of time. This kind of credulity about science is the simplest and most straightforward sign of scientism.

Such credulity naturally encourages what is by now a very common and familiar phenomenon, the use of “science” and its cognates as a kind of shorthand for “good, solid stuff.” This is another sign of scientism:

There’s a real irony here: as we saw, “science” originally meant simply “systematic knowledge,” but gradually became restricted to physics, chemistry, biology, etc.—as we would now say, “the sciences”; but those who turn “science” and “scientific” into honorific terms are in effect restricting the meaning of “knowledge” so as to coincide with the newer, narrower meaning of “science.”

As they become honorific terms, “science” and “scientific” soon lose descriptive content and become near-vacuous expressions of approval. Advertisers urge us to buy their new, scientific detergent or to try their new, scientific dietary supplement; a historian criticizes a rival on the grounds that he has no scientific evidence for his claims;62 phrenology or the phlogiston theory are dismissed as pseudo-sciences; and so forth. This honorific usage even entered our jurisprudence when, in Daubert v. Merrell Dow Pharmaceuticals, the U.S. Supreme Court’s first-ever ruling on the standard of admissibility of expert testimony, Justice Blackmun argued for the majority that, in determining whether such testimony is sufficiently reliable to be heard by a jury, judges should determine that it is genuinely “scientific … knowledge.”63 And, inevitably, as “science” becomes an honorific term, practitioners of other disciplines begin to describe their fields as sciences: “Management Science,” “Library Science,” even “Mortuary Science”64—and, of course, “Creation Science.”

When “scientific” is used as equivalent to “epistemologically strong,” of course it seems enormously important to find some way to distinguish genuinely scientific, epistemologically strong work from epistemologically weak “pseudo-science”—i.e., to find some criterion by which to demarcate real science from pretenders. Hence the third sign of scientism:

Karl Popper, most famous of twentieth-century demarcationist philosophers of science, thought he had a simple way to distinguish work like Einstein’s (which he deemed good, genuinely scientific) from Marx’s “scientific socialism” and Freud’s and Jung’s psycho-analytic theories (which he deemed bad, pseudo-science at best): the mark of a genuinely scientific claim or theory is that it is falsifiable. This idea is at work in Justice Blackmun’s first, vaguely Popperian “Daubert factor,” suggesting that, in assessing whether expert testimony is really scientific, and hence reliable enough to be admitted, judges ask: “can it be (and has it been) tested?”;65 and in those constitutional cases where judges reach for vaguely-Popperian criteria to argue that creation science,66 or Intelligent Design Theory,67 isn’t really science at all.

The demarcationist impulse often manifests itself in the form of what you might call “methodism,” the idea that real science, the genuine article, can be identified by means of its distinctive method or procedure of inquiry. Hence the next sign of scientism:

Philosophers argue about whether the Scientific Method is deductive, inductive, probabilistic, Bayesian, game-theoretical, error-theoretical, or what; scientists themselves, if put on the spot to say something about how they do what they do, sometimes parrot some half-understood idea from this tradition—most often, something vaguely Popperian in tenor. Textbooks (and scientific organizations) tend either to offer cook-book, step-by-step instructions so formulaic that they tell you nothing of substance—e.g., “make a hypothesis; design an experiment; conduct the experiment; write up the results; submit the results for peer-review”;68 or else describe the specialized procedures or techniques gradually developed over the years in their particular field of scientific work, so specific that they simply don’t apply in other fields.69

The idea that there is some distinctive method of inquiry used by all scientists and only by scientists inevitably encourages some to adopt what they take to be scientific methods, tools, and techniques as if this were sufficient by itself to make their work rigorous, “scientific” in the honorific sense. All too often, the result is—well, in Bentham’s phrase, it’s “nonsense upon stilts,”70 work lacking in real rigor but disguised in the technical trappings of science. This is another sign of scientism:

Scientists themselves are not immune to this kind of scientism. For example, one of Merrell Dow’s epidemiological studies of Bendectin, the morning-sickness drug at issue in Daubert, though decked out with all the usual statistical apparatus, failed to distinguish women who took the drug during the period of pregnancy when fetal limbs are forming from those who took it at other stages—and then, predictably, concluded that there was no evidence that Bendectin caused limb-reduction birth defects;71 Merck’s VIGOR trial of the arthritis drug Vioxx, as a result of which the FDA approved it for sale in the U.S., though also dressed up with all the standard statistical apparatus, tracked gastro-intestinal effects for longer than it tracked cardiovascular effects; as a result of which the several subjects who died of heart attacks and strokes after taking the drug could be excluded from the results, since their deaths occurred outside the study period.72

Dressing up dreck is even commoner, probably, in the social sciences, where lengthy introductory chapters on “methodology” are sometimes only window-dressing, and graphs, tables, and statistics sometimes focus attention on variables that can be measured rather than those that really matter, or represent variables so poorly defined that no reasonable conclusion can be drawn. David Abrahamson’s Second Law of Criminal Behavior, “C = (T+S)/R,” is a classic: “[a] criminal act is the sum of a person’s criminalistic tendencies plus his total situation, divided by the amount of his resistance.”73 The forensic sciences are also susceptible to this kind of thing. The “ACE methodology” for fingerprint identification, for example, is little more than a list of steps—analysis, comparison, evaluation; and the addition of “V,” for “verification” is much less reassuring than it sounds, since all it means is “get another fingerprint examiner to check.” “The scientific approach of the ACE-V process was detailed in [a 2009] article by the FBI,” write the authors of a recent report on the accuracy of fingerprint identifications;74 but when you look closer you find that this FBI report simply parrots a completely unhelpful textbook understanding of “scientific method”: make an observation, generate a hypothesis, conduct tests, generate conclusions, confirm through replication, record or present the conclusions.75

And now for my last two signs of scientism:

Because the sciences have made many true discoveries, they enjoy considerable prestige. So, not surprisingly, some people come to imagine that science could solve virtually all our problems—for example, that it could provide well-founded responses to vexing questions of public policy. Moreover, it’s second nature for scientists to press outward, to tackle the new questions that inevitably arise as older ones are answered, to explore hitherto-unexplored phenomena, to try out tools and techniques that have proven useful in one area to see if they can also be helpful in others. So, not surprisingly, some begin to aspire to take over work hitherto left in the hands of less-prestigious non-scientific disciplines, and do a proper job of it.76 And some in those less-prestigious fields, feeling themselves the poor relations in the academy and aspiring to share in the prestige of the “scientific,” begin to call on one or another of the sciences to solve the problems with which they’ve been wrestling unsuccessfully—to provide evolutionary answers to questions of ethics, for example, or neuroscientific answers to puzzles in philosophy of mind.

Again not surprisingly, these scientistic efforts to colonize other areas of culture often fail. And when they do, many respond, not by acknowledging frankly that the sciences have overreached, but by casting aspersions on the questions that prove recalcitrant and the fields that resist colonization. Hence the last sign of scientism on my list:

This takes many forms: from government efforts to focus resources on science education at the expense of other fields, through dismissive attitudes to the study of aesthetic, ethical, or other values recalcitrant to scientific explanation, to outright denial of the legitimacy of whole fields of human endeavor.

These seven signs of scientism roughly parallel the many and various manifestations with which I began this section; and, as we have seen, they are intimately interconnected—the honorific use of “science” and its cognates leading very naturally to a preoccupation with finding a criterion of demarcation of the genuinely scientific, this in turn to a preoccupation with identifying some method distinctive of the sciences, and so on. Of course, these signs also reflect some of the tensions within scientism: e.g., between the concern to fortify the frontiers of genuine science, the real thing, and the hope of extending the domain of science to previously-unoccupied territories; between attempts to colonize other areas of culture, and the impulse to denigrate whatever falls outside the scope of the sciences; and so on. But the key point, as we’ll soon see, is that every one of them betrays some misunderstanding of the scientific enterprise.

3. Mapping the Misunderstandings: The False Presuppositions of Scientism

There’s some temptation simply to point out that, since what the word “scientism” means is excessive or undue deference to the sciences, it’s trivially true that scientism is undesirable. So it is; but this doesn’t get us very far. It may alert us to the need to understand why scientism is a bad thing, but it doesn’t, by itself, throw any light on the matter. There’s some temptation, also, to rely on pointing out that, no less than anti-scientific cynicism, scientism poses real dangers: it threatens to cloud our appreciation of our distinctively human mindedness and of the extraordinary array of intellectual and imaginative artifacts this mindedness has enabled us to create,77 and even to lead to its own kind of nihilism. But just stressing that scientism is a threat to the health of our culture, or even explaining why it is, doesn’t do the whole job, either. No: we can get to the root of problem only by identifying the false presuppositions on which scientism rests; which is what I shall try to do.

(i) As my phrase “forgetting fallibility” suggests, credulity about scientific claims betrays a serious misunderstanding of how the sciences advance: not infallible step by infallible step, but by fits and starts, with numerous wrong turns and missteps along the way. Plenty of scientific studies and experiments are poorly-conceived, poorly-conducted, or both; and the results even of well-conceived and well-conducted studies and experiments can be misleading. While work on a scientific question is ongoing, capable scientists in a field may quite reasonably disagree about which of the rival approaches is likeliest to work out—and the rest of us just have to wait and see. The sciences have made many true discoveries, yes; but it obviously doesn’t follow that every claim made by a scientist, or every claim made by a scientist in his own field of expertise, let alone every pronouncement of a well-known scientist on whatever subject, will be true. Far from it: most scientific conjectures will probably turn out to be false, many will be found to be only partially or approximately true, and many will prove to be misleading; and of course nobody is an expert outside his own field.

(ii) The honorific use of the words “science,” “scientific,” etc. as generic terms of epistemological praise betrays a similar blindness to the often fumbling and always fallible character of scientific work. True, over many generations the sciences have found ways to inquire better, to overcome or mitigate some natural human cognitive limitations and weaknesses. But this doesn’t justify using “scientific” as shorthand for “strong, reliable, epistemologically good”; and neither does it justify pretending that bad science—poorly-conceived or poorly-conducted scientific work—isn’t really science at all.78

The sciences have devised sophisticated tools and techniques to bring previously-inaccessible evidence within their reach, found ingenious ways to design more-informative experiments, devised mathematical and other methods to appraise evidence more scrupulously, and so on. But it by no means follows that all or that only scientists are good, careful, thorough inquirers; and of course it isn’t true. Even with all these remarkable tools, scientific work is sometimes weak—relying on careless or biased observations or badly-designed experiments or studies, for example, or on botched statistical calculations, or trimmed or fudged results. Indeed, as I suggested earlier, nowadays the severe pressure on scientists to get grants and to publish, along with a burgeoning scientific bureaucracy and a badly broken peer-review system, positively encourage weak, flawed, and even fraudulent work. Moreover, plenty of excellent scientific work has been done without the benefit of sophisticated tools: think of those astronomers in ancient China, who managed without radio-telescopes; or of Charles Darwin, who sometimes checked the size of specimens against his handkerchief.79

(iii) Once you recognize that there can be poor scientific work as well as good, not to mention strong non-scientific work as well as weak, the “problem of demarcation” loses much of its urgency. This is just as well; for the task of identifying the frontiers to be fortified has proven quite intractable.

Popper’s idea that a claim or a theory is scientific just in case it is falsifiable has been enormously influential not only in philosophy of science but also among scientists themselves and even, as we saw, in the U.S. legal system; nonetheless, it is a badly confused idea. Popper purports to offer a theory of “objective scientific knowledge” based on the bold thesis that, while scientific theories can’t be shown to be true, they can be shown to be false; but what he actually gives us is nothing but a thinly-disguised skepticism. Why so? A theory is falsified, he tells us, when a basic statement with which it is incompatible is accepted. But he goes on to insist that what basic statements are accepted and what rejected is entirely a matter of convention, a “decision” on the part of the scientific community. So the fact that a theory has been “falsified,” in Popper’s sense, doesn’t mean that it is false; and Popper’s account implies that scientific theories can no more be shown to be false than they can be shown to be true. No wonder he couldn’t decide whether the theory of evolution is or isn’t science, and vacillated over whether the problem with “scientific socialism” is that it isn’t falsifiable, or that when it was falsified, its proponents didn’t give it up; no wonder, either, that by 1959 he had decided that his criterion of demarcation was itself nothing but an optional “convention.”80

Others have suggested that the distinguishing mark of real sciences is that they involve controlled experiments, that they make successful predictions, that their theories are well-tested, that they grow and progress, or …, etc. But astronomy doesn’t make controlled experiments; evolutionary biology makes no predictions;81 many scientific claims are not, as yet, well-tested; and not all scientific fields are always growing or progressing.

Larry Laudan writes that demarcationist projects “served neither to explicate the paradigmatic uses of ‘scientific’ … nor to perform the critical stable-clearing for which [they were] originally intended.”82 True enough; but I would stress, rather, that the preoccupation with demarcation betrays a seriously oversimplified conception of what is really a dense, complex mesh of similarities and differences among the disciplines we count as sciences, and of continuities and discontinuities between these disciplines and others not so classified. It loses sight of the elements of historical accident and of convention in our classification of disciplines, of the fuzzy, shifting, and contested boundaries of science, and of the sheer variety of the category “non-science.” And it tempts us to forget that “not science” includes many legitimate and valuable enterprises: writing fiction or making art, for example—excluded because they aren’t kinds of inquiry; pure mathematics, legal or literary interpretation, inquiry into moral, aesthetic, or epistemological values—excluded because are aren’t descriptive but normative; not to mention historical research or metaphysics—both, again, by my lights anyway, legitimate and valuable kinds of inquiry but not, in the modern sense, sciences.

Demarcationists often emphasize the importance of distinguishing science from pseudo-science. But this idea obscures more than it illuminates. It uses “pseudo-science” and “pseudo-scientific” as terms of generic epistemological disparagement, in much the same way that “science” and “scientific” are nowadays often used as terms of generic epistemological praise; but what we really want to know is what, specifically, is wrong with the work in question. Calling it “pseudo-science” is no help at all.

You may object that sometimes, e.g., in legal contexts, we really need some way to discriminate genuine science from pretenders. Perhaps the first thing to say is that, even if the law really did need to do this, that wouldn’t mean that the boundaries of science really are sharp and clear; after all, the law really does need to adopt precise definitions of, say, “adult,” or “drunk,” but that doesn’t mean that there really is a sharp line between adolescents and grown-ups, or between someone who’s drunk and someone who’s a bit tipsy. But, in any case, it’s not clear that the legal system really does need to distinguish genuine science from pseudo-science.

The serious issue in those cases about the standard of admissibility of expert testimony should have been, not how to tell whether expert testimony is genuinely scientific, but how to tell whether it’s reliable enough to be presented to a jury.83 The apparent need for a criterion of demarcation of science arose only because Justice Blackmun’s ruling in Daubert confused “reliable” and “scientific.” Similarly, the serious issue in those constitutional cases over the teaching of creation science or Intelligent Design Theory in public high-school biology classes should have been, not whether these are scientific theories, but whether they are religious. The apparent need for a criterion of demarcation of science arose only because one prong of the Lemon test84 for constitutionality under the Establishment Clause requires that a statute have a secular purpose,85 prompting proponents to argue that teaching creation science or IDT does have such a purpose—namely, improving science education; and it was this that obliged their opponents to argue that creation science and IDT simply aren’t scientific theories, and obliged judges to determine whether they are or not.

(iv) “Make an informed guess about what might explain a puzzling phenomenon; figure out what the consequences would be if this conjecture were true; check out how well those consequences stand up to any evidence you have and whatever further evidence you can lay hands on; and then use your judgment whether to accept the conjecture provisionally, modify it, drop it and start over, or just wait until you can get more evidence.” Fair enough; but this methodological advice, if you can call it that, applies no less to historical research, legal scholarship, detective work, or serious everyday inquiry than it does to inquiry in the sciences. “Design a randomized, double-blind, controlled study with a large-enough number of subjects and controls to compare the effects of drug X with the effects of a placebo; use these statistical techniques to calculate the results, those to check for statistical significance; …, etc.” Fair enough—for epidemiologists conducting clinical trials; but this methodological advice is no help to an astronomer, a molecular biologist, a sociologist, or an anthropologist.

And this pattern is no accident: contrary to what methodism assumes, there is no method used by all scientists and only by scientists. Rather there is, on the one hand, the familiar procedure of conjecture and checking common to all serious empirical inquiry; and, on the other, the myriad specialized techniques and procedures devised by scientists in various fields to get more evidence of the kind they need and a subtler sense of where it points. But those underlying procedures aren’t used only by scientists, and those special techniques, which are constantly evolving and often local to a specific field, aren’t used by all scientists.

(v) Those specialized tools and techniques have helped the sciences advance; and they have also sometimes been borrowed and put to good use by inquirers in other fields—such as the historians who borrowed medical-imaging techniques to distinguish traces of writing from the effects of weathering on the lead “postcards” Roman soldiers used to write home.86 But when scientific tools and techniques don’t make the work more rigorous but only disguise its lack of rigor, the trappings of science are mistaken for its substance. Of course, the idea that tables, graphs, mathematics, statistical and other technical jargon, etc., will somehow magically make what you do precise, rigorous, and accurate is an illusion. Borrowing the trappings of serious scientific work doesn’t, by itself, make your work serious—any more than getting your citations in perfect BlueBook form,87 by itself, makes you a serious legal scholar. Dressing up dreck in scientific trappings is, in short, the pretense of serious intellectual work, without the substance.

But, it may be objected, isn’t this exactly why we need the concept of pseudo-science, which you dismissed earlier as unhelpful? I don’t think so. After all, while it’s quite common for people in non-scientific fields to dress up dreck in hopes of making it look more rigorous than it really is, scientists themselves sometimes do the same thing. And a bad epidemiological study where elaborate statistical apparatus serves only to distract attention from a biased design is an epidemiological study nonetheless, albeit a poor one. It doesn’t, as the phrase “pseudo-science” suggests, falsely pretend to be science; it is (bad) science. The problem is pseudo-rigor, not pseudo-science.

(vi) By now, many questions once thought beyond the scope of the sciences have been found to be within their competence after all. In the seventeenth century philosophers debated whether a man born blind, if he were made able to see, would immediately be able to distinguish between a sphere and a cube, previously known to him only by touch, simply by looking at them;88 now, we see this as the kind of question to be settled by medical scientists.89 But it doesn’t follow that every kind of question is, or will eventually be, susceptible to resolution by the sciences, that the sciences can colonize every area of culture. The mistake here is to suppose that, because the sciences have made so many remarkable discoveries, there are no limits to their reach. To be sure, we can’t know now what future scientific work might be able to achieve; nevertheless, it’s clear enough that certain kinds of question are simply not susceptible to resolution by the sciences. Even though, as I argued earlier, the boundaries of science are fuzzy, shifting, and contested, not every legitimate kind of question falls within those fuzzy, shifting, contested boundaries.

This thought, however, needs careful handling. Religious people sometimes say that science can explain how things happen, but not why: how species evolved, for example, but not why, not for what purpose.90 Aggressive atheists predictably respond that these supposed why-questions aren’t really legitimate questions at all. I would put it differently: the theory of evolution is by now very well-warranted by a dense mesh of tightly interlocking lines of evidence, and provides an explanation of the origin of species entirely in terms of past causes, without postulating any purpose, plan, or goal. So in this instance, the answer to “for what purpose?” is: not for any purpose. Similarly, when religious people ask, as some did in the aftermath of hurricane Andrew, “why did the storm destroy their church, but not ours?” my answer would be: presumably there is a meteorological explanation of why the hurricane hit there rather than here; but there was no reason why it hit this church rather than that—that was coincidence. More importantly, whatever your view about the legitimacy or otherwise of the kinds of question to which theology offers answers or the legitimacy or otherwise of the kinds of answer it offers,91 there are many other kinds of legitimate but non-scientific question.

Take a question of public policy, such as whether we should dam this river at this place. Quite properly, we want such decisions to be made on the basis of the best information available. So we look to specialists in hydro-electric engineering to tell us how much electricity we can expect the dam to supply; we look to environmental scientists to tell us what the effects on the ecology of the region would be; perhaps we ask economists or sociologists to estimate costs and benefits to local communities. The scientistic mistake is to imagine that this could be sufficient to tell us whether or not we should build the dam. It couldn’t; even if we had that hypothetical best account of the benefits and the costs, there would still be something left over: whether the benefits outweigh the costs—and this is a matter of judgment, not to be settled by additional factual information or, for that matter, by any decision-theoretic algorithm.

Or take efforts by evolutionary psychologists or neuroscientists to colonize ethics. Evolutionary psychology may be able to explain why altruism has survival value, or to teach us something about the biological origins of what are sometimes called the “moral sentiments”; neuroscience may be able to teach us something about what’s going on in the brain when we feel disgust, righteous indignation, empathy, envy, guilt, remorse, pride, etc. But none of this could tell us whether helping others is morally desirable or, if so, why it is, or how to weigh it against other morally-desirable things; nor could it tell us which of our hard-wired moral sentiments are truly morally desirable and which not—or, again, why.

(vii) When colonizing efforts fail, as inevitably they sometimes do, those of a scientistic turn of mind—perhaps on the principle that the best defense is a strong offense—may respond by denigrating the fields that resist colonization, suggesting that they are inherently inferior, that they are luxuries we really can’t afford, or even that they aren’t really legitimate fields at all. And sometimes, I suspect, scientistic disdain for the different really is little more than a matter of temperament. At any rate, some of those who look down on non-scientific endeavors seem to feel, consciously or otherwise, that making music or art, telling stories, dancing, and the like are—dare I say it?—effete, inherently inferior to the more manly task of forcing nature to give up her secrets.92 All I can say to this is that, as I see it, our culture, and my life, would be much poorer without the work of scientists, and much poorer, also, without the work of playwrights, poets, novelists, composers, artists, etc.—though poorer, naturally, in different ways; and that the too tough-minded simply fail to appreciate the richness of our many-faceted, intertwining human capacities.

Others of a scientistic bent seem to assume that the intellectual work of historians, musicologists, legal scholars and theorists, literary scholars, philosophers, etc., is inherently soft, squidgy, and weak compared with the clean, hard-edged intellectual work of the sciences. It’s true, as Percy Bridgman observed,93 that successful scientific inquiry demands an honest and unshrinking respect for the facts. (Think of Watson and Crick’s willingness to go back to the drawing board and start over after Rosalind Franklin pointed out that their early model only had room for less than 10% of the water molecules DNA was known to contain.)94 But while in some non-scientific fields the pressure of facts is looser and less direct, nevertheless, the same honesty, the same humility is required of the serious inquirer whatever his subject-matter—a historian or a legal scholar, for example, no less than of a physicist or a psychologist.

Do I mean, then, to ally myself with those who feel a professional obligation to defend the humanities against the present predilection of our universities for neglecting these areas and devoting resources instead to what the jargon calls “STEM” subjects (science, engineering, technology, and mathematics)? No, not exactly. That predilection is indeed a kind of institutionalized scientism. But by my lights the appropriate response is not to cast around for arguments that a degree in history or religious studies or Sanskrit or philosophy will really be just as useful on the job market as a degree in computer science, petroleum engineering, or accounting, but to think carefully through the complex pressures to which university administrators are responding: federal policy; half-articulated concerns about the needs of the economy; parents’ and students’ worries about graduates’ employability; and, I’m afraid, an inchoate but not entirely unjustified sense that, in recent decades, the humanities have been in serious decline, not to say near-collapse.

And what, you will be wondering, do I have to say to those who, going even further, maintain that non-scientific fields aren’t just relatively weak or just a little frivolous but outright misconceived, that there is no legitimate inquiry outside the sciences? It’s tempting just to repeat that, on the contrary, there’s obviously a whole host of perfectly legitimate kinds of question that not even the most sophisticated future science imaginable could answer. And for now, that’s about all I can do. But in the next lecture, where I turn my attention to the rising tide of scientism in philosophy specifically, I can take at least a few steps towards a fuller and more satisfying response. The scientistic philosophies in vogue today, I shall argue, are hollow at the core, in principle incapable of providing answers to crucial questions about how the world must be, and how we must be, if science is to be possible; and so they leave the very science on which they rely with no rational means of support. And to answer those crucial questions, I will continue, we need a philosophical approach that is neither purely a priori nor scientistic. Of course, as we’ll soon see, the devil is in the details.

Lecture II: Scientistic Phi losophy, No; Scientific Philosophy, Yes

The kind of philosophy which interests me and must, I think, interest everybody, is that philosophy, which uses the most rational methods it can devise, for finding out the little that can as yet be found out about the universe of mind and matter from those observations which every person can make in every hour of his waking life. It will not include matters which are more conveniently studied by students of special sciences such as psychology. …

It is true that philosophy is in a lamentably crude condition at present; that very little is really established about it; while most philosophers set up a pretension of knowing all there is to know—a pretension calculated to disgust anybody who is at home in any real science. But all we have to do is to turn our backs upon all such vicious conduct, and we shall find ourselves enjoying the advantages of having an almost virgin soil to till, where a given amount of really scientific work will bring in an extraordinary harvest…. C. S. Peirce.1

In “Scientific Philosophy,” the 1905 paper from which this passage is taken, Peirce urges that, if philosophy is to make real progress, philosophers will need to tackle their distinctive questions and problems in the same spirit, from the same desire to find things out, that has motivated the best work in (as he might say, meaning what we would now call “the sciences”) the “special sciences”; and, like inquirers in the special sciences, they will need to call on experience—but not, like them, on special, recherché experience, but on familiar, everyday experience. As he writes elsewhere, the essential difference between metaphysics and meteorology, linguistics, or chemistry is that it needs no “microscopes, voyages,” etc., but only “such experience as every man undergoes every day and hour of his life.” 2

Today’s cultural landscape is very different from Peirce’s—and today’s philosophical climate is very different from the philosophical climate of his day. Our discipline is no longer beholden to theologians, as it was in Peirce’s time, so his complaints about the sham reasoning characteristic of “seminary philosophers”3 no longer resonate as they once did; and his concern to rescue “the good ship Philosophy … from the lawless rovers of the sea of literature”4—though it will surely remind us of the recent but now apparently receding tide of postmodern cynicism—no longer seems so pressing. Moreover, while much philosophy remains as thoroughly a priori and as pointlessly disputatious as ever, there seems to be a growing dissatisfaction with the long-dominant analytic paradigm, and a growing interest in allying our discipline, somehow, with the sciences. So nowadays many philosophers’ response to Peirce’s call for reform would probably be that philosophy already is, or at least is rapidly becoming, scientific; so that, while it may have been necessary in his own day, his advice is now old hat,5 completely out of date.

I couldn’t agree less. By and large, I fear, philosophy is becoming, not more scientific, in the sense Peirce had in mind, but more scientistic. What we see is not sustained, serious efforts to make philosophical inquiry as fruitful and as rigorous as the best scientific inquiry has been, but instead, a raft of sterile exercises in faux rigor, a flimsy pretense that philosophy already is scientific; and not solid and industrious investigation of philosophical questions, but bold promises that this or that result from the sciences will do the job for us—and when, inevitably, these promises go unfulfilled, even bolder claims that this or that philosophical question, or even this or that entire field of philosophy, since it proves recalcitrant to scientific resolution, must be misconceived, and should simply be abandoned. The upshot is, to borrow a word of Peirce’s, “unphilosophical”6 in the extreme. In short, the good ship Philosophy is sinking fast; and Peirce’s advice is more apropos than ever.

Articulating more exactly what the root of the trouble is, however, is challenging to say the least. For—beyond dissatisfaction, overt or covert, with neo-analytic philosophy, and very often an element, overt or covert, of anti-religious sentiment—the scientism presently at work in philosophy is no less various than the scientism at work elsewhere in our culture. Some are proposing to turn philosophy into a kind of descriptive meta-science; others are looking to cognitive psychology, or evolutionary biology, or neuroscience, or physics, or …, etc., to resolve philosophical questions; others again, finding that philosophical questions resist resolution by whichever science they favor, are concluding that these questions must be misconceived; and what looks on the surface like a unified, revolutionary movement in the direction of “experimental” philosophy turns out to encompass several different projects, some potentially radical, others, at bottom, remarkably conventional.

For me, at least, this post-analytic adulation of science is almost as disorienting as the anti-scientific disparagement of science a few decades ago—another deafening din of philosophical axes being ground, and a new cacophony of confusing “isms”: “naturalism,” “reductionism,” “physicalism,” “scientific realism,” “radically naturalistic metaphysics,” and, yes, even “scientism.” Once you step back far enough to hear yourself think, however, you soon realize that all these scientistic proposals—whether the idea is to transform philosophy into meta-science, to invite one or another of the sciences to colonize it, or to abandon it altogether in favor of scientific work—have a common flaw.

The underlying thought is simple enough, though its ramifications—well, they ramify alarmingly, as philosophical ramifications are apt to do. It is this: if successful scientific inquiry is to be even possible, there must be a real world, a world that is independent of how we believe it to be; and this world can’t be a complete chaos of unrelated things and events—there must be kinds of stuff, things, events, etc., natural phenomena, and laws of nature. Moreover, we humans must have the sensory apparatus to perceive particular things and events in the world, and the cognitive capacity to represent those things and events, to form generalized explanatory conjectures and check out how those conjectures stand up to further experience, and to marshal and record what we learn of the world so those who come later can build on it.

All scientific work rests on these presuppositions; but the special sciences can neither explain nor justify them—that task falls to philosophy. And this is why the idea that philosophy should focus exclusively on the sciences lacks a cogent rationale; why efforts to squeeze answers to substantial epistemological, metaphysical, etc., questions out of fundamental physics, psychological experiments, evolutionary theorizing, or those fMRI brain-images of which aficionados of “neurophilosophy” are so fond invariably fall short; and why assuming that only questions resoluble by the sciences are legitimate leaves the very scientific results on which you rely hanging in mid-air with no rationally defensible means of support—in short, why all these forms of philosophical scientism fail.

Now, I fear, some may take me to be defending the idea that the job of philosophy is to provide a priori foundations for the scientific enterprise, and urging that we circle the wagons and retreat to the safety of the old analytic paradigm, relying on our conceptual or linguistic intuitions, insisting on the autonomy of our discipline, and ignoring what the sciences have to say. But this would be a complete misunderstanding—a misunderstanding based on a dichotomy I emphatically reject: that philosophy must either be a purely analytic enterprise, or else turn to scientism. Empirical knowledge includes scientific knowledge, yes—but it includes much more besides: historical knowledge, for example, legal knowledge, culinary knowledge, etc., etc., and the everyday knowledge to which Peirce alludes, the knowledge available to anyone in his daily interactions with the world and with others. And what’s needed to get a grip on the questions scientistic philosophy ignores or evades or dismisses outright is, precisely, to pay close attention to “those observations which every person can make in every hour of his waking life,” and to devote serious reflection to what they reveal.

The first goal of this second lecture is to show that the scientistic philosophies in vogue today are hollow at the core: that, in the name of science, they duck the very questions on the answers to which our capacity to figure out something of how the world is, and hence the possibility of the scientific enterprise, depend (§1). The second goal is, with Peirce’s help, to suggest how—going beyond the limitations of the analytic approach, but avoiding an unphilosophical scientism—we might begin to articulate answers to some of those questions (§2). And then it will be time to turn briefly to Peirce’s thoughts about the motive from which philosophy should be undertaken, and what these thoughts reveal about the perverse incentives partly responsible for the present sad state of our profession; after which it will remain only to show how the proposed approach avoids scientism, and how it explains the seductive illusion that philosophy can be conducted purely a priori (§3).

1. Diagnosing a Disaster: The Hollow Core of Scientistic Philosophy

The focus here will be, not on scientists’ efforts to colonize philosophy,7 but on philosopher’s hopes of handing their discipline over to one or another of the sciences. And I won’t engage in detailed historical exploration of scientistic themes in twentieth-century philosophy. But I will mention two earlier forms of philosophical scientism that set the stage for present trends. One, going back almost a century, is the logical positivists’ effort to banish the traditional problems of metaphysics, aesthetics, ethics, etc., to the realm of the cognitively meaningless and, at the same time, to charge philosophy with the supposedly all-important task of articulating the “logic of science”—making our discipline once again (as Moritz Schlick announced) “Queen of the Sciences,” albeit with a distinctly shrunken empire:8 a self-flattering idea to which, as we shall see, some philosophers seem recently to have returned. Another, going back almost fifty years, is Quine’s “Epistemology Naturalized,”9 in which more than one of the various forms of philosophical scientism in vogue today can already be discerned in embryo.

Even before “Epistemology Naturalized,” Quine’s critique of the analytic-synthetic distinction10 and his skepticism about meaning11 prefigured a shift from older positivists like Schlick or Rudolf Carnap, and a break with the analytic paradigm; and his doubts about intensional concepts, belief among them,12 put epistemology in his sights. But it’s the multiple ambiguities of “Epistemology Naturalized”13 that are most relevant here. On a modest reading, Quine seemed to suggest that epistemology can’t be conducted purely a priori, and that it might have something to learn from the sciences of cognition. On a more ambitious reading, he seemed to suggest that epistemological questions might simply be handed over to psychology, evolutionary biology, or maybe even physics to resolve. And on the most ambitious reading, he seemed to suggest that supposed epistemological problems not resoluble by the sciences are misconceived, and should be jettisoned.

How did Quine manage to suggest three apparently competing positions in one short paper?—in part, by using the word “science” in two quite different ways: sometimes in something like the older, broader sense, to refer to our presumed empirical knowledge generally, and sometimes in the modern, narrower sense, to refer to those specific fields now classified as sciences.14 This made it all too easy to elide the relatively modest claim that epistemology is part of science in the broadest sense, i.e., that it is at least partly empirical, into the much more ambitious claim that epistemological questions can be answered by science in the narrow sense, i.e., by one or another of the sciences. But then the sheer implausibility of the idea that psychology or evolutionary biology, let alone physics, could answer such characteristically epistemological questions as “what makes evidence stronger or weaker?” or “is induction valid?” made it all too tempting to conclude that these aren’t really legitimate questions after all.

On its most modest reading, Quine’s paper was a step in the right direction, towards an acknowledgment that philosophy is, or should be, about the world, not just about our language or our concepts. But it was the more ambitious, scientistic positions that caught on. By the 1980s, Alvin Goldman was announcing in Epistemology and Cognition15 that the cognitive sciences could tell us, for example, whether the structure of epistemic justification is foundationalist or coherentist. Others went still further. Stephen Stich informed us that cognitive science had displaced “folk psychology” by showing that there simply are no beliefs; so that epistemology is entirely misconceived.16 And the Churchlands proclaimed that neuroscience had shown the folk ontology of beliefs and desires to be as mythical as phlogiston; and so, again, that epistemology, which takes this folk-psychological ontology for granted, is nothing but an old, failed pseudo-discipline long overdue for the scrapheap.17

Not surprisingly, these bold scientistic promises and even bolder scientistic pronouncements of the death of epistemology fell flat on their faces. The cognitive-psychological studies that Goldman reported in the second half of his book failed to engage with the philosophical analyses he offered in the first half; and his promise of an experimental resolution of the debate between foundationalism and coherentism was never honored.18 The studies on which Stich relied fell so far short of showing that there are no beliefs that one of them actually bore the sub-title, “The Origin and Accuracy of Beliefs about One’s Own Mental States” (!)19 The neuroscientific work the Churchlands cited—whether focused on the ganglia of the sea-slug, on human infants, on pre-propositional capacities such as recognizing a vowel sound, or on motor capacities like catching a ball—went nowhere even close to establishing their revolutionary conclusions.20 In any case, as I said in the previous lecture, such overweening claims were self-defeating: if epistemology really were misconceived, the idea of there being evidence for believing something could be nothing but sheer superstition, and the science on which Stich and the Churchlands called could have no evidentiary support.

At the time, Goldman’s scientistic hope of colonizing epistemology for cognitive science and Stich’s and the Churchlands’ melodramatic scientistic dismissals of the entire field probably seemed to many, as they did to me, like bizarre aberrations—manifestations of “opportunistic naturalism,” as I put it in 1993,21 philosophers’ hope of jumping on the newest and most prestigious scientific bandwagon.22 By now, however, it’s clear that they were harbingers of the present tidal wave of scientistic philosophy.

* * *

Quine had suggested, specifically, that the theory of evolution might explain why humans’ “innate quality space,” our inborn dispositions to see certain things as alike and others as unlike, might roughly correspond to real natural kinds.23 Of course, even in 1969 the idea of “evolutionary epistemology” was far from new;24 and since then there has been a good deal of work on the biological preconditions of knowledge and inquiry. Sometimes—as with Popper, who tried to persuade us that the method of conjecture and refutation stressed in his philosophy of science was analogous to random mutation and selective retention in biology25—“evolutionary epistemology” wasn’t much more than a metaphor; sometimes—as with the Just-So story Michael Ruse offered by way of “argument” that the scientific method is part of our evolutionary heritage26—it was off-hand and casual. Still, much of what was produced was serious, modest interdisciplinary work, and not, by my lights, scientistic.

But it’s worth pausing for a moment over Hilary Kornblith’s Inductive Inference and Its Natural Ground27 because of its striking combination of insight and blindness. The insight: Kornblith is right on target about the questions that need to be tackled—“What is the world, that we may know it?” and “what are we, that we may know the world?”28 The blind spot: not noticing Quine’s double use of “science,” Kornblith simply follows him in assuming that, since these are empirical questions, they must be questions for the sciences to resolve.29 As a result, he succumbs to a kind of scientism: offering, by way of answer to his second question, the results of psychological research that—though it surely has contributory relevance—couldn’t possibly, by itself, do the job.30

But my main focus here will be today’s newer, and much brasher, styles of scientistic philosophy, beginning in the early years of this century with the first breathless announcements of the birth of “experimental philosophy.” As we saw in the previous lecture, this phrase was at one time the usual term for what we would now call “the sciences”;31 this time around, however, it refers to a philosophical “movement,” as its proponents call it, with a brand name, a logo, and even an anthem.

Especially given the tone of some enthusiasts’ YouTube self-promotion, it’s tempting to dismiss the whole “experimental philosophy” enterprise with Peirce’s mordant observation that “[i]conoclastic inventions are always cheap and often nasty.”32 But no; it’s worth looking more closely because, while it’s clear enough what experimental philosophers are against—armchair linguistic or conceptual analysis relying on the individual philosopher’s own “intuitions”—it’s much less clear what, exactly, they’re for. Moreover, while the talk of a “movement,” the branding (“X-phi”), the logo (a burning armchair), and the anthem (belting out something about “tak[ing] it to the source and find[ing] out who we really are”)33 might suggest that revolutionary change is in the offing, Joshua Knobe and Shaun Nichols’s much tamer introduction to their 2008 anthology of work in experimental philosophy might convey the impression that they’re proposing nothing more radical than adding one more tool to the philosopher’s toolbox, in something like a return to an older tradition from a time before philosophy and psychology had evolved into distinct, separate fields—only (as Knobe and Nichols suppose) in a more rigorous way.34 The phrase “bait and switch” comes to mind.

But what is going on, exactly? The clearest picture I can form is this. The initial impetus was an understandable frustration with the idea that philosophers should rely on their own conceptual or linguistic “intuitions,” and with the inconclusive—and, frankly, terminally boring—disputes that arose when those intuitions turned out to be at loggerheads with each other. This frustration prompted these new-fledged experimental philosophers to try conducting little surveys (not, however, in the usual sense, experiments) as a putatively better way to determine “what we would say if … .” However, some of those social-psychological surveys of what subjects say they would say in these or those circumstances gave mixed results, leading some experimental philosophers to suspect that “our” concepts may be neither so simple nor so culturally invariant as the analytic mainstream imagined;35 and so they began devising slightly more complex surveys to explore factors that might influence such variations.36 And perhaps it was this that suggested the further possibility that the same methods might also serve, more generally, as a way of exploring how the human mind works—a project duly dubbed “experimental philosophy of mind.”

In its initial conception, experimental philosophy wasn’t really, as you might have supposed, a radical alternative to the analytic paradigm; it was analytic philosophy, albeit conducted by other than the usual introspective means. Moreover, these “other means” were nothing new: Arne Ness had conducted just such a philosophical survey, considerably more rigorously than many experimental philosophers today, in the 1930s.37 The next step was potentially more radical, suggesting that there might be real problems with the presuppositions of the analytic paradigm. But this was really nothing new, either; anyone familiar with the classical pragmatist tradition will realize that experimental philosophers of this second stripe might, on the most charitable interpretation, be seen as taking tentative baby steps towards a path that Peirce and his successors had cleared for us well before the analytic tradition took hold38—though somehow they seem to miss the most important point, that our concepts grow richer and deeper as our knowledge grows.

But most immediately to the present purpose is the third instantiation, the idea of an experimental philosophy of mind, because this brings to the surface a crucial ambiguity in “psychology” and “psychological,” exactly parallel to the ambiguity in Quine’s use of “science.” “Psychological” may mean either, broadly, “to do with the workings of the mind” or, narrowly, “falling within the sphere of the science of psychology.”39 People form beliefs, hopes, fears, desires, designs, plans, etc.; sometimes they deceive themselves, managing to believe that what they want to be true, is true; sometimes their judgment is skewed because they’re in the grip of strong emotion, sometimes they’re especially diligent in inquiry because they’re passionately anxious to find something out; what they think they see can be influenced by what they expect to see; etc.—these are all, in the broad sense, psychological truths. These are truths we learn from our everyday experience of the world and our everyday interactions with other people; and playwrights, novelists, etc., have been exploring their complexities for centuries. Psychological (narrow-sense) experiments or surveys may teach us more about the details of the effects of expectation on perception or the mechanisms of self-deception, etc.; but we certainly didn’t need the science of psychology to teach us those familiar underlying truths. So—while some in the big tent of experimental philosophy may be doing decent interdisciplinary work at the borders of philosophy and (narrow-sense) psychology—it’s not surprising that most, disregarding the difference between the broader and the narrower meanings of “psychology,” seem to be pursuing Goldman’s old scientistic fantasy of squeezing substantial philosophical conclusions out of narrow-sense psychological results.

You might wonder why—unlike Goldman, who relied on the work of cognitive psychologists and others in related fields—these experimental philosophers generally conduct the surveys on which they rely themselves.40 Ironically enough, a large part of the answer seems to be that their surveys are often focused on very familiar, very conventional old-chestnut puzzles from the analytic tradition, such as the Gettier paradoxes or the trolley problem, or else on recently-fashionable puzzles in neo-analytic contextualist epistemology;41 people’s intuitions about which are, to put it politely, unlikely to be of burning interest to professional psychologists. You might also wonder why so many of these surveys are apparently conducted in classroom settings.42 A large part of the answer, I suspect—besides, of course, the obvious fact that this kind of survey is both cheap and easy—may be that if you were to ask regular people on the street, rather than a class of meekly compliant students, what they would say about whether, in the scenario described by Gettier,43 Smith knows that Brown is in Barcelona, the most likely response would, understandably, be a baffled “Huh?”

Perhaps it’s unnecessary to add that the suggestion that experimental philosophers are merely returning to an older tradition in which philosophy and psychology weren’t as clearly distinct as they are now is very far from the truth. Take, for example, the work of Alexander Bain, the remarkable Scottish philosopher-psychologist whom Peirce once called the “grandfather of pragmatism.”44 Bain’s ideas are certainly of great philosophical interest; but that’s because, writing before the rise of experimental psychology, he happily took on any and all questions about the human mind, and paid especially shrewd attention to aspects of the human psyche of which everyone has experience but on which few ever seriously reflect. In my estimation, Knobe and Nichols’s anthology isn’t nearly as rewarding as Bain’s The Emotions and the Will,45 first published in 1859.

Still, isn’t today’s experimental philosophy more rigorous, at any rate, than the work of Bain and others like him? Well, it’s true that Bain doesn’t give us graphs and tables, as experimental philosophers do. But, as I stressed in the previous lecture, this doesn’t by itself guarantee rigor; and in fact it looks to me as if experimental philosophers’ graphs and tables often mask significant methodological flaws.46 Moreover, the little “vignettes” to which their questionnaires elicit subjects’ responses often seem sketchy and under-described. And in any case, and more importantly, the survey methods on which they rely would be completely inappropriate to the kinds of question Bain tackled.

My discussion of experimental philosophy has focused primarily on its expression in epistemology and philosophy of mind; now it’s time to turn to metaphysics, and a 2007 book by James Ladyman, Donald Ross, et al., Every Thing Must Go, which proposes a whole other style of philosophical scientism. Ladyman and Ross’s subtitle, “Metaphysics Naturalized,”47 echoes Quine; the opening sentence of their preface tells us that “contemporary analytic metaphysics fails to qualify as part of the enlightened pursuit of objective truth, and should be discontinued”;48 and a few pages later we learn that it’s not only “contemporary analytic metaphysics” that Ladyman and Ross regard as beyond the pale, but a priori metaphysics generally.49 They assure us, however, that they aren’t proposing, like the positivists, to abandon metaphysics entirely but want, instead, like the pragmatists, to reform it.50 This sounds promising; but you don’t have to read much further before you realize that that it is, to say the least, misleading.

Even though they occasionally allude to Peirce, Ladyman and Ross have apparently relied on what Putnam says about him, rather than actually reading him themselves;51 and, so far as I can tell,52 the “radically naturalistic metaphysics” they envisage is very different from the scientific philosophy Peirce proposed. It looks to me, in fact, like nothing so much as a repackaged version of the positivists’ hope of making philosophy into meta-science—it even has the same tone of deferentialist triumphalism, kowtowing to the sciences while puffing up philosophers’ importance. For—somewhat as the old positivists insisted that the only legitimate task of philosophy is the articulation of the supposed “logic of science”—Ladyman and Ross insist that the only legitimate task of metaphysics is the search for a “global consilience network,” meaning an account that, instead of trying to “domesticate” what they condescendingly call “folk pictures” of the world, 53 will unify the ontologies of the various sciences.54

For present purposes, it’s not necessary to go into the details of what Ladyman and Ross take this unified ontology to be; which is fortunate, because this would mean fighting our way through dense thickets of what they call “dialectical argument”—i.e., protracted criticism of almost every other philosopher who has ever written on this or any related matter. However, it’s worth noting that, maintaining the “primacy of physics,”55 Ladyman and Ross focus largely on the ontology of fundamental physical theory;56 which, according to their “ontic structural realism,” consists of patterns or mathematical models57—presumably, mathematical models of patterns or structures.58 So, you might wonder, where does the concern for “consilience” come in?—apparently, under what they call their “Rainforest Realism,” according to which the ontologies of the “special sciences” (by which they mean, every science except fundamental physics)59 are constrained by, but not reducible to, the ontology of fundamental physical theory. They have remarkably little to say, however, about the specifics; and while they devote many pages to quantum mechanics, serious references to the work of psychologists, sociologists, economists, anthropologists, etc., are notable by their absence.

Physical objects, Ladyman and Ross aver, are merely constructions, apparently mental constructions, made by humans and other intelligent social animals as “second-best tracking devices” of certain really-real patterns.60 This explains their curious title: things are out; patterns are in. But it leaves one puzzled about why, though they deny that there are things, they insist that this doesn’t “impugn the everyday status of objects like tables and baseballs”;61 and why they believe that, because the fact that there are kinds is (they claim) the same fact as the fact that there are relatively stable local patterns, it follows that there are no kinds.62 (Part of the problem is that it’s hard to know when, like Bishop Berkeley, they’re speaking with the vulgar, and when we’re hearing their official story.) But since what primarily concerns us here is Ladyman and Ross’s conception of the relation of philosophy to the sciences, I can set all this aside.

Hinting that only someone desperate enough to turn to “natural theology or speculative [by which they mean, a priori] metaphysics” for answers could possibly deny this, Ladyman and Ross write that “with respect to anything that is putatively a matter of fact about the world, scientific institutional processes are absolutely and exclusively authoritative.”63 This is an astounding statement. Can they really have forgotten the kinds of factual question that require historical research or legal scholarship or detective work or, etc., to answer, and even such everyday kinds of factual question as what building the physics department is in, or what they had for breakfast the day they wrote that extraordinarily incautious, and paradigmatically scientistic, line? Ironically enough, evidently unaware that in ordinary English the word has long been pejorative, they adopt “scientism” as their own word for their approach64—and “scientism” certainly is the mot juste, negative connotations and all.

I trust it’s unnecessary for me to say that I’m not for a moment suggesting that natural theology or a priori metaphysics is any substitute for well-conducted science; nor am I denigrating either the legitimacy or the importance of questions about how the various sciences hang together, or denying that, with respect to many kinds of factual question, our best bet is indeed to look to what the relevant science currently has to say. But generic references to the “institutional processes of science,” which are apparently all Ladyman and Ross have to offer on this score, don’t even begin to explain why looking to the sciences is so often our best bet65—a task that would require real epistemological work, not to mention serious attention to the susceptibility of those institutional processes to corruption.66 And the thesis that psychology is constrained by, but not reducible to, physics67 (which is pretty much all you find when you follow Ladyman and Ross’s index entry for “psychology”), though true enough, doesn’t even begin to explain how states and processes of the brain relate to mental states and processes such as belief and inference, a task that requires real metaphysical work of a kind undreamt of in their scientistic philosophy.

And finally I turn to—oh my goodness!—Alex Rosenberg. Ladyman and Ross are turgidly academic; Rosenberg writes in the breezy, jokey, mildly profane style of the blogosphere. Ladyman and Ross acknowledge that the currently-accepted scientific theories on which they rely may turn out to be mistaken; Rosenberg simply takes these theories for granted. Ladyman and Ross reveal their anti-religious feeling mostly in snippy asides; Rosenberg wears his anti-religious agenda on his sleeve—or rather, on the dust-jacket of his 2011 book, The Atheist’s Guide to Reality. And while Ladyman and Ross’s philosophy is, for sure, scientistic, Rosenberg takes the scientistic game of philosophical chicken to a whole new level, far beyond even the Churchands’ wildest eliminativist dreams.

Like Ladyman and Ross, and apparently no more aware of the pejorative overtones of the word than they, Rosenberg calls his position “scientism.” But as he uses the word what it refers to is—wait for it—the view that all atheists share.68 This is downright perverse. For one thing, there’s already a perfectly good word for the view that all atheists share: “atheism.” For another, there have been, and still are, plenty of atheists whose atheism has nothing to do with science; many religious scientists; and, I’m sure, many people (myself among them) who don’t buy the idea that, if theological explanations fail, the only possible conclusion is that the sciences must explain everything. His boastful title, “The Atheist’s Guide …,” notwithstanding, Rosenberg certainly doesn’t speak for all of us.

In fact, Rosenberg goes beyond the false dichotomy of religion or science, insisting that, since theological explanations fail, physics can explain everything. (Sometimes, however, he forgets which is the cart and which is the horse, and you find him arguing that physics must explain everything, because otherwise there would be wiggle-room for religious explanations to weasel their way in.)69 Anyhow, according to Rosenberg, what physics tells us is really real is (not patterns or mathematical models, but) fermions and bosons. And physics, he tells us not once but umpteen times, “fixes all the facts”70—including not only the facts of chemistry, but also the facts of biology, and therefore, he claims, all the facts about ourselves.

Anything physics can’t explain, according to Rosenberg, must be an illusion. The universe has no purpose, he begins, and human lives no meaning.71 “Doesn’t this ‘nihilism’ about the physical and biological worlds put us on the slippery slope down to nihilism about the social and the psychological worlds, as well as the moral and political ones?” he asks; and answers, “Yup.”72 The notion that there are moral values is an illusion that evolution has somehow tricked us into accepting; really, all moral claims are false.73 And the same goes, apparently, for values of other kinds, including the epistemological.74 The mind is the brain, Rosenberg avers;75 and almost everything we believe about ourselves and our minds is false. The title of his chapter 8—where our old friends the sea-slugs turn up yet again76—puts it like this: “The Brain does Everything without Thinking about Anything at all.”77 If this were true, the conclusion would be unavoidable: Rosenberg wrote his book, and physicists developed the theories on which he relies, without thinking about anything at all. My reaction might be best expressed in Rosenberg-ese: “OMG, is this guy for real?”

No wonder, these days, I so often find myself thinking with a wry smile of that splendid passage towards the end of Aldous Huxley’s Brave New World, where the Controller asks the Savage if he knows what a philosopher is, and the Savage—he has read only Shakespeare, whose works are banned in the “civilized” world—answers, quick as a flash: “a man who dreams of fewer things than there are in heaven and earth.”78

2. Coping with Complexity: The Path to Scientific Philosophy

So, as usual, I’m the cannibal among the missionaries. For, in the midst of all this scientistic hubbub, I’ve been trying to develop an approach that’s neither purely a priori nor scientistic but, as I put it in Defending Science, “worldly”:79 not restricted to our concepts or our language, but focused on the world, and acknowledging the contributory relevance of results from the sciences, but not expecting them to do our philosophical work for us. And part of this project has been to articulate an understanding of the world and of our distinctive human mindedness that, while acknowledging that the only stuff there is, is physical, is neither reductionist nor eliminativist. As I wrote in 2003, “it’s all physical, all right; but it isn’t all physics.”80

I begin, as my temperament inclines me, and in what I take to be the spirit of Peirce’s recommendation,81 with a host of everyday observations. Everyday experience reveals a world of astonishing variety—on the earth, oceans and deserts, mountains and rivers and plains, jungles and forests and savannahs, a multitude of kinds of physical stuff, plants, animals, reptiles, birds, creepy-crawlies, bugs, slugs, mold, etc. and, beyond the earth, a sun, a moon, stars, etc. It also reveals regularities amidst this vast variety. The sun rises and sets, the moon waxes and wanes, tides rise and fall, and stuff, things, plants, and animals of a kind behave in predictable ways: wood burns, but rocks don’t; acorns don’t grow into pea-plants, or peas into oak trees; crocodile eggs hatch into baby crocodiles, not cardinal birds, and cardinal eggs into baby cardinal birds, not baby crocodiles; wolves eat meat but not grass, rabbits grass but not meat; and so on.

By now, here on earth an astonishing array of human artifacts overlays and interpenetrates this natural reality. These artifacts might be categorized into the physical—huts and hats, books and bombs, cutlery and computers, roads and railways, farms and factories, pepper-mills and power stations, slaughterhouses and spacecraft, laboratories, beakers, and Bunsen burners; the social—economies, currencies, marriage rituals, governments, religions, legal systems, codes of honor and of etiquette, scientific societies, conferences, customs, and conventions; the imaginative—legends, myths, stories, ballads, poems, plays, novels, cartoon characters, soap operas, video games; and the intellectual—codes, maps, diagrams, mathematical and musical notations and theories, philosophical systems, works of history, and scientific concepts and theories. But, as my list already intimated, everywhere there is crisscrossing of categories: a language, for example, is both a social artifact and an intellectual one; Harriet Beecher Stowe’s novel, Uncle Tom’s Cabin, is an imaginative artifact, but copies of the book are physical artifacts, and the system of slave labor and slave trading it depicts so vividly was a social artifact; legal systems are social artifacts, but court-rooms, prisons, books of statutes and rulings, English judges’ wigs, etc., are physical artifacts, and the contents of those legal books and rulings are intellectual artifacts; and there are scientific artifacts in all these categories, including the imaginative.

Everyday experience teaches us that we can make physical artifacts by exploiting the natural properties of natural stuff, putting those properties to some purpose of ours; that knowledge of how to make things is passed from one generation to another and, in the process, prompts further innovations; and that this cultural transmission was enormously amplified by the invention of writing. Moreover, our experience is that sometimes we can explain how people behave, and even predict what they will do—not just that if you push a person off a building, he will fall to the ground, but also that if a Chinese infant is raised in a Spanish-speaking environment, he or she will grow up speaking Spanish, not Chinese; that if people are afraid of mad-cow disease in the beef supply, sales of chicken will go up; that if you give professors raises only if they publish a lot and run around talking at lots of conferences, most will find some way to do what they’re given this incentive to do. And so forth.

By now, thanks to the work of many generations of scientists, much, much more has been found out about our planet, our galaxy, our universe, the composition of the distant stars, the accretion of matter, the evolution of the elements on earth, the origin of species, the commonalities and the differences among human societies, and about ourselves. To be sure, by now commonsense conceptions of kinds, stuff, phenomena, laws, etc., have been rethought and reconfigured by generations of biologists, chemists, physicists, and other scientists; and a good deal of what was once taken to be commonsense knowledge is now known to have been mistaken. But long before there was modern science people knew that the world isn’t a chaos of random events, that there are kinds of stuff, kinds of thing, and regularities in the way they behave.82 And long before there was modern science, people knew, too, that we have some capacity to represent, and to devise possible explanations of various aspects of, the world.

Of course, the serious philosophical work of figuring out “the little that can as yet be found out about the universe of mind and matter” from those everyday observations begins only when we start asking such questions as: What, exactly, is the difference between the real and the imaginary? How does natural reality differ from socially-constructed reality? What’s involved in there being laws of nature? What are kinds, and what kinds of kind are there? How did there come to be kinds and laws? What is inquiry, and what makes it better or worse conducted? What factors determine whether the evidence for a claim or theory is strong or weak? What is the role of perception, of memory, of inference? What, if anything, is distinctive about the human mind? How did our human mindedness come about? Are human infants born minded, or do they become minded, and if so, how? What exactly is going on when someone believes that the earth revolves around the sun, wonders whether peptic ulcers might be caused by a bacterium, figures out how to test the theory that cholera is waterborne, hypothesizes that ours is only one of many multiverses, etc.? And it’s on the answers to these questions in metaphysics, epistemology, and philosophy of mind, and the many further questions those answers inevitably raise, that the very possibility of scientific inquiry depends.

In what follows, referring you elsewhere for further thoughts on the metaphysical and epistemological dimensions,83 I shall focus on a cluster of questions in philosophy of mind—the most straightforward way to see what’s so wrong-headed about the Churchlands’ supposedly tough-minded eliminativism and Rosenberg’s supposedly even tougher-minded whole-hog nihilistic physicalism; what’s so disappointing about Ladyman and Ross’s rather perfunctory treatment of the relation of mind and matter; and why the survey methods of experimental philosophy couldn’t even scratch the surface of the key questions. Human mindedness, I shall argue, is neither a myth nor a mystery. But it can’t be understood exclusively in evolutionary terms, or exclusively in terms of neurophysiology, and certainly not exclusively in terms of physics; even a halfway adequate understanding will involve an ineliminable socio-historical element.

To say even this much, of course, is already to invite the scorn of scientistic philosophers. Some, doubtless, will dismiss me as a stick-in-the-mud still wedded to the old folk-psychological mythology. But sneering at the idea that we can sometimes explain a person’s actions by reference to what he wants and what he believes by calling it “folk psychology” hasn’t the slightest tendency to show that it’s mistaken; and neither, if my account is even roughly on the right lines, does the failure to locate beliefs, desires, etc., in the brain. Others, doubtless, will object that, unless I’m covertly smuggling in a soul or some crypto-Cartesian mental substance, my position must be incoherent; that, if there is only physical stuff, everything must, ultimately, be explicable by physics. In due course we’ll see that this argument is a fallacy of equivocation. But now I’m getting ahead of myself.

* * *

Let me begin by assuring you that there are no Cartesian or theological cards up my sleeve—none.

Human beings are physical creatures in a physical world, subject to the same physical laws as everything else—laws that, in combination with facts about our build and theirs, explain why (most) birds can fly unaided but we can’t, why cheetahs can run at almost 60 miles an hour 84 but we can’t, and so on. Our brains are made of physical stuff;85 and, most to the present purpose, the nature of the physical stuff of which our brains are made enables our mental capacities, because it’s plastic, adaptable, capable of forming complex internal connections and external associations, while at the same time it constrains our capacities, because it’s not infinitely plastic and adaptable.

Moreover, our species is the product of a long process of evolution, a process that explains our ability to walk upright, our vestigial appendices, our big brains, and maybe even such psychological characteristics as our capacity for altruism, our penchant for induction, or (as Peirce thought) our aptitude for making correct abductions more often than chance.86 And, since all species are the product of this same evolutionary process, it’s not surprising that there are striking continuities between humans and other creatures: birds build nests, and bower birds even decorate their nests elaborately; beavers build lodges; rabbits dig warrens. Some primates, and even some crows, use tools.87 And some animals, like the troop of Japanese macaques that picked up the habit of washing their sweet potatoes to get rid of sand before they eat them,88 transmit know-how from one to another. But of course those macaques don’t, like us, grow sweet potatoes, make lists of the pests and diseases to which they are susceptible, or invent recipes for cooking them, let alone publish sweet-potato cookery books. In short: human beings really do have mental capacities far beyond those of even their closest and cleverest primate relatives.

This is not to say that there must be one single, simple capacity that humans have and other creatures don’t; more likely, it’s a combination of characteristics that humans have in significantly greater degree than other animals that explains why we are, as I shall say, “minded”89 in a way no other creatures are: self-aware, able to speak, to read, and to write, to form explicit designs and plans, tell stories, crack jokes, paint pictures, make music, venerate ancestors, relics, and holy sites, etc., etc.—and to devise explanations and theories, including scientific theories.

“If everything is physical,” today’s scientistic philosophers will ask at this point, “what could the explanation of our unique abilities possibly be, if not the greater size and complexity of the human brain?” “And who could possibly tell us about that,” enthusiasts of neurophilosophy will chime in triumphantly, “if not neuroscientists?” I’m tempted to say they’re barking up the wrong tree; but it would be more accurate to say that what we’re looking for isn’t to be found in any tree. To be sure, the human mind would be impossible without the human brain; but the brain isn’t all there is to it. Rather, it’s culture that makes mindedness possible—even as, at the same time, mindedness makes culture possible.90

Perhaps you find that last remark opaque; and it’s certainly in need of much more articulation—at least some of which I’ll supply in due course. Or perhaps you suspect that, after all, I’m smuggling in something non-physical; but on this score I can offer some reassurance right away. “Physical” has a double usage, rather like the double usage of “healthy.” “Healthy” applies in a primary sense to people, animals, and plants; and in a secondary sense to a climate, diet, etc., meaning that it is conducive to healthy humans (or, depending on context, to healthy polar bears or healthy dolphins or healthy soy-bean crops, etc.). In a roughly similar way, physical laws are laws governing physical stuff, physical kinds are kinds of physical stuff, physical phenomena are phenomena involving physical stuff, and physical relations are relations among physical things or kinds or bits of physical stuff. So, when I say “it’s all physical,” this should be understood as shorthand, not for “nothing but physical stuff is real,” but for “all the stuff there is, is physical”91—and as acknowledging that, besides physical stuff and physical things made of physical stuff, there are events involving physical things and physical stuff, physical kinds, physical laws, physical phenomena, physical relations, i.e., kinds, laws, phenomena, and relations of physical stuff.

So the challenge is whether, and if so how—on the assumption that everything is physical in this double-layered sense—we can answer the second of Kornblith’s good questions: “what are we, that we can know the world?” Well, I begin: such knowledge as we have of the world ultimately derives from our experience of it and the conjectures, inferences, and beliefs we form to account for that experience. Perceptual relations, our sensory interactions with things, events, etc., are a sub-class of the innumerable physical relations between humans and the rest of the world.92 But conjectures, inferences, beliefs, and the like, the mental states and processes sometimes classified as “propositional attitudes,” are less straightforward; to understand these, we need to refer to the enormously complex meshes of humans’ semiotic relations to the world: their relations to stuff, things, and events in the world and to words and other signs, and the relations of those words, etc., to that stuff, those things, those events. To call a relation “semiotic” doesn’t mean, however, that it’s not, in the broad sense explained, physical; rather, it is to identify it as a triadic relation involving persons, signs, and things. The semiotic relation of a pattern of word-usage to things and events in the world, for example, is a relation of (i) language-users, (ii) the sounds and marks they make, and (iii) things, events, etc., in the world.

I will focus here on belief—a phenomenon to which my epistemological work has obliged me to give a good deal of thought. I begin with a platitude that, as Bain wrote in 1859, is “admitted on all hands”: the unmistakable test of sincerity, of whether a person really believes what he says, is his “[p]reparedness to act on what he affirms.”93 In accordance with this, the first element in my account of belief is behavioral. But the second element, in accordance with the idea that the only stuff there is, is physical, is neurophysiological. And the last element, the one that accounts for the content of his belief, is social.

So, at a very approximate first approximation: if Tom believes, say, that tigers are dangerous then, normally:


Since its second clause speaks of “receptors” and “activators” without saying anything more about what these are than that they are physical aspects of the brain and/or central nervous system, this account is only schematic; but, schematic as it is, I hope it’s enough to make clear that we should be looking, not for some neurophysiological kind of brain-matter corresponding to the proposition that tigers are dangerous, but for associations of bits of generic brain-matter with tigers, with things that are dangerous, and with “tiger,” “dangerous,” etc. So far as I know, brain scientists haven’t had much to say about this; but I stumbled by chance on one tiny but intriguing piece of evidence: a study of patients awaiting brain surgery in which neuroscientists reportedly found that in each subject there was just one generic neuron that fired when the patient saw an image of Homer Simpson or heard the name “Homer Simpson,” or even Homer’s catch-phrase, “Doh!”95

Obviously, also, this account addresses just one element of a whole complex mesh of interrelated problems. More work would be needed to move beyond believing that p to other propositional attitudes such as hoping that p, fearing that p, wishing that p, and, most relevant to the work of the sciences, wondering whether p, conjecturing that q, inferring that r, and so on; and much more work would be needed to articulate what exactly is involved in talking of propositions or theories, let alone of culture. And even with respect to the one issue it addresses directly, this account is only the most approximate of first approximations, needing an enormous amount of amplification and many refinements.96

For one thing, the “normally” with which I began needs a lot of work; eventually the story would have to be spelled out in terms of (non-natural) sign-use generally, rather than of language-use specifically. For another, the belief I chose as my example, that tigers are dangerous, made the task relatively easy; a lot more work would be needed to accommodate mathematical beliefs, theoretical beliefs, religious beliefs, etc. The account will need to be made less atomistic, to accommodate the interaction of beliefs both with each other and with other propositional attitudes. And more will have to be said about how, over the first few years of life, a human infant gradually becomes minded as, through his interactions with others and with the world, he learns language; and about our habit of attributing beliefs, or at least “beliefs,” to some animals, those that satisfy some but not all the elements of this account.

Still, even in its present crude and incomplete form, this approach has some explanatory power. For example, it suggests a partial explanation of why it’s so hard to give tidy conditions for the individuation of beliefs: different languages, and even different idiolects of the same language, don’t always map words and world in exactly the same ways. And, without needing to appeal to mysterious non-physical causes of physical movements, it provides a partial explanation of how what a person believes and what he wants can explain what he does. When, for example, wanting a glass of cold orange juice, I go to the cupboard to get a glass and then to the fridge to get the orange juice, it’s those “activators”—which, to repeat, are physical aspects of my brain and nervous system—that get me moving; but which activators fire, and hence what I do, depends on which activators are associated with the things, events, etc., and with the words associated with the things and events, involved in my desire and my belief.

This approach is both worldly and social, giving key roles to the relations of people to things and events and to words, and to relations of word-usage in a linguistic community to those things and events. This is why, as I said earlier, while the human mind would be impossible without the brain, the brain isn’t all there is to it. And, sketchy as this has been, if you’ve followed me this far you’ll see that the survey methods of experimental philosophy, even if they were far more rigorous and sophisticated than they’ve been up till now, are not the way to understand such mental states and processes as belief, conjecture, inference, etc. You’ll see, too, that the reason the Churchands imagined they’d discovered that there are no beliefs was that they hadn’t thought hard enough about what belief is—and, more generally, that the sub-title of Patricia Churchland’s well-known book, Neurophilosophy, “Toward a Unified Science of the Mind-Brain,” already revealed a crucial misstep. And you’ll also see that, while Ladyman and Ross are quite right to say that the physical constrains, but doesn’t determine, the psychological, this leaves us still in need of an account of the “particular go” of it—something not to be found by looking to physics or even to narrow-sense psychology, but an unavoidably philosophical task, requiring just the kind of articulation of how the behavioral, neurophysiological, semiotic, cultural, etc. factors hang together that it falls to philosophy to provide.

OK, you may say; still, the fact that beliefs, etc., are not reducible to physical states of the believer, but involve complex relations and relations of relations to the world and to others, doesn’t show that the ultimate explanation of these relations couldn’t be given by physics. So how can you be sure that it isn’t, in the end, all physics, that physics-fixes-all-the-facts-ism in the style of Rosenberg can’t possibly do the job? After all, mightn’t the obstacle to reducing these relations to physics be only like the difficulty of predicting exactly where a hurricane will make landfall—not an in-principle irreducibility, just a matter of enormous complexity?

I don’t think so. It’s not just that the idea that physics, even the most sophisticated future physics, could tell us how humans became capable of language-use in the first place, how all the myriad human languages there have ever been evolved, how to interpret Portia’s “quality of mercy” speech in The Merchant of Venice, how to understand Watson and Crick’s article on the structure of DNA, or how to make sense of the 900-odd pages of President Obama’s Affordable Care Act, etc., etc., boggles the mind—though it certainly does. It’s also, and more importantly, that reducing the socio-historical-linguistic loop of human sign-usage to physics would be possible only in a completely deterministic world, the kind of world Laplace imagined, in which there were no probabilistic laws and no elements of randomness. The real real world isn’t like that;97 it has a history, a history marked by the singularities of the origin of our universe, the evolution of the elements from hydrogen, the evolution of species on earth, etc., and by contingencies, probabilities, coincidences.

I hasten to add that of course this isn’t to deny that there are real natural laws; it is only to say that there are also elements of chance, of randomness98—as, notwithstanding his repeated insistence that physics fixes all the facts, even Rosenberg seems implicitly to recognize. Physics, he tells us, explains why there are “blind” variations, imperfections in the copying of genes, for natural selection to work on.99 He doesn’t claim, however, that physics explains why mutation resulted in these variations rather than others; and he says explicitly that the emergence of our species was an improbable accident.100 And this is tantamount to admitting that, after all, there are facts that physics doesn’t fix. Moreover, even tiny elements of randomness may have very large consequences.101 At the very least: if not for the random variations that eventually gave rise to humans, there would be no human languages, no human cultures, no human artifacts, and so—unless intelligent life has come about elsewhere in our (or another) universe—no science.

So the argument that, if it’s all physical, all the truths there are must be reducible to the truths of physics is a kind of fallacy of equivocation. It relies on the premise that physics is the science of physical stuff. But in the interpretation in which this is true, it doesn’t yield the conclusion; while in the interpretation in which it would yield the conclusion, it’s not true. It’s true that physics is the science to which we look to understand the nature of matter itself, the processes that created it, and the laws that govern it; but it’s not true that the laws of physics can explain every phenomenon that has arisen in the course of the many contingencies and coincidences in the history of the world and of human civilizations—and this is what would be needed to yield the conclusion. It’s all physical, yes; but physics doesn’t fix all the facts.

3. Adjusting our Attitudes: The Problem of Perverse Incentives

By now you may be wondering why, if is as fundamentally flawed as I have suggested, scientistic philosophy of one kind or another has been so attractive to so many. My diagnosis would begin by noting that, despite their many differences, one thing today’s scientistic philosophers seem to have in common is an inchoate sense that something’s badly amiss with our discipline, that we can’t just go on with philosophical business-as-usual. And, indeed, I would continue, something is rotten in the state of philosophy. Our discipline becomes every day more specialized, more fragmented into cliques, niches, cartels, and fiefdoms, and more determinedly forgetful of its own history.102 More and more journals are crammed with more and more unread, and sometimes unreadable, articles about what X said about Y’s interpretation of Z’s response to W. Anyone with enough frequent-flyer miles to upgrade to publication-by-invitation is relieved to bypass a relentlessly conventional peer-review process often crippled by tunnel-vision, cronyism, and self-promotion. I won’t even mention the decades of over-production of Ph.D.s, or the disastrous effects of that horrible, and horribly corrupting, “ranking” of philosophy graduate programs.

Combine this with the fact that the neo-analytic establishment, though institutionally still pretty firmly entrenched, seems close to intellectual exhaustion, and it’s certainly no wonder that young Turks—and middle-aged Turks, and elderly Turks, too—are bored and restive, casting around for something new; and no wonder, either, that we’re beset by passing fads and fashions—among them the scientistic fads and fashions I’ve been criticizing here. Unfortunately, far from solving the problems of our profession, this hydra-headed scientism makes things, not better, but worse; for, as we have seen, it is in the end nothing but a confession of philosophical failure.

Up till now, it has been the first of Peirce’s themes—that, because philosophy is about the world, it requires experience, but that it differs from the sciences in requiring close attention to everyday experience rather than elaborate efforts to secure experience of a recondite kind—on which I have relied as I put mindedness, generally, and belief, specifically, under the microscope of philosophical reflection. But Peirce’s other theme—that philosophical inquiry should be conducted in the same spirit, from the same desire to figure out how things are that, he believes, has motivated the best work of the sciences—also has a role to play, this time in my reflections on the causes of the more general malaise of which those scientistic fads and fashions are just some recent manifestations.

Like the serious inquirer in every field, Peirce writes, the serious philosophical inquirer must “[draw] the bow upon truth, with intentness in the eye, with energy in the arm.”103 As this evocative metaphor suggests, if you’re seriously inquiring, you really want the truth, not just some comfortable or convenient conclusion—that’s why you need “intentness in the eye”; and you really want the truth, you don’t just vaguely wish you knew it—that’s why you need “energy in the arm.”104 This is, to say the least, not easy. It doesn’t just require intellect; it demands a brutal honesty, the humility to recognize when you’ve been on the wrong path, the fortitude to pick yourself up and start over when necessary, and the persistence needed to stick with a problem in the face of difficulty and in the full knowledge that you may very well fail.

Isaac Newton was well aware of this, telling an admirer who wanted to know how he had made his remarkable discoveries: “by always thinking unto them.”105 So was Santiago Ramon y Cajal, who wrote in his Advice for a Young Investigator that the most essential thing for a scientist is sustained concentration, what the French call “esprit de suite”;106 and so too was Francis Crick, who wrote in his memoir of his and James Watson’s discovery of the structure of DNA that “if we deserve any credit at all, it is for persistence and the willingness to discard ideas when they became untenable.”107 And so was Peirce, who wryly attributed his own achievements to “peirceistence” and “peirceverance.”108 I’m sorry to say, however, that when I read those experimental philosophers, or Ladyman, Ross et al., or Rosenberg,109 I’m so struck by their remarkable assurance of intellectual superiority that I find another painfully shrewd phrase of Peirce’s coming unbidden to mind: “the vanity of cleverness.”110

None of this is very surprising. For the sad fact is that, these days, almost everything about the way universities are organized conspires against the spirit of serious inquiry. The professional administrators who now “manage” universities stress “productivity,” the need for everyone to be “research-active,” and above all, anything and everything that could possibly be described as “prestigious.” It’s bad enough that professors are constantly distracted by conference calls, requests for referee’s reports on the ever-growing flood of submissions, pointless meetings, time-consuming electronic noise, and such. But the demands for abstracts of the paper or the lecture you haven’t yet written and for proposals spelling out the important discoveries you will make in the next few years, as well as the tyranny of the annual review demanding lists of the honors, the prestigious publications, and the coups in landing grant money you have pulled off over the last twelve months (!) are much more corrupting. For these erode the very virtues needed to get good work done: they positively discourage patience and painstaking and encourage, instead, efforts to create the appearance of progress, genuine or not.111

These problems extend across the entire academy—to every field, including not only the humanities, but the sciences too. Indeed, the perverse incentives I just described are (some of) the same pressures I mentioned in the previous lecture as threatening the health of the sciences, where they have encouraged “salami publishing,”112 those often-misleading multiple attributions of authorship, the corruption and manipulation of the peer review process, the bureaucracy, the endless hours spent “writing grants,” attending seminars on writing grants, reading others’ grant applications, etc., etc. But it’s only to be expected that the consequences for the humanities in general, and for philosophy in particular—where the pressure to accommodate hard facts is looser and more indirect, and there’s a long tradition of never-resolved disputation—have been even worse.

More than a century ago, Peirce wrote movingly of his hopes for the future of philosophy:

We must expect arduous labours [sic] yet to be performed before philosophy can work its way out of the jungle and emerge on the high road of science. But the prospect is no longer so desperately gloomy, if philosophers will only resign themselves to the toilsome procedure of science, and recognize that a single generation can make little headway, but yet may faithfully clear away a few obstacles, and lying down to die, resign the axe to their successors.113

When perverse incentives tempt us from our task, however, the jungle grows every day thicker than ever.

* * *

Almost done!—but there are still a few loose ends.

Peirce’s talk, in the passage I just quoted, of “the high road of science” and “the toilsome procedure of science,” may provoke the objection that the approach I am recommending is itself scientistic—specifically, that Peirce’s phrase, “scientific philosophy,” is a clear case of the honorific use of “science” and its cognates that is one sign of scientism. I reply, first, that far from suggesting that the work of philosophy could be handed over to the sciences, Peirce observes that to conduct an experiment to determine whether induction is valid would be “like adding a teaspoonful of saccharine to the ocean in order to sweeten it”;114 and, far from suggesting that philosophical problems that the sciences can’t handle must be pseudo-problems, he gives a long list of metaphysical questions in need of “solid and industrious [philosophical] investigation.”115

Still, it can’t be denied that he does use the word “scientific” in what looks like an honorific way. When he speaks of “science” here, however, it is in the old, broad sense in which it refers to any kind of serious investigation—which is why he needs the phrase “the special sciences” for what we, today, would call “the sciences.” So when he urges that philosophy become scientific, Peirce is urging, in effect, that it become serious inquiry. True, he takes for granted that, at its best, the work of the special sciences has manifested the spirit in which serious inquiry should be undertaken; but this idea is not, by my lights, scientistic—though a warning note to the effect that the integrity of the sciences is presently under severe strain would surely be in order.

The other concern that needs addressing is that Peirce’s idea that the special sciences need instruments of observation, voyages, excavations, etc., while philosophy does not, might after all provide a criterion of demarcation of science. Well, I reply, it does suggest a rough-and-ready way to draw the line between philosophy and the special sciences (though I would add that, while those ancient Chinese astronomers had none of the fancy instruments of modern astronomy, they were undeniably astronomers, nonetheless). But what’s much more important is that, while Peirce’s approach is helpful both as a way of understanding what’s peculiar about philosophy and as suggesting something of how the sciences grew out of everyday inquiry and the philosophical reflections it prompted, it certainly doesn’t suggest that all or only inquiry in the sciences is good inquiry—the characteristic motivation of the scientistic concern with demarcation; on the contrary, its core theme is that philosophy too can, and should, be a field of “solid and serious investigation.”

But it’s the light that Peirce’s conception of philosophy sheds on what’s peculiar about our discipline with which I will conclude, because his account explains both why the idea that philosophical work can be conducted entirely a priori is an illusion, and why, nevertheless, this idea is so seductive. It’s an illusion: being about the world, philosophy must call on experience. But it’s a seductive illusion: the experience philosophy requires is in no way recherché, but is available to everyone. A philosopher doesn’t need to conduct surveys or experiments, run fMRI machines, go on field trips, etc.; he needs to pay close attention to, and to reflect on, the experience he has every day. I’m sure you’ll be relieved to hear that, if you happen to find your armchair a good spot for such reflection, then after all there’s no need to burn it.


Notes to the Introduction

1. Thomas Szasz, The Second Sin (New York: Anchor Books, 1973), pp. 26-27.

2. This is the title of chapter 8 of Rosenberg, The Atheist’s View of Reality: Enjoying Life without Illusions (2011).

3. I echo Rudyard Kipling’s “If” (1910): “If you can keep your head when all about you are losing theirs, and blaming it on you, …, you’ll be a Man, my son.”

Notes to Lecture I

1. William James, “The Present Dilemma in Philosophy” (1906), pp. 15, 17.

2. C. S. Peirce, Collected Papers, 5.172 (1903).

3. Denis Diderot, Addition aux pensées philosophiques (c.1762); I rely on John Gross, ed., The Oxford Book of Aphorisms (1983), pp. 24-25.

4. Indeed, in chapter 2 of Defending Science I argued myself that formal models of “scientific reasoning” are inherently inadequate.

5. See, e.g., Michael Shermer, “The Shamans of Scientism” (2002); James Ladyman and Don Ross, with David Spurrett and John Collier, Every Thing Must Go: Metaphysics Naturalized (2007); Alex Rosenberg, The Atheist’s Guide to Reality: Enjoying Life without Illusions (2011).

6. The Oxford English Dictionary online gives two meanings for “scientism”: “a mode of thought which considers things from a scientific viewpoint”; “extreme or excessive faith in science.” The citations for the former, however, are mostly early; and the citations for the latter, described as “chiefly depreciative,” more recent. Friedrich von Hayek, “Scientism and the Study of Society” (1942), p. 269, describes scientism, the “slavish imitation of the method and language of science,” as a “prejudice.” E. H. Hutten, The Language of Modern Physics (1956), p. 273, describes scientism as “superstitious.” Peter Medawar, “Science and Literature” (1969), p. 23, describes scientism as an “aberration of science.”

7. According to von Hayek, although the earliest example given by Murray’s New English Dictionary was dated 1867, this narrower usage was already coming into play by 1831, with the formation of the British Association for the Advancement of Science. Friedrich von Hayek, “Scientism and the Study of Society,” p. 267, n.2, citing John T. Merz, History of European Thought in the Nineteenth Century (1896), vol. I, p. 89.

8. Something similar happened to the word “logic,” which used to mean “theory of whatever is good in the way of reasoning” but, with the rise of modern formal logic, became narrower in scope, and is now mostly confined to good ways of reasoning that can be formally, i.e., syntactically, represented.

9. The word “science” also sometimes refers, not to scientific inquiry, but to the knowledge that results from such inquiry—to the product rather than the process. In what follows, however, I will use the phrase “scientific knowledge” for the latter purpose.

10. Indeed, science fiction has been a significant influence on some scientists: Carl Sagan, for example, acknowledged that Edgar Rice Burroughs’s “A Prince of Mars” inspired his interest in that planet. See Michael Saler, “The Ship of the Imagination” (2015), reviewing Brian Clegg, Ten Billion Tomorrows (2015) and Matt Kaplan, Science of the Magical (2015).

11. David Wootton, The Invention of Science: A New History of the Scientific Revolution (2015), pp. 163-64 (double-entry bookkeeping), 164ff. (perspective drawing).

12. For example, Max Delbrück came to molecular biology from physics; and Maurice Wilkins (who in 1962 shared the Nobel Prize with Watson and Crick) worked on the Manhattan Project before he turned to DNA. Gunther Stent, “Introduction” to James D. Watson, The Double Helix (1968; critical edition, 1980).

13. Sharon Begley, “The Ancient Mariners” (2000), p. 54.

14. Robert Buderi and Joseph Wisnovsky, “Science: Beaming in on the Past” (1986), p. 75.

15. See, e.g., Spencer S. Hsu, FBI Admits Flaws in Hair Analysis over Decades” (2015), according to which there were errors in the hair analysis offered by FBI examiners in 95% of the cases studied. The following year, Santae Tribble—exonerated after spending 28 years in prison—was awarded $13.2 million when it was found that the hair analysts who had “matched” hairs found at the crime scene to him were unable even to distinguish human hair from dog hair. Spencer S. Hsu, “Judge Orders D.C. to Pay $13.2 million in Wrongful FBI Hair Conviction Case” (2016).

16. See, e.g., Erica Beecher-Monas, “Reality Bites: The Illusion of Science in Bite-Mark Evidence” (2009).

17. See, e.g., McLean v. Arkansas Bd. of Ed., 529 F. Supp. 1255 (E. D. Arkansas, W. D., 1982), and Edwards v. Aguillard, 482 U.S. 578 (1987) (creation science); Kitzmiller v. Dover Area Sch. Dist., 400 F.Supp. 2d 707 (M.D. Pa. 2005) (Intelligent Design Theory).

18. John Snow, On the Mode of Communication of Cholera (1855).

19. See, e.g., Jacob Stegenga, “Is Meta-Analysis the Platinum Standard of Evidence?” (2011); Massiliano Copetta et al., “Advances in Meta-Analysis: Examples from Internal Medicine to Neurology” (2013).

20. Thomas H. Huxley, On the Educational Value of the Natural History Sciences (1854), p. 13.

21. Albert Einstein, “Physics and Reality,” in Ideas and Opinions of Albert Einstein (1954), p. 290.

22. John Dewey, Logic, the Theory of Inquiry (1938), p. 66.

23. Percy Bridgman, “The Prospect for Intelligence” (1945), p. 535.

24. James B. Conant, Modern Science and Modern Man (1952), p. 22. More recently, some scientists have suggested that we can see precursors of scientific inquiry even in babies. Alison Gopnik, Andrew N. Meltzoff, and Patricia K. Kuhl, The Scientist in the Crib: What Early Learning Tells us about the Mind (1999).

25. Gustav Bergmann, Philosophy of Science (1957), p. 20.

26. Steven Weinberg, To Explain the World: The Discovery of Modern Science (2015), p. x (my italics).

27. John Maddox, What Remains to be Discovered: Mapping the Secrets of the Universe, the Origins of Life, and the Future of the Human Race (1998), p. 2.

28. For example, according to Sheldon Glashow what led to the decline of Arab science early in the second millennium was the influence of the doctrine of Taqlid, that there are no truths beyond the Koran; and what led to the decline of Chinese science in the fifteenth century was the idea that there was nothing beyond the Celestial Empire worthy of discovery. Sheldon Glashow, “The Death of Science” (1992), p. 28.

29. Wootton, The Invention of Science, pp. 53-54 (the words “experience” and “experiment” began to diverge in the seventeenth century); 103 ff. (the idea of “discovery,” previously used only in the context of geographical exploration, emerged from the Portuguese “discubrimento”); 251 ff. (“fact,” first used in English by Thomas Hobbes); 385 ff. (“hypothesis”); 400 ff. (“evidence”).

30. William Whewell, “On the Connexion of the Physical Sciences, by Mrs. Somerville” (1834), p. 60. “Natur-Forscher” means, literally, “nature-poker.” However, the new word “scientist” caught on only rather slowly, apparently because of its disagreeably mongrel ancestry: the “scient-” part comes from the Latin, but the “-ist” is Greek in origin.

31. Sidney Ross, “‘Scientist’: The Story of a Word” (1962), pp. 71-72 (citing Isaac Todhunter’s biography of Whewell).

32. Francis Bacon, Works, IV, p. 42 (The New Organon [1620], Aphorism II).

33. Editors’ Preface to De Interpretatione Naturae Proemium, Bacon, Works, III, p. 515.

34. Bacon, Works, IV, p. 107 (The New Organon, Aphorism CXXI).

35. My source is J. E. Roe, Sir Francis Bacon’s Own Story (1918), p. 61.

36. See, e.g., Rick Bonney et al., “Citizen Science: A Developing Tool for Expanding Science Knowledge and Scientific Literacy” (2009).

37. See e.g., George S. Counts and Nucia Lodge, The Country of the Blind (1949), chapter 6; Valerii Soifer, The Tragedy of Soviet Science (1994).

38. David Armstrong and Keith J. Winstein, “Antidepressants under Scrutiny over Efficacy—Sweeping Overview Suggests Suppression of Negative Data has Distorted View of Drugs” (2008).

39. Charles Darwin, letter to Joseph Hooker (January 11, 1844).

40. The discovery was made in 1982, by Australian scientists Barry Marshall and J. Robin Warren. Initially, their idea was very controversial; but in 2005 Marshall and Warren were awarded the Nobel Prize in medicine for this work. Francis Mégraud, “A Humble Bacterium Sweeps This Year’s Nobel Prize” (2005).

41. For a summary of this story and numerous references, see Haack, Defending Science, pp. 199 ff.

42. See Horace Freeland Judson, The Eighth Day of Creation: Makers of the Revolution in Biology (1979), p. 39.

43. Oswald T. Avery, Colin M. MacCleod, and Maclyn McCarty, “Studies of the Chemical Nature of the Substance Inducing Transformation in Pneumococcal Types” (1944).

44. Specifically, I don’t mean to suggest, as Kuhn did, that such revolutionary changes cannot be explained in terms of scientists’ assessment of evidence, but must be thought of in essentially political ways. Thomas Kuhn, The Structure of Scientific Revolutions (1962).

45. Susan Haack, “Peer Review and Publication: Lessons for Lawyers” (2007).

46. See, e.g., John Bohannon, “Who’s Afraid of Peer Review?” (2013) (Prof. Bohannon submitted 304 versions of a hoax article to open-access journals, more than half of which accepted it); Charlotte J. Haug, “Peer-Review Fraud—Hacking the Scientific Publication Process” (2015) (authors are suggesting “peer reviewers” whose e-mail addresses are actually fake accounts they have themselves set up).

47. William A. Wilson, “Scientific Regress,” First Things (2016), p. 42. See also Susan Haack, “The Integrity of Science: What It Means, Why It Matters” (2006).

48. Wilson, “Scientific Regress,” p. 42.

49. And at the most extreme, we see what Wilson calls the “Science Cult” manifested in the popular Facebook site “I f—ing love Science!” and the hashtag “#sciencedancing.” Wilson, “Scientific Regress,” p. 42.

50. See, e.g., the essays in Semir Zeki and Oliver Goodenough, eds., Law & the Brain (2004).

51. See, e.g., Paul Harris and Alison Flood, “Literary Critics Scan the Brain to Find Out Why We Love to Read” (2010).

52. See, e.g., Patricia Smith Churchland, Neurophilosophy: Toward a Unified Science of the Mind-Brain (1986).

53. See, e.g., Paul M. Churchland, Scientific Realism and the Plasticity of Mind (1979), and “Eliminative Materialism and the Propositional Attitudes” (1981); Patricia Smith Churchland, “Epistemology in the Age of Neuroscience” (1987).

54. As I patiently explained a quarter of a century ago. Susan Haack, Evidence and Inquiry (1993; expanded edition, 2009), chapter 8.

55. Alex Rosenberg, The Atheist’s Guide to Reality: Enjoying Life without Illusions (2011).”

56. Raymond Tallis, Aping Mankind: Neuromania, Darwinitis, and the Misrepresentation of Humanity (2011).

57. See, e.g., Paul Davies, Cosmic Jackpot: Why Our Universe is Just Right for Life (2007); Robin Collins, “The Fine-Tuning Evidence is Convincing” (2013); Man Ho Chan, “Would God Create our Universe through Multiverses?” (2015).

58. The word was coined by Paul Geisert and Mynga Futrell, apparently in hopes that (like “gay” for “homosexual”) it might, unlike “godless” or “faithless,” convey a positive connotation.

59. This was, to be fair, apparently not Giesert’s and Mynga’s intention. But it was hardly surprising that such an ill-chosen new word would be taken to have this implication. Dennett didn’t help matters by writing that “we brights don’t believe in ghosts or elves or the Easter Bunny—or God.” Daniel Dennett, “The Bright Stuff” (2003); nor did Dawkins by writing that “brights constitute … a stunning 93% of those scientists good enough to be elected to the élite National Academy of Sciences.” Richard Dawkins, “The Future Looks Bright” (2003).

60. See Susan Haack, “Credulity and Circumspection: Epistemological Character and the Ethics of Belief” (2014).

61. See e.g., John P. A. Ionnadis, “Contradicted and Initially Stronger Effects in Highly Cited Clinical Research” (2005); Wilson, “Scientific Regress.”

62. Mary Lefkowitz, Not Out of Africa (1997), p. 157.

63. Daubert v. Merrell Dow Pharm., Inc., 509 U.S. 579 (1993). (The ellipses are in the original: Justice Blackmun has strategically excised some key words from the text of Federal Rule of Evidence 702, which speaks of “scientific, technical, or other specialized knowledge.”) See also Susan Haack, “Trial and Error” (2005).

64. In 1968 C. Trusedell gave a list based on a random search of graduate-school catalogues: “‘Meat and Animal Science’ (Wisconsin), ‘Administrative Sciences’ (Yale), ‘Speech Science’ (Purdue), ..., ‘Forest Science’ (Harvard), ‘Dairy Science’ (Illinois), ‘Mortuary Science’ (Minnesota).” Trusedell, Essays in the History of Mechanics (1968), p. 75. The list, and especially “Mortuary Science,” became famous among philosophers of science when Jerome Ravetz cited it in Scientific Knowledge and Its Social Problems (1971), p. 387, n.25. The trend continues: “Management Science” and “Library Science” are now commonplace, and on a recent visit to Wayne State University I passed a university building bearing the sign “Mortuary Science.”

65. Daubert, 509 U.S. 579 (1993), 593.

66. McLean, 529 F. Supp. 1255 (E. D. Arkansas, W. D., 1982).

67. Kitzmiller, 400 F.Supp. 2d 707 (M.D. Pa. 2005).

68. See e.g., Peter Bock, Getting It Right: R&D Methods for Science and Engineering (2001), p. 168; Stephen S. Carey, A Beginner’s Guide to Scientific Method (fourth edition, 2011), pp. 305ff. Hugh G. Grauch, Scientific Method in Practice (2003) gives a brief statement (p.11) of the kind of thing typically found in beginning college texts, but acknowledges its simple-mindedness—though he is himself quite naïve about the supposedly “incisive thinking and penetrating analysis” of Popper and Kuhn. See also the amicus brief submitted in Daubert by the Product Liability Council, telling us that the Scientific Method is: “(1) set forth a hypothesis, (2) design an experiment … to test the hypothesis, (3) conduct the experiment, collect the data, and then analyze those data, (4) publish the results so that they may … be subject to critical scrutiny, and (5) ensure that these results are replicable and verifiable.” Brief of Product Liability Council as Amici Curiae in Support of the Respondent (No. 90-102) 1993 WL 13006388, *3 and n.20.

69. A particularly good example is Sylvia Wassertheil-Smoller, Biostatistics and Epidemiology: A Primer for Health and Biomedical Professionals (third edition, 2004) which, after some brief and very naïve remarks about the “scientific method” in general, turns serious attention to the serious stuff—which, however, applies only in this specific field.

70. Jeremy Bentham, “Anarchical Fallacies: Being an Examination of the Declaration of Rights issued during the French Revolution,” in John Bowring, ed., The Works of Jeremy Bentham (1843), vol. II, p.501. (Bentham was referring, of course, to the idea of natural rights.)

71. Olli P. Heinonen, Denis Slone, and Samuel Shapiro, Birth Defects and Drugs in Pregnancy (1977); see in particular the description of the project design and data collection, pp. 8-29. The record in Blum v. Merrell Dow Pharms, Inc., 33 Phila. Co. Rep. 193 (1996), 215-17 shows that Dr. Shapiro admitted under oath that the study had failed to distinguish these two sub-groups of the sample.

72. Claire Bombadier, et al., “Comparison of Upper Gastrointestinal Toxicity of Rofecoxib and Naproxen in Patients with Rheumatoid Arthritis” (2000) (the original article); Armstrong, David, “Bitter Pill: How the New England Journal of Medicine Missed Warning Signs in Vioxx—Medical Weekly Waited Years to Report Flaws in Article that Praised Pain Drug—Merck Seen as ‘Punching Bag’” (2006) (explaining what went wrong). See also Susan Haack, “The Integrity of Science: What It Means, Why It Matters; “Peer Review and Publication: Lessons for Lawyers.”

73. David Abrahamson, The Psychology of Crime (1967), p. 37.

74. Igor Pacheco, Brian Cerchiai, and Stephanie Stoiloff, “Miami-Dade Research Study for the Reliability of the ACE-V Process: Accuracy and Precision in Latent Fingerprint Examinations” (2014), pp. 14-15.

75. FBI, Forensic Science Communications (2009).

76. See, e.g., E. O. Wilson, Consilience: The Unity of Knowledge (Knopf, 1998).

77. Susan Haack, “The World According to Innocent Realism: The One and the Many, the Real and the Imaginary, the Natural and the Social” (2014); “Brave New World: Nature, Culture, and the Limits of Reductionism” (forthcoming).

78. A kind of scientism to which, ironically, William Wilson succumbs when, apparently seduced by snappy rhetoric, he writes that “[t]he problem with science is that so much of it simply isn’t.” Wilson, “Scientific Regress,” p. 37.

79. Nora Barlow, Charles Darwin and the Voyage of the Beagle (1946), p. 151 (referring to Darwin’s Buenos Ayres [sic] notebook recording observations from late 1832 and early 1883, where he describes one specimen as “length: one handkerchief and half”).

80. Haack, “Just Say ‘No’ to Logical Negativism” (2011) gives comprehensive quotations and references for all these points; and shows that the distinguished British scientists who enthusiastically endorsed Popper’s ideas (Sir Peter Medawar, Sir John Eccles, and Sir Hermann Bondi) all misunderstood what those ideas were.

81. It is sometimes said that evolutionary biology does make predictions; but what is meant is only that it predicts what fossils you can expect to find at such-and-such places, not that it predicts how the evolution of species will go in the future.

82. Larry Laudan, “The Demise of the Demarcation Problem” (1983), p. 347.

83. As the Supreme Court finally acknowledged in the third of a trilogy of cases on the standard of admissibility of expert testimony, Kumho Tire Co. v. Carmichael, 526 U.S. 137 (1999), which concerned not scientific testimony but the testimony of a supposed expert on tire design. See also Susan Haack, “The Expert Witness: Lessons from the U.S. Experience” (2015).

84. So-called because it was introduced in Lemon v. Kurtzman 411 U.S. 192 (1973).

85. Ironically, this was a (perhaps unintended) change from the earlier ruling where the Court had introduced the “purpose” idea, but had spoken of “purposes,” in the plural. Abington Sch. Dist. v. Schempp, 374 U.S. 203 (1963), 222.

86. “Wish you were here,” Oxford Today (1998).

87. The BlueBook, the Bible of law librarians and law-review editors, lays down the very elaborate rules governing legal citations.

88. John Locke discusses this question, put to him by William Molyneux, in the second edition of his Essay Concerning Human Understanding (1694), II.ix. See generally Brian Glenney, “Molyneux’s Question.”

89. Recent work on subjects who had cataracts removed indicates that at first they were unable to distinguish the shapes by sight, but could do so when subsequently re-tested. Richard Held, et al., “Newly Sighted Cannot Match Seen with Felt” (2011).

90. Biologist Kenneth Miller (co-author of the biology text at the heart of the case), interviewed in Judgment Day: Intelligent Design on Trial, a Nova documentary on Kitzmiller.

91. See Susan Haack, “Fallibilism and Faith, Naturalism and the Supernatural, Science and Religion” (2005).

92. The idea of forcing nature to give up her secrets is Baconian; but to my surprise it turns out that it wasn’t Bacon himself who writes of “putting nature on the rack,” but Leibniz, describing Bacon’s view. Leroy E. Loemker, Gottfried Wilhelm Leibniz: Philosophical Papers and Letters (second edition, 1970), p. 465 (from a letter to Gabriel Wagner, 1696).

93. Percy Bridgman, “The Struggle for Intellectual Integrity” (1933), especially section II.

94. The story is told by Watson in The Double Helix (1968), p. 59.

Notes to Lecture II

1. C. S. Peirce, Collected Papers, 1.126-27 (c. 1905).

2. Peirce, Collected Papers, 8.110 (c.1900).

3. Peirce, Collected Papers, 1.129 (c.1905).

4. Peirce, Collected Papers, 5.450 (1905).

5. Or maybe I should say—as Peirce says of Descartes’ distinction of clear and distinct ideas—an “antique bijou” that belongs in the curio cabinet. Peirce, Collected Papers, 5.392 (1878).

6. Peirce, Collected Papers, 7.232 (1873), 5.409 (1878), 6.427 (1878), 7.492 (c.1898).

7. See, e.g., E. O. Wilson, Consilience: The Unity of Knowledge (1998), which is remarkable in part because of the way it combines two books in one: a modest book arguing that there can’t be incompatible truths, that all the truths there are must somehow fit together; and an ambitious, scientistic book suggesting that the truths of ethics, aesthetics, etc., can be derived from the truths of biology. The papers in Antonio Demasio et al., eds., Unity of Knowledge: The Convergence of Natural and Human Sciences (2001), seem mostly to have been inspired by the more ambitious idea; though Stuart A. Kauffman, “Prolegomenon to a Future Biology” (18-36) is an honorable exception.

8. Moritz Schlick, “The Turning Point in Philosophy” (1930-31), p. 56.

9. W. V. Quine, “Epistemology Naturalized” (1969).

10. W. V. “Two Dogmas of Empiricism” (1951).

11. See generally W. V. Quine, Word and Object (1960).

12. Quine, Word and Object, chapter 6.

13. Ambiguities identified in careful textual detail in chapter 6 of my Evidence and Inquiry (1993/2009).

14. In Evidence and Inquiry I marked this distinction typographically: “SCIENCE” for the broad sense, and “science” for the narrow (p. 274).

15. Alvin Goldman, Epistemology and Cognition (1986).

16. Stephen P. Stich, From Folk Psychology to Cognitive Science (1983). Subsequently, in The Fragmentation of Reason (1992), Stich acknowledged that we do, after all, have beliefs; but argued that it is nothing but superstition to care whether your beliefs are true. Gosh.

17. Paul M. Churchland, A Neurocomputational Perspective: The Nature of Mind and the Structure of Science (1979); Patricia Smith Churchland, “Epistemology in the Age of Neuroscience” (1987).

18. The argument is made in detail in Haack, Evidence and Inquiry, pp. 226-36.

19. Timothy D. Wilson, “Strangers to Ourselves: The Origin and Accuracy of Beliefs about One’s Own Mental States” (1985; Stich had cited the pre-publication version). Needless to say, there is absolutely no suggestion in Wilson’s paper that beliefs are mythical.

20. Haack, Evidence and Inquiry, pp. 215-26.

21. Haack, Evidence and Inquiry, pp. 210 and 216.

22. As early as 1986, however (the same year that Patricia Churchland’s Neurophilosophy: Toward a Unified Science of the Mind-Brain appeared), we find Thomas Nagel complaining about scientistic tendencies in philosophy, which “[put] one type of human understanding in charge of the universe and what can be said about it” (p. 9), and writing of the “powerful reductionist dogmas which seem to be part of the intellectual atmosphere we breathe.” Thomas Nagel, The View from Nowhere (1986), p. 82.

23. Quine, “Epistemology Naturalized”; “Natural Kinds” (1969).

24. See, e.g., Chauncey Wright, “The Evolution of Self-Consciousness” (1873); Konrad Lorenz, “Kant’s Doctrine of the ‘A Priori’ in the Light of Contemporary Biology” (1941); Donald T. Campbell, “Perception as Substitute Trial and Error” (1956).

25. Karl R. Popper, “Evolution and the Tree of Knowledge” (1972); “Natural Selection and the Emergence of Mind” (1987). (On p. 144 of the latter paper, Popper acknowledges that earlier [“Evolution and the Tree of Knowledge,” p. 241] he had suggested that natural selection was a tautology rather than a genuinely scientific, falsifiable statement.)

26. Michael Ruse, “The View from Somewhere: A Critical Defense of Evolutionary Epistemology” (2009), p. 251.

27. Hilary Kornblith, Inductive Inference and Its Natural Ground (1993).

28. Kornblith, Inductive Inference and Its Natural Ground, p. 2.

29. Kornblith, Inductive Inference and Its Natural Ground, p. 3. I will add, however that, much to his credit, Kornblith firmly rejects Jaegwon Kim’s misreading of Quine as simply proposing to make epistemology descriptive, which is all too often taken to be canonical. Jaegwon Kim, “What is Naturalized Epistemology?” (1988).

30. Nicholas Rescher also seems to assume, with Quine, that empirical questions must be scientific. Rescher, A Useful Inheritance: Evolutionary Aspects of the Theory of Knowledge (1990), p.74, n.11. But because his main concern is only to argue that epistemology may appeal to evolutionary considerations without falling into a vicious circle, Rescher is not in the same danger of succumbing to scientism as Kornblith.

31. The Wikipedia article on experimental philosophy makes a passing reference to this earlier use. “Experimental philosophy,” p. 1, citing P. Anstey and A. Vanzo, “The Origins of Early Modern Experimental Philosophy” (2012). It’s not clear, however, that many supporters of the current movement have any awareness of this history.

32. Peirce, Collected Papers, 4.71 (1893). Peirce’s target, by the way, is Descartes, who, he writes, marks the time when philosophy “put off childish things and began to be a conceited young man.”

33. If you Google “experimental philosophy anthem” you find a YouTube video showing an armchair gradually burning to ashes, accompanied by the very loud but almost unintelligible anthem. After several painful attempts to listen to it, all I can tell you is that I think I heard these words.

34. Joshua Knobe and Shaun Nichols, eds., “Introduction,” Experimental Philosophy (2008), p. 10.

35. In this context, Stich is sometimes mentioned as having done pioneering work. So I feel obliged to note that when, in The Fragmentation of Reason, Stich claimed that the concept of knowledge was culturally variable, he did so on the basis of nothing more than one article about the Yoruba language! See Evidence and Inquiry, p. 256.

36. Thus far they seem to have focused primarily on western vs. other and on male vs. female subjects. I noticed that the non-westerners were sometimes (presumably) Chinese students in Hong Kong—the most westernized part of China—and sometimes students at U.S. universities from India, Pakistan, and Bangladesh; and that in the latter case we weren’t told whether these students had come to the U.S. from the sub-continent of India, or were Americans of east-Asian descent.

37. Arne Ness, “‘Truth’ as Conceived by Those who are not Professional Philosophers” (1938). Ness’s work is not, by the way, something obscure that experimental philosophers can easily be forgiven for not knowing about; Tarski refers to in it the shorter and less technical of his well-known papers on truth, “The Semantic Conception of Truth” (1944), which was what first led me to it. (At a 2007 workshop, after Prof. Knobe had conducted one of his “experiments” on the participants, I tactfully drew him aside and told him about Ness’s work. I was somewhat shaken that he seemed not in the least embarrassed but, instead, delighted that Ness had, as he put it, anticipated him.)

38. Peirce, in particular, stresses that words acquire meaning and concepts become richer and thicker as our knowledge grows. See, e.g., Collected Papers 7.587 (c. 1866-67), 2.302 (c. 1895). See also Susan Haack, “The Growth of Meaning and the Limits of Formalism, in Science and Law” (2009).

39. In Evidence and Inquiry I also marked this distinction typographically: “PSYCHOLOGY” for the broad sense, and “psychology” for the narrow (p. 210).

40. Indeed, Jesse Prinz proposes this as a distinguishing mark of experimental philosophy, as opposed to empirical philosophy, by which he means philosophy that calls on the work of professional psychologists. Jesse Prinz, “Empirical Philosophy and Experimental Philosophy,” p. 196. (Of course, I reject the idea that “empirical philosophy” can only be scientistic philosophy, Goldman-style.)

41. Joshua Alexander, Experimental Philosophy: An Introduction (2012), gives numerous examples.

42. See, e.g., in Knobe and Nichols, Experimental Philosophy: Edward Machery, Ron Mallon, Shaun Nichols, and Stephen P. Stich, “Semantics, Cross-Cultural Style; questions were presented in English to students in the U.S. and Hong Kong (p. 52). Robert L. Woolfolk, John M. Doris, and John M. Darley, “Identification, Situational Constraint, and Social Cognition”; questions were put to 72 subjects, all undergraduates in philosophy classes at the University of California, Santa Cruz (p.65). Shaun Nichols and Joshua Knobe, “Moral Responsibility and Determinism”; all their studies were conducted on students from the University of Utah (p. 110). Harry Cushman and Alfred Mele, “Intentional Action”; all the subjects in one of their studies were students at Florida State University (p. 180). I found one exception: Jonathan M. Weinberg, Shaun Nichols, and Stephen P. Stich, “Normativity and Epistemic Intuitions,” where the subjects in one study were all undergraduates at Rutgers University (p. 26), but another study actually approached people on the street in New Brunswick (p. 39). (The latter subjects, we are told, had to be paid to induce them to participate; I could find no indication, however, of how many declined.)

43. Edmund Gettier, “Is Justified True Belief Knowledge?” (1963).

44. Peirce, Collected Papers, 5.12 (1902).

45. Alexander Bain, The Emotions and the Will (1859; third ed., 1875).

46. For example, in many of the studies in Knobe and Nichols the number of subjects (when it is given) was quite small; the authors of the studies described in note 42 above don’t tell us who handed out the questionnaires, or in what terms; but several tell us that they predicted such-and-such result, and this prediction was confirmed—which, in the circumstances, leaves one wondering about the possibility of confirmation bias.

47. James Ladyman and Don Ross, with David Spurrett and John Collier, Every Thing Must Go: Metaphysics Naturalized (2007). In what follows, rather than spend a lot of time trying to figure out exactly who is responsible for what in the book, I will, with apologies to Spurrett and Collier, refer to the authors simply as “Ladyman and Ross.”

48. Ladyman and Ross, Every Thing Must Go, p. vii.

49. Ladyman and Ross, Every Thing Must Go, §1.2.

50. Ladyman and Ross, Every Thing Must Go, p. 27.

51. There is one actual quotation from Peirce (p. 129). But I suspect this may be second-hand; in any case, it is taken entirely out of context, and the bibliographical reference, to “Peirce (1960-6),” is hardly reassuring. There is also one crashing misconception: that Peirce conceived of induction as “inference to the best explanation” (p. 255)—a misconception that seems to come, not from Putnam, but from the kind of folk history of philosophy popular in our now blithely ahistorical profession. If the index can be relied on, there are no references anywhere in the book to William James, John Dewey, or George Herbert Mead.

52. This caveat is meant quite seriously. The book is written in such impenetrable prose, so heavily larded with the painful jargon of recent debates in philosophy of science, and so constantly interrupted by long parenthetical strings of names and dates, that I found it hard to keep going.

53. Ladyman and Ross, Every Thing Must Go, p. 1.

54. Ladyman and Ross, Every Thing Must Go, p. 28. Looking for further explanation of this idea, I discovered that the book has no index entry for “consilience,” and that the bibliography includes neither William Whewell (who coined the word in 1840) nor E. O. Wilson (whose 1998 book, Consilience: The Unity of Knowledge, put the word into circulation).

55. Ladyman and Ross, Every Thing Must Go, pp. 38-45.

56. There is some discussion of biology. But there are no entries in the index of Ladyman and Ross’s book for “sociology,” “social sciences,” or “economics” (or for “mental states,” “mind,” “propositional attitudes,” etc.) and the entry for “psychology” takes you to three pages most of which are devoted to arguing that the mind is not a computer.

57. Ladyman and Ross, Every Thing Must Go, p. 119.

58. I wanted to ask, “patterns of what?” But since they tell us that relata are constructed from relations (p. 154), apparently Ladyman and Ross think this question somehow answers itself.

59. Ladyman and Ross, Every Thing Must Go, p. 195.

60. Ladyman and Ross, Every Thing Must Go, p. 121 and, especially, p. 242.

61. Ladyman and Ross, Every Thing Must Go, p. 5.

62. Ladyman and Ross, Every Thing Must Go, p. 292.

63. Ladyman and Ross, Every Thing Must Go, p. 28.

64. Chapter 1 of Ladyman and Ross, Every Thing Must Go, is entitled “In Defence of Scientism.”

65. Ladyman and Ross’s reliance on appeal to the “institutional processes” of science seems to be motivated by their rejection of the idea that there is a special scientific method (p. 28). I agree that there is no method used by all and only scientists—a point I first argued in print in 1993 (Evidence and Inquiry, p. 187). But Ladyman and Ross tell us nothing about what “institutional processes” they have in mind, let alone what it is about these processes that they believe makes science “authoritative.”

66. See, e.g., Susan Haack, “The Integrity of Science: What It Means, Why It Matters” (2006); “Peer Review and Publication: Lessons for Lawyers” (2007); William Wilson, “Scientific Regress” (2015).

67. Ladyman and Ross Every Thing Must Go, p. 209.

68. The Atheist’s Guide to Reality, p. 6.

69. The Atheist’s Guide to Reality, p. 52.

70. Rosenberg The Atheist’s Guide to Reality, pp. 20, 81, 113, 162, 194, 219, 220, 222, 223, 242, 242, 313.

71. I get the impression that Rosenberg may think the second claim follows from the first; but this would involve significant fudging. See Susan Haack, “Worthwhile Lives” (2001-02).

72. Rosenberg, The Atheist’s Guide to Reality, p. 93.

73. Rosenberg, The Atheist’s Guide to Reality, p. 109.

74. This is not said explicitly in the book; but it follows from what is said, and at the 2014 conference in Amsterdam to which I alluded in my introduction, Rosenberg seemed to acknowledge it.

75. Rosenberg, The Atheist’s Guide to Reality, p. 195.

76. Rosenberg, The Atheist’s Guide to Reality, p. 180.

77. Rosenberg, The Atheist’s Guide to Reality, p. 164. Subsequent chapters tell us that we can learn nothing from history and that free will is an illusion. Rosenberg recommends that, if we find all this depressing, we try Prozac. (For just a moment, I wondered unkindly if his problem mightn’t be that he’d taken his own advice!)

78. Aldous Huxley, Brave New World (1932), p. 231.

79. Haack, Defending Science, p. 52.

80. Haack, Defending Science, p. 160.

81. Peirce’s route from those observations anyone can make in any hour of his waking life to metaphysics relies on his “phaneroscopy,” or as we would say, “phenomenology” (Collected Papers 1.284 ff., c.1904, 1905); my procedure here, however, will run more directly parallel to his Critical Common-sensism (5.497 ff., c.1905).

82. Indeed, if there weren’t kinds and laws, there couldn’t have been people.

83. On the metaphysical issues, see Susan Haack, “Realisms and their Rivals” (2002); Defending Science, chapter 5; “The Real, the Fictional, and the Fake” (2013); “The World According to Innocent Realism” (2014/2016). On the epistemological questions, see Evidence and Inquiry chapter 4; Defending Science, chapters 3 and 4; “The Integrity of Science” (2006); “Epistemology: Who Needs It?” (2011/2015).

84. “The fastest [cheetah], appropriately named Ferrari, hit a top speed of 59 mph.” See here.

85. As we now know, chemical imbalances can affect a person’s mood, and a brain tumor can change someone’s personality quite dramatically. See, e.g., Jessie A. Seiden, “The Criminal Brain: Frontal Lobe Dysfunction Evidence in Capital Proceedings” (2004), p. 395, summarizes the story of a Virginia schoolteacher who had “begun collecting child pornography, soliciting prostitutes, and making sexual advances to his stepdaughter.” He was found to have a large tumor displacing his orbitofrontal lobe; when the tumor was removed, “the deviant urges subsided.”

86. Peirce, Collected Papers 6.476 (1908).

87. Thomas Suddendorf, The Gap: The Science of What Separates Us from Other Animals (2013), p. 147.

88. Suddendorf, The Gap, p. 173.

89. I borrow the word from Mead; and those who know his work will see that it has been a significant influence on my thinking about these matters. George Herbert Mead, Mind, Self, and Society from the Standpoint of a Social Behaviorist (1934).

90. Nicholas Rescher wrote in 1990 that evolution may account for the physical processes involved in mental operations, but can’t account for their content; he even added that to understand the intentional we need to refer to the social, to culture. Rescher, A Useful Inheritance, pp. 123-24. Unfortunately, however, he hadn’t a great deal to say about the details of this, which are what will concern me in what follows.

91. Am I claiming that this was what I meant in 2003? Not exactly; if I’d got this far in 2003, I wouldn’t have had to spend so much time over the next decade or so figuring it out! But now I have figured it out, I see that it was implicit in what I said in Defending Science.

92. I suspect, by the way, that when aficionados of “neuroart” and “neuroaesthetics” tell us that artists are really neuroscientists studying the brain and the visual system, they are somehow forgetting that perception is a sensory interaction with something external, in this case an artwork.

93. Bain, The Emotions and the Will, p. 505.

94. This useful concept is borrowed from a now almost-forgotten philosopher, H. H. Price. Price, Belief (1969), pp. 267ff.

95. Robert Lee Hotz, “A Neuron’s Obsession Hints at Biology of Thought” (2009).

96. I’ve been very gradually amplifying and refining the core idea for many years since I first suggested it in 1993, in chapter 8 of Evidence and Inquiry: in chapter 6 of Defending Science; in “Belief in Naturalism: An Epistemologist’s Philosophy of Mind” (2010); and in “Brave New World: On Nature, Culture, and the Limits of Reductionism” (forthcoming).

97. This is the idea Peirce calls “tychism.” Peirce, Collected Papers, 6.7 ff (1891); “Man’s Glassy Essence,” 6.238 ff. (1892).

98. I think Ladyman and Ross acknowledge this; at least, that’s what a brief passage on pp. 25-26 of Every Thing Must Go suggests.

99. Rosenberg, The Atheist’s Guide to Reality, pp. 53 ff.

100. Rosenberg, The Atheist’s Guide to Reality, pp. 88-89.

101. As Peirce writes in “Man’s Glassy Essence,” “protoplasm is in an excessively unstable condition; and it is the characteristic of unstable equilibrium that near that point excessively minute causes may produce startlingly large effects.” Collected Papers, 6.264 (1892). See also “The Doctrine of Necessity Examined,” Collected Papers 6.35-65 (also 1892).

102. See Susan Haack, “The Fragmentation of Philosophy, the Road to Reintegration” (2016).

103. Peirce, Collected Papers, 1.235 (1902).

104. See also Susan Haack, “Serious Philosophy” (2016).

105. I rely on Charles Coulston Gillespie, The Edge of Objectivity: An Essay in the History of Scientific Ideas (1960). p. 117.

106. Santiago Ramon y Cajal, Advice for a Young Investigator (1923; English edition 1999), p. 32. The wonderful French phrase means, roughly, “the spirit of following-through.”

107. Francis Crick, What Mad Pursuit: A Personal View of Scientific Discovery (1988), p. 74.

108. I rely on Joseph Brent, Charles Sanders Peirce: A Life (1993), p. 16.

109. I suppose I’m naïve; but I was stunned when, after his talk at the Amsterdam conference I described earlier, Rosenberg happily filled a silence by observing that if Alex Rosenberg didn’t exist, he would have to be invented.

110. Peirce, Collected Papers, 1.131 (c.1897).

111. See also Susan Haack, “Out of Step: Academic Ethics in a Preposterous Environment” (2013).

112. That is, boosting the number of your publications by cutting your work into many small pieces and publishing the pieces as stand-alone articles—which makes your CV look more impressive to a casual reader, but also makes it harder for others to build on your findings.

113. Peirce, Collected Papers, 8.170 (c.1903).

114. Peirce, Collected Papers, 5.522 (c.1905).

115. Peirce, Collected Papers, 6.6 (c.1903).


Abrahamson, David, The Psychology of Crime (New York: Columbia University Press, 1967).

Alexander, Joshua, Experimental Philosophy: An Introduction (Malden, MA: Polity Press, 2012).

Anstey, P. and A. Vanzo, “The Origins of Early Modern Experimental Philosophy,” Intellectual History Review 22, no.4 (2012): 499-518.

Armstrong, David, “Bitter Pill: How the New England Journal of Medicine Missed Warning Signs on Vioxx—Medical Weekly Waited Years to Report Flaws in Article that Praised Pain Drug—Merck Seen as ‘Punching Bag,’” Wall Street Journal, May 15, 2006, A1, A10.

Armstrong, David, and Keith J. Winstein, “Antidepressants under Scrutiny over Efficacy—Sweeping Overview Suggests Suppression of Negative Data has Distorted View of Drugs,” Wall Street Journal, January 17, 2008, D1.

Avery, Oswald T., Colin M. MacCleod, and Maclyn McCarty, “Studies of the Chemical Nature of the Substance Inducing Transformation in Pneumococcal Types” (1944), reprinted in Conceptual Foundations of Genetics: Selected Readings, eds. Harry O. Corwin and John B. Jenkins (Boston: Houghton Mifflin, 1976), 13-27.

Bacon, Francis, Works, eds. James Spedding, Robert L. Ellis, and Douglas D. Heath (London: Longmans, Green and Co., 1857-74).

Bain, Alexander, The Emotions and the Will (London: Longmans, Green and Co., 1859; third edition, 1875).

Barlow, Nora, Charles Darwin and the Voyage of the Beagle (New York: Philosophical Library, 1946).

Beecher-Monas, Erica, “Reality Bites: The Illusion of Science in Bite-Mark Evidence,” Cardozo Law Review 30, no.4 (2009): 1369-1410.

Begley, Sharon, “The Ancient Mariners: Forget the horned helmets: the Vikings were traders as well as raiders, remaking Western Europe—and sailing to America,” Newsweek, April 3, 2000, 48-54.

Bentham, Jeremy, “Anarchical Fallacies: Being an Examination of the Declaration of Rights issued during the French Revolution,” in John Bowring, ed., The Works of Jeremy Bentham (Edinburgh: William Tait, 1843), vol. II, 489-534.

Bergmann, Gustav, Philosophy of Science (Madison, WI: University of Wisconsin Press, 1957).

Bock, Peter, Getting It Right: R&D Methods for Science and Engineering (London: Harcourt Brace, 2001).

Bohannon, John, “Who’s Afraid of Peer Review?” Science (October 4, 2013), available here.

Bombadier, Claire, et al., “Comparison of Upper Gastrointestinal Toxicity of Rofecoxib and Naproxen in Patients with Rheumatoid Arthritis,” New England Journal of Medicine 343, no.21 (2000): 1520–28.

Bonney, Rick, et al., “Citizen Science: A Developing Tool for Expanding Science Knowledge and Scientific Literacy,” Bioscience 59 (2009): 977-84.

Brent, Joseph, Peirce: A Life (Bloomington and Indianapolis: Indiana University Press, 1993).

Bridgman, Percy “The Struggle for Intellectual Integrity” (1933), in Bridgman, Reflections of a Physicist, 361-79.

Bridgman, Percy, “The Prospect for Intelligence” (1945), in Bridgman, Reflections of a Physicist, 526-52.

Bridgman, Percy, Reflections of a Physicist (1952; second edition, New York: Philosophical Library, 1955).

Buderi, Robert, and Joseph Wisnovsky, “Science: Beaming in on the Past,” Time, March 10, 1986, 75.

Campbell, Donald T., “Perception as Substitute Trial and Error,” Psychological Review 63 (1956): 330-42.

Carey, Stephen S., A Beginner’s Guide to Scientific Method (1994; Boston: Wadsworth, fourth edition, 2011).

Chan, Man Ho, “Would God Create our Universe through Multiverses?” Theology and Science 13, no, 4 (2015): 395-408.

Churchland, Patricia Smith, Neurophilosophy: Toward a Unified Science of the Mind-Brain (Cambridge, MA: Bradford Books, 1986).

Churchland, Patricia Smith, “Epistemology in the Age of Neuroscience,” Journal of Philosophy 84, no.10 (1987): 544-53.

Churchland, Paul M., Scientific Realism and the Plasticity of Mind (Cambridge: Cambridge University Press, 1979).

Churchland, Paul M., “Eliminative Materialism and the Propositional Attitudes,” Journal of Philosophy 78, no.2 (1981): 67-89.

Churchland, Paul M., A Neurocomputational Perspective: The Nature of Mind and the Structure of Science (Cambridge, MA: MIT Press, 1989).

Clegg, Brian, Ten Billion Tomorrows (New York: St. Martin’s, 2015).

Collins, Robin, “The Fine-Tuning Evidence is Convincing,” in J. P. Moreland et al., eds., Debating Christian Theism (Oxford: Oxford University Press, 2013), 35-46.

Conant, James, Modern Science and Modern Man (Columbia University: Bampton Lectures in America, 5) (1952), available here.

Copetta, Massiliano, et al., “Advances in Meta-Analysis: Examples from Internal Medicine to Neurology,” Neuroepidemiology 42 (2013): 59-67.

Counts, George S., and Nucia Lodge, The Country of the Blind (Boston: Houghton Mifflin, 1949).

Crick, Francis, What Mad Pursuit: A Personal View of Scientific Discovery (New York: Basic Books, 1988).

Cushman, Harry, and Alfred Mele, “Intentional Action,” in Knobe and Nichols, Experimental Philosophy, 171-88.

Darwin, Charles, letter to Joseph Hooker (January 11, 1844), in Darwin Correspondence Project, letter no. 729, available here.

Davies, Paul, Cosmic Jackpot: Why Our Universe is Just Right for Life (New York: Houghton Mifflin, 2007).

Dawkins, Richard, “The Future Looks Bright,” The Guardian, June 21, 2003, available here.

Demasio, Antonio, et al., eds., Unity of Knowledge: The Convergence of Natural and Human Sciences, Annals of the of Sciences, 935 (May 2001).

Dennett, Daniel, “The Bright Stuff,” New York Times, July 12, 2003, available here.

Dewey, John, Logic, the Theory of Inquiry (New York: Henry Holt, 1938).

Diderot, Denis Addition aux pensées philosophiques (c.1762), in J. Assézat-Tourneux, ed., Oeuvres Completes (Paris: Garnier Frères, 1863/1875).

Einstein, Albert, “Physics and Reality,” in Ideas and Opinions of Albert Einstein (New York: Crown Publishers, 1954).

“Experimental philosophy,” available here.

FBI, Forensic Science Communications 11, no.4 (2009).

Gettier, Edmund, “Is Justified True Belief Knowledge?” Analysis 23 (1963): 121-23.

Gillespie, Charles Coulston, The Edge of Objectivity: An Essay in the History of Scientific Ideas (Princeton, N.J.: Princeton University Press, 1960).

Glashow, Sheldon, “The Death of Science,” in Richard Q. Elvee, ed., The End of Science: Attack and Defense (Lanham, MD: University Press of American, 1992), 23-32.

Glenney, Brian, “Molyneux’s Question,” Internet Encyclopedia of Philosophy, available here.

Göhner, Julia, and Eva-Maria Jung, eds., Susan Haack: Reintegrating Philosophy (Berlin: Springer Verlag, 2016).

Goldman, , Epistemology and Cognition (Cambridge, MA: Harvard University Press, 1986).

Gopnik, Alison, Andrew N. Meltzoff, and Patricia K. Kuhl, The Scientist in the Crib: What Early Learning Tells us about the Mind (New York: William Morris and Company, 1999).

Grauch, Hugh G., Scientific Method in Practice (Cambridge: Cambridge University Press, 2003).

Gross, John, ed., The Book of Aphorisms (Oxford: Oxford University Press, 1983).

Haack, Susan, Evidence and Inquiry (1993; expanded edition, Amherst, NY: Prometheus Books, 2009).

Haack, Susan, “Worthwhile Lives” (2001-02), in Haack, Putting Philosophy to Work, 229-33 (text) and 310 (notes).

Haack, Susan, “Realisms and their Rivals,” Facta Philosophica 4, no.1 (March 2002): 67-88.

Haack, Susan, Defending Science—Within Reason: Between Scientism and Cynicism (Amherst, NY: Prometheus Books, 2003).

Haack, Susan, “Fallibilism and Faith, Naturalism and the Supernatural, Science and Religion” (2005), in Haack, Putting Philosophy to Work, 199-208 (text) and 306-07 (notes).

Haack, Susan, “Trial and Error” (2005), in Haack, Evidence Matters, 104-21.

Haack, Susan, “The Integrity of Science: What It Means, Why It Matters” (2006), in Haack, Putting Philosophy to Work, 121-40 (text) and 283-88 (notes).

Haack, Susan, “Peer Review and Publication: Lessons for Lawyers” (2007), in Haack, Evidence Matters, 156-79.

Haack, Susan: Putting Philosophy to Work: Inquiry and Its Place in Culture (Amherst, NY: Prometheus Books, 2008; expanded edition, 2013). Page references in the endnotes and the bibliography are to the 2013 edition.

Haack, Susan, “The Growth of Meaning and the Limits of Formalism, in Science and Law,” Análisis Filosófico XXIX, no.1 (May 2009): 5-29.

Haack, Susan, “Belief in Naturalism: An Epistemologist’s Philosophy of Mind,” Logos & Episteme 1, no.1 (2010): 1-22.

Haack, Susan, “Six Signs of Scientism” (2010), in Haack, Putting Philosophy to Work, 105-20 (text) and 278-83 (notes).

Haack, Susan, “Epistemology: Who Needs It?” (first published, in Danish, in 2011), Cicilia Journal of Philosophy 3 (2015):1-15; and Filosofia UNISINOS 16, no.2 (2015): 183-93.

Haack, Susan, “Just Say ‘No’ to Logical Negativism” (first published, in Chinese, in 2011) in Haack, Putting Philosophy to Work, 179-94 (text) and 298-305 (notes).

Haack, Susan, “Out of Step: Academic Ethics in a Preposterous Environment,” in Putting Philosophy to Work, 251-68 (text) and 313-17 (notes).

Haack, Susan, “The Real, the Fictional, and the Fake,” Spazio Filosofico 8 (2013): 209-17.

Haack, Susan, “Credulity and Circumspection: Epistemological Character and the Ethics of Belief,” Proceedings of the American Catholic Philosophical Association 88 (2014): 27-47.

Haack, Susan, Evidence Matters: Science, Proof, and Truth in the Law (New York: Cambridge University Press, 2014).

Haack, Susan, “The Expert Witness: Lessons from the U.S. Experience,” Humana Mente, 28 (2015): 349-70.

Haack, Susan, “The Fragmentation of Philosophy, the Road to Reintegration,” in Göhner and Jung, eds., Susan Haack: Reintegrating Philosophy, 3-32.

Haack, Susan, “The World According to Innocent Realism: The One and the Many, the Real and the Imaginary, the Natural and the Social” (first published, in German, in 2014) in Göhner and Jung, eds., Susan Haack: Reintegrating Philosophy, 33-55.

Haack, Susan, “Serious Philosophy,” Spazio filosofico 18 (2016): 395-407.

Haack, Susan, “Brave New World: Nature, Culture, and the Limits of Reductionism,” in Bartosz Brozek and Jerzy Stelmach, eds., Explaining the Mind (Kraków, Poland: Copernicus, forthcoming).

Harris, Paul and Alison Flood, “Literary Critics Scan the Brain to Find out Why We Love to Read,” Observer, April 11, 2010: 3.

Haug, Charlotte J., “Peer-Review Fraud—Hacking the Scientific Publication Process,” New England Journal of Medicine 373, no.25 (December 17, 2015): 2393-95.

Heinonen, Olli P., Denis Slone, and Samuel Shapiro, Birth Defects and Drugs in Pregnancy (Littleton, MA: Sciences Group, 1977).

Held, Richard, et al., “Newly Sighted Cannot Match Seen with Felt,” Nature Neuroscience 14 (2011): 551-53.

Hotz, Robert Lee, “A Neuron’s Obsession Hints at Biology of Thought,” Wall Street Journal, October 9, 2009, A14.

Hsu, Spencer S., “FBI Admits Flaws in Hair Analysis over Decades,” Washington Post, April 18, 2015, available here.

Hsu, Spencer S., “Judge Orders D.C. to Pay $13.2 million in Wrongful FBI Hair Conviction Case,” Washington Post, February 28, 2016, available here.

Hutten, E. H., The Language of Modern Physics (London: Allen and Unwin, 1956).

Huxley, Aldous, Brave New World (1932; New York: Harper Perennial Modern Classics, 2006).

Huxley, Thomas H., On the Educational Value of the Natural History Sciences (London: John van Voorst, 1854).

Ionnadis, John P. A., “Contradicted and Initially Stronger Effects in Highly Cited Clinical Research,” Journal of the American Medical Association 294 (2005): 26-28.

James, William, “The Present Dilemma in Philosophy” (1906), in Pragmatism (1907), eds. Frederick Burkhardt and Fredson Bowers (Cambridge, MA: Harvard University Press, 1976), 9-26.

Judson, Horace Freeland, The Eighth Day of Creation: Makers of the Revolution in Biology (New York: Simon and Schuster, 1979).

Kaplan, Matt, Science of the Magical (New York: Scribner, 2015).

Kauffman, Stuart A., “Prolegomenon to a Future Biology,” in Demasio et. al., eds., Unity of Knowledge, 18-36.

Kim, Jaegwon, “What is Naturalized Epistemology?” Philosophical Perspectives 2 (Atascadero, CA: Ridgeview, 1988), 381-405.

Kipling, Rudyard, “If” (1910), in James Cochrane, ed., Kipling: Poems (London: Penguin Books, 1977), 357-58.

Knobe, Joshua, and Shaun Nichols, eds., Experimental Philosophy (New York: Oxford University Press, 2008).

Kornblith, Hilary, Inductive Inference and Its Natural Ground (Cambridge, MA: Bradford Books, 1993).

Kuhn, Thomas, The Structure of Scientific Revolutions (Chicago: University of Chicago Press, 1962).

Ladyman, James, and Don Ross, with David Spurrett and John Collier, Every Thing Must Go: Metaphysics Naturalized (New York: Oxford University Press, 2007).

Laudan, Larry, “The Demise of the Demarcation Problem” (1983), in Michael Ruse, ed., But Is It Science? The Philosophical Question in the Creation/Evolution Controversy (Amherst, NY: Prometheus Books, 1996), 337-50.

Lefkowitz, Mary, Not Out of Africa (New York: Basic Books, 1997).

Locke, John, Essay Concerning Human Understanding (second edition, 1694), ed. Peter H. Nidditch (Oxford: Clarendon Press, 1979).

Loemker, Leroy E., Gottfried Wilhelm Leibniz: Philosophical Papers and Letters (Dordrecht, the Netherlands: Reidel, 1956; second edition, 1970).

Lorenz, Konrad, “Kant’s Doctrine of the ‘A Priori’ in the Light of Contemporary Biology” (1941), in H. C. Plotkin, ed., Learning, Development, and Culture: Essays in Evolutionary Epistemology (Chichester: Wiley, 1982), 121-43.

Machery, Edward, Ron Mallon, Shaun Nichols, and Stephen P. Stich, “Semantics, Cross-Cultural Style,” in Knobe and Nichols, eds., Experimental Philosophy, 47-60.

Maddox, John, What Remains to be Discovered: Mapping the Secrets of the Universe, the Origins of Life, and the Future of the Human Race (New York: Simon and Schuster, 1998).

Medawar, Peter, “Science and Literature,” Encounter XXXI, no.1, 1969: 15-23.

Mead, George Herbert, Mind, Self, and Society from the Standpoint of a Social Behaviorist (Chicago: University of Chicago Press, 1934).

Mégraud, Francis, “A Humble Bacterium Sweeps This Year’s Nobel Prize,” Cell 123 (December 16, 2005): 975-76.

Merz, John T. History of European Thought in the Nineteenth Century (Edinburgh: W. Blackwood and Sons, 1896).

Murray, Sir James, A New English Dictionary on Historical Principles (Oxford: Clarendon Press, 1884-1933) (10 vols.).

Nagel, Thomas, The View from Nowhere (New York: Oxford University Press, 1986).

Ness, Arne, “‘Truth’ as Conceived by Those who are not Professional Philosophers,” Skrifter Utgitt av der Norske Videnskap-Akademi i II, Hist.-Philos. Klasse, no.4 (1938): 11-118.

Nichols, Shaun, and Joshua Knobe, “Moral Responsibility and Determinism,” in Knobe and Nichols, eds., Experimental Philosophy, 105-28.

Pacheco, Igor, Brian Cerchiai, and Stephanie Stoiloff, “Miami-Dade Research Study for the Reliability of the ACE-V Process: Accuracy and Precision in Latent Fingerprint Examinations” (unpublished report of a study supported by the U.S. Department of Justice, 2014), available here.

Peirce, C. S. Collected Papers, eds. Charles Hartshorne, Paul Weiss, and (vols. 7 and 8) Arthur Burks (Cambridge, MA: Harvard University Press, 1931-58). (References given in the endnotes are by volume and paragraph number, followed by the original date of the passage cited, as given by the editors.)

Popper, Karl R., “Evolution and the Tree of Knowledge” (based on a lecture delivered in 1961), in Popper, Objective Knowledge (London: Oxford University Press, 1972), 256-84.

Popper, Karl R., “Natural Selection and the Emergence of Mind,” in Gerard Radnitsky and W. W. Bartley III, eds., Evolutionary Epistemology, Rationality, and the Sociology of Knowledge (La Salle, IL: Open Court, 1987), 139-54.

Price, H. H., Belief (London: Allen and Unwin, 1969).

Prinz, Jesse, “Empirical Philosophy and Experimental Philosophy,” in Knobe and Nichols, eds., Experimental Philosophy, 189-208.

Quine, W. V., “Two Dogmas of Empiricism” (1951), in Quine, From a Logical Point of View (1952; second edition, New York: Harper Torchbooks, 1961), 20-46.

Quine, W. V., Word and Object (Cambridge, MA: MIT Press, 1960).

Quine, W. V., “Epistemology Naturalized,” in Quine, Ontological Relativity and Other Essays, 69-90.

Quine, W. V., “Natural Kinds,” in Quine, Ontological Relativity and Other Essays, 114-36.

Quine, W. V., Ontological Relativity and Other Essays (New York: Columbia University Press, 1969).

Ramon y Cajal, Advice for a Young Investigator (1923). English translation by Neely Swanson and Larry W. Swanson (Cambridge, MA: MIT Press, Books, 1999).

Ravetz, Jerome, Scientific Knowledge and Its Social Problems (Oxford: Clarendon Press, 1971).

Rescher, Nicholas, A Useful Inheritance: Evolutionary Aspects of the Theory of Knowledge (Savage, MD: Rowman and Littlefield, 1990).

Roe, J. E., Sir Francis Bacon’s Own Story (Rochester, NY: The Du Bois Press, 1918).

Rosenberg, Alex, The Atheist’s Guide to Reality: Enjoying Life without Illusions (New York: W.W. Norton, 2011).

Ross, Sidney, “‘Scientist’: The Story of a Word,” Annals of Science 18 (1962): 65-85.

Ruse, Michael, “The View from Somewhere: A Critical Defense of Evolutionary Epistemology,” in Michael Ruse, ed., Philosophy after : Classic and Contemporary (Princeton: Princeton University Press, 2009), 247-75.

Saler, Michael, “The Ship of the Imagination,” Wall Street Journal, December 19-20 2015, C5.

Schlick, Moritz, “The Turning Point in Philosophy” (1930-31), in A. J. Ayer, ed., Logical Positivism (New York: Free Press, 1959), 53-59.

Shermer, Michael, “The Shamans of Scientism,” Scientific American 287, no.3 (September 2002): 305.

Seiden, Jessie A., “The Criminal Brain: Frontal Lobe Dysfunction Evidence in Capital Proceedings,” Capital Defense Journal 16, no.2 (2004): 395-419.

Snow, John, On the Mode of Communication of Cholera (London: John Churchill, 1855); reprinted, with other material, in Snow on Cholera (Cambridge, MA: Harvard Medical Library, 1956).

Soifer, Valerii, The Tragedy of Soviet Science (New Brunswick, NJ: Rutgers University Press, 1994).

Stegenga, Jacob, “Is Meta-Analysis the Platinum Standard of Evidence?” Studies in History and Philosophy of Biological and Biomedical Sciences 42 (2011): 497-507.

Stich, Stephen P., From Folk Psychology to Cognitive Science (Cambridge, MA: MIT Press, Bradford Books, 1983).

Stich, Stephen P., The Fragmentation of Reason (Cambridge, MA: MIT Press, Bradford Books, 1992).

Suddendorf, Thomas, The Gap: The Science of What Separates Us from Other Animals (New York: Basic Books, 2013).

Szasz, Thomas, The Second Sin (New York: Anchor Books, 1973).

Tallis, Raymond, Aping Mankind: Neuromania, Darwinitis, and the Misrepresentation of Humanity (Durham, U.K.: Acumen, 2011).

Tarski, Alfred, “The Semantic Conception of Truth” (1944) in Herbert Feigl and Wilfrid Sellars, eds., in Philosophical Analysis (New York: Appleton-Century Crofts, 1949), 52-84.

Todhunter, Isaac, William Whewell: An Account of his Writings (London: Macmillan, 1876).

Trusedell, C., Essays in the History of Mechanics (New York: Springer, 1968).

von Hayek, Friedrich, “Scientism and the Study of Society,” Economica, August 1942: 267-91.

Wassertheil-Smoller, Sylvia, Biostatistics and Epidemiology: A Primer for Health and Biomedical Professionals (New York and Berlin: Springer, 1990; third edition, 2004).

Watson, James D., The Double Helix (1968); ed. Gunther Stent (New York: W. W. Norton, 1980).

Weinberg, Jonathan M., Shaun Nichols, and Stephen P. Stich, “Normativity and Epistemic Intuitions,” in Knobe and Nichols, eds., Experimental Philosophy, 17-45.

Weinberg, Steven, To Explain the World: The Discovery of Modern Science (New York: HarperCollins, 2015).

Whewell, William, “On the Connexion of the Physical Sciences, by Mrs. Somerville,” Quarterly Review 51 (1834): 54-68.

Wilson, E. O., Consilience: The Unity of Knowledge (New York: Knopf, 1998).

Wilson, Timothy D., “Strangers to Ourselves: The Origin and Accuracy of Beliefs about One’s Own Mental States,” in John. H. Harvey and Gifford Weary, eds., Attribution: Basic Issues and Applications (Orlando, FL: Academic Press, 1985), 9-36.

Wilson, William A., “Scientific Regress,” First Things (May 2016): 37-42.

“Wish you were here,” Oxford Today 10, no.3 (Trinity 1998): 40.

Woolfolk, Robert L., John M. Doris, and John M. Darley, “Identification, Situational Constraint, and Social Cognition,” in Knobe and Nichols, eds., Experimental Philosophy, 62-80.

Wootton, David, The Invention of Science: A New History of the Scientific Revolution (New York: HarperCollins, 2015).

Wright, Chauncey, “The Evolution of Self-Consciousness” (1873), in Wright, Philosophical Discussions (New York: Henry Holt, 1877), 199-266.

Zeki, Semir and Oliver Goodenough, eds., Law & the Brain (Oxford: Oxford University Press, 2004).

Cases Cited

Abington Sch. Dist. v. Schempp, 374 U.S. 203 (1963).

Blum v. Merrell Dow Pharms, Inc., 33 Phila. Cnty. Rep. 193 (1996).

Daubert v. Merrell Dow Pharm., Inc., 509 U.S. 579 (1993).

Edwards v. Aguillard, 482 U.S. 578 (1987).

Kitzmiller v. Dover Area Sch. Dist., 400 F.Supp. 2d 707 (M.D. Pa. 2005).

Kumho Tire Co. v. Carmichael, 526 U.S. 137 (1999).

Lemon v. Kurtzman 411 U.S. 192 (1973).

McLean v. Arkansas Bd. of Ed., 529 F. Supp. 1255 (E. D. Arkansas, W. D., 1982).


Text copyright © 2017 Susan Haack

Cover image: Dasha Lebesheva

This version is published under a Creative Commons Attribution-NonCommercial-NoDerivatives licence. This licence allows the content to be downloaded and shared with others, as long as attribution is credited to the original. The content may not be re-used commercially or altered in any way, including re-use of only extracts or parts. To view a copy of this licence, visit here.

About Rounded Globe

Rounded Globe is a publishing venture situated on the border between scholarly research and the reading public. Our goal is to disseminate accessible scholarship beyond the borders of the academic world.

Accessibility has two sides: our ebooks are free from jargon and narrow disciplinary focus; and they are released under a legal license that allows readers to download an ebook for free if they cannot afford to purchase it.

For a list of our titles please visit our website,