In New Mexico, the mountains' naked strata compel historical, even geological, perspectives. The human culture, too, is stratified. Scattered in villages throughout the region are Native Americans such as the Tewa, whose creation myths rival those of modern cosmology in their intricacy. Exotic forms of Christianity thrive among both the Indians and the descendants of the Spaniards who settled here several centuries ago. In the town of Truchas, a sect called the Hermanos Penitentes seeks to atone for humanity's sins by staging mock crucifixions and practicing flagellation.
Lying lightly atop these ancient belief systems is the austere but dazzling lamina of science. Slightly more than half a century ago physicists at the Los Alamos National Laboratory demonstrated the staggering power of their esoteric formulas by detonating the first atomic bomb. Thirty miles to the south, the Santa Fe Institute was founded in 1985 and now serves as the headquarters of the burgeoning study of complex systems. At both facilities, some of the world's most talented investigators are seeking to extend or transcend current explanations about the structure and history of the cosmos.
The mix of nature, ancient culture and contemporary science makes New Mexico an ideal place in which to contemplate the future of science. The precipitous pace of research in New Mexico and elsewhere at the close of the millennium has given rise, paradoxically, to two opposing views of the scientific enterprise. On the one hand, physicists such as the Nobel laureate Steven Weinberg of the University of Texas have proclaimed that researchers are converging on a "final theory," one that will encompass the basic laws governing the physical realm. On the other hand, many philosophers and others question whether science can ever arrive at the truth. Given the speed with which one theory has succeeded another, these skeptics argue, how can scientists ever be sure that their current theories are right?
In this struggle between belief and disbelief, scientists have always had a potent ally: science writers. Most journalists covering science assume that the laws of the universe are "like veins of gold, and that scientists are extracting the ore," as George Johnson puts it. Now Johnson, a reporter for the New York Times, has veered away from the pack with a brilliant new book, one that raises unsettling questions about science's claims to truth.
Fire in the Mind is on one level a conventional--albeit exceptionally well-executed--work of science journalism. The author of three previous books (on memory, artificial intelligence and the role of conspiracy theories in American politics), Johnson provides an up-to-the-minute survey of the most exciting and philosophically resonant fields of modern research. This achievement alone would make his book worth reading. His accounts of particle physics, cosmology, chaos, complexity, evolutionary biology and related developments are both lyrical and lucid. They made me realize, somewhat to my consternation, how poorly I had grasped David Bohm's pilot-wave interpretation of quantum mechanics, or the links between information theory and thermodynamics.
What sets Fire in the Mind apart from other science books is its profound questioning of such theories. Science can also be viewed as "a man-made edifice that is historical, not timeless," Johnson suggests, "one of many alternative ways of carving up the world." His reports on the latest attempts to explain the origin of life or of the entire universe alternate with descriptions of New Mexico's primordial terrain and of the religious views of the Penitentes and the Tewas. These theologies, Johnson notes, stem from the same human passion for order--and the same faith that such order exists--that quantum mechanics or the big bang theory do. To aliens from an an advanced civilization, or to human scientists a thousand years hence, might not both sets of beliefs appear almost equally primitive, equally distant from The Truth (assuming that classical concept remains viable)?
It is a testament to Johnson's rhetorical skills, and to his intimate knowledge of both science and philosophy, that he forces us to take such possibilities seriously. Fire in the Mind is a subversive work, all the more so because it is so subtle. Johnson's style is less polemical than poetic: he advances his position through analogy, implication, innuendo. That may be why previous reviewers of Fire in the Mind, including the evolutionary biologist Stephen Jay Gould of Harvard University, seem not to have appreciated just how serious an assault Johnson has mounted against the concept of objective knowledge.
Johnson chips away at science's foundations with tools drawn from science itself. Physicists have demonstrated that even some apparently simple systems are chaotic; that is, minute perturbations of nature (the puff of the proverbial butterfly's wing in Iowa) can trigger a cascade of utterly unpredictable consequences (a monsoon in Indonesia). These arguments also apply to our own mental faculties. Neuroscientists often emphasize that the brain, far from being a perfect machine for problem solving, was cobbled together by natural selection out of whatever happened to be at hand.
Johnson proposes that science, too, might be "a construction of towers that just might have been built another way." Although he never mentions The Structure of Scientific Revolutions, Johnson's view evokes the one set forth in 1962 by the philosopher Thomas Kuhn. In that book's coda, Kuhn compared the evolution of science to the evolution of life. Biologists, he noted, have banished the teleological notion that evolution advances toward anything--including the clever, featherless biped known as Homo sapiens. In the same way, Kuhn suggested, scientists should eschew the illusion that science is evolving toward a perfect, true description of nature. In fact, Kuhn asserted, neither life nor science evolve toward anything, but only away from something; science is thus as contingent, as dependent on circumstance, as life is. Johnson espouses a similar view, though more eloquently than Kuhn did.
Fire in the Mind serves as a provocation rather than a definitive statement. It challenges readers to reconsider their assumptions about what is true, what merely imagined. It challenged this reader, at any rate. I first encountered Fire in the Mind while completing my own book, one that takes the quite different view that scientific truth is virtually at hand. Reading Johnson, I would often nod in agreement and then realize that in doing so I was violating one of my own convictions. Conversely, I would reflexively object to something he had written and then question my own position. Other readers will surely come to terms with Johnson's thesis in their own way. But unless they are radical relativists to begin with, they are unlikely to finish the book without undergoing a crisis of faith.
My main disagreement with Johnson is that he plies his doubts too even-handedly. The inevitable result is that all theories, from the most empirically substantiated to those that are untestable even in principle, seem equally tentative. I also cannot accept Johnson's evolutionary, Kuhnian model of scientific progress. Just because the objects of science's scrutiny--including the human mind, from which science springs--are shaped by contingent factors does not mean that science is equally contingent. Natural selection has inclined us to see the world in terms of nouns and verbs, objects and actions, matter and energy. There may well be other ways to categorize reality, but this is the mode of perception we have been granted, and it has been remarkably successful.
Given our perceptual predilections and given the world in which we live it seems inevitable that once we invented science we would stumble upon certain facts about the world. The genetic code may be the product of contingent events, but its discovery seems inevitable. Astronomers have established that our sun is just one of many stars, our Milky Way one of billions of galaxies. It is a discovery, as irrefutable as the roundness of the earth, that all matter consists of substances called elements, all elements are made of atoms, and atoms are in turn composed of even smaller entities, the electrons, protons and neutrons. Surely Johnson accepts these findings as facts.
Or does he? Early on in his book, Johnson offers a refresher course on modern physics, beginning with general relativity and quantum mechanics and culminating in the phantasmagoria of superstring theory. While doing full justice to the power and beauty of these achievements, Johnson also suggests that modern physics may be in some sense an ingenious confabulation--an effective one, to be sure, but merely one of many possible alternatives. Time and again, Johnson points out, physicists postulated and then found a particle satisfying some theoretical requirement. "A theory requires a particle and there is a race to find it. The detector is built and then tuned and retuned until, lo and behold, the predicted effect is observed--the effect, not the particle itself, which might not live long enough to leave a track."
Take the neutrino. Theorists first postulated its existence to ensure conservation of energy in certain forms of nuclear decay, and twenty-five years later experimenters found clear-cut evidence for it. In part because its mass and other properties have been so difficult to pin down, the neutrino now plays a central role in speculations about the constitution and ultimate fate of the universe. "Once accepted as real, neutrinos could be used to make sense of other phenomena," Johnson writes. "And so they became woven tighter and tighter into the mesh, the gauze of theory that lay between us and the nuclear world."
What Johnson downplays is that some hypothesized particles and processes were never discovered. Theorists have lusted after magnetic monopoles, for instance, and proton decay, but experiments have failed to provide evidence for either one. Moreover, experiments have also uncovered countless phenomena that no one had anticipated, such as x-rays, radioactivity and superconductivity.
Johnson further attempts to undermine the reader's faith in physics by reviewing ongoing efforts to make sense of quantum mechanics. One of the most puzzling of the notorious quantum paradoxes is the measurement problem. Quantum theory suggests that a particle such as an electron follows many different possible paths, all of which are embodied in the particle's so-called wave function. But the ambiguity suddenly ceases when the particle is measured; at that moment, the wave function "collapses" and the particle assumes one of its possible values. The measurement problem implies, unsettlingly, that the physical realm is defined in some sense by our perception of it.
To solve that and other conundrums, some physicists are now attempting o recast quantum mechanics in the mold of information theory, a discipline created in the 1940s by the mathematician Claude Shannon, then at the Bell Telephone Laboratories in Murray Hill, N.J. Information, as defined by Shannon, can be viewed as the capacity of a system to surprise an observer. Wojciech H. Zurek of Los Alamos and others have proposed that the apparent dichotomy between the observer and the object under observation can be dissolved if all physical processes are viewed as information rather than as strictly physical phenomena.
But resolving the measurement problem in this way is analogous to solving the mind-body problem by assuming that all matter is a manifestation of mind. Information, after all, only exists if there are conscious information-processors to apprehend it. Johnson, although he seems sympathetic to the information-based paradigm, acknowledges its fatal flaw. "If the human race were wiped from the earth, the computers would keep going until the energy supplies ran down. But without an interpreter, could they really be said to be processing something called information?" The universe, as far as we know, was devoid of life until 4 billion years ago; intelligent, self-aware life, capable of creating science, may have existed for only the last split second of cosmic time. How, then, can information be a fundamental property of reality?
Other interpretations of quantum mechanics also strain credulity, as Johnson shows. According to the many-worlds interpretation, for example, a quantum entity such as an electron actually follows all the paths allowed it by quantum mechanics--in separate universes. Worse yet, none of the most popular interpretations can be empirically differentiated. When physicists care to interpret at all (and many do not, preferring to retreat to instrumentalism), they choose one over the other primarily for aesthetic reasons.
Johnson's real purpose in probing the philosophical underpinnings of quantum mechanics is to cast doubt on its status as an absolute truth, one that physicists have plucked directly from the Platonic realm. Have physicists discovered quantum mechanics, Johnson asks, or imagined it? Are other, perhaps more sensible, models of the microworld possible? One way to address such questions is to accept Johnson's own description of scientific theories as maps of nature. Road maps rely on certain conventions and formalisms, and so do the maps that scientists construct of an atomic nucleus, or the Milky Way, or the human genome. But the fact that these formalisms are invented does not make highways, neutrons, galaxies and genes equally imaginary .
Formalisms, in and of themselves, may not always make much sense . Newton's formalism for gravity, for instance, smacked of the occult to his contemporaries. How can one thing possibly tug at another across vast distances of empty space? (Last year, an editorial in Nature questioned whether Newton would have been able to publish his theory today, given its self-evident preposterousness.) But the important thing is that the formalism works; its truth is not threatened by mere weirdness. So it is with quantum mechanics--and with Einstein's formalisms. What is more shocking to common sense than curved space and compressible time? But those concepts work too, even better than Newton's version of gravity.
The question is, Do those theories of gravity correspond to something real in nature? Are physicists reifying their theories when they speak of gravity as something real rather than invented? After all, we cannot see gravity in the same way that we can see neurons, or bats, or red giant stars; we can only infer gravity's existence. Nevertheless, I believe--and most scientists and even nonscientists believe--that there is something real in nature that both Newton's theory and Einstein's can be said to be about. Newton can justly be said to have discovered rather than invented gravity, although his mathematical formalism is itself an invention.
One may legitimately ask, Are better mathematical and conceptual formalisms possible? Perhaps. But quantum mechanics and general relativity, the premier formalisms of modern physics, already describe the behavior of matter under all but the most extreme circumstances. These formalisms are so uncannily effective that they can be considered as virtual discoveries; that is, their creation seems to be in retrospect every bit as inevitable, and irrevocable, as our discernment of atoms, elements and galaxies.
Johnson is quite right that physicists often become overly enamored of their formalisms. He justly questions whether concepts such as symmetry--which has become the guiding light of particle physics--represent truths in and of themselves or merely useful mathematical tools. An object is said to be symmetrical if it can undergo certain mathematical transformations--such as rotation around an axis--and remain essentially unchanged. It is symmetry that has allowed physicists to show that, just as electricity and magnetism are two aspects of a single force, so are electromagnetism and the weak nuclear force manifestations of a unified electroweak force.
In the quest for similar platonic properties, some physicists have speculated that the cosmos had its beginnings in symmetry-rich particles called superstrings. These almost infinitely malleable loops of ur-stuff allegedly gave rise to all matter, forces, space and time. "In the beginning," Johnson writes, "was a world of mathematical purity that shattered to give birth to the world in which we find ourselves. How is this belief so different from the Fall from the Garden of Eden, or the emergence of the Tewa from the heavenly underworld?"
Comparing the myths of the Tewa to superstring theory is not entirely unfair. Superstrings are thought to wriggle about in a microrealm more distant from human intervention, in practical terms, than the quasars lurking at the edge of the visible universe, and they exist not just in the four dimensions we inhabit (three spatial dimensions plus time) but in at least six extra ones as well. There is no direct empirical evidence supporting the existence of superstrings, and there almost certainly never will be. Probing the distance scales and energies where superstrings are thought to dwell would require a particle accelerator 1,000 light years around.
On the issue of superstrings, Johnson and I are in complete agreement. Superstring theory seems to be a product not of empirical investigation of nature but of a kind of religious conviction about reality's symmetrical structure. Some particle physicists have the same view. Sheldon Glashow of Harvard University, one of the architects of the electroweak theory, has ridiculed superstring researchers as the "equivalents of medieval theologians."
Such skepticism is essential. Without it, there can be no progress in science, or in the humanities or in any mode of knowledge. But just how skeptical should we be? When pushed too far, skepticism culminates in the ultimate conversation-stopper, solipsism. In a late-night discussion fueled by controlled substances, a college sophomore may suggest that he is nothing but a brain sitting in a vat of chemicals somewhere, programmed by alien scientists to imagine all his experiences. But as amusing as that assumption may be, it is untestable and therefore unscientific.
Religious myths, in much the same way, are either demonstrably wrong or untestable. Geologists have proven beyond a reasonable doubt that the earth formed not 6,000 years ago (as some fundamentalists believe) but roughly 4.5 billion years ago. Scientists cannot prove that God does not exist (though as the biologist Richard Dawkins has argued, the concept of a benign God certainly seems contradicted by the evidence). But fundamentalists cannot show that God does exist, either. God is thus an unscientific concept.
Johnson is well aware of all that, and yet he still asks us to consider that at some level science, too, may be as much a matter of faith as of reason. His doubts, moreover, are shared by many scientists--Freeman Dyson among them. An eminent physicist at the Institute for Advanced Study, Dyson is fond of asserting that our current knowledge will seem as primitive to future scientists as the physics of Aristotle seems to us.
Why do people so obviously knowledgeable and respectful--indeed, even worshipful--of science as Dyson and Johnson veer so close to relativism? My own theory is that the alternative, for them, is much worse: the end of science. They recognize that science, if it is capable of absolute truth, might soon become a victim of its own success. After all, researchers have already mapped out the entire universe, from the microrealm of quarks and electrons to the macrorealm of galaxies and quasars. Physicists have shown that all matter is composed of a few elementary particles ruled by a few basic forces.
Scientists have also woven their knowledge into an impressive, albeit incomplete, narrative of how we came to be. The universe exploded into existence 15 billion years ago, give or take five billion years, and is still expanding. Some 4.5 billion years ago, the detritus of a supernova condensed into our solar system. During the next few hundred million years, single-celled organisms bearing an ingenious molecule called DNA emerged on this planet. These primordial microbes gave rise, by means of natural selection, to an extraordinary array of more complex creatures, including Homo sapiens.
My guess is that this his modern myth of creation will be as viable 100 or even 1,000 years from now as it is today. Why? Because it is true. In fact, given how far science has already come, and given the physical, social and cognitive limits constraining further research, science may have little to add to the knowledge it has so far bequeathed. To the extent that our current knowledge is true, it is that much more difficult to transcend.
In 1965 Richard Feynman, prescient as always, prophesied that science would reach this impasse. "The age in which we live is the age in which we are discovering the fundamental laws of nature, and that day will never come again." After the great truths are revealed, Feynman continued, "there will be a degeneration of ideas, just like the degeneration that great explorers feel is occurring when tourists begin moving in on a new territory."
As Johnson himself points out, science has left some rather important questions unanswered: How, exactly, did the universe begin? Could it have come into being in some other way? How inevitable was the emergence of life on earth and its subsequent evolution? Does life exist elsewhere in the universe? Efforts to answer those questions--and to transcend the received wisdom--have prompted some ambitious, creative scientists to vault past empirical studies, landing them in theories more akin to philosophy or art than to science in the conventional sense.
If superstring theory and other untestable hypotheses signal the demise of true, empirical science, it is a tragedy for humanity. For all its achievements, science has been something of a disappointment. It has left us still plagued with warfare, disease, poverty, ethnic conflict and a multitude of lesser evils. It has crushed the superstitions that made life meaningful without offering us a comforting substitute. As Steven Weinberg himself once put it, "The more the universe becomes comprehensible, the more it seems pointless."
Johnson, I suspect, would agree. For seekers such as he and Dyson, the possibility that science might be nearing closure is terrifying, because truth-seeking, not truth itself, is what makes life meaningful. The skeptical stance allows Johnson, Dyson and others to maintain hope that the great age of discovery is not over, that more revelations and revolutions lie ahead. They are willing to sacrifice the notion of absolute truth so that the truth can be sought forever.
Whether or not these science-loving skeptics turn out to be right, books like Johnson's are precious. Wittgenstein wrote, in his prose-poem Tractatus Logico-Philosophicus: "Not how the world is, is the mystical, but that it is." The most exalted of all human emotions, Wittgenstein knew, consists of pure dumbfoundment before the mystery of existence. By pointing to all the inadequacies of science, to all the questions that it raises but leaves unanswered, Johnson has done his part to ensure that our sense of wonder does not soon vanish.
John Horgan is a senior writer for Scientific American. His book The End of Science: Facing the Limits of Knowledge in the Twilight of the Scientific Age, is being published this spring by Addison-Wesley.