By Michael Epperson
Founding director of the Center for Philosophy and the Natural Sciences and the History and Philosophy of Science Program at California State University where he is a research professor and principal investigator.
PUBLISHED IN iai.tv
A new solution to the crisis in modern physics: Greek American philosopher Michael Epperson brings Plato and Aristotle back into the game.
=====================================
In their seeking of simplicity, scientists fall into error. They mistake their abstract concepts describing reality – for reality itself. The map for the territory. This leads to dogmatic overstatements, paradoxes and mysteries such as quantum gravity. To avoid such errors, we should evoke the thinking of philosopher Alfred North Whitehead and conceive of the universe as a universe-in-process, where physical relations beget new physical relations, writes Michael Epperson.
When celebrity physicists disagree about some fundamental prediction or hypothesis, there’s often a goofy and well-publicized wager to reassure us that everything is under control. Stephen Hawking bets Kip Thorne a one-year subscription to Penthouse that Cygnus X-1 is not a black hole; Hawking and Thorne team up and bet John Preskill a baseball encyclopedia that quantum mechanics would need to be modified to be compatible with black holes. Et cetera, et cetera. And even as we roll our eyes, we’re grateful because at least some part of us does not want to see these people violently disagreeing about anything.
So when celebrity physicist Lawrence Krauss publicly called celebrity physicist David Albert a “moron” for not appreciating the significance of Krauss’s discovery of the concrete physics of nothingness, it caused quite a stir. In his book, A Universe from Nothing, Krauss argued that in the same way quantum field theory depicts the creation of particles from a region of spacetime devoid of particles (a quantum vacuum), quantum mechanics, if sufficiently generalized, could depict the creation of spacetime itself from pure nothingness. In a scathing New York Times review of Krauss’s book, Albert argued that claiming that physics could concretize “nothing” in this way was at best naïve, and at worst disingenuous. Quantum mechanics is a physical theory, operative only in a physical universe. To contort it into service as a cosmological engine that generates the physical universe from “nothing” requires that the abstract concept of “nothing” be concretized as physical so that the mechanics of quantum mechanics can function. What’s more, if quantum mechanics is functional enough to generate the universe from nothing, then it’s not really nothing; it’s nothing plus quantum mechanics.
This is a familiar maneuver in popular physics books these days—claims of concretizing what is inescapably abstract, usually by way of a purely speculative and untestable assertion costumed mathematically as a testable hypothesis.
This is a familiar maneuver in popular physics books these days—claims of concretizing what is inescapably abstract, usually by way of a purely speculative and untestable assertion costumed mathematically as a testable hypothesis. It is a cheap instrument, as attractive as it is defective, used more often as cudgel than tool for exploration. Fortunately, as we saw with David Albert, few despise its dull edge more than other physicists and mathematicians. During the first years of modern mathematical physics and the construction of its two central pillars, quantum theory and relativity theory, Alfred North Whitehead warned, “There is no more common error than to assume that, because prolonged and accurate mathematical calculations have been made, the application of the result to some fact of nature is absolutely certain.”
Whitehead would later generalize this error as the “fallacy of misplaced concreteness.” It is often oversimplified as merely mistaking an abstract conceptual object, like a mathematical or logical structure (e.g., the number zero, or the concept of “nothingness”), for a concrete physical object. But the fallacy has more to do with what Whitehead argued was the chief error in science and philosophy: dogmatic overstatement. We commit the fallacy of misplaced concreteness when we identify any object, conceptual or physical, as universally fundamental when, in fact, it only exemplifies selective categories of thought and ignores others. In modern science, the fallacy of misplaced concreteness usually takes the form of a fundamental reduction of some complex feature of nature—or even the universe itself—to some simpler framework. When that framework fails, it is replaced with a new reduction—a new misplaced concreteness, and the cycle repeats.
Scientific progress is marked by these cycles because “failure” doesn’t mean the reduction was entirely wrong; it just means it wasn’t as fundamental—as concrete—as previously supposed. Our understanding of nature does increase, just not at the expense of nature’s complexity. In this regard, the reductive mathematization of natural philosophy over the last 500 years has proven to be both its greatest strength and its greatest hazard. The fundamental objects of modern physics are no longer understood as material physical structures but rather as mathematical structures that produce physically measurable effects. The waves of quantum mechanics are not material-mechanical waves; they are mathematical probability waves. The “fabric” of spacetime in relativity theory is pure geometry.
Wow! I admit this is a bit over my pion brain to comprehend in it’s complexity and relativity.
Fabulous article which in my opinion, gives validation to science not being based on fact but a series of hypothetical thesis’ trying to prove or disprove a particular theory or another viewpoint/theory. Kinda like, “If the the glove doesn’t fit, you must acquit” approach.
Which also gives credence to a faulty human trial vaccine which will kill thousands at the expense of a false narrative.