To understand some of the key philosophical problems in modern science, I take as an illustration a recent question asked by a friend, and show both in the question and my reply how the problem posed in the question is an exemplar of sciences’ philosophical struggles with its own concepts born of a long history of redefinition in response to continuous philosophical challenges. So let us begin:
In response to a friends comments suggesting that mechanistic reductionism is a form of primacy of consciousness, in that it projects into the universe a putatively false (or at least incomplete, if not incoherent) mental construct, I wrote:
I appreciate your point, but disagree that limited imagination about what is possible is a species of primacy of consciousness. This latter requires the belief that consciousness causes reality to come into existence. Being unable to see does not mean that one holds the world as invisible in principle. All theory is based on context of knowledge. If we try to formulate a model of reality entirely based our years of studying classical and modern physics, and conclude that the world is materialistic and deterministic at all levels (materialist reductionism), this is not a form of primacy of consciousness, it is a case of scientific knowledge reasonably embracing a deeply and broadly established scientific theory and thus limiting our capacity to believe that alternative theories may be valid, especially those which are non-reductionistic and/or non-materialist.
Projecting an a priori model of the world, if held as an hypothesis with sufficient grounding, is not irrational. But if it projects this view as mandated by a ‘knowable‘, especially mathematically knowable, reality, it does become a species of the fallacy of prior certainty. And that fallacy, I submit was the primary error committed by the 17th century philosopher-scientists, as a consequence of their deep (and ultimately dogmatic) embrace of Plotinian Neoplatonism.
Yet, as you suggest, modern science (indeed science from the Enlightenment to the present) rests on non-obvious and often a priori assumptions about the nature of physical existence. What has been lost in modern physics (and physical sciences including the biological sciences, insofar as they demand materialist reductionism) is deep thinking about all of the issues that Aristotle raised in his writings on the philosophy of nature. These include thoughts regarding the concepts of infinity, continuity, void, substance, matter, form, energy, motion, space, causality, and many other ontological concepts. Having rejected virtually all of Aristotle’s deeply reasoned views on these concepts, in the name of Neoplatonism and it’s late cousin Cartesian dualism, modern science is left with the false belief that the only substance that does and can exist is ‘matter’ in the form of elementary particles. Thus, according to the emerging materialist view, all of reality must be reducible to only these and their local causal principles of aggregation and motion.
From Atoms in the Void to the Void in the Atoms
When modern atomic physics discovered, to its absolute astonishment, that atoms are mostly empty ‘space’ or ‘void’, they had to rapidly reexamine their concept of the atom as ‘irreducible’, ‘passive’, devoid of self-generated motion, ‘solid’ and ‘impenetrable’ (whatever these terms were actually supposed to mean at the time). In particular, modern science had to account for the problems of the ‘new’ mechanics of the atomic structure and dynamics. For example, why while moving in elliptical orbits (and not straight lines, and thus undergoing continuous acceleration as an angular change in velocity), and while having opposite electrical charges, the negatively charged electrons do not ever slow down and ultimately spiral or collapse into the positively charged nucleus, and why the electrons being accelerated around the nucleus do not appear to emit electrical energy (as classical electrodynamics requires). How is this all possible, and how do electrons maintain their continual existence in the face of the requirements of classical Newtonian mechanics? How is it possible that the fundamental elements, the atoms, are not solid, impenetrable, passive and devoid of self-generated motion?
Field Theory: Non-Material Physical Objects
When the theories of electromagnetic dynamics were formulated by James Clerk Maxwell under the name of ‘field theory’ and electomagnetic fields were said to be physical but not ‘material’ and were treated as objects in themselves, scientists were forced to ask themselves: ‘What is the ontological status of fields, in the context of classical Newtonian physics, which held that solid, passive, particles (featureless corpuscles) constitute the whole of physical existence?’ If fields are physical but not material, what is to be done to save materialist reductionism? Field theory stood beside Newtonian theory of the particulate material universe in parallel with Newton’s theory of gravity, modeled gravity as itself a field of force. Fields, not being material, could not be modeled as particle dynamics. Yet fields influenced the motion and interaction of matter. How can this be explained? Thus emerged a revolutionary new field/matter dualism.
The only way to have a consistent relativistic theory is to treat all the particles of nature as the quanta of fields…. Electrons and positrons are to be understood as the quanta of excitation of the electron-positron field, whose ‘classical’ field equation, the analog of Maxwell’s equations for the EM field, turns out to be the Dirac equation, which started life as a relativistic version of the single-particle Schroedinger equation. – Robert Mills, Space Time and Quanta W. H. Freeman; First Edition edition (April 15, 1994), Chap. 16.
What is often glossed over, as if it does not represent a fundamental problem in the physicalist interpretation of the natural world, is the very concept of a field. This concept was introduced to physics with the discovery of electromagnetic phenomena. In order to explain how magnetism effects electrons (or any charged particles) without action-at-a-distance (forbidden by strict materialism), it was hypothesized that electrical and magnetic phenomena distribute their causal forces (energy) as specially extended fields permeating outwardly from the center of the electromagnetic object.
But fields do not seem to have mass, nor do they physically occupy space. They are, in the words of Faraday, ‘real but not physical.’ By ‘real’ Faraday meant that they have an ontological basis. By ‘not physical,’ Faraday meant that they are not reducible to physical particles or aggregates of such particles. They are not reducible to particulate ‘things.’ Fields are the space (the locus) in which forces have influence. Thus fields are a spacial mapping of the range of forces interacting between entities manifesting, in this case, electromagnetic properties. Modern quantum mechanics, in fact, has reversed the ontological relationship between ‘things’ and fields. As one advocate of quantum physics has expressed it:
Fields are not things and fields are not made of things. It is the other way around: things are made of fields. — Art Hobson University of Arkansas, Response to “The Real Scandal of Quantum Mechanics” by Richard Conn Henry [Am. J. Phys. 77 (10), 869–870 (2009)]
So here we see the ultimate paradoxical inversion of fields (regions of causal influence) with material objects, asserting that the latter are products of the former. Astounding! How can things, which have mass, be made of fields, which are massless?
Chemical Theory: The Irreducibility of Molecules
When modern chemical science discovered that the elements (atoms), contrary to classical Newtonian physics (which held that the elements are by their very nature unchangeable except with respect to location or position), are in fact changed qualitatively when they react with different kinds of atoms at the molecular level, yielding new molecular substances that are not mere aggregates of their constituent atoms, but emergent entities with new ontological, static and dynamic forms giving rise to new and causal properties. How is all of this possible in the face of the dogma of Newtonian science?
Modern Physics: Abandonment of Newtonian Concepts
Newton would undoubtedly be astounded and deeply disturbed by what modern physical science has done to his foundational ideas of matter, space, time, gravity and motion. Indeed, modern physics has questioned the very concept of the independence of matter and energy and as well as the independence of matter, time and space. With regards to matter and energy, in the new modern physics a revolutionary idea about matter and energy was codified by Einstein as an ontological principle of matter-energy equivalence (E=MC^2). With regard to matter, time and space, Einstein’s general relativity holds that space is not a primary independent of time and matter, but rather space is a function of matter that, through its universal property of gravitation, has the power to ‘bend’, ‘curve’ or otherwise topographically define space itself.
In general relativity Einstein shows that matter and energy actually mold the shape of space and the flow of time. What we feel as the ‘force’ of gravity is simply the sensation of following the shortest path we can through curved, four-dimensional space-time. It is a radical vision: space is no longer the box the universe comes in; instead, space and time, matter and energy are, as Einstein proves, locked together in the most intimate embrace. from Einstein Theory – HyperHistory.com
In Einstein’s own words, regarding his General Theory of Relativity, he states:
[The general relativity] then compels a much more profound modification of the conceptions of space and time than were involved in the special theory. For even if the special theory forced us to fuse space and time together to an invisible four-dimensional continuum, yet the Euclidean character of the continuum remained essentially intact in this theory. In the general theory of relativity, this hypothesis regarding the Euclidean character of our space-time continuum had to be abandoned… from Einstein’s Theory of Relativity by Albert Einstein, February 3, 1929
Quantum Physics: Abandonment of Physical Ontology
All of this started with Max Plank, who very reluctantly posited the heretical idea that perhaps physical reality is not continuous after all, and that both particles and energy exist as discrete ‘packets’ of ‘energy/mater’. Later the nature of quantized electron energy ‘shells’ were used to explain why electrons cannot spontaneously change their own quantum energy states, and why their energy is not evenly and continuously distributed within the atom, or at least within the aggregate ‘shell’ of electron ‘presence’. Yet Plank’s theory was just the beginning of modern quantum physics with all of its subsequent ontological and epistemological ‘strangeness.’ This strangeness in itself reveals the philosophical (ontological/epistemological) inadequacy of quantum physics and its foundational principles of quantum mechanics. Bohr’s explanation, that the collapse of the ‘quantum wave’ that brings with its collapse a determinate state of an entity, is ultimately caused by man looking (in the form of measuring) the state of the entity, can only be understood as a species of the primacy of consciousness (the very fallacy with which this article began its discussion.)
The bizarre nature of modern quantum theory with its (in Einstein’s words) ‘spooky’ action at a distance, uncertainty principle, mind-caused-reality , and entanglement theory, is the consequence of a fundamental failure of the philosophy of science (which died of paralysis from the dogmatic affliction of Cartesian substance dualism). For an excellent (if difficult) book about the influence of philosophy (including theology) on the history of science, with deep emphasis on the importance of Aristotle and the Neoplatonic rejection of many of Aristotle’s most profound and still profound insights about physical existence, please patiently read: The Nature of Physical Existence by Ivor Leclerc . You will find it deeply rewarding and intellectually stimulating. It is books such as these that will, hopefully, help us to return to a more philosophically correct and richer view of the nature of reality and its laws at all levels of substantial existence (from atomic substances, through living substances, and finally to the ontological nature of consciousness itself).
Quantum Mechanics and Reductionism
Perhaps the most serious error committed by modern physics, in the name of quantum reality is to assume that the laws of quantum mechanics are the laws of reality. That quantum realities the ultimate reality and that only to the extend that science can come to grips with quantum physics can it ever expect to know what reality is, assuming that it is knowable at all.
This quantum mechanical view holds that at the macroscopic level, there are no new fundamental laws. That the laws of quantum reality are all there is and that all macroscopic objects are merely the complex expression of their underlying quantum mechanical laws.
The problem is that nothing in the macroscopic world can be understood in terms of the world of quantum physics–and vice versa. At the quantum level it is impossible for the scientist to bring to bear any of his scientific knowledge of the macroscopic world. As Professor Richard Feynman expressed this problem:
Newton’s laws are wrong–in the world of atoms. Instead, it was discovered that things on a small scale behave nothing like things on a large scale. That is what makes physics difficult–and very interesting. It is hard because the way things behave on a small scale is so “unnatural”; we have not direct experience with it. Here things behave like nothing we know of, so that it is impossible to describe this behavior in any other than analytic ways. It is difficult, and takes a lot of imagination. – from Richard P. Feynman in Six Easy Pieces: Essentials of Physics Explained by Its Most Brilliant Teacher, Introduction by Paul Davies, Helix Books, NY, 1995, p. 33.
What Feynman means by the statement that ‘Newton’s laws are wrong–in the world of atoms,’ is simply that Newton’s Laws do not apply and cannot be applied to the actions of atoms and subatomic particles. Feynman goes on to discuss the familiar topic of the uncertainty principle with respect to the momentum and position of subatomic particles. This inherent uncertainty is held to be part of what defines the subatomic world. Actually it defines are limitations in attempting to observe, control and measure events at the subatomic level. This is fundamentally an epistemological problem or limit. But Feynman argues that it is a manifestation of subatomic laws. He puts it this way:
Quantum mechanics has many aspects. In the first place, the idea that a particle has a definite location and a definite speed is no longer allowed; that is wrong. – from Richard P. Feynman in Six Easy Pieces, p. 34.
By ‘no longer allowed’ I take Feynman to mean that it does not apply at the subatomic level and that is why it is ‘wrong’ to think of particles in that way.
Beyond the uncertainty principle with regard to momentum and position, is the far more fundamental principle of quantum indeterminism. Regarding quantum mechanics, Feynman writes:
[It] is not possible to predict exactly what will happen in any circumstance. . . No, there are no internal wheels; nature as we understand it today, behaves in such a way that it is fundamentally impossible to make a precise prediction of exactly what will happen in a given experiment. This is a horrible thing; in fact philosophers have said before that one of the fundamental requirements of science is that whenever you set up the same conditions the same thing musts happen. This is simply not true, it is not a fundamental condition of science. The fact is that the same thing does not happen, that we can find only an average, statistically, as to what happens. – from Richard P. Feynman in Six Easy Pieces, p. 35.
Notice Feynman’s universal statement: ‘nature as we understand it today, behaves in such a way that it is fundamentally impossible to make a precise prediction of exactly what will happen in a given experiment.’ Nature, i.e., nature itself, not just the nature of subatomic particles. Here we see that Feynman has gone beyond the mere epistemological problems relating to the study of subatomic particles, and boldly asserts that unpredictability applies to nature itself (with the qualifier ‘as we understand it today’). The reasoning here seems to imply that if some part of nature, the subatomic domain, is subject to predictive limits, then that applies to ‘nature as we understand it today.’ This is an implicit admission that what is true of the subatomic realm is fundamental to and therefore redounds and applies equally to the supra-atomic (macroscopic) realm.
Feynman seems to believe that the fundamental principle of The Uniformity of Nature is not fundamental to the philosophy of science, but rather a mere hypothesis, and must be replaced by the new ‘fundamental hypothesis of science’:
What is the fundamental hypothesis of science, the fundamental philosophy? We stated it in the first chapter: the sole test of the validity of any idea is experiment. – from Richard P. Feynman in Six Easy Pieces, p. 36.
Feynman, and other advocates of positivist doctrines of scientific validation, fail to recognize the deep contradictions of their various tests of verification or falsification. All scientific experiments involve the establishment of an hypothesis that is inherently verifiable or falsifiable (provable or disprovable), with an emphasis on the latter, since verifiability and provability are regarded by positivists as to difficult if not impossible a task. The process of testing the hypothesis is described in the procedure section of a scientific report. The data collected in the ‘observations’ section, and the falsification or confirmation of the hypotheses in the conclusions. But all experiment depends on observations, recordings of those observations and references to observations (results) reported by other scientists. Yet, observation is the central paradox of quantum mechanics, for observation in quantum mechanical experiments causes the very facts under observation to be ‘changed by the act of looking’ bringing about the so-called ‘collapse of the quantum wave’ a stochastic wave of probability of the entire range of possible outcomes. One would think that this principle of wave-collapse under observation would bring to an end the entire enterprise of science, at least in the realm of the subatomic. But this conclusion is only likely to occur if one foolishly thinks about it, which one must not. For to think about epistemology and ontology are philosophical acts, not the acts of a true skeptical ‘open minded’ scientist. In the end, such scientists, like Feynman, only find such contradictions and paradoxes ‘very interesting.’
Quantum Mechanics’ Epistemological Dualism
Quantum Mechanical explanations rest on the concept of probability, where probability is used as an ontological concept rather than as an epistemological concept. It seems to say that at the quantum level of existence, physical properties of particles, such as their momentum and location are physically probabilistic, i.e., are not quantitatively determinate . I have written elsewhere on this blog:
Based on my layman understanding of it, quantum logic appears to be an inherently ambiguous concept. It generally refers to the special nature of quantum ‘facts’. These include 1. That no absolute mathematical facts about instances of existents are possible at the quantum mechanical level, since all facts about the measurable properties of existents are stochastic (defined entirely and exclusively in terms of probabilities, not actualities), 2. Probability, in the QM view permeates the universe thoroughly and metaphysically, implying that at the QM level (which quantum theory holds to be the foundational level of all existence) nothing has an ontological determinate value, all properties of existents being absolutely ontologically stochastic, 3. Human knowledge (including scientific knowledge) of concrete existents requires measurement (that is any form of epistemological ‘looking’) and that the very act of ‘looking’ causes the quantum phenomenon under observation to collapse its probabilistic wave into an determinate transient ‘fact’.
There is some confirmation of my conclusions in various publications from modern physicists and philosophers of science. In her blog post entitled A Private View of Quantum Reality, which is about the implications of the concept of QBism (Fuchs’ neologism for Quantum Bayesianism), first conceived by Christopher A. Fuchs himself, Amanda Gefter in her interview of Fuchs observes:
When the founders of quantum mechanics realized that the theory describes the world in terms of probabilities, they took that to mean that the world itself is probabilistic. Christopher A. Fuchs in A Private View of Quantum Reality – quoted from an interview in Quanta Magazine.
This is the explicit affirmation of my interpretation above, which I repeat:
2. Probability, in the QM view permeates the universe thoroughly and metaphysically, implying that at the QM level (which quantum theory holds to be the foundational level of all existence) nothing has an ontological determinate value, all properties of existents being absolutely ontologically stochastic.
Stochastic means, in this context, probabilistic. The Quantum Probability Wave associate with a quantum event, is viewed as a physical phenomenon to classical QM theory. But this leads the the paradox of the quantum wave collapse in reaction to a measurement (actually an observation) of the quantum phenomenon. The observation yields a discrete value and not a stochastic range of values. This seems to imply, paradoxically, that observation changes physical reality. That is the classical interpretation of the collapse phenomenon. But QBism offers a simple epistemological solution (explanation) of the collapsing QM wave phenomenon.
Take, for instance, the infamous “collapse of the wave function,” wherein the quantum system inexplicably transitions from multiple simultaneous states to a single actuality. According to QBism, the wave function’s “collapse” is simply the observer updating his or her beliefs after making a measurement. – Christopher A. Fuchs in A Private View of Quantum Reality – quoted from an interview in Quanta Magazine.
So we see, given this QBistic interpretation, that to cause a collapse of the quantum wave underlying a quantum phenomenon, it is not sufficient for the observer to merely ‘look’; it is further necessary to draw a conclusion in response to that looking. A conclusion that gives the observation a discrete value for the quantum state. This is pure epistemology (remembering that epistemology is that branch of philosophy that answers the question ‘How do we know?). The article summarizes this process as follows:
A quantum particle can be in a range of possible states. When an observer makes a measurement, she instantaneously “collapses” the wave function into one possible state. QBism argues that this collapse isn’t mysterious. It just reflects the updated knowledge of the observer. She didn’t know where the particle was before the measurement. Now she does. – Christopher A. Fuchs in A Private View of Quantum Reality– quoted from an interview in Quanta Magazine.
The entire problem here with QM Theory is its demand for scientific objectivity in the face of knowing that the observing subject and the observed measurement must entangle the subjective observation with the intrinsic object. QM theorists want to preserve the classical model of the physical world, by classically describing the object. On the other hand, to explain the quantum collapse, they must apply a different methodology where by knowing somehow changes the object of observation. This creates a paradox of methodological (or epistemological) dualism. As Carl Friedrich von Weizsäker puts it:
If this were its final word, quantum mechanics would in fact be introducing a new dualism into physics, a dualism between the classical and the quantum theoretical mode of description. (emphasis added by bioperipatetic), –from The Unity of Nature (1971), by Carl Friedrich von Weizsäker, Translated (1980) by Francis J. Zucker, Farrar = Straus – Giroux, New York, p. 126
Weizsäker goes on to argue that the ‘dualism disappears’ if we take the objective ‘document’ of the recorded observation as itself a form of ‘objectification’ of the subjective measurement (or observation).
I see no reason within the framework of quantum mechanics for not extending the quantum mechanical description to the brain of the observer. The Unity of Nature (1971), p. 126
However, this does not seem to resolve the dualism, but instead introduces the additional problem of confusing correlation with identity. The recorded observation, the recording itself, the document itself are objective facts. But the recording, observation, and documentation are meaningless unless we understand their intent or purpose. Thus we arrive at concept of intentionality, a fundamental feature of mind and not of physical matter.
Of course, we know for a fact neither whether quantum mechanics can be applied unchanged to living organisms nor what the relation between the conscious act of perception and the event in the observer’s brain might be. The Unity of Nature (1971), p. 126
Thus, the problem of dualism cannot be solved without the scientific study of the nature of perception and consciousness itself. This issue is discussed at length by Dr. Robert Efron in his now classic paper ‘Biology without Consciousness–and it’s Consequences.’ (For a discussion of Efron’s paper see Mind and Brain: The Epistemological Battle.)
Philosophy of Science: Abandonment of Causal Ontology
Modern theories of causality evolved from the philosophy of David Hume. For Hume, causality is purely an expectation based on the statictical history of correlated observed events. There is, according to this view, no ontological basis for causality. One can never validly asked ‘why’ one outcome occurred rather than another. For causality does not, under this view, consider such issues as the nature of the interacting events. Thus, to strike a match in a room filled with pure oxygen cannot be known in advance to result in a massive explosion, based on the very nature of oxygen and the very nature of a lighted match. It is irrelevant that oxygen is in fact highly explosive by virtue of its chemical composition and it instability in the presence of extreme thermal phenomena or even ‘extremely hot objects.’ We cannot ask, in the Humean model what is the nature of the physical world, for Hume would argue that our sensory contact with the world is illusory and unprovable (read Hume’s theory of perception for details).
Bertrand Russell believed that the common concept of causation, as held by philosophers of his day, was false in that causation did not exist:
All philosophers, of every school, imagine that causation is one of the fundamental axioms or postulates of science, yet, oddly enough, in advanced sciences such as gravitational astronomy, the word “cause” never occurs. Dr. James Ward, in his _Naturalism and Agnosticism_, makes this a ground of complaint against physics: the business of those who wish to ascertain the ultimate truth about the world, he apparently thinks, should be the discovery of causes, yet physics never even seeks them. To me it seems that philosophy ought not to assume such legislative functions, and that the reason why physics has ceased to look for causes is that, in fact, there are no such things. . . . . from On the Notion of Cause – Bertrand Russell,
Russell argued in that paper that the law of causation held by philosophy as a necessary connection between successive events is simply false, because science did not describe causes but only functional relations between events, such relations being called determinants:
We may now sum up our discussion of causality. We found first that the law of causality, as usually stated by philosophers, is false, and is not employed in science. We then considered the nature of scientific laws, and found that, instead of stating that one event A is always followed by another event B, they stated functional relations between certain events at certain times, which we called determinants, and other events at earlier or later times or at the same time. – from On the Notion of Cause
This all seems a equivocation that wants to keep the laws of successive events as scientific phenomena, but call them relations rather than causes. In either cases, we are talking about external relationships between events, a relationship with respect to time only.
Humean as against Aristotelian Causality
The preceding account of causality as defined by an external, temporal relationship between events is pure Hume. In no case is cause or relation allowed to refer to the natures of the objects underlying the events nor the natures of the events themselves as actualizations of object potentialities. In other words, Humean causality, fully embraced by modern science, represents essentially the total abandonment of Aristotelian causality. What is retained of Aristotle’s four-aspect view of causality is merely the efficient cause, and even here, the ontology of the effective object is omitted and only the temporal domain of the event is retained.
As said before, Aristotle viewed causes as the actualizations of an entity’s potential. This potential and the power to actualize it is inherent in the hylomorphic unity of a substance. For Aristotle, the Law of Causality is an application and extension of the Law of Identity manifest in the actions of a thing. H. W. B. Joseph, a profound admirer of Aristotle, in his magnum opus, ‘An Introduction to Logic,’ expressed the relationship between causality and identity as follows:
[T]he way in which [a thing] acts must be regarded as a partial expression of what it is. It could only act differently, if it were different. As long therefore as it is a, and stands related to under conditions c to a subject that is s, no other effect than x can be produced; and to say that the same thing acting on the same thing under the same conditions may yet produce a different effect is to say that a thing need not be what it is. But this is in flat conflict with the Law of Identity. To assert a causal connexion between a and x implies that a acts as it does because it is what it is; because, in fact, it is a. So long therefore as it is a, it must act thus; and to assert that it may act otherwise on a subsequent occasion is to assert that what is a is something else than the a which it is declared to be. – H. W. B. Joseph, An Introduction to Logic, Second Edition, Revised, 1926, Chapter XIX: Of the Presuppositions of Inductive Reasoning: The Law of Causation, p. 402.
Copyright © 2014 by bioperipatetic. Published on: Feb 8, 2014 @ 13:37
Latest revision: October 17, 2015 @ 11:06 pm