Emergence of Solid Matter

The Emergence of Solid Matter

The concept of emergence is a very powerful and important addition to scientific thought. Why then has emergence, as a broadly embraced serious alternative to traditional materialist reductionism, had such a recent history?  In fact, though emergence is seen as a radical, even ‘unscientific’ view since the early 20th century, the idea, under a different name, was originated in the late 19th century.  Thus is the nature of science, whose ontology is constantly swing toward and away from reductionist materialism since the time of Newton and Descartes.

John Stuart Mill’s Heteropathic Laws and Emergence

The idea of emergence seems to have originated at least as far back in medical history as the works of the ancient greek physician, Galen, and yet not to have been forcefully identified and articulated until the late 19th century by John Stewart Mill.

All organised bodies are composed of parts, similar to those composing inorganic nature, and which have even themselves existed in an inorganic state; but the phenomena of life, which result from the juxtaposition of those parts in a certain manner, bear no analogy to any of the effects which would be produced by the action of the component substances considered as mere physical agents. To whatever degree we might imagine our knowledge of the properties of the several ingredients of a living body to be extended and perfected, it is certain that no mere summing up of the separate actions of those elements will ever amount to the action of the living body itself.  (John Stewart Mill, A System of Logic, Bk.III, Ch.6, §1) Cited in Emergent Properties by Timothy O’Connor  and Hong Yu Wong in Sanford Encyclopedia of Philosophy

Mill was the first to see that the mechanical model that dominated physics in the 19th century, whereby the effects of all mechanical causes acting on an object could be applied independently to the reactions (motions) of that object by applying simple vector analysis to demonstrate the natural mechanical laws as ‘Composition of Causes,’ (which Mill called homeopathic laws.)   Yet, Mill argued, this homeopathic approach to causal analysis is not applicable to chemical phenomena, whose laws Mills called heteropathic:

By contrast, the chemical mode of the conjoint action of causes is characterized by a violation of the Composition of Causes: the joint action of multiple causes acting in the chemical mode is not the sum of effects of the causes had they been acting individually. This mode of conjoint action of causes is named after the chemical reactions which typically exhibit it, e.g.:

NaOH + HCl → NaCl + H2O

(Sodium hydroxide + hydrochloric acid       produces          sodium chloride + water)

The product of this neutralization reaction, water and a salt, is in no sense the sum of the effects of the individual reactants, an acid and a base. These are ‘heteropathic effects,’ and the causal laws which subsume them are ‘heteropathic laws.’ Heteropathic laws and effects correspond to a class of laws and effects that the later British Emergentists dubbed ‘emergent.’ – from Emergent Properties

____________________________________________________

NOTE:  Anyone who has worked in inorganic chemistry, knows that reactions of powerful bases (such as sodium hydroxide) and equally powerful acids (such as hydrochloric acid), are not a simple chemical uniting of two compounds with no noticeable  consequences except for the inert product of salt water!  On the contrary, this reaction is accompanied by a massive release, with an explosive force, of energy stored in both in the strongly active acid and the strongly active base.   For a chemistry students to mix any substantial undiluted quantity of these two chemicals together is to take their very lives into their hands!  That is why the proper reaction equation is written as follows:

NaOH + HCl → NaCl + H2O – ∆H

Where ∆H is the subtracted exothermic Heat of Neutralization released by the chemical reaction.

It has always amazed me why this term ∆H is almost always left out of textbook equations. I can only assume that this heat of neutralization is not easily explained in the simplistic model of mechanistic reductionism.

____________________________________________________

Besides identifying the concepts of homeopathic laws and heteropathic laws,  Mills was the first to identify the emergence of new laws at the heteropathic level and to suggest that the homeopathic causal laws may be supervened (i.e. be counteracted by)  the newly emergent heteropathic causal laws.  He wrote:

Those bodies continue, as before, to obey mechanical and chemical laws, in so far as the operation of those laws is not counteracted by the new laws which govern them as organized beings. (1843, p. 431)

Returning to our original question, why was emergence theory not embraced and applied as a major paradigm by scientists until the late 19th century and early 20th century?  The answer to this question lies, it would appear,  in the fact that science since the 17th century has been dominated by the doctrine of materialist reductionism.  As its name implies materialist reductionism rests on the concept of matter.  To understand what the concept of matter meant in the 17th center and what it means today, we must examine the evolution of that concept.

A Brief History of the Concept of Matter

Since the birth of the New Science during the Renaissance, science has undergone many fundamental changes in its concept of matter, as well as the related concepts of substance, continuity, extension, energy, causality, space, time and infinity.  These fundamental ideas and their history underlie the history of scientific doctrine from Aristotle through Newton, Descartes, Galileo, Kepler, Bruno as well as the modern views of such physicists as Einstein, Bohr and Plank.  It is our concepts and their definitions that determine how we will think about a given subject matter.  This principle applies as well to the concepts embraces by scientists across the ages and into the current era.

On Reductionism:  Leibniz contra Newton:

An excellent and profound book that deeply studies the history of the concepts underlying the philosophy of nature is The Nature of Physical Existence by Ivor Leclerc.  In this book one finds, for example, an important discussion (relevant to the topic of emergence) of the contrast between Newton and Leibniz on the relationship between the character of an aggregate and the character of its substances (atoms).   Although Leibniz agreed with Newton that the character of the whole ‘must arise out of or result or derive from the character of its constituents;’ Leibniz disagreed with Newton’s view that ‘the constituents must necessarily have the character of the whole,’ and argues instead that ‘the constituents must have [only] such a nature that the character of the whole can be derivable from the constituents.’  Thus, Leibniz asserts that ‘The constituents of the whole can have ultimately different characters, but the character of the whole can nevertheless be derivable from that of the constituents. ‘  (Quoted from pp. 246-247 of The Nature of Physical Existence. Emphasis added.)

In the current context regarding the question of the nature of the apparent solidity of material objects and the issue of their reducibility to the properties and laws of their infrastructure, Leibniz would emphasize that any property found in the aggregate or whole, such as solidity, is not necessarily a character of the constituent atoms from which the character of the whole is derived or, as we would say, emerges.  But the views of Leibniz, unfortunately, did not prevail.  If 17th century philosophers of science had discovered the sophisticated concepts of emergence and understood the ontological and epistemological limitations of materialist reductionism they may have been spared the hundreds of years of confusion regarding simplistic reductive analysis and doctrinaire materialism.   Instead, it was Descartes’ simpler view of matter as res extensa  that prevailed and dominated 17th century science (as well as philosophy) surviving even into the early 20th century.

Cartesian Dualism and Res Extensa

The idea that the ‘real stuff’ of the universe is essentially ‘continuous extendedness’ and that the universe is a plenum (full and having no emptiness or vacuum), was first ordained by Rene Descartes in his dualist doctrine of the physical and the mental ‘stuff’ of being (in Latin: ‘res’) that composes the universe.  The two Cartesian substances are res extensa (physical, material being) and res cogitans (the being of thought or cognition).  According to Descartes’ dualist doctrine, what characterizes res extensa essentially and universally is that it manifests extensiveness (geometric spaciality).

Descartes’ Mathematization of the Real (res)

Descartes, being a leading mathematician of his day, thought of reality in mathematical terms.  Thus the geometric qua mathematical nature of res extensa.  Although  res cogitans  and res extensa existed as two incompatible forms of being, Descartes argued that they are integrated through the cogito, in that the mathematical nature of res extensa can be grasped abstractly as a mathematical idea by res cogitans, even though res cogitans is powerless to interact with and change or be changed by res extensa.  Thus began the mathematization of science, where mathematics is reified as an active principle uniting mind and matter in a manner that transcends them both.

Descartes’ Mathematization as Cusanian Neoplatonism:

This is all, of course, the doctrines of Plotinian Neoplatonism (first formulated by Plotinus and the dramatically elaborated by Nicolas Cusanus) deeply informing the philosophy of Descartes, as it deeply informed most of the philosopher/scientists of the Enlightenment.  And thus began the fatal dualism of mind and world, that has haunted science from its beginnings.

From Extended Matter to Particulate Matter:

The dominant view of modern (pre-quantum) physics was that the universe consisted exclusively of particles all obeying the same simple Newtonian laws.  This assumption was (and still is) the basis for doctrinaire physicalist reductionism.  This premise, in its strongest form, was most clearly and uncompromisingly articulated by one of its critics, C. D. Broad:

[There] is one and only one kind of material. Each particle of this obeys one elementary law of behaviour, and continues to do so no matter how complex may be the collection of particles of which it is a constituent. There is one uniform law of composition, connecting the behaviour of groups of these particles as wholes with the behaviour which each would show in isolation and with the structure of the group. All the apparently different kinds of stuff are just differently arranged groups of different numbers of the one kind of elementary particle; and all the apparently peculiar laws of behaviour are simply special cases which could be deduced in theory from the structure of the whole under consideration, the one elementary law of behaviour for isolated particles, and the one universal law of composition. On such a view the external world has the greatest amount of unity which is conceivable. There is really only one science, and the various “special sciences” are just particular cases of it. – (from The Mind and Its Place in Nature, London: Routledge & Kegan Paul, first edition.1925, p. 76,  quoted in Emergent Properties by Sanford Encyclopedia of Philosophy )

From Extended Matter to Solid Matter:

Dissatisfied with Descartes’ res extensa entirely understood as geometric extension, other Renaissance scientists (especially Pierre Gassendi, Issac Newton, Robert Boyle and Robert Hooke) soon realized that extension was not sufficient to characterize the physical substance of reality and demanded the addition of the concepts of atomism, impenetrability, continuity, and force.   (Source: Ivor Leclerc, 1972.)  All four of these latter ideas became part of the doctrine of ‘solidity’ (atomism, substantial continuity & impenetrability). Thus the modern view of ‘solidity’ was expressed clearly by Sir Arthur Eddington’s lecture, which reflected the transition from the original concept of ‘solidity’ as an empirical fact, to the modern scientific explanation of ‘solidity’ as a property emerging from the scientific theory of molecular substances as composed of atoms, with the additional assertion that these atoms, by virtue of their universally present outer shells of electrons, exhibit mutually repellant negatively charge across the entire outer ‘layers’ or ‘shells’.

Field Theory as Non-Physical, Non-Particulate

What is often glossed over, as if it does not represent a fundamental problem in the physicalist interpretation of the natural world, is the concept of a field.   This concept was introduced to physics with the discovery of electromagnetic phenomena.  In order to explain how magnetism effects electrons (or any charged particles) without action-at-a-distance (forbidden by strict materialism), it was hypothesized that electrical and magnetic phenomena distribute their causal forces (energy) as specially extended fields permeating outwardly from the center of the electromagnetic object.

But fields do not seem to have mass, nor do they physically occupy space.  They are, in the words of Faraday, ‘real but not physical.’  By ‘real’ Faraday meant that they have an ontological basis.  By ‘not physical,’ Faraday meant that they are not reducible to physical particles or aggregates of such particles.  They are not reducible to particulate ‘things.’  Fields are the space (the locus) in which forces have influence.  Thus fields are a spacial mapping of the range of forces interacting between entities manifesting, in this case, electromagnetic properties.   Modern quantum mechanics, in fact, has reversed the ontological relationship between ‘things’ and fields. As one advocate of quantum physics has expressed it:

Fields are not things and fields are not made of things. It is the other way around: things are made of fields. — Art Hobson University of Arkansas, Response to “The Real Scandal of Quantum Mechanics” by Richard Conn Henry [Am. J. Phys. 77 (10), 869–870 (2009)]

Physicalism as Pure Cartesianism Omitting res cogitans

So physics has moved gradually away from the particle model, as a concentration of mass at a specified localization, to the notion that even the particle is a field vector whose boundaries are merely those of its capacity to act or be acted upon within an extended spatial region.   The particle is no longer a particle as a constitutive ‘atom’ of construction or configuration of material being.  So there is not minimal physical that constitutes the natural world.  Thus, the physicalist (materialist reductionist) program has failed.  But the physicalist cannot accept this verdict.  From his perspective, the failure of monistic materialism must require dualism.  Specifically Cartesian dualism.  Yet it is not Cartesianism that physicalism opposes.  In fact physicalism grew out of and continues to embrace Cartesian res extensa. What physicalism rejects is Cartesian Dualism.   It a word, physicalists want to have their Cartesianism and eat it too:

The readiness with which any challenge to physicalism is turned into an accusation of dualism shows how much the physicalist program is formulated within a Cartesian framework. For all its guise of contemporaneity and scientific seriousness, physicalism is simply Cartesianism with one of his two kinds of substance lopped off. That is just another manifestation of how unfortunate has been the framing of issues about basic ontology in terms of a Cartesian mental/physical dichotomy. Further, it shows how deeply a substance metaphysics is entrenched, so that it has become difficult even to conceive how science could proceed if explanations ultimately do not refer to kinds of basic particulars (particles) and their properties.– from Physicalism, Emergence and Downward Causation by Richard J. Campbell and Mark H. Bickhard, p. 12.

From Quantum Physics to Emergence

It is important to ask ourselves to what it is that modern (pre-quantum) physics was attempting to reduce all natural phenomena.  Here again, the answer generally is to Cartesian res extensa, mere extended matter as the primary substance of the physical world.  As we showed above, the Cartesian res extensa was forced to take on the properties of particulate matter having important properties such as shape, impenetrability and incompressibility. Thus the ultra-structure of the physical world was eventually held by physicists to consist of  elementary particles.  But modern quantum physics has shown that there is no coherent way to hold that elementary particles (or even sub-particles) form the absolute basis for all natural phenomena and laws in the face of the facts marshaled by both quantum mechanics and Einsteinian relativity, especially special relativity.  It appears, therefore, that elementary particles cannot be the real ultra-structure of the physical world, and that substance metaphysics (physicalism) must be abandoned.  This seems to imply that whatever the postulated level of ontological organization that underlies all of the physical world may be,  it is logically possible for that level to be itself emergent.

Once we have made the conceptual shift required to free ourselves of the age-old prejudice of a substance metaphysics – of which particle metaphysics is the most recent manifestation – it is far from clear that there is any basic level. That is, there might well be no fundamental plane of organization, ‘lower’ than which it is not possible to go. Still, this much at least is clear: if the organization at some level cannot be shown to be necessary , then one could never have logical grounds which ensure that that level of organization was not itself emergent from a yet more basic one.  — from Physicalism, Emergence and Downward Causation p. 10.

From Materialist Reductionism to Hierarchical Emergence

An important and related concept is that of the emergence of properties of  systems as transcending the properties of the  entities of which those systems are composed.  In other words, independent  entities that make up a system can demonstrate phenomena that are quite different from those that emerge when those same entities are organized and interact to form that system.

Many phenomena in physics demonstrate that entities are a different thing when they stand alone and when they take part in the creation of a system.  These evidences are frequently brought up not only by physicists working on solid state physics and condensed matter physics (e .g . Anderson 1972, Leggett 1987, Laughlin 1998, Healey 2010) . Recently, research in quantum theory has also provided similar evidence (e .g . Healey 1991, Silberstein – McGeever 1999, Kronz – Tiehen 2002, Hüttemann 2004) .  – from Searle on Emergence by Vladimír Havlík

Another related but important separate principle associated with emergence is that emergent properties and laws may not depend on the laws of the particular infrastructure from which they emerge, but rather that some would emerge as the same properties and laws even if the infrastructure from which they emerge were different.

We also know that while a simple and absolute law, such as hydrodynamics, can evolve from the deeper laws underneath, it is at the same time independent of them, in that it would be the same even if the deeper laws were changed. [Emphasis added.]

Thinking through these effects seriously move one to ask which law is the more ultimate, the details from which everything flows or the transcendent, emergent law they generate. –Robert B. Laughlin from A Different Universe: Reinventing Physics from the Bottom Down, paperback edition, p. 207.

Some of today’s physicists argue that materialist reductionism is applicable only to limited cases, and that even the fundamental phenomena of physics, such as charge, magnetism, density, wave, gravity,  and even space itself, are not reducible from classical (nor even quantum) physics, but are instead emergent phenomena which possess their own laws that transcend those of their underlying infrastructure. See in particular, A Different Universe by Robert B. Laughlin, winner of the Nobel prize in physics for his work on the quantum Hall effect.   Here is one of my favorite passages from his book:

“The myth of collective behavior following from the law is, as a practical matter, exactly backward. Law instead follows from collective behavior, as do things that flow from it [i.e. law], such as logic, and mathematics. The reason our minds can anticipate and master what the physical world does is not because we are geniuses but because nature facilitates understanding by organizing itself and generating law.”  [Emphasis added.] – from Robert B. Laughlin,  A Different Universe  paperback edition, p. 209.

Laughlin is making the powerful (and to classical physicists, shocking) argument that the laws of physics are all emergent and can only manifest themselves when huge quantities of matter (atoms) aggregate in such a way that the laws will emerge. They are laws caused by the effects and consequences of organized aggregation (such as is found in solids, crystals, liquids, gases and plasmas). In sum, rather than being reducible (to underlying subatomic laws), the laws of physics (as well as chemistry and biology) are all emergent.

Laughlin’s position is not a species of prior certainty, nor a species of divine synchronization between mind and reality.  It is, on the contrary, epistemologically objective.  It holds that laws of nature are not subjective conventions invented by man to organize the contents of their thoughts, they are not arbitrary, nor pragmatic (having no ontological correspondence).  On the contrary, laws are the form in which a scientific mind identifies and retains the principles and causal nature underlying the observed and quantified regularity of physical, chemical, and biological phenomena.   In other words, ontologically, the laws are out there in the world and it is the job of science to grasp these laws and hold them in conceptual form, a form demanded by the nature of human cognition.

Platonic Essence vs. Aristotelian Substance

To be clear here, lest we be confused with idealists, rather that than the realists we are, laws are not reified abstractions inhabiting the world in the form of Platonic ideas (or ideal forms).  Laws are principles underlying the nature of existents but do not have some sort of separate ontological reification or separate transcendent existence outside of the natures of existents and systems of existents that manifest these principles, either as individual existents or by virtue of their relationships to other existents.   This is the Aristotelian view, essences, principles, laws, and universals exist in concrete things and in their relationships in this world.  Laws are not ontological idealizations existing in some other realm, such as Plato’s realm of ideas.  Nor do they exist only as mathematical models or idealizations of the world.  Rather mathematical models are the epistemological tools and means by which humans seek to understand and conceptualize the laws of nature.

Our point is that mergence theory rests on a realist view not an idealist view of nature.  In contrast with Platonic idealism, emergence theory is much more compatible with Aristotelian realism with its view of nature as a manifest hierarchy, and of substances as directly knowable through our senses, whose power we know how to sharpen and how to properly use to explore the world around us and build upon conceptually, through inductive and deductive forms of reasoning, yielding true scientific knowledge.  Through his conceptual capacity man is able to inductively and experimentally discover in the world its natural hierarchical structure and dynamics and its corresponding hierarchy of natural laws, each operating at specific levels of that hierarchy.   Emergence theory holds that that hierarchy is emergent rather than intrinsic or reductionistic, and that its corresponding laws are emergent as well,  being independent of the laws of their sub hierarchies upon which they rest, but upon whose laws they are not necessarily reducible, either ontologically or epistemologically.


References

(in order of citation)

  1. Timothy O’Connor and Hong Yu Wong, Emergent Properties in Stanford Encyclopedia of Philosophy, 2010
  2. Ivor Leclerc, The Nature of Physical Existence, Muirhead library of philosophy (Hardcover), Humanities Press 1972.
  3. C. D. Broad, The Mind and Its Place in Nature, International Library of Philosophy 1955
  4. Richard J. Campbell and Mark H. Bickhard,, Physicalism, Emergence and Downward Causation lehigh.edu, 2010
  5. Vladimír Havlík, Searle on Emergence The Academy of Sciences of the Czech Republic, Prague, 2012
  6. Robert B Laughlin, A Different Universe: Reinventing Physics from the Bottom Down Basic Booka, 2006

Published on: Aug 27, 2014 @ 4:15
Copyright  ©2014, 2015 by Jack H. Schwartz (a.k.a. bioperipatetic).
 All rights reserved. 
Latest Revision: October13, 20122 @ 6:18 pm

2 Responses to Emergence of Solid Matter

  1. e says:

    and the emergence of space and time?

    Liked by 1 person

    • ‘e’ you make an important point, implied by your comment: ‘and the emergence of space and time?’ Views of modern scientists, especially Einstein, totally reassess the classical Newtonian ideas of space and time. Newton thought of space as absolute and the container of all celestial objects. Time, like space, was equally absolute and independent of mass and matter. Thus things were said to exist in space and time. Einstein argued, in his General Theory of Relativity’ that space is a product or property of matter and not properly to be regarded as a container of matter. Matter may shape space itself. Time and space are held to be relative and not absolute cosmic principles. The relativity of time was thoroughly defended in Einstein’s Special Theory of Relativity.’

      Einsteins ‘General Theory of Relativity’ was developed to address the problem of gravity, especially gravity’s Newtonian ‘spooky’ action-at-a-distance doctrine. Einstein, by arguing that gravity is the constraint of motion in the vicinity of matter (especially large aggregated matter in the form of celestial bodies). Celestial objects, by virtue of their having mass alone, cause gravity wells which change the very fabric of ‘space’. Gravity is then the effect of spacial ‘wells’ that force objects to appear to ‘fall toward the bottom of gravity wells’ when in fact, space is merely constraining and ‘guiding’ the motion of objects or particles near other objects. Objects therefore cannot and do not move in strait lines through space, for space is itself a product of mass which itself constrains and controls the passive motion of objects. Moving through straight lines in absolute space are reduced to rational mathematical concepts inherent in Euclidian geometry. Einstein would argue that Euclidian space and natural orthogonal lines do not apply to the real world (at the cosmic or galactic level) but are useful enough to solve local geometric problems where the influence of relatively small mass upon local space may be safely disregarded.

      In this sense, this Einsteinian sense, we may truly say that both space and time are relative to mass and emergent in that space is shaped by mass and time is relative, even in gravitational fields.

      Liked by 1 person

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.