I remember making a conscious decision to become a scientist instead of a philosopher because the former offered the opportunity to test ideas using the scientific method.
Once I began my formal training as a working scientist, the most difficult thing beyond formulating a testable/refutable hypothesis was devising controls for the experiment, which in theory seemed trivial, but in practice was challenging.
In order to perform a scientific experiment, it must be controlled. But why? Functionally, controls compensate for methodologic and biologic variation, but controls may have a much deeper philosophical significance.
Controls acknowledge the relativistic nature of our existence, which theoretical physicist David Bohm expressed as the Explicate Order, perceived subjectively through our evolved senses. Bohm thought that there was an ideal world beyond our subjective experience – the Explicate Order – which he termed the Implicate Order. Experimentation is the only quantitative way of transitioning from the Explicate Order to the Implicate. And because of the subjective nature of the Explicate Order, reference values, or controls must be used to compensate for such subjectivity in order to successfully make this transition.
The scientific method has come under fire of late for being unable to solve big problems in Physics and Biology with regard to their ultimate origins and causation
But where did that concept of controlling an experiment start? It is largely attributed to Francis Bacon, who published the Novum Organum Scientiarum in 1620. But it was Charles Pierce who redefined the Scientific Method in The Fixation of Belief (1877), and more recently Karl Popper who emphasised the necessity of being able to refute hypotheses.
But the scientific method has come under fire of late for being unable to solve big problems in Physics and Biology with regard to their ultimate origins and causation. Back in 2008 Chris Anderson, proposed in Wired magazine (The End of Theory: the data deluge makes the scientific method obsolete) that instead of hypothesis testing science, we should turn to informatics for such answers, based on the informaticist’s credo that if you haven’t answered the question at hand, you just need more data. That attitude subsumes that the world’s knowledge is a closed set. That is in direct conflict with Bohm’s thesis that there are two ‘orders’, the Explicate and Implicate.
The only way we have for determining the Implicate Order is through the scientific method, or hypothesis testing using controlled experimentation. Beyond the methodology, controlling an experiment is critically important for its success or failure. However, the rationale for controlling the experiment may not be obvious beyond needing to ensure the unbiased, objective observation and measurement of the dependent variable.
Nominally, controls, both positive and negative are material and procedural. But beyond such practical issues, there is a philosophical question. When you perform an experiment, it must be 'controlled' to avoid procedural and material artifacts. But perhaps the need for the experimental controls is a subliminal recognition of the relative nature of the Explicate Order as subjective, and that to transcend it and approximate the Implicate Order requires controls as 'reference points' so that the experiment can successfully disclose the underlying ultimate interrelationship between energy and mass. That is similar to what Peter Rowlands1 says about everything in the Universe adding up to zero, zero being an attractor in math. The same is true of biology, the cell existing due to the negative entropy within it, appearing as zero energy relative to the entropy of the energy in the external environment.
Another example of the ‘subjectivity’ of our perception of nature is reflected in the level of accuracy we accept for differences between groups. For example, when statistical methods are used to determine the difference between control and experimental groups, a probability of less than or equal to 5% is conventionally considered to be statistically significant. The efficacy of that arbitrary convention has been debated for decades and was recently deliberated formally by the American Statistical Association2. Suffice it to say that despite questioning the validity of that convention, the statistical difference of p< 0.05 remains the standard, reflecting our inherent acceptance of the subjectivity of our perception of reality.
From control to consciousness
The purpose of the scientific method is to understand ourselves and our place in the Cosmos objectively. Conventionally, we subscribe to the anthropic principle, that we are ‘in’ this Universe, whereas the endosymbiosis theory, advocated for by American evolutionary theorist and biologist Lynn Margulis, dictates that we are ‘of’ this Universe as a result of the assimilation of the physical environment. This means when faced with an existential threat, the organism will endogenize that factor and make it physiologically ‘useful’, like iron as the core of a hemoglobin molecule, or bacteria as the origin of mitochondria.
By applying the developmental mechanism of cell-cell communication to phylogeny3, these interrelationships have been revealed, seeing evolution from its origin in the unicellular state, evolving multicellularity via cell-cell communications. The ultimate realisation of this systematic approach to evolution is that consciousness is the product of cosmology as the physical origin of both matter and life.
The empiric approach is like an equation in which the observed constituents react to form a product, like E=mc2, this equation setting forth the idea that the whole of the cosmos equals zero. Similarly, in considering various reactants, they form products, whether they are chemicals or cells. And in either case their relative reactivities can be expressed as ‘valences’ like those in the Periodic Table of Elements.
Parenthetically, the genius of Mendeleev’s Periodic Table was not just in his use of Atomic Number to organize it, but he also took into consideration the chemical reactivities of the Elements4. In the mind’s eye, by introducing chemical reactivity to the Periodic Table of Elements, Mendeleev was transcending space-time, rendering the algorithm free of the subjectivity of Alchemy. That is like the ‘fourth wall’ in Drama, providing the audience an omniscient, god-like perspective for what is happening on stage. Or like a Henry Moore sculpture with a hole in it, asking the viewer to ‘decide’ which to focus on, the solid matter or the space within it? Or a jazz pianist, playing ‘between the cracks’, asking a similar question about reality. In all of these modalities, we are transcending the Explicate Order, moving slowly toward the Implicate Order.
The Big Bang and physiology
Empiricism originated in the Big Bang, which occurred some 13 billion years ago, blowing up the Singularity, generating a cosmos full of dualities. Chemically, such dualities are resolved through reactions that restore the universal balance between energy and mass. Biology similarly resolves dualities because the cell originated in the ambiguity of negative entropy within its cytoplasm5.
It is because of the cell’s capacity to ‘recognise’ the ambiguity between life and matter that it can reconcile existential threats through endogenisation in order to evolve. This integral relationship between life and matter is seen in the epigenetic mechanism of inheritance of biologic traits directly from the environment. The organism interacts with its surroundings and assimilates them as epigenetic ‘marks’; such marks modify the DNA of the egg and sperm of the host, and thus change the phenotypic expression of the offspring. The offspring, in turn, interact with the environment in response to such epigenetic modifications, giving rise to the concept of the phenotype as agent6, the agency deriving from this relationship in which the organism is actively and purposefully interacting with its environment. This concept of phenotype fundamentally differs from the conventional description of the phenotype as merely a set of biologic characteristics, rendering the germ cells of the host as the primary level of selection, not the reproductive ability of the adults as depicted by Darwinian evolution.
In reverse-engineering this process from the multicellular back to the unicellular state it has been realised that biologic diversity is actually an artifact of the Explicate Order, whereas the unicell is the primary level of selection in the Implicate Order, striving to remain faithfully proximate to the first principles of physiology7. Think of it like the Red Queen in Alice in Wonderland, running as fast as she can to remain in place.
That perspective offers insight to the true nature of consciousness, not as ‘mind’ separate from ‘body’, but as cosmology made organic by the endogenisation of physical matter, which complies with the laws of Nature. In other words, our consciousness is the physiologic manifestation of our endogenised physical surroundings, compartmentalised, and thus made relevant to our being by forming the basis for our physiology.
This way of thinking about consciousness offers insight to our literal awareness of our physical surroundings as the basis for our self-referential self-organisation. We exist due to the formation of micelles from the lipids immersed in the snowball-like asteroids that formed the oceans on our primitive Earth8, the semi-permeable membrane providing access to the cytoplasm for the uptake and integration of physical things like heavy metals, ions, gases and bacteria to form our being8. The ‘rules and regulations’ for our physiology were innate to those physical traits as laws of Nature, thus forming the conduit from our insides to the cosmos9. Such homologies derive from the common origins of matter and life alike in the Big Bang. It can be speculated that the template for life was the singularity that existed prior to the Big Bang since both it and the cell are point sources10.
There has been a plea for a recombining of philosophy and Science11, which parted ways due to the rise of empiricism. But this article recognises the centrality of experimentation for the advancement of thought and the attendant advance in philosophy needed to cope with so many extant and emerging issues in society. But we need a common ‘will’ to do so. The rationale is provided herein, if only.
- Rowlands P. The Foundations of Physical Law. Singapore, World Scientific Publishing, 2015.
- Wasserstein RL, Lazar NA. The ASA’s statement on p-values: context, process and purpose. The American Statistician 70: 129-133, 2016.
- Torday JS, Rehan VK. Evolution, the Logic of Biology. Hoboken, Wiley, 2017.
- Scerri E. The Periodic Table: Its story and significance. Oxford, Oxford University Press, 2006.
- Schrodinger E. What is Life? New York, Macmillan, 1944.
- Torday JS, Miller WB. Phenotype as Agent for Epigenetic Inheritance. Biology (Basel). 2016 Jul 8;5(3).
- Torday JS, Rehan VK. Lung evolution as a cipher for physiology. Physiol Genomics. 2009 Jun 10;38(1):1-6.
- Deamer D. The Role of Lipid Membranes in Life's Origin. Life (Basel). 2017 Jan 17;7(1).
- Torday JS, Miller WB Jr. The Cosmologic continuum from physics to consciousness. Prog Biophys Mol Biol. 2018 Dec;140:41-48.
- Torday JS, Miller WB. The Unicellular State as a Point Source in a Quantum Biological System. Biology (Basel). 2016 May 27;5(2).
- Maximillian N. The Metaphysics of Science and Aim-Oriented Empiricism. New York, Springer, 2018.
Author: Professor John S. Torday is Professor and Director of The Henry L. Guenther Laboratory for Cellular-Molecular Biology at Harbor-UCLA and a Professor of pediatrics at the Los Angeles Bio Medical Research Institute. He is also Professor in Residence for Pediatrics, Obstetrics and Gynecology, and Evolutionary Medicine at UCLA.