If television were the prime cause of illiteracy, then the remedy would be simple: Turn it off.

http://www.nadin.ws/archives/3029

Rethinking the experiment: necessary (R)evolution

AI & SOCIETY Journal of Knowledge, Culture and Communication 1435-5655 (Online) Online PDF

AI & SOCIETY
Journal of Knowledge, Culture and Communication
1435-5655 (Online)
Online PDF

Received: 2 February 2017 / Accepted: 1 March 2017
© Springer-Verlag London 2017

Abstract

The current assumptions of knowledge acquisition brought about the crisis in the reproducibility of experiments. A complementary perspective should account for the specific causality characteristic of life by integrating past, present, and future. A “second Cartesian revolution,” informed by and in awareness of anticipatory processes, should result in scientific methods that transcend the theology of determinism and reductionism. In our days, science, itself an expression of anticipatory activity, makes possible alternative understandings of reality and its dynamics. For this purpose, the study advances G-complexity for defining and comparing decidable and undecidable knowledge. AI and related computational expressions of knowledge could benefit from the awareness of what distinguishes the dynamics of life from any other expressions of change.

Keywords

Experiment – Reproducibility – Decidability – Non-deterministic – Anticipation

1 Preamble

The scientific community owes respect to the Center of Open Science and to the Science Exchange for the “Reproduction Project: Cancer Biology.” The first reports [editorialized in Nature (Replication studies offer much more than technical details 2017), and The Scientist (Williams 2017), January 18, and eLife (Kandela et al. 2017) January 19, 2017] have produced experimental evidence for the view-point to be expressed herein—the crisis is not that of methodology, but of perspective. The findings need to inform the realization that the dynamics of the living (cancer is a form of change) and the dynamics of the physico-chemical (non-living) are different. So is causality.

Empirical evidence from evolutionary outcomes led to the conclusion that the phase space of the living is continuously changing (Longo 2017). By the way, in the replication attempts made public so far, scientists took note of the cell line genetic drift, in addition to the variability of the microbiome (for example). The variability of tumor formation assays in-vivo is also related to the variability of the phase space of the living (healthy or not). Furthermore, the complexity threshold for the living is the undecidable (Nadin 2016a, b, c, d). Henceforth, the expectation of experiment reproducibility—legitimate in the decidable domains of the non-living (physics, chemistry)—is a goal set in contradiction to the nature of the change processes examined. Evidently, when tumors grow too quickly or too slowly (Horrigan et-al. 2017)—given the non-deterministic nature of the processes examined—replication becomes uninterpretable. The only replications that partially met the expectations pertain to the physico-chemistry level (Mantis et al. 2017)—peptides that penetrate tumors and are useful for targeting affected cells. But this process (tissue penetration) is below the decidability threshold that characterizes the living. Let it be noticed also that none of the experiments—five Replication Studies have been published so far—and attempts at replication have paid any attention to the holistic nature of living dynamics.

The Reproducibility Project, representative for the crisis of reproducibility—to be further discussed in the body of this paper—generated a large body of evidence for alternative views. But nobody seems interested in alternatives Instead of feeling sorry for themselves because the expectation of repeatability cannot be met, those scientists still in the grip of the “theology” of the reductionist-deterministic view of the world should use their impressive knowledge in order to advance an understanding of the living—in particular the cancer affecting it—grounded in its condition. An editorial in Nature contrasts normal science and revolution in science. The time of the normal is way over.

2 A Ptolemy moment

To save science (Sarewitz 2016), when it is needed more than ever, requires the re-examination of some fundamental assumptions informing scientific activity. Failed reproducibility—not only in cancer research—is symptomatic of the crisis in science of this hour. In biomedical sciences (Goodstein 2002; Horton 1999, 2015), or, notoriously, in psychology (Clay et al. 2015), irreproducibility affects more than the validation of experiments. As a matter of fact, the experimental method in its standard formulation becomes questionable. If science continues on the same path without questioning the premises that led to the current breakdown, it will self-destruct (Bailey 2016). It is true that “Science has been peculiarly resistant to self-examination” (Ball 2016). However, the “metric incentives”, i.e., quantifying goals and rewards, will not change the situation. They would further instrumentalize a questionable perspective. The consequence of “no publication without confirmation” (Mogil and MacLeod 2017) would not be better. This crisis is not about how serious and responsible scientific publications are; it is rather about whether the expectation of experiment confirmation makes scientific sense. The situation in which science finds itself is comparable to the one that corresponds to the “flat Earth” view, which even after Ptolemy’s H? math?matik? syntaxis (the Almagest) debunked it (by interpreting data accumulated by others), retained some currency.

3 Data regarding the reproducibility crisis

The still unfolding Reproduction Project is, like all other failed experimental attempts, a good source of data for those willing to break from the dominant, prejudiced model of knowledge acquisition exclusively through experiment. Awareness of how experiments fall prey to epistemological circularity is bound to increase. Using tools (sensors for data acquisition, data processing methods and procedures, etc.) that themselves carry conceptual assumptions falsifies the data, and thus prompts conclusions not based on how things are, but rather on how they are represented. Just as an aside: the “MatLabization” of experimental science shows how tools are shaping the acquisition and dissemination of knowledge. Pre-developed tools for the production and the analysis of large data sets carry with them assumptions which might have been validated for physical phenomena but are not necessarily adequate for biology subjects. The so-called “blind spots” are the result. Therefore, some of those studying living processes prefer to work on the “raw data.” This subject—how data are falsified by our own use of a variety of tools—will preoccupy us as we try to understand not only what undermines the reproduction of experiment, but also why, for certain aspects of reality, the experiment might have to make room for complementary forms—including empirical observation—of knowledge acquisition.

Against the background of successful technological innovation grounded in physical science, the life sciences, while seeking legitimacy in the guise of chimeric experimental replication, are delivering below expecta-tions and societal need. If the object of the experiment is the physical substratum, reproducibility can be expected, provided that the experiment is properly designed and carried out. For instance, one could rewire a genome (introduce new links between unrelated genes), or measure an interaction between biological components with a high degree of precision, similar to how a chemist would study the attractive forces between non-living elements. But such experiments contribute to knowledge of phys-ics or chemistry, not to the life sciences, whose object is change in the living. Moreover, the same is not meaningful for exploring, for example, protein folding or anticipatory genetic expression, or for that matter evolutionary dynamics. We shall examine why this is the case.

In contradistinction to experiments in “physics or astronomy or geology” (Chomicki and Renner 2016, Baker 2016)—knowledge domains identified as test provable by the vast majority (90%) of researchers—failed reproducibility occurs almost exclusively in life science experiments. From the many reports published, we learn that in particular domains, 80% of published results from researchers who earned the respect of their peers proved to be irreproducible. In one review (Pritsker 2012), findings from the biotech company Amgen are detailed. Up to 100 of its researchers attempted to corroborate data from 53 cancer research reports from well-known facili-ties, published in leading journals. Only 6 of the studies proved to be reproducible (i.e., ca. 10%). The pharmaceutical branch of the Bayer conglomerate had no better luck in seeking validation for research in oncology, women’s health, and cardiovascular medicine.

4 The origin of the problem

For the sake of clarity: this study will add close to nothing to what the scientific community is aware of by simply rehashing the record of failure that triggered the current crisis. The intention is to propose a different perspective, and to submit hypotheses for a different course of action. By and large, we shall follow the path of abductive reasoning

“The surprising fact C is observed” (Let C be the failed replication of the majority of experiments concerning the living.)But if A were true, C would be a matter of course (if the distinction living is different from non-living, which explains the successful replication of experi-ments concerning the non-living, i.e., physics and chemistry).Hence, there is reason to suspect that A is true. (The living changes in ways different from the non-living.)

The form of abduction invoked above, in the formulation that C.S. Peirce gave it in 1903, is “inference to the best explanation (Sober 2008)”. Peirce maintained that “All of the ideas of science come to it by way of Abduction” [5:145, Pierce (1932)]. Let us exemplify the manner in which these principles will guide us.

Thesis 1 Experiments intended to advance knowledge of the living are useful but not reproducible.

The following summary of research is meant as an argument (obviously, there are many more, but this study is not about their particular findings). No doubt, the distinction living/non-living is the critical aspect of this thesis. Just as a preliminary: experimental evidence, including (but not limited to) functional imaging studies, documents “neural specialization for nonliving and living stimuli (e.g., tools, houses versus animals, faces)” (Mahon et al. 2009). The same “specialization” holds true for the distinctions expressed in the dynamics of plants (different in respect to stones or metallic objects in their respective environments and in respect to other plants) (Baluska et al. 2006). The data extend to specific forms of interaction with non-living or living stimuli: “category preferences in adults who are blind since birth.” Research documents the “specialization” mentioned above, regardless of whether scientists adopt the distinction living/non-living or not: larger blood oxygen-level dependent (BOLD) responses in the medial fusiform gyrus for the non-living, versus differential BOLD responses in lateral occipital cortex for stimuli associated with interactions with the living. In short, the distinction is not based on visual perception. Domain-specific constraints shape the interaction, i.e., the knowledge acquisition process. Thesis 1 (see above) is the outcome of the abductive reasoning that informs this study. The consequences of ignoring the thesis are obvious.

The crisis of reproducibility has undermined the credibility of leading scientific publications (Nature, Science, eLife, PNAS, etc.)—none eager to address fundamental science. They literally censor contributions to the subject not in line with the views they promote. Screening by staff eliminates the opportunity for peer review. For all practical purposes, such publications have become self-styled newsletters for the extremely profitable industry of experiments. They preach to the choir instead of offering a platform for scientific debate. One example: lizard mobility framed within the momentum conservation principle of physics (Libby et al. 2012). A lizard-like robot looks good on a journal cover. It does not matter that the experiment does not address the real subject. The Radio Shack toy conveniently made into a “lizard” robot proves the physics. A 2-dimensional tail modeling confirms (through circular thinking) the false hypothesis. Anticipatory aspects of the natural lizard’s mobility were fully ignored. A review of this experiment, submitted to Nature, never made it past the screening.

The crisis undermined as well the activity of funding agencies (governmental or private). Some, such as the national Institutes of Health (NIH), rushed to issue new grant guidelines without understanding where the problem lies. Guided by a concept of knowledge acquisition generalized from physical science to the living, they disburse public money for more research equipment, but not for appropriate hypotheses. Stimulating alternative ways of thinking seems to no longer be part of their mission. Talent is wasted on servicing expensive machines for data acquisition instead of advancing new ways of thinking. And when the cheap labor of graduate students is not available, the jobs are outsourced to scientists in countries desperate to have access to new machines. I experienced this in several laboratories, but most painfully at the Bechtereva Institute of the Russian Academy in St. Petersburg, where providing service in data acquisition replaced the once respected original work on advancing hypotheses regarding the brain.

The American National Academies, as well as the British Academy of Medical Sciences, the Max Planck Institute and many other prestigious institutions, were prompted to examine the disturbing facts (Symposium Report 2015). A new field of inquiry focused on experiment replication and reliability before the experiments are carried out is ascertaining itself (Nosek 2015; Fehr et al. 2016). However, we know that “when a measure of success,” such as met-rics “becomes a target, it loses its significance” (Smaldino and McElreath 2016). Be this as it may, this crisis should not go to waste into more pseudo-science (based on the ever fashionable probability theory, Bayes, or some fancy mathematics) about bad science, or how to stifle science by further institutionalizing rules and regulations soon to become a goal in themselves. Without addressing the origins of the problem, the scientific community will only continue to reproduce the never proven assumptions upon which the majority of its activity is still carried out.

5 Addressing a systemic condition

The replication quandary is an opportunity not to be missed. The relation between various knowledge domains and the need to adapt research methods to the specific dynamics of the subject that scientists attempt to describe is an unavoidable subject. Opening an in-depth discussion of this subject, as we shall try herein, would be in many ways a promising beginning.

Data dredging, omission of null results (nobody wants to hear about them), underpowered studies, and underspecified methods or weak experimental design are symptomatic of weaknesses brought up in the discussion of replication failure. A small-size sample saves effort but undermines robustness. Open data, always desirable, more collaboration, automation—where human error can be avoided without altering the meaning of the outcome—and similar methodical suggestions ought to be considered. But ultimately they are not the answer. Therefore, now more than ever before the scientific community has to come to a rather sobering realization: eliminating weaknesses such as those mentioned will not change the systemic condition that resulted in failed replication in life sciences. The lack of reproducibility is only a symptom of a deeper reaching malaise: the refusal to accept alternative conceptions of reality and its extremely rich dynamic forms.

As opposed to the non-living, the living is endowed with control processes manifest at each of its levels—from cells to organism to interactions with the world. There is freedom at each level (large interaction space), and there are interactions expressed as constraints. What I describe here goes beyond the hierarchy theory (of Pólya, continued by Pattee and Rosen, among others). My preference is the Principle of Minimal Interaction (Gelfand and Tsetlin 1966): interaction among constituents at a lower level of the organism hierarchy follows the path of minimizing external input. For instance, in motoric expression, each joint is under its own neural control. Local interaction among elements is such that the outcome (motoric expression) is minimally dependent on the output of other elements. On the global level, the outcome of the structural unit (e.g., elbow joint) is minimized by changes in the output of other elements (the other joints in the kinematic chain).

To the best of my knowledge, nobody involved in the evaluation of the situation (by no means new) has issued a call to the scientific community to reevaluate the underlying assumptions upon whose basis knowledge acquisition and confirmation are pursued. The scientific community ought to come to the realization that experiments different from those that undergird the progress of physics, chemistry, astronomy, geology and the like are unavoidably non-reproducible. Science practitioners have tacitly accepted reproducibility since the early stages of the Cartesian grounding of the experimental method, i.e., the reductionist-deterministic model. Experiment was always congenial to inquiry; reproducibility a?rmed an expectation that became the epistemological premise: determinism. This was never a matter of philosophy, as some frame it, but one of the practical consequences. Replication of experiment, or for that matter of medical treatment, has become a matter of public concern because it is not about one or another scientist, or physician, missing the expected threshold of acceptance. This is about failed science in an age of higher than ever expectations, given the significance of knowledge in general, and, in particular, of the living, for the future of humankind. The critical re-evaluation of the epistemological premise is the only rational path left to pursue.

Indeed, machines can be built (and were successfully built) on account of physics and chemistry. They are supposed to be as pre-determined as possible. Their functioning is repetitive. But during the timeline of the machine revolution, the understanding of life has improved only slightly. Everything that lends itself to the building of yet another machine can be reproduced—that is what machines are for. But they are not science about the living. The lever is an extension of the arm, but not a theory of motoric expression, even less of muscles, tendons, bones, joints, etc. The artificial neuron inspired by the living neuron is a mathematical construct of extreme application poten-tial. But it is not knowledge about the neuron that inspired it, and it is not about any neuronal processes. Albeit, we are still in a rudimentary phase of scientific development regarding aspects of life (such as intuition, emotion, creativ-ity, spectrum diseases, etc.) associated with motoric expres-sion or neuronal activity (to mention only two aspects). At the same time, we benefit from the sophisticated machinery in production facilities or deployed in AI deep learning, inspired by the living neuron. One such machine beat the world champion in Go (a game more complicated than chess, but still of permutation choices in a finite space); another triumphed over the world’s top professionals in the no-limits Texas Hold’em Poker. AI also imitates the art of the masters of painting and composing. The procedure is artificial, no doubt about it, but is it intelligent?

Research, instead of speculation, is a shared choice that scientists made—giving science its impetus. Nevertheless, the expectation that research is best validated through reproducible experiments, no matter what the subject or purpose, became questionable. The empirical evidence accumulated suggests the need to re-examine this expectation. In view of progress in science, it is only logical to think that reductionist and deterministic explanations are begging for complementary perspectives. This in itself is an argument that cannot be ignored for the suggested re-examination. The understanding of what Newton called Nature, under which label he aggregated both the physical and the living, might prove as inadequate in our time as it was when it was articulated.

6 A question discarded

After vitalism was debunked, and replaced by “mechanism” (Williams 1992), science rejected the distinction between the living and the non-living. This rejection is quite surprising, since in science you do not discard a ques-tion because it was improperly answered, or because the alternative answers are not aligned with a dominant view. The foundational works of Walter Elsasser (1998), and Robert Rosen (1991), not to mention Erwin Schrödinger (1944), advanced views of nature different from those of Descartes and Newton and their followers. Their contributions were pretty much ignored at the time they were published. Elsasser and Rosen articulated arguments that were quite different in their perspectives: Elsasser within a physics-inspired perspective, Rosen in the mathematical language of category theory. Both deserve a closer look at this moment of questioning the research and validation methods of life sciences. They provided proof that the living is heterogeneous, purposeful, and anticipatory, as opposed to the non-living, which is homogenous, purpose free, and reactive. These distinctions have certainly earned the attention of scientists—even those who do not read any reports older than 5 years. If, indeed, to know is to be aware of distinctions—especially those of a fundamental nature, corresponding to their different dynamics—such distinc-tions cannot be eliminated by fiat—or worse, just ignored.

While physics and physics-based theories adequately describe the non-living, there remains a need for a complementary perspective that expresses the nature of life. What defines this perspective is the fact that the specific causal-ity characteristic of life is accounted for by integrating past, present, and possible future. The living changes in a way different from the non-living. The causality characteristic of the living is much richer than that of the physical, if for no other reason at least because the living is reactive (like the physical) and anticipatory (which the non-living is not). However, the scientific community, conditioned by an education set on the Cartesian foundation at the expense of any alternatives, is reluctant to accept this. Moreover, the description of change in the living calls for particular means and methods to properly capture it. Measurement and, by extension, the limited model of experiment appro-priate for describing physical non-living entities return incomplete, and at times confusing, knowledge when applied to capturing life change.

Causality, i.e., how and why things change regardless of their specific nature, proved to be richer than what classical determinism ascertained. Under “things” belongs not only a stone that eventually turns into sand, but also earthquakes, the sequence of seasons, the day-and-night cycle, women’s monthly cycle, the way humans think, the “intelligence” of plants, the adaptive nature of the microbiota (to name only a few). The matter from which physical entities (not endowed with life) are made remains the same, subject to what the laws of thermodynamics describe, in particular increased entropy. The living, from the simplest unicell to the human being, is in an uninterrupted state of remaking itself, sui generis re-creation of its constitutive cells. It is negentropic. The re-making, i.e., renewal, of cells takes place at various rhythms: some are renewed almost daily; others over weeks, months, years, and others not at all.

Determinism, the characteristic causality of physical phenomena, is convincingly relevant to the physics and chemistry of the living. Nevertheless, its description through experimental data returns an incomplete explanation of the specific nature of the changing living. Just to present an example along this line: physical forces (e.g., pulls, compressions and stretchings, distortions) applied to a cell can further affect it, probably more than the inherited genetic code does (Picollo 2013a, b). Taking both physi-cal forces and the genetic code into consideration affords an understanding of cell changes that neither can deliver alone. Non-determinism, describing a relation between cause and effect that takes the form of a multitude of possible outcomes, sometimes contradictory, pertains to change as an expression of something being alive, influencing its own change. Changes due to physical forces applied on cells (think about cutting yourself with a sharp knife, or falling against a rock) and genetic processes governing dynamics are interwoven. There is no way to unequivocally predict whether a cell becomes cancerous or simply divides in a process of self-healing. This is neither randomness nor stochastic expression—place holders for the notion of non-determinism, which is almost never acknowledged in its full expression (different from randomness).


Fig. 1 The current state of an anticipatory system depends upon previous states, current state and possible future states

To know how the physical changes is to infer from a quantitatively described past state to a future state, under assumptions usually defined as initial conditions (also expressed numerically). Knowledge of the process underlies our ability to predict a future state. One description, sometimes rich in detail, captures the process. To know how the living changes requires more than the physics-based description or what chemistry, usually associated with life, ascertains. The empirical observation of changes in the living always leads to multiple descriptions, corresponding to multiple possible states, sometimes simultane-ous, none exclusive of the other. An adequate explanation of change in the living requires integration of inferences from past states with interpretations of the meaning of possible future states together with the possible paths to them. Anticipation is expressed in actions informed by possible future states (Nadin 2011) (Fig. 1). The domain of the possible is by no means less real than that of what is (the extant). The ontology of the living entails that of the possible. A suggestive analogy is justified here: potential energy is no less energy than that at work in a physical or living process.

The framing of change within the respective consequences, different in the physical and the living, is key to understanding their difference. The causality specific to interactions in the physical realm, transcending Galileo’s world, is described in Newtonian laws—action–reaction, in particular. It was further refined in relativity theory, and for the micro-level of matter in quantum mechanics. The causality specific to interactions in the living, which is purposeful, includes, in addition to what the laws of physics describe quantitatively, the realization of significance—What does it mean?—in connection to the possible future (meaning being ambiguous, of course). For instance, the question “What does it mean?” is the implicit question addressed in each synapse, and the ambiguity is resolved in the context in which the questions are posed. Photosynthesis is the outcome of mean-ingful processes, not of data processing at the molecular level. We can use numbers to describe the quantitative aspects of the process (or at least part of it). At the neu-ronal level, much guessing (in the ambiguity of meaning) takes place, sometimes felicitous, sometimes not. The same holds true for cell interaction. Protein expres-sion allows for electron transfer across membranes, but this does not make the process machine-like (as Dutton (2015) is inclined to describe it), and does not make elec-tron transfer a characteristic of life. The construct called number (to which we shall return) emerges at a higher level than neural activity or cellular expression. We are more precise in describing neural activity in terms of significant qualitative terms, corresponding to the variety of possible synapses, than in associating numbers to them. Furthermore, the construct called “machine” associated with the neuron is the outcome of mathematics. It is neither a characteristic of nor a metaphor for life.

For the sake of illustrating the thought we show how one from among very many types of neurons was abstracted into the artificial neuron and further subjected to the mathematics that describes its monotonic performance. No living neuron, regardless of which type, “functions” like the “machine” representing it.

As mentioned above, we can use numbers (for inputs, weights, etc.) to describe machines (Fig. 2), but in doing so, we need to realize that such descriptions (i.e., representations) are incomplete (the result of reductions). Of extreme significance is the fact that the living, in addition to the constraints of physics, or those associated with molecular chemistry, is subject to contingent rules of behavior, expressed as anticipatory action. This is usually brushed aside, or trivialized (as in identifying historicity and contingency (Desjardins 2011)).


Fig. 2 From the neuron (draw-ing) to the abstracted artificial neuron and to the “machine” neuron

7 How did we get here? The theology of reductionism

The consensus view was summed up in the Preface to a Committee Report to the National Academy of Sciences in 1970: “Life can be understood in terms of the laws that govern and the phenomena that characterize the inanimate, physical universe, and indeed, at its essence, life can be understood only in the language of chemistry.” This became the o?cial doctrine of how science would be carried out in the USA (comparable to the imposition of dialectical materialism in the Soviet Union). Those who still questioned the reductionist position (they were not fired or jailed for this) received an encouraging face-saving line: “Until the laws of physics and chemistry had been elucidated, it was not possible even to formulate the important, penetrating questions concerning the nature of life.” Philip Handler (1970), at that time president of the National Academy of Sciences, published these thoughts in the “Preface to Biology and the Future of Man—a National Research Council Report.” Thus, what used to be the dominant but not exclusive view among scientists became the dogma of science (like the “flat Earth” was until Ptolemy, and even after him). The 1989 Opportunities in Biology report (National Research Council 1989) obviously reflects a new, “state-of-the-art” (mentioning “recombinant DNA, scanning tunneling microscopes, and more”) focus on the molecular scale of life. Nevertheless, it remains aligned with the position adopted almost 20 years back. It should not surprise that the current state of the art in knowledge acquisition and dissemination pertinent to the living—another 30 years later—is evidently the consequence of the o?cial “theology” of science—a form of science fundamentalism. Of course, such a drastic characterization, which has to be extended to the ever-deepening machine reductionism, begs to be well grounded.

The machine model of reductionist determinism, made explicit in the Cartesian perspective, was the clock, followed by hydraulic, pneumatic, electric, and all kinds of engines. In our days, they were replaced, in their role as the model of the human, by the computer (the 1989 report takes note of it), and are expressed as algorithmic reductionism. It is quite useful to understand the process. Machines originate in the making of tools. Starting with the lever, the pulley, the inclined plane, machines were made, like tools before them, as extensions of the body itself. They were supposed to ease labor and make it more e?cient. All other machines came into existence as constructs of the same nature: imitate some human activity with the aim of aug-menting the output of labor; or, as with the computer, imitate the brain and its output. The making of something that seems to acquire qualities beyond those endowed upon it is a subject of dialectic—a reasoning method for establishing the truth of arguments. The Young Hegelians of the nineteenth century extended this discussion to the dialectics of ideas. They were aware of the fact that words, or numbers, or languages (including those of science) could take on the appearance of being independent of those who originated them. In discussing the Christian god, they also took note of the dialectic of making tools and relating to them: tools seemed more powerful than their makers, as though they were endowed with what appeared as superhuman qualities. The same dialectic of self-deception (i.e., they seemed more powerful than their makers, as though they had magical properties) was at work in creating social institutions and political systems. An inversion takes place: “God” was made by humans to explain what appeared as transcending the individual, or even the group sharing in the nam-ing. Such a construct is useful for explaining phenomena of a scale different from those involving direct interaction or human reasoning. After constructing the entity “god,” “God” appears to be the originator of those who made it (or wrote commandments attributed to it), as though it existed independent of the maker, moreover as the Maker of its makers. The concept of law, adopted by science, has the same origin.

The machine was meant to augment human abilities. It is an expression of applied knowledge, informed by human creativity. In performing better than the humans who made them, and even independent of them (the hydraulic machine, for instance, or the pendulum, or the computer), machines appeared as though they were the blueprint of life itself. The understanding of the heart, the lungs, or the brain as machines led to the inverse statement: the heart (or brain, or lungs) is nothing but a machine. Supervenience, in which physico-chemical characteristics of matter determine biological expression, is yet another view of the same origin. With the advent of the “machine of machines”—the computer—the human being became the tool of its own tool—a reduction. This understanding is objectified in ideology—the logic of our ideas—pretty much like that declared in the theology of physics and chemistry, to which the living has been reduced. Like the god concept, it was never proven that the reduction to physics and chemistry—given that the living is embodied in matter—or the reduction to machines, affords a better understanding of what life is, or that it represents more than analogy. The belief in a god issuing laws results in self-reinforcing ideas and actions: for instance, living according to commandments, or getting married, or finding peace with oneself. The recently proclaimed “miracle medicine,” extending life by 20–30 years (Vanderweele 2016; Li et al. 2016) exemplifies the idea (based on the just released account issued by the Harvard School of Medicine, even though the same has been known for a long time). The same process of self-reinforcement holds true for machines and physico-chemical reductionism: it is a placebo effect, which, incidentally, is a particular form of anticipation expression.

Concepts and views that document the continuity of questioning the premises that resulted in the current crisis of reproducibility, and implicitly in questioning a rather limited understanding of experimental evidence, make up a large body of contributions from scientists and philosophers. Let us take note of the fact that neither Galileo’s mechanics, nor Newton’s theory, nor Einstein’s, nor the still under-defined quantum mechanics is the outcome of experiment. That experimental evidence confirmed them is beyond question; after all, they are laws of physics.

The expressive power of deduction exceeds that of incomplete inductions embodied in experiment. Each of the descriptions mentioned corresponds to entailed processes in the world. That is the source of their predictive power: a future state can be inferred from a past state, provided that we consider initial conditions. Darwin, aiming at being the Newton of biology, based his evolution theory on ample empirical observations (his own as well as those made by others over time). Nevertheless, evolution is unentailed (Longo et al. 2012): one cannot pre-state a future evolutionary state, like physics pre-states the future position of a particle or of a rocket. Gould (1989) dealt with the contin-gency of evolutionary processes. The “replaying the tape” thought experiment he proposed, in reference to experimental data from the seas of the Burgess Shale, implicitly a?rmed that there are no biological laws. That is, biology is idiographic1, in contrast to physics, which is nomothetic2 (we shall return to this). Discussions of Gould’s views (Beatty 1995, 1997; Ben-Menachem 1997; Conway Morris 2003; Turner 2009) are relevant to what we discuss because they brought up experiments. One of them is the long-term evolution project [using evolving populations of E.coli bac-teria (Lenski and Travisano 1994)]; the other concerns macro-evolution (Losos et al. 1998). Without entering into details, let us take note of the fact that these experiments showed that the closed system of the experiment makes the expression of holistic dependencies characteristic of the open system of nature impossible. They are successful for reproducing premises, but not for documenting contingency, or for undermining Gould’s views. “Convergent evolution as natural experiment” (Powell and Mariscal 2015) addresses iterated evolutionary outcomes, conceding that histories can be generalized, but are not law-like generalizations.

1Pertaining to or involving the study or explication of individual cases or events.

2 Pertaining to or involving the study or formulation of general or universal laws.


Fig. 3 Knowledge expresses the open ended learning experience of the living

This in itself defines an epistemological horizon: in order to cope with change, including their own, humans describe it, hoping to find in the description clues concerning possible consequences. Literature documents how descriptions evolved from the pictorial shorthand associated with incipient humankind to language and to progressively more abstract representations (Nadin 1997), such as the languages of science.

8 Knowledge is representation

Knowledge, in its variety of forms (transitory, conjectural, implicit, explicit, instinctual, emotional, rational, etc.), is the outcome of learning. The living, in all its known forms, learns continuously. In some ways, evolution is the aggregate expression of how learning supports life in its continuous change. Knowledge acquisition is the expression of anticipatory action: the contingent, possible future explains the need for knowledge. The process is non-deterministic: it can, when the knowledge informing life actions is meaningful, reinforce life changes; or, when it is not appropriate, undermine them.

To explain how knowledge is acquired along the time-line of life is to pursue a never-ending spiral (Nadin 2016).The subjects observing the world are part of the world. Their own phase space (i.e., domain of observables) is variable. Therefore, as opposed to the physical (non-living) reality, we no longer have a reference system (the phase space). Indeed, each observation changes the world; not to mention that the world after a hypothesis was formulated (Newton’s physics, for example) is a different world altogether (Longo and Montevil 2013) (Fig. 3). Knowledge of physics or chemistry—that is, at a level of generality higher than that characteristic of cell activity, of tissues, of organisms, of individuals—is the expression of anticipatory action with regard to the living and non-living com-ponent of reality. Physico-chemical aspects of the interaction between the living and the world (once again: living and non-living) are part of a larger dynamics, in which they are necessary but not su?cient conditions for knowledge acquisition. In this process, in addition to processing what is, the living generates a new reality, i.e., representations, upon which it acts.

It is important to point out that invivo (real-time) whole cell recordings such as those performed by Constantinople and Bruno (2013) are a source of experimental data for confirming the hypothesis (Nadin 2013b) that the living not only receives sensory information, but also generates information. Sensory information processing in the neocor-tex is very similar to non-sensory information (Fig. 4). The aggregate outcome of the process is representation upon which the living actually acts.


Fig. 4 The living generates information

Representation, whether based on sensory or non-sensory information (propagated as excitation), is not neutral. In the representation, the represented is reduced to whatever is intentionally, or accidentally, of interest, i.e., to what is significant. The illusion that a representation, such as a number, is objective, independent of the representer, is the source of the many “religions” developed inside science over time. A second illusion is that representations are like mirror images of what they represent. In reality, each representation is also constitutive of what it references. Words make reality just as tools make reality. Time representation, as interval clocked by a machine, is one of the most obvious examples. Rosen (1999), fully dedicated to finding more adequate descrip-tions of life (such as his model theory), worked hard to prove that the origin of the measurement method cor-responds to the mystical understanding of numbers in Pythagoras’s idealism. Paradoxically, the idealism of the number (rather, the idolatry of numbers) undergirds the experimental method meant to reject idealism. In the broader epistemological perspective of science, the goal is to achieve objectivity: how to know the world independent of who tries to understand and describe it, and independent of the means to represent it. The fact that the description ultimately expressed in a language (natural or artificial) makes objectivity dependent on the relation between expressive power of language and its precision has to be taken into consideration. The more precise language of the 2-letter alphabet of Yes (1) and No (0, zero), “spoken” by digital machines, is more “objective” than the ambiguous natural language of a larger alphabet spo-ken by scientists expressively (one can even say artfully) describing natural phenomena.

In search of objectivity, the “knowing” subject (observer) had to be done away with. For physical processes, such as the falling of a stone or for that matter a lunar or solar eclipse, it does not matter who observes them. The equations, the sketch, the picture, the video, the animation, and the sentences describing them capture different aspects of the process. The outcome of experiments pertinent to the laws of physics describing them is independent of characteristics of human perception. The experiments a?rm an understanding of dynamics whose underlying time is nothing other than duration—between cause and effect. The gravity machine called “pendulum” (eventually packed into a clock) labels intervals, which cor-respond to diurnal time. It objectifies an understanding of time that assumes that there are no differences between the living and the non-living. Late in his life, Einstein realized the error of connecting time to the physical clock (i.e., a machine)—a position more or less consistent with the discovery of the limited speed of light (again, independent of whether it is observed by a machine or a human being). Living time experiences, not reducible to their physics, are not clocked. The continuous renewal of cells in the organism exemplifies the idea. The renewal processes are timed: which sequence of events (physical, chemical, or informational in nature) or which configuration (of matter, energy, symbols, etc.) is significant for maintenance of life, for reproduction, for creativity. Instead of duration independent of the observer, we have the meaning of time, pertinent to observation and to the observer. Among cells, as well as among neurons, interactions are slow. The speed of light, with respect to which duration (Einstein’s time) is referenced, is actually of no significance at the cell level (cells are tightly packed). The experimental evolution project involved the cloning of bacteria to produce genetically identical populations—which would be an anomaly in nature. Of course, biological time was reduced to duration, and in effect the experiment became one in duration of adaption to a new, artificial, environment. The bacteria experiment at Michigan State University pre-empted the anticipatory expression, reducing life processes to physico-chemical processes.

Time in the living reflects the physics of gravity. In addition, biological self-preservation prompts the anticipatory action of the organism (as a whole) as it avoids hurt in the process of falling, for example. This anticipatory action takes place at a different timescale: faster than real time, i.e., faster than physical duration, as it is described and exemplified by recordings on film or digital media [more on this in (Nadin 2003)]. The physics is expressed in the description of the fall: always the same. The biology is expressed in the timescale within which anticipatory action (not always successful) allows for avoidance of hurt. We think faster than real time; distributed thinking engages the entire body, of many integrated “clocks”—to use the machine name for labeling purposes—“ticking” at different timescales, many of them variable.

9 The ideology of science

The incipient machine concept (de la Mettrie, Descartes), as the concrete embodiment of determinism and reduc-tionism, was applied to the larger context of an economic system dependent on machines. Since Descartes’ time, science became the religion of its own never-proven assertion concerning the living as a machine, and causality as determinism. There was a historic necessity to machines, as there was historic necessity to the attempt to extend the rationality embodied in them to the living. The practicality of machine understanding of the world reflects that of the machine as a means of production. Since that time, experimental evidence has meant the reproduction of machine reproducibility, i.e., the “production” of knowledge, instead of discovery.

Science, however, is not meant to align everyone and forever with the religion of immutable concepts. The supplicants of the machine religion (the machine fundamentalism) are captive to the circularity of religion: the machine and the machine metaphor are the outcome of human attempts to overcome their limitations, including those affecting the possibility of the knowing subjects to be able to know themselves. Those who claim today that our reality is the outcome of some larger machine [a computation, or a simulation (Bostrum 2003)] are only bringing the solipsism religion farther than it ever was: we humans con-structed science and now we search for experimental proof that science makes us, as we continue to search for answers relevant to our existence within circumstances so different from those reflected in Descartes’ views. As a conse-quence of this creed, it is no longer acceptable to maintain in the scientific conversation that the living and the non-living (the physical) are different. O?cial science (Handler 1970, and the pursuant reports on the state of knowledge in biology) proclaims that they are fundamentally the same. Supervenience views (Jaegwon 2009) even hold that

…every mental phenomenon must be grounded in, or anchored to, some underlying physical base (presum-ably a neural state). This means that mental states can occur only in systems that can have physical properties, namely physical systems.

Such views are reflected in the beliefs of many practitioners. Whitesides (2015), defining the new chemistry (recall the claim “life can be understood only in the language of chemistry”), describes its new ambitions: What is the molecular basis of life? (“…life is an expression of molecular chemistry”); How does the brain think? (“thought” is simply interacting molecules, and hence chemistry). A “new physics theory of life” (England 2013) ascertains that matter, under some circumstances, acquires life attributes: “You start with a random clump of atoms, and if you shed light on it for long enough, it should not be so surprising that you get a plant.” The “strong ALife” hypothesis adopted von Neumann’s position: “Life is a process which can be abstracted away from any particular medium” (Neumann 1951). The weak Alife still associated life with wet computation. Even Darwin’s theory becomes a special case of a more general physical phenomenon. Teg-mark (2014) went so far as to consider the universe made out of mathematics. Barabasi (2009) generalizes from the large networks (Internet, for example) to protein interactions, not realizing that living networks are continuously reconfigured. Berry et al. (2016) find shapes in nuclear astrophysics quite similar to those of a cellular organelle. Of course, for these authors the stacked sheets of neutron stars are more complex. (The jargon of reductionism always implies degrees of complexity, without ever defining them, as complexity is not defined beyond numbers describing sets of interacting entities.) Latash (2016: 136) argues that “Behavior of biological systems is based on basic physical law, common across inanimate and living systems…” That anticipation is definitory of the living—i.e., transcends the features (homeostasis, metabolism, growth, reproduction, etc.) he mentions—does not fit within the view of the living he presents. However, Latash leaves the door open (not unlike the authors of the 1970 report) to “…currently unknown physical laws that are specific for living systems.” Do they have to be physical?—is a question he, and many others sharing this view, does not address.

These are only examples. Examining the position adopted by many researchers, it is clear that the o?cial science program of the National Science Council and of the Academy of Sciences became the underlying ideology. Even foundations claiming to pursue alternatives (such as the well-endowed Templeton Foundation) are ultimately practicing it. Su?ce it to point out that, impervious to anticipatory processes, they embraced the so-called pro-spective psychology laden terminology of experiments impossible to replicate (Report of Positive Psychology Center 2015).

10 The “Making Life” machine

The social science ideology promotes the false premise of reductionism and causality reduced to determinism. Never proven but always assumed to be true (as a god is assumed to be), reductionism was instrumentalized in science policy and the associated reward mechanisms. Those seeking support (or at least hoping to be tolerated) have to align with the ideology. Not surprisingly, the machine that makes all machines—i.e., the computer—is exactly what the Church-Turing thesis ascertains: every physically realizable process can be computed by a Turing machine. A simple, deterministic recursively functioning device does it. This would make the claims of chemists and physicists nothing other than an expression in computation. If “We may be on the verge of creating a new life form…and evolutionary breakthrough…” sounds like a headline [and it is (Good-ell 2016)], the declarations of AI practitioners [evinced in “Artificial Intelligence and Life in 2030”(Artificial Intelli-gence 2016)] make such a breakthrough sound like reality. The inference made by transhumanists that living processes are Turing computations, i.e., algorithmic, entails exactly the assumption of Hilbert’s formalist program: semantics (of mathematical statements) is reducible to syntax. But Turing actually proved the opposite. His machine internalizes all referents to the world in syntactic form. There is no isomorphism of any kind between the state of a machine and that of a living open system.

For such an isomorphism to be achieved, the living would have to be associated with a phase state (of variables describing its change) that does not change. Once again: the opposite is the case. Not only Longo, mentioned above, took note of this. Moreover, the algorithmic machine would have to handle the ambiguity of meaning characteristic of living processes. For this, we need appropriate means of representation—unavailable at present—and a new notion of intelligence: action entailed by understanding and result-ing in variable degrees of self-awareness. The spectacular performance associated with deep learning and reinforcement learning, as well as with neuromorphic computing, takes place at the syntactic level because algorithmic computation is by its condition data-driven. No such a computation, regardless of whether it is called “brute force” (as the performance in chess with Big Blue and its successor Watson), AI, deep learning, etc., is associated with the realization of meaning. Machines have no idea what music is, or what a painting is, what a game is (although they can win in every game embodied in a machine).

Deep neural networks trained along the line of supervised learning and reinforcement learning do not understand the goal pursued. The search algorithm (combining Monte Carlo simulation with value and policy networks) of the AlphaGo (and similar programs) is more effective than previous algorithms, but the outcome is rather the result of availability of more computational resources than of anything that qualifies as intelligent. No matter how much more computation is aggregated in all kinds of applications, there is no indication of achieving anything comparable to the dynamics of the self-awareness associated with the living. Let us contrast the most advanced methods competing for the limelight to the slime mold. The protoplasmic slime, a unicellular organism with several nuclei, acts in awareness of the meaning of stimuli and, thus, changes its behavior in anticipation of new situations (Ball 2008).

Thesis 2 Life originates within life.

The abduction at work accounts for the realization, shared almost universally, that machines are closed sys-tems; the living is an open system (Fig. 5).


Fig. 5 Unbounded world as open system

Corollary to Thesis 2 Changes in the living constitute the historic record of evolution.

The abduction is based on the observation that causality characteristic of evolutionary processes is of a condition different from determinism.The question, “Are we on the verge of witnessing the birth of a new species?” might make those entertaining it feel like pioneers of science and technology. In reality, the question identifies nothing but the theology of machines. This fact still escapes the logic of the preachers of the machine religion. Moreover, there are alternatives to the algorithmic Turing machine—which Turing himself considered (Turing 1948; Eberbach et al. 2004). A relatively detailed presentation of the variety of forms of computation (Nadin 2016) extends to an analysis of the assumption that everything can be measured; that is, that measurement is the only representation of change, regardless of what is changing (i.e., a living or non-living entity).Rosen, as mentioned already, found fault with Pythagoras for this situation. Mathematical truth is formal in nature, and at best it is the expression of cognitive performance upon a quantified representation of reality. The assump-tion that mathematical truth corresponds to the dynamics of reality is tantamount to a reduction of all there is to only what can be mathematically represented, i.e., expressed in the language of mathematics. That a “mathematics” or logic of living processes might be an alternative does not fit in the reductionist program. Gelfand (Gelfand and Tsetlin 1966) and Bernstein (Bassin et al. 1966) (to whom we shall return) argued in favor of such a new mathematics. Prob-ably along the same line of thinking is the inference that what does not align with the premise (i.e., is not algorithmic) has to be implicitly wrong. Therefore, non-determinism was declared a form of determinism, and open systems were qualified as canonically perturbed closed system. The opposite provides a more adequate understanding. Actually, there is a need for a comprehensive understanding of non-determinism, as well as for better descriptions of open systems, and of holism, as a characteristic of such systems.

Inspired by Niels Bohr, we would be better off opting for a model of complementarity. This, by extension, applies to the experiment as a knowledge acquisition and validation method (among others) for everything pertaining to the physics of the world, and empirical evidence as interval series (or history) for the living. On account of interval series, interpretations of the data they afford lead to the category called story, which Kauffman (2000, Kauffman and Gare 2015) (among others) called to the attention of the scientific community. This particular aspect deserves more detailed examination (Nadin 2013a, b), if indeed story, in a well-defined sense, should become the outcome of life sciences and inform practical activities (such as healthcare).

11 A distinguishing criterion

Gödel’s concept of decidability (the logic pertinent to axiomatic systems used in arithmetic operations) can be applied in defining knowledge domains. It offers the possibility of describing the particular manner in which the physical and the living can be effectively distinguished by an observer (natural or artificial). Gödel’s Incompleteness Theorems ascertain that any theory trying to describe elementary arithmetic cannot be both consistent and complete, i.e., is not decidable. The dynamics of physical reality is by its condition different from that of the dynamics of an evolutionary system (actually a system of embedded systems). The number of entities involved in physical processes, from the simplest (such as those described in Galileo’s mechan-ics and in Newton’s laws) to the more complicated (such as those captured in Einstein’s physics or even in quantum mechanics), is finite. This number does not increase along the timeline of change. Living processes are open-ended. The number of entities is in continuous expansion. They are endlessly re-created. That Gödel’s theorems concern not only descriptions of reality (formal systems), but also of reality might not be directly provable within the formalism in which they are expressed. However, they reflect the constructive nature of all human knowledge, about arithmetic as much as about anything else in the world. If mathematics and measurement are co-substantial, logic (as in Gödel’s logical theorems) pertains to interpretation, and is co-substantial with representation. Representations of the world change the world. The undecidable becomes more undecidable (to use a figure of speech, not a precise qualifier). Moreover, Heisenberg’s uncertainty principle (Heisenberg 1927) shows that extending Gödel’s theorems to reality actually reflects our understanding of fundamental properties of material systems (he limited himself to quantum systems).

On account of these considerations, we can proceed with an effective distinction procedure. The focus in this alternative view is not on Gödel’s rigorous logical proof—which can be used in describing reality, and not only the num-bers representing it—as it is on the notion of decidability, extended here from the formal domain to that of reality.

Definition An object of knowledge inquiry is decidable if it can be fully and consistently described. (This is, of course, a generalization of Gödel’s notion.)

Indeed, physics, astronomy, geology (mentioned in Goodstein, as well as in many subsequent reports), knowl-edge domains where reproducibility of experiments is close to 100%, are descriptions of dynamics (how things change), i.e., representations of change that can be complete and consistent.

Lemma Experiments involving decidable processes are reproducible. Such descriptions undergird predictions: the expected output of science without necessarily guarantee-ing it. Observation: The dynamics of the decidable—i.e., how entities that can be fully and consistently described change—is not necessarily a su?cient condition for mak-ing it predictive. The Poincaré problem regarding the 3-body dynamics is probably the best known example con-cerning this observation.

Thesis 3 The threshold from the decidable to the unde-cidable is the so-called G-complexity (G for Gödel, obvi-ously (Nadin 2014)).

The source of undecidability is the interaction through which the living is identified as anticipatory. If the notion of complexity conjures any meaning, it cannot be reduced to numbers or to the language of mathematics, which captures only quantitative aspects of dynamics. G-complexity remedies the generality of the notion of complexity in favor of a distinction—the decidable—that can be probed. The cell is as undecidable as tissue and, moreover, as the organism it makes up.


Fig. 6 Characterizing the physical and the living

Thesis 4 Change above the G-complexity threshold is undecidable.

Interactions are the concrete expressions of G-com-plexity. In the living, interactions continuously multiply. As the living returns to its physical condition (from senescent states to death), interactions decrease and settle in the decidable domain of physical and chemical interactions.

The living, in its unlimited variety of ever-changing forms, is G-complex, i.e., it is characterized by undecidability. For non-living physical entities, interaction takes the specific form of deterministic reaction, expressed in physical laws (such as those expressed in Newton’s equations or in Einstein’s theory of relativity), or embodied in measurement devices. Time is expressed as duration.

Change in the physical is unreflected: neither a stone nor a volcano is aware of the respective dynamics of how they change—and even less of why they change. They undergo change that can be observed, but which does not afford self-awareness. The observing subject (Fig. 6, see also Fig. 3), whose own condition is changed in the act of observation, takes note of their record of change (the duration sequence of the falling, the speed, the acceleration, the impact, or the detailed the sequence of a volcano’s eruption). Based on empirical observation, the observer can design experiments pertinent to the dynamics of falling or of a volcanic erup-tion (or whatever other physical phenomenon). The living observes and, most important, can affect its own change.

Actually, the never solved question between predicting an event and the cause of the event acquires a new perspective. Anticipation being always expressed in action entails interactions that reshape those which are interacting. Hence, the never solved question between predicting an event and the cause of the event acquires a new perspective. For the living, change is the outcome of interactions in which the physical (the dynamics of action-reaction) is complemented by anticipatory expression: current state contingent upon possible future state as it pertains to preserving life. (Anticipatory processes also underlie successful actions: athletic performance, financial transactions, military strategy, etc.). Living entities (animals mostly) do not simply fall; they “know” (implicit knowledge) how to fall, as long as anticipatory processes take place. The example of falling extends to all organisms. Tardigrades (among other so-called cryptobiotic organisms, i.e., living in states of suspended dynamics), like seeds, also behave in an anticipatory manner. The process can be described (in an anthropomorphic way of speaking) as following the Möbius strip trajectory back to life (Neuman 2006). Put otherwise, there is a continuum of life akin to the Klein bottle geometry (surface and interior are on the same plane), as there is in every creative endeavor: from inspiration to artwork (yet another example documented by the entire history of art and literature). The possible future state in the dynamics of cell interaction or in neuronal synapses is of a different nature, and easier to understand than that of the tick that can be in “suspended” life (up to 18 years) until the butyric acid of a body in its Umwelt (Uexküll 1934) triggers its action. Change in the living is reflected in the form of its representation through successive states. Living interaction is not reducible to the physical action–reaction sequence (Ellis 2005, 2006).

The description of physical interaction conjures quantity: its observation (measurement) results in data. The description of living interaction conjures quality: its obser-vation results in information, i.e., data associated with meaning (Wheeler 1989). Information, characteristic of life, is not physical (López-Suárez et al. 2016). Lived time and clocked time are different. In the physical, the scale of time is constant; in the living, it is variable. For instance, aging entails, among other changes, a different rhythmic expression. The circadian clock is progressively disrupted (Kuintzle et al. 2017) and, as a result, anticipatory pro-cesses are affected (Nadin 2004). The circadian system is part of a sophisticated duration-based control process of living processes. This process extends from the molecular to the cellular, physiological and behavioral levels. By no means do we describe here clocks, but rather timing processes. (Those who try to produce computational models ignore this and, therefore, contribute to another reductionist view.)

Behaviors subject to the timing control processes are the expression of anticipatory processes: the performance of dancers, orchestra conductors, soloists, actors; market dynamics, political leadership. These are examples of anticipatory expression resulting in success or failure. Numbers do not capture the uniqueness, i.e., the meaning of actions driven by the possible future expressed in the anticipation. Those who documented that a drug-induced “high” is similar to the “high” occurring while listening to music, or experiencing a moving theater performance, or having sex, confirmed this idea (Mallik et al. 2017).

12 Change is the outcome of interaction

Physics continuously provides experimental reproducible evidence concerning the cause-and-effect perspective expressed in determinism. If the physical world should start over (using Gould’s suggestive replay of the film), its dynamics would, for all practical purposes, be the same. Evolution provides empirical evidence of its underlying process: anticipation. Successful anticipation drives survival. If life were to begin again (replay the video), the outcome would be different in each such re-beginning, and not predictable. The reactive and the anticipatory are integrated in the dynamics of life (Fig. 7, see also Fig. 1). Anticipation, as the underlying process of evolution, is documented through empirical evidence. It expresses characteristics of the living such as adaptivity, holism, purposefulness, and creativity, and it provides the premise for understanding emotion, the relation between the embodied brain and mind interaction, the focus on meaning.

Research (Pethel and Hahs 2011) acknowledges that the living generates information. Attempts were made to “measure the information fiow in experimental data from neuroscience, finance and even music”. The goal: “…to discover what’s predicting and what is causing”. The economic bubbles—how they come into being—are like many other anticipatory processes (e.g., listening to music, participating in religious services or mass rallies, observing one’s blood pressure or weight), self-fulfilling prophecies. The back-and-forth coupling between desired outcome and data generation reflecting onto goal-oriented activity of the living to a large extent explains adaptive processes. Interactions of all kinds explain the dynamics of such coupling.

13 Decidability and algorithmic processing

To expect experiments involving the living (of interest not only to psychology, but also to the biomedical sciences and many other fields of inquiry pertinent to life) to be repro-ducible is epistemologically equivalent to reducing the living to its physical substratum, and biology to physics and chemistry (the reductionist doctrine). A particular form of this reduction is the machine model in its algorithmic expression. Humans made the machine-god to replace the omnipotent, and after that they took the machine functioning predicated by the machine-god and committed to the practice of the machine-religion based on this belief. Experiments within this religion of the machine are reproducible, since they confirm the premise: the machine-god means that everything is a machine, or behaves like one, hence it is fully predictable.


Fig. 8 Pregnancy, as part of the creative aspect of life, exemplifies anticipation at all levels of the life of the mother (and even that of the father)

Another aspect of the same scientific theology is that of conditioning: we created god, defined the commandments, and expect everyone to submit to them as though they come from a higher authority. It is conditioning, with a reward mechanism attached to it: respect the “commandments” as though they were “god-sent”. The “god you made up” will reward you by making you more god-like. In the machine-god phase, the same is practiced: the machine rewards those who accept that they are machines—and who behave accordingly—by making them more machine-like. Many experiments turn out to be mere instances of conditioning (psychology outperforms every other known discipline in this respect). Debunked many times over, conditioning became a feature of the very large body of “press the button” experiments—act like a machine—meant to legitimize user interfaces, cognitive aspects of perception, e?ciency of behavioral treatment.

In general, a limited understanding of causality as it pertains to life dominates experiments concerning the living. By contrast, in modern physics, and not only in the quantum mechanics perspective, causality is gradually approached within a broader understanding of determinism and even openness to non-deterministic processes. Awareness of non-linearity and stochastic aspects of physical phenomena permeates such a view.

Mapping from an open system (extending from the cells to the whole human being), of extreme dynamics, to the closed system of the experiment—which by definition is supposed to be decidable—might result in reproducibility. But what is reproduced is a false assumption, not knowledge-bearing hypotheses about change. The validity of some 40,000 fMRI studies, and more broadly the inter-pretation of neuroimaging results, was recently questioned (Eklund et al. 2016), after the fMRI (25 years old) technology itself was critically assessed [(Shifferman 2015)among others]. False-positive rates of up to 70% concerning its most common statistical methods, which have not been validated using real data, are actually a proof of a replicated misguided assumption.

But even within the mechanistic view of the living replication is by no means guaranteed. As impressive as the Human Genome project was, it is a good example of irreproducible experiments (along with its incompleteness). It was generated under the reductionist assumptions of a blue-print—published as such (Venter et al. 2001)—of a homo sapiens that does not change over time, i.e., epigenetics was ignored. What was extracted is a truncated image of gene syntax. The 1000 Genomes Project (2008–2015), aimed at studying variation (initially ignored) and genotype data, is an example of improved understanding but yet still another irreproducible experiment. It affords useful empirical data, such as access to some semantic aspects of gene expression. The goal, probably not yet on the radar of scientific inquiry, should be the pragmatic level, where meaning is constituted in the context of life unfolding in an anticipatory manner. However, that would entail the need to accept that experiments within the decidable are reproducible. The complementary is evident: experiments concerning the dynamics of the G-complexity domain (the living) cannot be replicated. They are of extreme importance to science, but more as a source of data: the time (actually duration) series of the observed phenomenon. The synchronic view (bearing the time stamp of the experiment clock) begs for the complementary diachronic representation of integrated processes—involving a variety of clocks and timescales.

This idea is relatively well illustrated by the entire cycle of reproduction (Fig. 8). Pregnancy (Brunton and Russell 2008) is a convincing example of anticipatory expression underlying creation, i.e., the birth of some entity that never existed before. For instance, anticipatory adaptations that diminish or eliminate the infiuence on the fetus of the mother’s stressful experiences take quite a number of forms—from the fiow of energy needed for the new life to triggering lactation. Oxytocin is released in advance of parturition and during lactation. Anticipatory processes under-lie the timing and details of the body making milk for the future nursing infant. Maternal behavior is also changed in anticipation of the birth proper. An altered emotional condition parallels new endocrine and cognitive functions. As was already pointed out, the living, pregnant or not, is in a continuous state of remaking itself, sui generis re-creation of its constitutive cells—each different from the other—and thus of the entire organism. The constancy of physical (non-living) entities, even those of extreme dynamics (such as black holes), stands in contrast to the variability of any and all organisms and the matter in which they are embodied.

An assumption similar to that of the Human Genome governs the current Connectome project. It will be ten or one hundred times more costly than the Genome project, but not more adequate in reporting on the variability of the cortex. Brain activity has become the showcase of computational modeling. There is, of course, much to gain from computational models in physics applications—the Juno space mission is only a recent spectacular example. In the biological realm, an intrinsic limitation is ignored: algorithmic computation captures only the deterministic aspects of change. In addition, the premise that such processes are algorithmic in nature was never proven—or even questioned. The algorithmic is decidable; moreover, it is tractable. This means that the execution of the program representing the dynamics of the process represented by the computation takes a time represented by the polynomial of the steps required.

Thesis 5 The decidable can be represented by a tractable algorithmic computation.

Thesis 6 The living is not algorithmic.

Those who for centuries have tried to come up with a “recipe” for making life might produce only an open-ended inductive sequence. The abductive reasoning behind Thesis 6 is based on the consequences of the understanding that the living is couched in G-complexity, that is, it is undecidable.

Thesis 7 The undecidable is not tractable, neither in com-putational form nor in any form of data processing.

Even considering infinite computing resources of any kind of computation (analog, algorithmic, interactive, etc.)—which would undermine our current understanding of the relation between energy and data processing—the undecidable would remain intractable. For example, the ants (phildris nagasan) that “cultivate coffee for accommodation” accumulated experience in this particular form of agriculture over millions of years (Chomicki and Renner 2016), eons before the human beings did. An inverse computation from today to when this ants’ “agriculture” actually emerged might explain the anticipatory characteristics of the elaborate process. But to perform it would take more than the energy involved in the evolution of life over that span of time. The physics of what the ants do is relatively simple; even an abacus would su?ce for calculating how the process takes place.

The guaranteed reproducibility of computational neuro-science experiments conjures knowledge and validation not about the brain, whose deterministic and non-deterministic aspects complement each other in its functioning, but about algorithmic computation. Interactive computation, as well as other forms of computation, in line with the dynamics of interaction of the living in general, and of the brain in particular, is rarely considered (Nadin 2016, 2017).

Windelband’s (Windelband 1907) view of nomothetic science (expressed in universally valid laws, such as Newton’s laws of mechanics) and idiographic science (diachronic processes subject to empirical observations) could as well guide in defining new methods for gaining knowledge peculiar to the living. Let us recall all those biologists (not only Gould, mentioned above) who have been questioning the assumption that there are laws similar to those of physics that describe the living. Biologists mostly are experiencing the uniqueness of each subject and won-der how this uniqueness (idiographic characteristic) can be described. There is a direct practical consequence to this distinction: medical care as a reactive praxis of fixing the “human–machine,” or individualized care (the art and science of healing) reflecting the awareness of uniqueness. A knee replaced is an example of an experiment replicated many times. A genetic-based treatment, extremely individual in nature, is as unique as the new attempts at immunotherapy in addressing conditions for which reactive medicine is not an option. Moreover, the crisis of the experiment is also the crisis of our understanding of how knowledge is acquired, validated, and shared. Although the languages of visualization, modeling, and simulation are different from those of analytic expression, logic, mathematics, etc., we continue to expect some uniformity, as though the reality we question is by necessity homogenous. Peirce (1992) suggested diagrammatic thinking as an alternative. So far, his ideas remain outside the mainstream of science. In recent years, Leamer (2009), among others, contrasted “theory and evidence” vs. “patterns and stories.” For some reason, neither biological theorists, such as Elsasser, Rosen, Pattee, and more recently Kaufmann (Gare 2013), nor philosophers of the subject (in particular the competent Arran Gare) have taken note of these developments. Even when the notion of story is mentioned, work in defining it (in contrast to the narrative) is ignored (Nadin 2013a).

Consequently, story remains a rather undefined candidate, although the uniqueness of life phenomena speaks more in favor of variety (which stories can offer) than the replication of experiments. The proponents of physics as “the science of everything,” are grounded in its constructs. Those who advance alternative understandings of life processes know more about what they reject than what defines the culture of the subject of biology. For instance, they ignore contributions coming from researchers outside their own context (such as those who worked in what used to be the Soviet Union, and who were severely censored). Just for the sake of the argument (i.e., integrating ideas from outside the culture shaped by the Cartesian view), let us mention that Bernstein (1967) wrote about the “repetition without repetition” characteristic of the living as an expression of its dynamic variability. Machines provide mechanical repetition, which is their expected, and desired, performance. Empirical evidence is yet another argument in favor of finally transcending the machine view of the living characteristic of Cartesian determinism and reductionism. Based also on empirical evidence, Gelfand (along Wigner’s line on the effectiveness of mathematics in physics) stated (Gelfand 2007): “There is only one thing which is more unreasonable than the unreasonable effectiveness of mathematics in physics, and this is the unreasonable ineffectiveness of mathematics in biology.” Mathematics captures the decidable. Other descriptions, such as the record of a process, return testimony to the undecidable. Progress in science renders the need for a “new Cartesian revolution,” at the forefront of science’s efforts to better understand change in the specific manner in which it characterizes life.

Acknowledgements

This study is the outcome of a long-term endeavor. Interactions with distinguished colleagues and many young researchers helped in defining the foundation for this work. The author would like to acknowledge Robert Rosen for his pioneering work in defining the living, and colleagues from the University of California–Berkeley, Professors Harry Rubin (Biology) and Lotfi Zadeh (Electrical Engineering and Computer Science); Professor Solomon Marcus (mathematician, Member of the Romanian Academy), Aloisius H. Louie, Stuart Kauffman, Kalevi Küll (University of Tartu, Estonia), and more recently Arran Gare (Swinburne University of Technology, Melbourne, Australia) for their intellectual openness to new ideas and their encouragement. An anonymous reviewer suggested the inclusion of arguments pertinent to AI (and ALife). Luigi Longo took time to discuss in detail the arguments presented in a preprint version of this study. Both deserve acknowledgment and my gratitude.

Compliance with ethical standards

Confiict of interest There are no confiicting interests to be reported.

References

Artificial Intelligence (2016) Life in 2030 One Hundred Year Study on Artificial Intelligence. Report of the 2015 Study Panel, September 2016. https://ai100.stanford.edu/sites/default/files/ai_100_report_0831fnl.pdf. Retrieved February 13, 2017

Bailey R (2016) Most scientific findings are wrong or useless, reason. August 26, 2016 (reason.com/archives/2016/08/26/most-scientific-findings-are-wrong-or-use)

Baker M (2016) 1500 scientists lift the lid on reproducibility. Nature 533:452–454 (corrected 28 July 2016)

Ball P (2008) Cellular memory hints at the origins of intelligence. Nature 451:385. doi:10.1038/451385a

Ball P (2016) The mathematics of science’s broken reward system. Nat News Comment. doi:10.1038/nature.2016.2097

Baluska F, Mancuso S, Volkmann D, Stefano M (2006) Communication in plants, in neuronal aspects of plant life. Springer, Berlin/Heidelberg

Barabasi AL (2009) Scale-free networks: a decade and beyond. Science 325:5939, 412–413

Bassin PV, Bernstein NA, Latash LP (1966) Towards the problem of the relations between brain architecture and functions in its modern understanding. Physiology in Clinical Practice. Nauta (in Russian), Moscow, pp 38–49

Beatty J (1995) The evolutionary contingency thesis. In: Wolter G, Lennox J (eds) Concepts, theories and rationality in the biological sciences. University of Pittsburgh Press, Pittsburgh, pp 45–81

Beatty J (1997) Why do biologists argue like they do? Philos Sci 6(4 supp):S432–S443

Ben-Menachem Y (1997) Replaying life’s tape. J Philos 103(7):336–362

Bernstein NA (1967) The coordination and regulation of movements. Pergamon Press, Oxford (see also: Nadin M (ed) Learning from the past. Early Soviet/Russian contributions to a science of anticipation. Cognitive Systems Monographs. Springer, Cham CH2015)

Berry DK, Caplan ME, Horowitz CJ, Huber G, Schneider AS (2016) “Parking-garage” structures in nuclear astrophysics and cellular biophysics. Phys Rev C 94:0558901. doi:10.1103/physrevc.94.055801

Bostrum N (2003) Are you living in a computer simulation? Philos Q53(211):243–255

Brunton PJ, Russell JA (2008) The expectant brain: adapting for motherhood. Nat Rev Neurosci 9:11–25

Chomicki G, Renner SS (2016) Obligate plant farming by a specialized ant. Nat Plants 2:16181. doi:10.1038/nplants.2016.181

Clay R et al (2015) Reproducibility project: psychology. Science 349:6251

Constantinople CM, Bruno RM (2013) Deep cortical layers are activated directly by thalamus. Science 340:6140, 1591–1594

Conway Morris S (2003) Life’s solution: inevitable humans in a lonely universe. Cambridge University Press, Cambridge

Desjardins E (2011) Biology and philosophy 26(3):339–364

Dutton L (2015) Nature’s marvelous machines. Research frontiers in bioinspired energy: molecular learning from natural systems. Washington DC: National Academies of Science, Engineering and Medicine. http://nas-sites.org/bioinspired/featured-scientists/les-dutton-natures-marvelous-machines/. Accessed 9 Nov 2016

Eberbach E, Goldin D, Wegner P (2004) Turing’s ideas and models of computation. In: Teuscher C (ed) Alan turing. Life and Legacy of a Great Thinker. Springer, Berlin/Heidelberg

Eklund A, Nichols TE, Knutsson H (2016) Cluster failure: Why fMRI inferences for spatial extent have infiated false-positive rates. PNAS 113(28):7900–7905. doi:10.1073/pnas.1602413113 (Epub 2016 Jun 28)

Ellis GFR (2005) Physics, complexity and causality. Nature 435:743

Ellis GFR (2006) Physics in the real world. Found Phys 36(2):227–262

Elsasser W (1998) Reflections on a theory of organisms. Holism in biology. John Hopkins University Press, Baltimore

England J (2013) Statistical physics of self-replication. J Chem Phys 139:12. doi:10.1063/1.4818538

Fehr J, Heiland J, Himpe C, Saak J (2016) Best practices for replicability, reproducibility and reusability of computer-based experiments exemplified by model reduction software. 2016 (arXiv:1607.01191v)

Gare A (2013) Overcoming the Newtonian paradigm: the unfinished project of theoretical biology from a Schellingian perspective. Prog Biophys Mol Biol 113(1):5–24

GelfandI M (2007) In: Borovik AV (ed) Mathematics Under the Microscope. Notes on cognitive aspects of mathematical practice. September 5, 2007. Creative Commons, http://eprints.ma.man.ac.uk/844/

GelfandI M, Tsetlin ML (1966) On mathematical modeling of the mechanisms of the central nervous system. In: Gelfand IM, Gurfinkel VS, Fomin SV, Tsetlin ML (eds) Models of the structural-functional organization of certain biological systems. Moscow: Nauka, 9–26 (In Russian; a translation is available in the 1971 edition by MIT. Press, Cambridge, MA)

Goodell J (2016) Inside the artificial intelligence revolution: a special report (Pt 1), Rolling Stone, February 29, 2016. http://www.rollingstone.com/culture/features/inside-the-artificial-intelligence-revolution-a-special-report-pt-1-20160229. Retrieved February 13, 2017

Goodstein D (2002) Scientific misconduct. Academe 88(1), 28–31

Gould SJ (1989) Wonderful life: the burgess shale and the nature of history. W.W. Norton, New York

Handler P (ed) (1970) Biology and the future of man. National Academies Press, Washington DC

Heisenberg W (1927) Über den anschaulichen Inhalt der quanten-theoretischen Kinematik und Mechanik. Zeitschrift für Physik (in German) 43(3–4):172–198

Horrigan S et al (2017) Replication study: melanoma genome sequencing reveals frequent PREX2 mutations. eLife 6:e21634

Horton R (1999) Scientific misconduct: exaggerated fear but still real and requiring a proportionate response. Lancet 354:9172, 7–8

Horton R (2015) O?ine: what is medicine’s 5 sigma? Comment. Lancet 385:9976, 1380

Jaegwon K (2009) Mental causation. In: McGlaughlin B, Beckermann A, Walter S (eds) The Oxford handbook of philosophy of mind, Oxford Handbooks Online, 40. http://www.oxfordhand-books.com/view/10.1093/oxfordhb/9780199262618.001.0001/oxfordhb-9780199262618-e-2

Kandela I et al (2017) Replication study: discovery and preclinical validation of drug indications using compendia of public gene expression data. eLife 6:e17044

Kauffman SA (2000) Emergence and story: beyond Newton, Einstein and Bohr? Investigations, Chap 6. Oxford University Press, Oxford

Kauffman SA, Gare A (2015) A beyond descartes and newton: recov-ering life and humanity. Prog Biophys Mol Biol119:219–244

Kuintzle RC, Choq ES, Westby TN, Gvakharia BO, Giebultowicz JM, Hendrix DA (2017) Circadian deep sequencing reveals stress-response genes that adopt robust rhythmic expression during aging, Nat Commun. Accessed 24 Feb 2017. doi:10.1038/ncomms14529

Latash ML (2016) Towards physics of neural processes and behavior. Neurosci Biobehav Rev 69:136–146. doi:10.1016/j.neubiorev.2016.08.005

Leamer SE (2009) Macroeconomic patterns and stories. Springer, Berlin/Heidelberg

Lenski RE, Travisano M (1994) Dynamics of adaptation and diversification: a 10,000-generation experiment with bacterial populations. Proc Natl Acad Sci 91:15, 6808–6814

Li S, Stamfer M, Williams DR, VanderWeele TJ (2016) Association between religious service attendance and mortality among women. JAMA Intern Med 176(6):777–785

Libby T et al (2012) Tail-assisted pitch control in lizards, robots and dinosaurs. Nature Lett 481:7380, 181–184

Longo G (2017) How future depends on past and rare events in sys-tems of life. Foundations of Science, 2017, IEA Nantes. http://www.di.ens.fr/users/longo/files/biolog-observ-history-future.pdf

Longo G, Montevil M (2013) Extended criticality, phase spaces and enablement in biology, Chaos, Solitons and Fractals, Emerg Crit Brain Dyn 55:64–79. doi:10.1016/j.chaos.2013.03.008

Longo G, Montévil M, Kauffman S (2012) No entailing laws, but enablement in the evolution of the biosphere. arXiv:1201.2069

López-Suárez M, Neri I, Gammaitoni L (2016) Sub-kBT micro-electromechanical irreversible logic gate. Nat Commun 7

Losos JB, Jackman TR, Larson A, deQueiroz K, Rogriguez-Schettino L (1998) Contingency and determinism in replicated adaptive radiations of island lizards. Science 279:5359, 2115–2118. doi:10.1126/science.279.5359.2115

Mahon BZ, Anzellotti S, Schwarzbach J, Zampini M, Caramazza (2009) A category-specific organization in the human brain does not require visual experience. Neuron 63:397

Mallik A, Chanda ML, Levitin DJ (2017) Anhedonia to music and mu-opiods: evidence from the administration of naltrexone. Nat Sci Rep 7. Article number: 41952. http://www.nature.com/arti-cles/srep41952. Retrieved 23 Feb 2017

Mantis C et al (2017) Replication study: coadministration of a tumor-penetrating peptide enhances the e?cacy of cancer drugs. eLife 6:e17584

Mogil JS, MacLeod MR (2017) No publication without confirmation. Nat Comment 542:7642. http://www.nature.com/news/no-publication-without-confirmation-1.21509. Accessed 23 Feb 2017

Nadin M (1997) The civilization of illiteracy. Dresden University Press, Dresden

Nadin M (2003) Anticipation—the end is where we start from. Lars Müller Verlag, Basel

Nadin M (2004) Project Seneludens. http://seneludens.utdallas.edu

Nadin M (2011) The anticipatory profile. An attempt to describe anticipation as process. Int J General Syst (special edition), 41(1):43–75. Taylor and Francis, London

Nadin M (2013a) Anticipation: a bridge between narration and inno-vation. In: Müller AP, Becker L (eds) Narrative and innovation, management—culture—interpretation. Springer Fachmedien, Wiesbaden, pp 239–263

Nadin M (2013b) The intractable and the undecidable—computation and anticipatory processes. Int J Appl Res Inf Technol Comput 4(3):9–121

Nadin M (2014) G-complexity, quantum computation and anticipa-tory processes. Computer Communication and Collaboration, vol 2:1. BA Press, Toronto, pp 16–34

Nadin M (2016a) Medicine: the decisive test of anticipation. In: Nadin M (ed) Anticipation and medicine. Springer, Cham, pp 1–27

Nadin M (2016b) Anticipation and the brain. In: Nadin M (ed) Anticipation and medicine. Springer, Cham

Nadin M (2016c) Anticipation and the brain. In: Nadin M (ed) Antici-pation and medicine. Springer, Berlin/Heidelberg

Nadin M (2016d) Predictive and anticipatory computing. In: LaPlante P (ed) Encyclopedia of computer science and technology, 2nd edn. Taylor and Francis, London, pp 643–659. doi:10.1081/E-ecst2-120054027

Nadin M (2017) Predictive and anticipatory compuitng. In: Laplante P (ed) Encyclopedia of computer science and technology, 2nd edn. Taylor and Francis, London, pp 643–659

National Research Council Board on Biology (1989) Opportunities in biology. National Academy Press,Washington DCNature/Editorial (2016) Reality check on reproducibility. Nature 533:437

Neuman J (2006) “Cryptobiosis.” A New Theoretical Perspective. Prog Biophys Mol Biol 92:66

Nosek B (2015) Estimating the reproducibility of psychological science. Science 349(6251):943

Peirce CS (1932) The collected papers of Charles Sanders Peirce. In: Hartshorne C, Weiss P (eds) Cambridge MA: The Belknap Press of Harvard University Press (Following accepted practice, the reference refers to Vol 5, entry 145)

Peirce CS (1992) On the algebra of logic. In: Houser N, Kloesel C (eds) The essential Peirce: selected philosophical writings, vol 1, 227. Indiana University Press, Bloomington, pp 1867–1893

Pethel S, Hahs D (2011) Distinguishing anticipation from causality: anticipatory bias in the estimation of information fiow. Phys Rev Lett 107:128701. http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.107.128701. Accessed 24 Feb 2017

Picollo S (2013a) Embracing mechanical forces in cell biology. Differentiation 86(3):75–76

Picollo S (2013b) Developmental biology: mechanics in the embryo. Nature 504:223–225

Powell R, Mariscal C (2015) Convergent evolution as natural experiment: the tape of life reconsidered. Interface Focus 5:1–13

Pritsker M (2012) Studies show only 10% of published science are reproducible. What is happening? J Vis Exp. https://youtube.com/watchy?v=b-Tz1XofYhs

Replication studies offer much more than technical details (2017) They demonstrate the practice of science at its best. Nature|Editorial 541:7637

Report of Positive Psychology Center (2015) http://www.sas.upenn.edu/psych/seligman/ppcannualreport.pdf

Rosen R (1991) Life Itself. A comprehensive inquiry into the nature, origin, and fabrication of life (complexity in ecological sys-tems). Columbia University Press, New York

Rosen R (1999) The Church-Pythagoras Thesis, in essays of life itself. Columbia University Press, New York, pp 63–81

Sarewitz D (2016) Saving science. The New Atlantis 49:5–40 Spring/Summer

Schrödinger E (1944) What is life? Macmillan, New York

Shifferman E (2015) More than meets the fMRI: the unethical apotheosis of neuroimages. J Cognit Neuroethics 3(2):57–116

Smaldino PE, McElreath R (2016) Royal open society. Science 3. doi:10.1098/rsos.160384

Sober E (2008) core questions in philosophy, 5th edn. Core Questions in philosophy: a text with readings, 5th edn. Pearson, London

Symposium Report (2015) Reproducibility and reliability of biomedical research: improving research practice. Joint statement of the Academy of Medical Sciences, the Biotechnology and Biological Sciences Research Council (BBSRC), the Medical Research Council (MRC) and the Wellcome Trust. October 2015. http://www.acmedsci.ac.uk/policy/policy-projects/reproducibility-and-reliability-of-biomedical-research/

Tegmark ME (2014) Our mathematical universe: my quest for the ultimate nature of reality. Knopf, New York

The Sequence of the Human Genome (2001) Science “The Human Genome” 291:5507. American Association for the Advance-ment of Science, Washington DC

Turing AM (1948) Intelligent machinery [technical report]. Teddington: National Physical Laboratory (see also Copeland BJ (ed) 2004 The Essential Turing: seminal writings in Computing Logic, Philosophy, artificial Intelligence, and Artificial Life plus The Secrets of Enigma. Oxford University Press, Oxford)

Turner DD (2009) How much can we know about the causes of evolutionary trends? Biol Philos 24:341. doi:10.1007/s10539-008-9139-5

Uexküll J (1934) Streifzuge durch die Umwelten von von Tieren und Menschen. Berlin: Julius von Springer Verlag (see also A Foray into the Worlds of Animals and Humans with A Theory of Meaning. (O’Neill, J.D., trans.). University of Minnesota Press, Minneapolis 2010)

Vanderweele TJ (2016) Religion and health: a synthesis. In: Peteet JR, Balboni MJ (eds) Spirituality and religion within the culture of medicine: from evidence to practice. Oxford University Press, New York

Venter JC et al (2001) The sequence of the human genome. Science 291(5507):1304–1351

VonNeumann J (1951) The general and logical theory of automata. Cerebral mechanisms in behavior, 1–41. (This statement is cited in many texts, but no precise reference is ever given)

Wheeler JA (1989) Information, physics, quantum: the search for links. In: Proc. 3rd Int. Symp. Foundations of Quantum Mechanics. Tokyo, pp 354–368

Whitesides GM (2015) Reinventing chemistry. Angewandte Chemie Internationale 54(11):3196–3209. doi:10.1002/anie.201410884

Williams GC (1992) Natural Selection domains, levels, and challenges. Oxford University Press, New York

Williams R (2017) Replication complications. An initiative to replicate key findings in cancer biology yields a preliminary conclusion: it’s di?cult. The Scientist. http://www.the-scientist.com/?articles.view/articleNo/48031/title/Replication-Complications/

Windelband W (1907) Geschichte und Naturwissenschaft. Rectoratsreden der Universität Strassburg (History and Natural Science, Rectoral Address). Mohr, Tübingen, pp 35–379


Posted in Anticipation, Post-Industrial/Post Literate Society

copyright © 2o16 by Mihai Nadin | Powered by Wordpress

Locations of visitors to this page