Abstract

The tractable history of life records a successive emergence of organisms composed of hierarchically organized cells and greater degrees of individuation. The lowermost object level of this hierarchy is the cell, but it is unclear whether the organizational attributes of living systems extended backward through prebiotic stages of chemical evolution. If the systems biology attributes of the cell were indeed templated upon prebiotic synthetic relationships between subcellular objects, it is not obvious how to categorize object levels below the cell in ways that capture any hierarchies which may have preceded living systems. In this paper, we map out stratified relationships between physical components that drive the production of key prebiotic molecules starting from radiolysis of a small number of abundant molecular species. Connectivity across multiple levels imparts the potential to create and maintain far-from-equilibrium chemical conditions and to manifest nonlinear system behaviors best approximated using automata formalisms. The architectural attribute of “information hiding” of energy exchange processes at each object level is shared with stable, multitiered automata such as digital computers. These attributes may indicate a profound connection between the system complexity afforded by energy dissipation by subatomic level objects and the emergence of complex automata that could have preceded biological systems.

1. Introduction

The tractable history of life exhibits a consistent trend in structural hierarchy, as recorded by the successive emergence of organisms with greater numbers of levels of nestedness and greater degrees of individuation at its highest levels [13]. The lowermost object level of this hierarchy is the cell, but it is not clear whether this trend extended to object levels below the cell itself prior to the emergence of the Last Universal Common Ancestor (LUCA). As one of few trends found across diverse clades of living systems, it is reasonable to infer that some corollaries of this generalized behavior were also central to the chemical processes leading to life’s origins. One critical impediment to investigating life’s origins along these lines is that the delineation of object levels below the cellular level can be done in any number of possible ways: should it consist of a subdivision into the four major polymer types that make up the cell (nucleic acids, proteins, lipids, and carbohydrates) each with its own precursors and settings [4, 5], an assortment of autocatalytic [6] or mutually catalytic [7] sets, an inferred chronological ordering of the appearance of life’s universal chemical constituents [8, 9], or some other logical arrangement altogether? Additionally, how far down in the hierarchy of matter do object level classifications ought to extend? Is it sufficient to stop at the level of small molecules?

A synthesis of physics, chemistry, systems biology, and automata theory may provide a constructive means of distilling groups of objects that enable a living cell to emerge. An automaton is a machine that performs a function or set of functions according to a predetermined set of coded instructions, especially one capable of a range of programmed responses to different circumstances [10]. This broad definition encompasses many different physical forms (instantiations) of objects that exhibit automaton behavior [1113]. For example, one may describe an enzyme as a kind of automaton that carries out a catalytic function that responds to input instructions (i.e., a substrate molecule) to produce a specific molecular output [1416]. Some enzymes carry out catalytic functions only under the circumstance that an appropriate energy activation molecule is available (i.e., nucleoside triphosphates such as ATP or GTP, NADP, and ferredoxins); others do not [17]. At a broader scale, biosynthetic pathways or even the entire metabolic network of a cell may be viewed as exhibiting automaton-like properties in coordinating the uptake of nutrients and the excretion of wastes [18]. In these ways and many others, living organisms carry out complex physical and chemical processes that resemble the workings of automata assemblies across many different scales of structure [1921].

In this paper, we extend the idea of subsumed complexity [22] to map out stratified hierarchical relationships between physical and chemical objects that produce key prebiotic molecules. These relationships extend from subatomic objects up to cells, where they join contiguously with biological nested hierarchies. The stratification of these object levels reflects the division of input energy among greater numbers of particles within the system, corresponding to an increase in entropy. Multilevel transfer processes across different object levels impart emergent dynamical properties best described with automata formalisms. The circumstances by which physicochemical automata can emerge and direct the production of higher level objects may serve as a useful guide for reconstructing life’s origins.

2. Many Roads Lead to Rome: Continua Connecting Abiotic to Biotic States

Reconstructive efforts in the origins of life have historically been assessed along different continua as new analytical tools, fundamental concepts, and disciplinary advances have been developed. Each assessment is broadly similar in theory and approach, consisting of an attempt to match the characteristics of extant life to the characteristics of nonliving settings or circumstances that may be most conducive to life’s emergence. In the modern era, this assessment has broadly unfolded along environmental [2326], physical [2729], and chemical [25, 3033] axes that attempt to connect nonliving (abiotic) to living (biotic) states of matter in a contiguous fashion. With greater elucidation of universal chemical and physical processes, comparison of possible prebiotic environments has been recast as comparison of physical and chemical processes that are colocated with one another in any particular environment.

Origins research has recently expanded to include computational simulations of automata with emergent properties that mimic or recapitulate living behaviors [3436]. Abstract in silico automata have uncovered dynamical relationships that, for example, can give rise to the emergence of functional biopolymers [37]. Beyond the scope of in silico model development, though, more fundamental questions remain with respect to how the fundamental characteristics of automata (in contrast to those characteristics that may be reduced to wholly physicochemical mechanisms) were accumulated and exhibited by prebiotic systems. Indeed, the field of biological computation is built upon the notion that life forms are not just objects that may be approximated by automata; depending upon how they are cultivated and observed, living systems may be described as forms of automata [21, 3840], which are themselves composed of smaller-scale automata with emergent properties that arise between and among lower object levels [20, 41].

At multiple organizational levels (enzymatic, metabolic network, transcription/translation, and entire cellular systems), biochemical reactions and organismal responses are ultimately structured by architectural information that is stored in an organism’s genome. It is particularly tempting to draw an analogy between the intracellular processing of nucleic acids and the processing of information stored in memory elements of the most complex class of automaton known as a Turing machine: both systems process information stored in a string of symbols built upon a fixed alphabet, and both operate by moving step-by-step along those strings, modifying or adding symbols according to a given set of rules [41].

Despite key differences between biological systems and devices such as Turing machines, the machine-like functionality of life’s structural components invites the possibility that prebiotic chemical synthesis was made possible by automaton behavior that emerged at the interface between chemistry and physics. The functional requirements imposed by generalized automaton system classes may thus serve as a starting point for assessing whether automata predecessors would have been possible under prebiotic circumstances.

The specific instantiation of a complex automaton that can read, write, and store information is unconstrained; it may consist of molecules, transistors, vacuum tubes, springs, or even colliding objects [4244]. Similarly, the substance and configuration of a memory element may consist of switches, valves, beads in boxes, genetic sequences, or etched silicon; the only requirement is that the memory element be an arrangement of stable or periodic states [42, 45]. Analogous to lines of inquiry conducted in the physical and chemical disciplines, one may inquire about intermediate states of matter capable of behaving like automata with different capacities, specifically:(i)Prior to the development of informational polymers, did prebiotic chemical systems exhibit behaviors or network patterns that may be approximated by automata formalisms?(ii)What conditions or configurations of the environment may promote the development of memory elements, and what form(s) could such elements assume?(iii)At what stage of prebiotic chemical evolution did substances of any object level acquire attributes functionally equivalent to complex automata capable of chemical computation?

The design of modern digital computers may prove instructive in further breaking down these questions. Computers and living systems are not analogous in function, but the common attributes of their constituent automata can reveal underlying trends that should apply equally to both biological and digital systems [45]. Digital computers are built upon primitive functions involving the reading, manipulation, storage, and output of binary digits (1s and 0s) [46]. High level functions such as conducting complex mathematical operations, running programs, or typing text documents are carried out by abstracting groups of these primitive functions into more and more sophisticated algorithms within a hierarchy of nested object levels. The hierarchical compounding of instruction sets in modern computers includes at least 16 levels of operation (Table 1) [47, 48]. The cumulative effects of so many levels of abstraction are so sophisticated that most computer programming can be done in natural languages or actions comparable to common human communication, to the extent that most users complete tasks in ignorance of the underlying programming languages or specific designs of computing devices.

Living systems, on the other hand, function and behave in ways that are quite distinct from digital computers. The basic unit of life, the cell, is itself a highly complex organic machine capable of high-fidelity replication powered by uptake of nutrients from its environment [49]. The components of the cell, namely, informational polymers such as DNA and RNA and metabolic polymers such as enzymes, are part of an elaborate network of mutually recognizable polymers that are concentrated and contained within the cell membrane [41]. These polymeric components, and their energetic substrates, move via diffusion throughout the cell volume, bringing them into physical contact with one another with sufficient frequency that cell upkeep and growth can be maintained throughout a cell’s lifespan. Multiple copies of the most active polymers are constructed in a cell to ensure frequency of contact and robustness of the cellular metabolic network [50]. A flow of energy through the cell is enforced by specialized and complementary docking surfaces on these polymer types that ensure that contact between polymers is highly coordinated and specific (Figure 1). Living organisms do not compute binary digits but rather bind to and process specific nutrients, excrete waste, and exhibit complex behavior individually and among cellular groups in relation to analog chemical signals, digital readings of genetic polymers, internal feedback loops of interacting compounds, and stimuli from the surrounding environment [51]. Making sense of the cell as an ensemble of automata requires a comprehensive, system-level approach to characterize the interactions and control networks that regulate and drive cellular behavior.

Living systems are built upon physical and chemical interactions arranged in such a way so as to exhibit many (if not all) of the functional components associated with the most complex classes of automata: memory elements (genetic sequences, enzyme sequences, or at a systems level the entire cell itself), reading devices (the ribozyme reads RNA; enzymes “read” substrates and nutrients in the cell), and distinct states (activation and deactivation of enzymes or metabolic intermediates, or the location of the cell itself as a state of local nutrient availability or environmental habitability). This distilled perspective maps the real-world domain into a mechanical domain of automata formalisms [52] which may be used to synthesize a new reconstructive approach to studying life’s origins: was prebiotic chemical complexification also a process of automata complexification?

As a specific example, consider the requirement that the most complex classes of automata that resemble the cell require the use of memory elements. There is no requirement that memory elements take any particular physical or chemical form. At the subcellular level, memory elements correspond to encoded genes. At levels immediately above the cell, it is the cell itself and its system-level status as persisting, reproducing, dying, dead, or translocating in relation to other cells around it that may be considered as a functional memory element. The capacity for memory elements to retain specific information about past, present, and future states of a system is required to manifest stable local structures.

An assessment of memory elements may be extended to conditions that immediately preceded the emergence of the cell. It is highly improbable that the very first abiotically produced nucleic acid and peptide sequences were capable of catalyzing molecular reactions, if only because of the low probability that any randomly generated peptide sequence has such properties [53]. It is less likely that the first generated sequences were capable of recognizing and coordinating with one another in diffusion-limited polymer-polymer interaction networks that form the basis for extant cell functionality and individuation [54]. This imposes a functional requirement that pregenetic memory elements, in some form, must have preserved asynchronously produced polymer sequences in close proximity to one another. Sequences must have been preserved over sufficient time that interaction networks could arise from within a subset of many combinations of abiotically produced polymer populations. The sequences located within a memory element reflect information about the spatial and temporal patterns of physicochemical states and environmental conditions within that region of the larger system. This information did not have a biological context until polymer-polymer interaction networks based on mutual polymer recognition produced an individuated, higher object level entity. A general requirement that pregenetic memory elements facilitate diffusion-limited interactions between polymeric compounds narrows the range of settings that can be reasonably expected to connect LUCA with nonliving systems.

Complex behaviors of automata are often manifested across different object levels or scales of observation. Reconstructing the potential for prebiotic automata first requires enumeration of specific subcellular object levels that span energy inputs and outputs.

3. Subsumed Complexity and Subcellular Object Level Recognition

Life is a far-from-equilibrium configuration of different classes of molecules. Its complexity and continuity are a result of polymeric components coming into frequent contact with one another via diffusion within an enclosed cellular environment. The origins of life may be considered as a specific problem involving a much broader question: what is the origin of complexity [55] and does complexity represent a quantity that can be physically measured, compared, or estimated?

Subsumed complexity is the idea that the first living cell, as a complex arrangement of atoms and molecules, was preceded by and derived from an arrangement of matter and energy that was at least as complex as the cell itself. By extension, the process by which life emerged cannot be characterized as a gradual increase in complexity; rather, there was a preexisting background of physical complexity, within which chemical complexity became gradually subsumed until life emerged [22]. It is built upon the concepts of “complexity as thermodynamic depth” [56] and the relative entropy of different forms of input and output energy [57].

Thermodynamic depth is defined as the difference between the macroscale () and microscale () entropy for a system:In this analysis, the complexity of a system corresponds to the amount of entropy that has to be produced to generate that object. This definition is versatile (it may be applied to many different physical and chemical systems, at many different scales of observation) and instructive (it is an additive property, showing that copy processes at any scale have almost no thermodynamic depth beyond the depth of the original process that formed the first copy). One of its main limitations is that its classification of “macroscale” and “microscale” measurements is arbitrary, and it is not predictive of how complexity will be manifested in a system with great thermodynamic depth. In other words, all of the descriptive information of a system is reduced to the difference between two numbers across some subjective threshold of measurement. One may infer that a system is likely to be more complex than another using the analysis, but the underlying mechanisms that make a system more complex than another must be enumerated and described separately.

The thermodynamic depth of an energy source can be approximated using a calculation of the entropy production of that energy within a closed system. An approximate calculation for the entropy production in a system with conserved mass [57] provides a means of estimating the difference in microscale energy input and macroscale energy output (i.e., contact with a thermal sink):By maximizing the ratio of the number of output quanta produced per number of input quanta, the microscale entropy () of a system may be minimized, thus maximizing the thermodynamic depth of a closed system in contact with a thermal sink at a fixed temperature. This approximation provides a physical basis for inferring that systems driven by few but highly energetic input particles such as gamma rays are inherently more complex than systems driven by other forms of energy such as infrared or visible light, even if the total amount of energy inputted into and out of the system is equivalent using both forms of energy.

It follows that, instead of searching for life’s origins among chemical systems that produce key prebiotic molecules of interest (Figure 2(a)), one may instead search for generalized physical conditions in which complexity is inherently high (Figure 2(b)). One may then investigate whether the molecules used in cellular polymers can be plausibly produced within these systems when they are endowed with an initial composition of common materials. The objective is to seek out systems that maximize the number of object levels and therefore maximize the number of energy exchange rules in operation across these object levels. This should maximize the sophistication of energy dissipation pathways while providing suitable conditions for automaton behavior to arise across different object level thresholds. Another objective is to eliminate the influx and outflux of mass, maximizing entropy production per number of massive particles in a system [56, 58]. This also increases the probability that larger, longer-lived molecular structures can form and increase in concentration over time in the system without being either diluted by influx or washed out of the system.

Powerful subatomic radiation such as gamma rays maximizes power input normalized per number of input particles. The most notable effect of using this energy source is that powerful photons enable the perturbation of more finely resolved structures that make up molecules and atoms (Figure 3). The rules that a system follows in relaxing back to a ground state differ depending on the scales at which those structures are being perturbed.

By perturbing more finely resolved structures, the use of more powerful input particles increases the number of descriptive states that are required to approximate the full range of energetic responses of molecules and atoms. Under moderate physical conditions, coarse macroscopic descriptions of molecules and atoms are sufficient to describe most induced perturbations of the system. At even more extreme conditions, the outermost valence electrons begin to dissociate and rearrange, emitting light commensurate with photoelectron spectra in the visible to UV portions of the spectrum (or, conversely, when species combust, these are the portions of the spectrum at which light is emitted). The perturbation of inner atomic electrons requires even more energy, typically in the X-ray portion of the spectrum. The different spectra of X-ray emission are intimately associated with identifying specific atoms through X-ray diffraction. Probing the scale of nuclei would require highly energetic subatomic particles such as gamma rays, neutrons, or highly accelerated electrons or other atoms that can collide with and transform particles at very short distances proximal to the nucleus that fall under the influence of the strong nuclear force.

The overall picture is that energy in any natural system dissipates according to a natural hierarchy, one in which powerful forms of energy are attenuated by fine atomic and molecular objects as they are converted to greater numbers of less powerful particles. Simultaneously, these greater numbers of less powerful particles are unable to perturb lower level reactions, stratifying the hierarchy and effectively hiding the processes of energy exchange followed at these lower levels as the system equilibrates with a thermal sink. The result is that a system with a given number of massive particles, driven by a given quantity of energy, exhibits a broader and more complex range of behaviors if energy input is constrained to follow “rules” of energy flow through lower level objects. The rules of energy flow are effectively set by the hierarchical arrangement of subatomic, atomic, and molecular components within the system.

The structure of this hierarchy and the relationship of these object levels to one another share some attributes with the hierarchy of digital computers. Computers are constructed to translate basal physical processes into such a high level of abstraction that we are barely aware that we are manipulating the precise placement of small numbers of electrons, or that the circuits themselves are composed of ever-smaller numbers of semiconducting atoms. This level of abstraction is made possible by the high number of intervening object levels which process and repackage signals over the entire architecture of the system [42]. Atomic scale processes are critical to achieving sufficient signal-to-noise performance so that the mechanisms function reliably during every process cycle [49]. This degree of control is made possible because atoms and groups of atoms statistically function the same way and, according to very specific and highly stratified rules, across the lowermost levels of interaction.

Viewing a generalized prebiotic system through a physical lens also demonstrates that these object levels, and by association the rules that govern interaction at any given object level, are themselves stratified in a very specific order: energy flow in natural physicochemical systems tends to proceed from the bottom levels to the top levels through the process of energy dissipation until the temperature of the thermal sink is reached. Interactions at lower object levels involve energy exchanges and transformations on much shorter physical and temporal scales. As a unit of input energy is attenuated and filtered higher into the hierarchy, the energy initially carried by few particles becomes distributed and exchanged between larger numbers of particles. Entropy is manifested as a decreasing likelihood that any single unit of input energy can ever be found in any single state space or particle, as a chain of events is set into motion that affect ever-greater numbers of particles in the system.

Another important aspect of using subatomic energy to drive system behavior is that the resulting spatial and temporal distribution of the attenuated and scattered energy particles (i.e., Bremsstrahlung X-rays, secondary electrons, UV photoelectrons, Cerenkov UV photons, etc.) carry information about the composition and state of the target atoms and molecules. This information is typically considered from a laboratory or analytical perspective with little bearing on driving the chemical evolution of a natural system. However, the physical forms of these pieces of information also increase the number of rules that describe how the system can be reconfigured as energy moves through higher object levels. This represents a vast and underexplored source of physicochemical sophistication that can drive chemical evolution of abiotic systems as energy is attenuated across different object level thresholds. By utilizing the rules imposed on energy transfer by lower object level interactions, one can obtain greatly increased capacity to output unlikely, far-from-equilibrium configurations of molecules at higher object levels. The question becomes, do any of these far-from-equilibrium configurations resemble those that seem to be required for the production of key biological molecules?

Discerning object levels associated with different energy inputs and outputs enables one to estimate where and how transformation rules at a given level should be capable of effecting energy distribution at higher levels. One possible arrangement of object levels relevant to reconstructing life’s origins is outlined in Table 2. The lowermost object levels are set by the structural hierarchy of subatomic, atomic, and molecular particles and are therefore more rigidly defined. The middle and uppermost object levels are less rigidly defined; the molecular composition of a system driven by perturbation of lowermost objects changes over time, with the result that objects at these levels may not be present or exert different effects on the overall system at different points of time as the system develops.

It would be a mistake to directly equate the physical and chemical hierarchical levels outlined in Table 2 with the levels of abstraction that make up the operating systems of digital computers; there is no direct correspondence between these different systems. Nevertheless, the architectural attributes of the relationships of these object levels to one another and the different formalisms that succinctly relate groups of objects within each level have properties that resemble abstracted computer hierarchy formalisms. Hierarchically nested systems start from a small set of elementary components from which, layer by layer, more complex entities may be constructed [52]. Shared attributes between these different hierarchically arranged systems include the following:(i)An operating system is composed of a hierarchically organized control program for selecting and allocating resources among different tasks within a computing system. Chemical compounds are selectable components with a hierarchically organized means of allocating the flow, attenuation, and retention of energy among different objects in a system.(ii)The lowermost object levels of both systems generate an array of statistically predictable responses of atoms and molecules to energy inputs.(iii)Each level builds on the levels below but also hides all of the internal details of its operations from the levels above.(iv)A function describing activity at a given level has access only to functions defined at lower levels.(v)Stratification ensures that higher level objects can only emerge sequentially from functions carried out at lower levels. This imparts stability on the entire system.

A distillation of hierarchically nested interactive objects found within any prebiotic physicochemical system would seem to be a critical prerequisite to reconstructing life’s origins as diffusion-constrained instantiations of automata. Without understanding the formalized groups of objects distinguishable by physical and chemical thresholds, the way that entropy production stratifies the thresholds into a hierarchy, or the full scope of intralevel interactivity where emergent behavior begins to manifest, it is impossible to recognize the conditions under which automaton-like behavior can originate.

There are no a priori reasons to infer that life’s origins must originate or be driven from energy input at any particular hierarchical level. However, simple rules formalized at any single level can give rise to emergent complexity at higher levels separated by structural, spatial, or temporal thresholds. On this basis, it is reasonable to infer that the selection of systems with the highest possible number of object levels between energy input and output should maximize the possible complexity of the system. The reason is quite simple: each additional hierarchical level opens a capacity for sophistication through the imposition of new system rules at that level and another potential degree of nestedness that can be organized using objects drawn from the level immediately below. As each level imposes its own unique set of exchange rules on the overall system, greater sophistication should be afforded between system input and system output. Greater sophistication affords fewer external constraints that need be applied to reach a desired output state. By extension, the only external constraints that need to be applied should be the initial composition of atomic and molecular components, the energy input, and the thermal sink temperature.

4. Lower Object Level Threshold Effects on High Level Chemistry

The metabolic basis of all life is the structural attenuation and channelization of ambient energy (chemical or photonic) into the coordinated production of biomass. This coordination has become highly specialized over billions of years of evolution. Prior to the development of encoded genes and decryption protocols that compose the Central Dogma, it is unclear whether or how this level of sophistication and specificity would be expressed in a prebiotic physicochemical system.

Origins of life research as a chemical discipline has been carried out as a search for chemical reaction pathways and conditions that preceded highly specialized biological synthesis pathways. Experiments typically consist of different reactive species that correspond to hierarchical level 3, conducted within a solvent medium such as liquid water [59, 60]. This general experiment architecture is part of a broad-ranging, systemic search for the production of key components of biology such as amino acids [6163], nucleic acids [5, 33, 64, 65], cell membrane phospholipids [66, 67], and reactive metabolic intermediates that make up the carboxylic acid cycle [68, 69] or the gluconeogenesis pathway [70]. These studies have been instrumental in uncovering reactions that produce biological molecules using reactants and conditions that would have been available on the early Earth or early solar system with the shortest possible list of process specifications. However, there have been no reports of a prebiotic chemical system that possesses the key hierarchical attributes exhibited by life at the level of the cell and higher.

Driving energy into a system at lower object level thresholds has the potential to impart novel chemical synthesis functionality that relaxes the need to impose external manipulations to produce molecules of interest. The most general output of irradiating a homogeneous liquid mixture is the rearrangement of atoms and molecules to produce a mix of oxidizing and reducing compounds across a wide range of redox states (Figure 4(a)). As a specific example, consider the irradiation of a simple solvent molecule like water (see Figure 4).

A water molecule can respond in a number of different ways (Figure 4(b)), but most of these responses involve entering an excited state (H2) or the ejection of an electron (), leaving behind a water radical (H2) [71]. This process occurs very rapidly, on the order of 10−16 seconds, much shorter than the duration of typical molecule-molecule collision frequencies. These first-generation products interact with one another or, more likely, with other surrounding water molecules to produce a second generation of products such as H or OH radicals or a solvated electron (), which is an electron that is thermalized and surrounded by a loose cage of water molecules. These second-generation products interact with one another and other water molecules on longer time scales to produce an array of third-generation products such as hydrogen peroxide, hydrogen or oxygen gases, OH and H ions, or more solvated electrons. From the initial state to the final state, the processes are essentially unidirectional and occur within a very small radius. All of these reactions are so rapid and so far from equilibrium that they are essentially instantaneous step changes in the states of those atoms and molecules; they occur so rapidly and to such a localized number of atoms and molecules that the overall process is nearly isothermal. In a heterogeneous liquid mixture, the final products persist and diffuse far away enough from the point of formation that they interact with other molecular species; the efficiency of this process depends in part on the energy transfer characteristics of the type of irradiation. In total, this multistep process traverses the subatomic, atomic, and intramolecular bond levels and the products can then interact with other molecules to form new covalent bonds.

Energy exchanges that occur at these lowermost levels are fundamentally different than those of higher levels. Subatomic and atomic perturbations are approximately unidirectional processes and are not specific to any particular atomic or molecular species. Unidirectionality arises because energy attenuation occurs on time scales that are much shorter (10-16 s) than intermolecular bombardment times (10-10 s) and the attenuation objects (subatomic particles such as electrons) are much shorter than the characteristic sizes of molecules. Overall charge balance in the system remains essentially constant. This has the effect that fragmentation of molecules into a higher number of excited or reactive objects generated in a single high energy event is eventually reduced back to approximately the same number of starting molecules. This lack of specificity regarding reactions between atomic or molecular fragments, an effect at least partly attributed to the nature of the highly reactive radical intermediates formed, causes these exchange processes to resemble primitive, generalized functions or operations rather than equilibrium chemical reactions. Continued irradiation generates a first generation of products, and as these products accumulate in concentration, the same generalized functions that created the first generation begin to transform a small number of these initial products into a new array of second-generation products. Selectivity of product formation can begin to arise at higher object levels through different rates of reaction between different reactive intermediates and compounds that begin to accumulate over time. Such primitive operations are therefore likely to result in far-from-equilibrium outputs throughout all hierarchical levels of the system, particularly as the energy stored in far-from-equilibrium species at lower levels drives reorganization of chemical structures to create higher object levels. The functional result is that each level builds on top of the levels below, but the internal details of operations at the base levels are effectively hidden from the top levels.

This simple array of lower object level reactions drives the system at higher levels in multiple ways. First, it is not necessary to introduce implausibly high concentrations of reactive compounds to drive the system; the solvent itself is driven to produce highly reducing (e.g., H2, ) and oxidizing (e.g., O2, H2O2, and ) species, which creates high chemical potential throughout the solvent. Second, the solvated electron and hydroxyl radical are effective drivers of reducing and oxidizing reactions, respectively, among organic species in a system. The and can act sequentially and quickly on organic compounds composed of many different atomic species and masses, without requiring the introduction of new material to the system [72, 73]. This means that a system that is closed in terms of mass flow, but open in terms of energy flow (powerful subatomic particles in, thermal sink photons out), can be driven far from equilibrium within a short period of time. The action of and is both constructive and destructive of objects found at higher object levels. All of these factors taken together mean that entropy production per molecule can be very high over both short and long time scales without leading the system to a chemically unreactive dead end.

A critical question from the perspective of life’s origins remains: can driving a system in this manner help to align abiotic and biotic chemical synthesis pathways? Compounds derived from the radiolysis of water, combined with the interactions afforded by the solvated electron and hydroxyl radicals, drive an array of irreversible reactions involving simple organic and inorganic compounds that produce key biological molecules. The following paragraphs are intended to give a few examples of some of the reactions that can take place under geologic settings that combine a source of powerful radiation (uraninite or other radioactive minerals); common molecular species found at the Earth’s surface such as water, dissolved salt, carbon dioxide, and nitrogen gas; apatite as a potential source of phosphorus for nucleotide sequences and energy transduction molecules, and pyrite as a source of iron-sulfur clusters found in electron transport chain and other key energy transduction metalloproteins [74, 75].

Gamma rays and X-rays can cause inert gases such as N2 and CO2 to fragment such that N and C may recombine with atoms derived from water, leading to compounds like NH3 and HCN [76, 77]. Bielski and Allen showed that gamma radiolysis of aqueous potassium cyanide (KCN) generates formaldehyde (CH2O) through a mechanism possibly involving HCN reduction by the solvated electrons produced in situ [78]. In the same mixture, , the hydroxyl radicals formed, attacked cyanide anions leading to formamide (FA) and cyanate. The authors also observed glycine and cyanamide (H2NCN) as products of radiolysis. It is insightful to mention that the Sutherland group has demonstrated that solvated electrons generated photochemically by UV irradiation of copper cyanide complexes in the presence of excess cyanide can initiate a Kiliani-Fischer type homologation mechanism for the synthesis of simple sugars, like glycolaldehyde (GA) and glyceraldehyde (GLA) [79], providing an alternative to the formose reaction commonly invoked for prebiotic scenarios [80]. Draganić and coworkers reported their observation of glycolaldehyde, ribose, and glucose arising from radiolysis of aqueous acetonitrile solutions [81]. It is tantalizing to imagine that ionizing radiation like gamma rays may be able to drive a Kiliani-Fischer homologation mechanism [82]; however, such a radiolytic mechanism for the synthesis of simple sugars has yet to be explicitly demonstrated to the best of our knowledge.

The presence of iron pyrite [83] (FeS2) and apatite [84] ((Ca)10(PO4)6(Cl,F)2) in radioactive deposits opens a pathway to release abundant phosphoric acid. FeS2 acts as a sink for oxidizing compounds such as O2, releasing abundant ferrous iron (Fe2+) and sulfuric acid (H2SO4), and imparting a redox and acid asymmetry to the system [85]. Sulfuric acid can then degrade apatite, releasing soluble phosphoric acid (H2PO4) and precipitating solid-phase gypsum from the solution (CaSO4) [86]. The net result of coupling between apatite and pyrite under radiolytic conditions is localized release of dissolved phosphate and degradation of pyrite, which may enable the incorporation of pyrite-derived iron-sulfur clusters as part of protometabolic polypeptides.

Alpha radiolysis of aqueous solutions of common salt (NaCl) has been shown to lead to the formation of sodium hypochlorite (NaOCl) [87]. The reaction of hypochlorous acid (HOCl) with cyanide anion (CN) is known to produce cyanogen chloride (ClCN) very rapidly, characterized by a second-order rate constant of 1.22 × 109 M–1 s–1 [88]. ClCN reacts with imidazole to yield diimidazole imine (Im2CNH), an activating agent for ribonucleotides yielding the corresponding phosphorimidazolide [89], substrates of which have been extensively employed in the study of nonenzymatic template-directed synthesis of RNA [90]. This activation chemistry has been shown to occur in one pot, by slowly adding NaCN and NaOCl to a solution containing imidazole and one of the four canonical 5′-ribonucleoside monophosphates. ClCN when mixed with ammonia (NH3) is known to yield cyanamide (H2NCN) [91], as does UV and electron radiolysis of aqueous solutions of ammonium cyanide (NH4CN) [92]. The Sutherland group has shown that H2NCN and GA will undergo a cyclization reaction catalyzed by inorganic phosphate acting as a general base to form 2-aminooxazole (2NH2Ox) [93]. This five-membered ring will undergo another cyclization reaction with GLA to furnish a mixture of 2-aminooxazoline stereoisomers, including the arabinofuranosyl-aminooxazoline, a key intermediate in the potentially prebiotic ribonucleotide synthetic pathways demonstrated by both the Sutherland and Powner groups [64, 94]. Furthermore, the Szostak group has shown that 2-aminoimidazole (2NH2Im), a potent leaving group in the context of activation chemistry for nonenzymatic RNA replication [95], and 2NH2Ox are related products from the same reaction network, involving H2NCN, GA, and NH3, in which lower pH and higher NH3 concentrations favor greater 2NH2Im production [96]. We speculate that aqueous mixtures of NH3, HCN, and NaCl when exposed to ionizing radiation have the potential to form H2NCN, GA, and GLA and thereby higher level products such as 2NH2Ox, 2NH2Im, and arabinofuranosyl-aminooxazoline through a common reaction network taking place in a single mixture. We stress, however, that, to the best of our knowledge, such a reaction network has yet to be reported.

The resulting arrays of potential reactions are diverse, and compounds that serve as initial reactants (H2O, CO2, N2, NaCl, pyrite, and apatite) are all plausibly found on an abiotic Earth and require no preparation or treatment prior to irradiation. One significant question regarding the chemical plausibility of the entire system is whether minerals containing radioactive elements such as U or Th would have been present and concentrated within a newly formed planetary crust. There are differing opinions about the likelihood that the early Earth crust was highly differentiated or closer in composition to the primitive mantle [9799]. The only requirement seems to be that there were at least some areas of the Earth’s surface, even relatively small areas, with felsic rock types such as granites that concentrated uranium minerals within a few hundred million years of Earth’s formation [100102]. These rocks would then have been weathered and reworked just as they have been throughout most of Earth’s geologic history. This possibility opens up multilevel physicochemical configurations and interactions that could have enabled emergent automaton-like behaviors.

5. Discussion

5.1. Geochemical Automata across Object Levels

Automata are rarely discussed in a geologic context. Most geologic events are reducible to physical (sedimentation rates, material transport properties, magma heating and cooling processes, mantle convection cells, etc.) or chemical (mineral crystal formation from magma melts, ion dissolution and transport, mineral alteration under heat and pressure, etc.) analytical approaches. There are, however, phenomena exhibiting feedback or iterative network behavior best approximated by automata-like descriptions. Cellular automata in particular are used to great effect for discretized macroscale systems for which state or phase changes are highly contingent upon localized interactions, such as subsurface flow through a lattice network [103], mineral recrystallization [104], solute transport and mineral dissolution [105], or seismic wave propagation [106]. Investigators have previously acknowledged that automata may best approximate critical prebiotic reaction networks [107, 108], but such observations were not developed in sufficient detail to yield specific physical or chemical hypotheses.

Many mechanisms or external conditions invoked in prebiotic scenarios are presumed to function like automata even when this is not explicitly stated. There is little in geology that remains constant for long. Temperature fluctuations are forecastable within envelopes but are not precisely repeatable or predictable. A constant-temperature setting, or even a discrete series of changing temperature phases of variable duration, can only be a result of either complete chance for a short period of time or an emergent mechanism that can sense, and precisely respond to, stochastic external conditions over long periods of time. Bringing multiple, differing, and often conflicting steps and external conditional constraints of prebiotic chemistry into alignment remains an ongoing challenge for origins research. As a result, evaluation of prebiotic plausibility typically includes at least a cursory assessment of how broadly permissible a range of conditions may be for a studied reaction or process. If the range is narrow, an assessment will describe environmental settings which include at least one selected variation (among many possible variations) of conditions that adequately conform to a required laboratory regimen.

Enumeration of objects and object level thresholds is a prerequisite to identify automaton-like behaviors in a physicochemical system. Rather than seek a relatively narrow set of functional conditions among many possible stochastically variable natural combinations, or a special array of initial conditions, an alternative approach would be to seek out naturally occurring circumstances in which automata arise by virtue of intralevel emergent properties. Such automata may assist in driving the system through relatively restricted or improbable states that link abiotic and biotic configurations of matter. The remainder of this paper will describe two different kinds of multilevel interactions that resemble attributes or necessary components of automata that may be worthy of further study.

5.2. Automata 1: A Geochemical Thermostat with Repetitive Operation Algorithm

Radiolysis of the selected system is likely to produce intermediate compounds that can lead to oligomer formation under some circumstances, but without heating the dominant chemical component of the system is likely to remain water. Water can result in the hydrolytic cleavage of the amide bonds in peptides [109] (especially at extremes of pH) and RNA phosphodiester bonds [110] (particularly in the presence of divalent metal cations) over relatively short timescales and is therefore not an ideal medium for facilitating some of the molecular network reactions that may be required to reach interacting polymer object levels. Additionally, many of the reactions that link these intermediate compounds to the production of nucleotides work best at high compound concentrations and, in the specific case of nucleobase (NB) condensation from FA, desiccated and heated liquid mixtures [111]. All of these compounds have a boiling temperature greater than that of water. Without a means of increasing the temperature of the system above the boiling temperature of water (but not so high that the compounds themselves are thermally degraded), concentrations of all of these compounds are likely to remain in the millimolar range or less. Cycles of heating + drying and cooling + wetting are often invoked as ideal circumstances for nonenzymatic polymerization [4, 112, 113]. The parameters of the required heating system are stringent in geological terms: a thermostat with a maximum temperature tuned just above the boiling temperature of water, an ambient temperature and pressure that permit liquid water, a duration of heating on the order of tens of minutes to hours that is followed by cooling to ambient conditions, an ability to periodically switch between heating and cooling states, prolonged operation of a repeating heating/cooling cycle over hundreds or thousands of years, and replenishment of reactive precursor molecules.

One means of meeting all of these stringent heating requirements is provided by water-moderated fissioning of uranium [101, 114] (Figure 5). On the early Earth, the fissionable isotope of uranium, 235U, was enriched in excess of 20% of all planetary U (today, it is less than 1%) [102]. One of the decay paths of 235U is via fission, which emits nucleus decay fragments and an average of just over 2 neutrons per decay event. If U-bearing deposits on the early Earth also contained abundant hydrogen-bearing compounds such as water or reduced organic compounds, the neutrons are quickly slowed through neutron-hydrogen collisions, moderating the neutrons and setting up conditions by which another fission event can occur. This continues a chain reaction of events that maintains the release of abundant, highly energetic subatomic particles. The resulting volumetric power density and total amount of energy released are many orders of magnitude above even the most energetic redox processes found at the Earth’s surface [115].

A mineral deposit with sufficient 235U and water or organic compounds functions according to a feedback system [116] that is best approximated as a class of automata known as a finite state machine. This machine approximation has two states, On and Off, which meets the minimum requirement of a periodically operating system achievable by toggling between these states. Neutrons are moderated in the On state, as each emitted neutron induces an average of at least one more fission event in a chain reaction. Each fission event releases hundreds of mega-electron volts of energy emitted as a mix of , , , and and fission recoil fragments at flux rates powerful enough to quickly heat the reactor core.

The heating process continues until the temperature rises above the boiling temperature of the water (or that of the moderating fluid if it is composed of a mix of organic compounds), driving the water from the fissioning uranium-rich deposit as pressurized steam. With the loss of neutron-moderating, H-rich fluids, the reactor enters a quiescent Off state. The deposit cools and the moderating fluid condenses and flows back into the U-rich zone, resetting the cycle. The transition between On and Off states repeats itself until water is driven completely from the system and is unable to return, or the 235U fuel is expended, pushing the reactor below criticality thresholds. In the case of well-documented natural fission zones at Oklo, Gabon, subsurface fission zones operated on approximately 30-minute duration On and 150-minute duration Off cycles for over 500,000 years [117]. Power production longevity and density are afforded by the high density of the fission energy source itself. The overall system behavior is robust, and parameters such as peak temperature or duration of On-Off cycling can vary slightly depending on burial depth, permeability of the U-bearing deposit, distance from the fission zone, and the degree to which the water in the system is connected to a larger external reservoir. Fission zones formed within short distances of one another introduce another object level of molecular production, namely, convective exchange between adjacent zones of differing chemical composition or maximum temperature.

A water-governed fission thermostat increases the robustness of synthesis pathways between lower and higher object levels of interest to life’s origins. Radiolysis is known to produce potential condensing agents such as cyanate and cyanamide, but heating cycles such as this supplement these reactions by driving dehydration/condensation reactions by removing water from the system on scales of minutes to hours. This evaporation of water would also increase the concentration of the reactive intermediate compounds while driving out other compounds with boiling temperatures below that of water, increasing the rates at which higher level molecular objects can form. The geologic duration of such features (>100,000 years) affords a great deal of time in an ensconced, near-surface environment that can produce and accumulate key reactive molecules in the system to form, degrade, and reform many different combinations of polymer sequences using these molecules. An ensconced environment would also be shielded from larger-scale perturbations to the Earth’s surface such as intermittent meteorite impacts, solar flares, or climate variations that may hinder prebiotic system development. The rapid and highly localized heating would set up convective flows that carry the lower level object radiolysis products a few meters from the fission zone, which is beyond the penetration depth of most forms of subatomic radiation. Convective displacement of the precursor molecules for sugars, amino acids, and polymers increases the likelihood that these higher level object compounds will escape and diffuse from the fission zone and into the surrounding rock matrix.

5.3. Automata 2: Hereditary Precellular Compartments—Rock Matrix Pore Spaces as Memory Elements

The most complex forms of automata are able to store information about past time states or functions across multiple time steps in memory elements. There are few formal constraints on the construction of such memory elements; they may be 1-, 2-, or 3-dimensional in arrangement, and they may be composed of nearly any spatially or temporally organized, discrete objects at any scale [45].

Life has optimized a universal polymer-based genetic system that encodes nearly all of the information required to replicate cells across generations. One of the most daunting challenges in the origin of life is simultaneously producing genetic monomers and polymers, polymeric polypeptides, and energy transduction molecules, with high concentrations and in such close proximity that they may all interact with one another at diffusion-limited rates. Noncellular compartments housing mixtures of lower level objects with persistent, hereditary chemical features (i.e., spanning time without becoming well mixed with surrounding volumes) are one means of maintaining high concentrations and localized chemical gradients. Hydrothermal vent chambers [26, 118], self-assembling lipid vesicles [119121], rock pores [122], pyrite mineral surfaces [32], and sediment pore spaces [116, 123] and other structures have been described as possible prebiotic compartments that preceded the emergence of LUCA.

Automata theory, combined with physical diffusion parameters, may provide a means of evaluating the likelihood that these different environments are capable of manifesting some form of complex automaton that includes memory elements that function in this way. It is unclear at what point or in what form memory elements related to the genetic code may have originated. However, the physics of diffusion are general enough that one may assess the constraints by which distinct groups of polymer-based memory elements may interact with one another. One approach would be to focus on the density, dimensionality, and diffusion-limited interconnectivity of compartments as memory elements.

The average diffusion distance as a function of time is defined similarly for 1D, 2D, and 3D systems. It is approximately the square root of the product of qτ, where is proportional to the dimensionality of the system (2 for 1D, 4 for 2D, and 6 for 3D), is the diffusion constant for compounds which is proportional to each compound’s molecular mass and physical size, and is the approximate amount of time that a compound may be allowed to diffuse in the system. For all systems, may be allowed to become large to connect distant memory elements, but this comes at a cost of reducing the total number of memory elements possible within a system of a given size and, more practically, a decrease in the effective concentration of those molecules in the system. Memory element density, which is related to the total capacity to retain information about past states within a defined characteristic distance, requires a clear tradeoff between system dimensionality, compound size, and time step duration.

1D diffusion-limited memory elements would take the form of discrete paths along which an object would be subjected to an array of physical and chemical conditions. An example would be the flow paths taken by molecules through a hydrothermal vent chimney (Figure 6) (disregarding for a moment the effects that flow would impart on mixing subdivided portions of the path). From a physical perspective, it is not clear whether memory elements of this configuration would be capable of functioning effectively as part of an automaton. Atoms and molecules flowing along these paths would have only a very short period of time to interact with any other groups of molecules before exiting the system. By definition, 1D paths located adjacent to one another can have little conceivable interaction with one another, meaning that the total informational content of each memory element (however it would be defined) would be isolated from all others with limited interconnectivity. This limitation would effectively impose the constraint that all prebiotic synthesis reactions must occur in the time between the entry of objects into the heart of the chimney and the exit of those objects at the oceanic interface. For these reasons, it is difficult to imagine the content or arrangement of characteristics within a 1D diffusion-limited memory element being constructive or cumulative over time.

A 2D diffusion-limited array of memory elements can form at any location where new, reactive material can be introduced at different points of the surface. A sediment-water interface, the mineral faces of pyrite crystals, or rocky surfaces of exposed crustal rocks are all locations that would meet these requirements (Figure 7). These settings would line a well-mixed basin that brings new molecular species to the surface via basin-surface diffusion and gravitational settling of products produced in the column above the surface itself. Once brought into contact with the surface, reactive molecular species may diffuse outward to interact with other molecular species that have accumulated or been produced through other processes. A practical upper limit on the number of individual sources that may be found on a source of this type would be related to the physical size of a source molecule and the rate at which new source molecules can be delivered to the surface, which would be correlated with the productivity in the overlying column.

The connectivity and diffusion characteristics of 3D compartments would seem to be best suited as memory elements. Compartments would have multiple axes of interaction with the nearest neighbors, which means that a diffusing compound can reach more adjacent compartments in a shorter period of time, increasing the density of memory elements (Figure 8). A 3D array of elements could also be broken into 1D or 2D memory elements under certain conditions. Inhibiting free flow could reduce a set of 3D elements into adjacent 1D or 2D elements, and phase changes such as evaporation could reduce 3D volumes to a network of 2D-lined surfaces of residual compounds along pore space walls.

As described in the section above, uranium-rich sediment deposits with water in interstitial void spaces on the early Earth have already demonstrated complex, emergent attributes typically associated with a finite state machine capable of toggling between On and Off states [117]. Void spaces in such deposits may also be predisposed to serving as a 3D diffusion-limited memory array. Many uranium-rich rocks originally derive from minerals found in granitic rocks that make up continental crust. Such rocks also include other mineral types that are central to origins of life research such as pyrite, apatite, and rutile [111]. One particular kind of early Earth deposit that hosted uranium-rich rocks such as uraninite is known as a heavy mineral placer [116]. Placer deposits are composed of a high proportion of minerals that are dense and resistant to weathering. Placer minerals are sorted by hydrodynamic forces in environments that contain moving water such as beaches, rivers, and creeks, often collecting in these places where water speed is fastest. Typical early Earth placer deposits could also include silica, monazite, ilmenite, magnetite, zircon, and garnet, with grain sizes typically corresponding to fine to coarse grained sandstones.

A heterogeneous mix of mineral grains made of placer sediments are an advantageous setting for precellular compartments for many reasons. Sandstones readily allow the flow of fluids and the diffusion of dissolved compounds through interstitial pore channels. This ensures that zones of intense radiolytic energy flux or key molecules would be well connected to one another and that the entire volume contributes to the memory capacity of the overall, multitiered energy dissipation system. Heterogeneous distributions of mineral grains would mean that some pores are lined by more key source molecules for reactive intermediates such as phosphate found in apatite or monazite, while others would have pores lined with catalytic minerals such as rutile, zircon, or pyrite. In some zones, reactions will occur that are net producers of reactive biomolecular intermediates, and in other zones reactions will predominate that are net consumers of such compounds. In this way, mineral heterogeneity undergirds broader molecular interaction network heterogeneity.

Cells contain within their genomes a precise record of all structural components required to make copies of themselves. Prior to the emergence of a cellular entity, it is unlikely that predecessors of all organelles, metabolic pathways, and polymer types required for LUCA to reproduce on its own would be found in a single void space by chance. It is more likely that a great deal of time, close proximity, and diffusion-driven mixing were required before polymer sequences of different types (i.e., polypeptides and polynucleotides enclosed within lipids; object level 8, Figure 9) formed that were mutually recognizable and reinforcing of one another’s formation.

Void spaces could have served critical roles as memory elements by giving some spaces or zones a consistent identity defined by those reactions that would cause them to serve as net sources or sinks, and serving as an enclosed volume in which the products made in previous time steps could accumulate and interact with products of adjacent spaces in future time steps. By connecting disparate voids that act as sources and sinks for different portions of a molecular interaction network, the probability of forming a higher object level grouping of interactive polymers is increased.

An estimate of the number of unique combinations of mineral-lined pore spaces of this approximate diversity () and number of bounding minerals ( varies from roughly 6 to 8, depending upon packing configuration) can be calculated. The appropriate estimate is provided by combinations (the order or specific arrangement of minerals surrounding the void space does not matter) with repetitions allowed (any number of 6 to 8 objects drawn from each of the categories is selectable without limit). Within these parameters, there are between 5,005 and 24,310 unique combinations of minerals that can line a pore of this configuration. For medium-grained sandstone, the average pore diameter is approximately 130 microns, and the void space of the overall rock (the proportion of the total volume that is empty space rather than minerals) is approximately 20%. Based on these figures, there are approximately 1.7E11 distinct pore spaces in a single cubic meter of host rock, so each unique combination would likely be sampled many times over within every cubic meter of host sandstone.

Referring back to the plot of maximum 3D memory elements in Figure 8(b), at 1E11 elements per cubic meter, freely diffusing small molecules and small proteins would reach adjacent pore spaces within about a minute, while larger compounds such as large proteins and nucleotides would reach adjacent pore spaces within 1 and 10 hours, respectively. A model system with parameters matching a typical surface geyser or fission-zone thermostat (heating On-Off cycles approximately every hour [117]) would just barely be within the requirement for efficiently diffusing RNA sequences to adjacent pores during each characteristic cycle step of the system.

From a slightly larger perspective, there is an upper limit on the lifetime diffusion distance of a typical RNA sequence set by the rate of hydrolysis of its bonds. The half-life of a phosphodiester bond is about one year [110] in water at room temperature. The maximum diffusion distance for a free polynucleotide in that time would only be less than 1 centimeter. Note that these are all approximations for ideal, diffusion-only scenarios. Convection could shorten molecule travel times but would also greatly dilute the molar concentrations of these compounds. In real systems, surface friction and irregular constrictions would lengthen molecular travel times between pore spaces.

This spatial limitation indicates how severe diffusion-limited transport becomes for the exchange and interaction among the class of compounds commonly invoked for RNA World scenarios. At an effective diffusion range of less than 1 centimeter per year, it is important that prebiotically generated RNA sequences come into contact with reactive prebiotic compounds, substrate molecules, source minerals, reactive catalysts, and other polymers on time scales much shorter than the half-life of bond degradation. The emergence of a self-reinforcing network of catalytic RNA sequences would help to overcome abundance and concentration limitations within a volume, but high abundance and productivity do not fundamentally overcome the diffusion-limited time and distance constraints for RNA sequences migrating out of a source volume and into contact with other polymer types unless the sources of compounds that compose these sequences are in abundance and continuously replenished. For these reasons, it is likely that RNA sequences must also have come into contact with zones containing other polymer types such as polypeptides that were generated in a combinatorial manner and that the combinations of these polymer productivity zones should be explored within millimeters or centimeters of one another. In an ideal system, these polymer productivity zones should be directly adjacent to one another or include pore spaces where conditions are suitable for the production of multiple polymer types. All of these facets of time, distance, and productivity for prebiotic RNA monomers indicate that source mineral heterogeneity on scales of millimeters to centimeters may be an implicit prerequisite feature of RNA World scenarios.

6. Conclusions

One of the implicit attributes of “complexity as thermodynamic depth” is that greater thermodynamic depth can correlate with the perturbation of finer object level structures of molecules and atoms. Driving a prebiotic chemical system with energy powerful enough to disrupt subatomic structures involves perturbing the highest identifiable number of fine resolution object levels possible for conditions found at Earth’s surface. These object levels are hierarchically nested and stratified by entropy production. Each object level builds on the levels below, while also effectively hiding details of the energy of the physical and chemical exchange processes from levels above. The energetic and structural thresholds that delineate these object levels provide the requirements for automata to emerge, including complex mechanistic behaviors such as localized, periodic heat production which are critical to concentrating reactive compounds and promoting prebiotic polymer condensation reactions. Interstitial mineral grain pore spaces provide mineral heterogeneity and source mineral abundance on spatial scales commensurate with the diffusion distances of RNA sequences over the half-life of phosphodiester bond hydrolysis under ideal conditions. These properties would indicate that such void spaces can serve as automata memory elements prior to the emergence of a fully self-enclosed, genetically encoded cell capable of Darwinian evolution. The mechanisms that govern exchange processes across so many levels open the possibility for sophisticated means of channeling energy input at lower levels into far-from-equilibrium outputs at higher object levels. These attributes lend stability to the overall system and enable the sequential emergence of higher level objects that have the potential to link abiotic and biotic states of matter.

It is currently impossible to assess with certainty whether all of the inherent systemic complexity driven by subatomic-scale radiation is required for life’s emergence. It is possible that chemical reactions will be uncovered that obviate the need to rely on this degree of energy transduction to achieve the undirected, abiotic synthesis of biotic polymers. Nevertheless, mapping out the relationships between object level perturbation, chemical synthesis, and energy dissipation indicates that these kinds of systems represent some of the most thermodynamically stable and robust means of linking abiotic and biotic configurations of matter. Systems with these attributes may have served as the progenitors of prebiotic geochemically derived automata that eventually became living systems.

Disclosure

The opinions expressed in this publication are those of the authors and do not necessarily reflect the views of any particular organization. The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

This work was supported by a grant from the Simons Foundation (494291, Zachary R. Adam). This work was supported by John Templeton Foundation Grant no. 58562 (Betul Kacar) and National Science Foundation Grant EF-1724090 (Betul Kacar). The authors would like to thank Marta Dueñas Diez and Juan Perez-Mercader for stimulating discussions that led to this paper. This article is the product of a workshop titled “Proto-computation and Proto-life,” held at Harvard University on December 10-11, 2016.