#### Abstract

Reaction networks are a general formalism for describing collections of classical entities interacting in a random way. While reaction networks are mainly studied by chemists, they are equivalent to Petri nets, which are used for similar purposes in computer science and biology. As noted by Doi and others, techniques from quantum physics, such as second quantization, can be adapted to apply to such systems. Here we use these techniques to study how the “master equation” describing stochastic time evolution for a reaction network is related to the “rate equation” describing the deterministic evolution of the expected number of particles of each species in the large-number limit. We show that the relation is especially strong when a solution of master equation is a “coherent state”, meaning that the numbers of entities of each kind are described by independent Poisson distributions. Remarkably, in this case the rate equation and master equation give the exact same formula for the time derivative of the expected number of particles of each species.

#### 1. Introduction

A “reaction network” describes how various kinds of classical particles can interact and turn into other kinds. They are commonly used in chemistry. For example, this reaction network gives a very simplified picture of what is happening in a glass of water:The reactions here are all reversible, but this is not required by the reaction network formalism. The relevant particles here are not atoms, but rather “species” including the water molecule , the proton , the hydroxyl ion , and the hydronium ion .

While the dynamics of the reactions are fundamentally quantum-mechanical, reaction networks give a simplified description of what is going on. First, given a “rate constant” for each reaction, one can write down a differential equation called the* rate equation*, which describes the time evolution of the concentration of each species in a deterministic way. This is especially useful in the limit where there are many particles of each species. At a more fine-grained level, one can describe reactions stochastically, using the* master equation*. In the rate equation the concentration of each species is treated as a continuous variable, but, in the master equation, the number of particles of each species is an integer.

In this paper, we explain how techniques from quantum physics and in particular second quantization can be used to study reaction networks. This work is part of a broader program emphasizing the links between quantum mechanics and a subject we call “stochastic mechanics”, where probabilities replace amplitudes [1–3].

Using second quantization methods to model classical stochastic systems goes back at least to 1976, with the work of Doi [4]. It has been further developed by Grassberger, Scheunert [5], and many others. The big surprise one meets at the very beginning of this subject is that the canonical commutation relations between creation and annihilation operators, long viewed as a hallmark of quantum theory, are also perfectly suited to systems of identical* classical* particles interacting in a probabilistic way.

In quantum theory, the simplest example of a Fock space is used to describe the states of a quantum harmonic oscillator. When such an oscillator is in its th energy level, its state is a vector that we can formally write as for some formal variable , or better, , where the normalization factor comes in handy later. A general quantum state is a linear combination of such vectors, so we can write it as a power serieswhere the coefficient is the amplitude for the oscillator to be in its th energy level. We can increase the energy level using the “raising operator” , which multiplies any power series by :We can decrease the energy level using the “lowering operator” , which differentiates any power series:If we choose an inner product for which the vectors are orthonormal, these operators are indeed adjoint to each other:In checking this one sees why the normalization factor is required.

We can think of as a state containing quanta of energy. In applications to quantum physics, we think of these quanta as actual particles, for example, photons. If we have different kinds of particles or a single kind of particle with different states, we describe the quantum state of a collection of particles using a power series in variables. For example, if there is a Fock space consisting of power series in two variables and :where is the amplitude for having particles of the first kind and particles of the second kind. We have operators that create and annihilate particles of each kind:Each creation operator is adjoint to the corresponding annihilation operator , and these operators obey the “canonical commutation relations”: These rules are the basis of many practical calculations in particle physics, where products of creation and annihilations operators are used to model processes in which particles interact and turn into other particles.

Doi noticed that all this has a stochastic analogue where we work with probabilities rather than amplitudes. For example, there is a stochastic Fock space whose elements are power seriesHere, is the* probability* of having particles of the first kind and particles of the second kind. The formalism is somewhat simpler than in the quantum case; for example, the normalization factors are no longer required. For to describe a probability distribution, we require that all the coefficients be nonnegative and sum to 1. We call with these properties a “mixed state”. A special case is a monomial . This is called a “pure state”: it describes a completely definite situation where there are particles of the first kind and of the second kind.

This method of describing a probability distribution using a power series is far from new. It was introduced by DeMoivre [6] and perfected by Laplace [7] in a paper published in 1782. He dubbed these power series “generating functions”, and they have been used in probability theory ever since. Doi’s new realization was that spaces of generating functions are analogous to the Fock spaces in quantum theory. In particular, products of creation and annihilation operators can be used to describe stochastic processes in which objects interact and turn into other objects.

A natural example is chemistry, where reactions are often described stochastically using the master equation or deterministically using the rate equation. There is a long line of work on both of these equations, initiated by Horn and Jackson [8] along with Feinberg [9] in the late 1970s and continuing to this day. This line of work does not use quantum techniques, but it might profit from them and also contribute to their development. To illustrate how, here we use these techniques to study how the master equation reduces to the rate equation in the large-number limit. This is formally similar to the “classical limit” in which a quantum field theory reduces to a classical field theory.

In Theorem 8 we start with the master equation and derive an equation for the rate of change of the expected number of particles of each species. This equation is similar to the rate equation and reduces to it in a certain approximation. However, in Theorem 9 we show that this equation gives the rate equation* exactly* when the numbers of particles of each species are described by independent Poisson distributions. This relies on a nice fact: the generating function of a product of independent Poisson distributions is an eigenvector of all the annihilation operators.

In quantum mechanics, eigenvectors of the annihilation operators are called “coherent states”. Many facts about classical mechanics have simple generalizations to quantum mechanics if we restrict attention to coherent states [10]. Here we are seeing a similar phenomenon in stochastic mechanics: the deterministic dynamics given by the rate equation match the stochastic dynamics given by the master equation in the special case of coherent states. We explore this further in a companion paper [3].

The application of quantum techniques to reaction networks is just one of many interesting connections between quantum physics and network theory. For example, the evolution of networks displays phase transitions analogous to Bose–Einstein condensation [11, 12]. For more, see the review by Biamonte, Faccin, and De Domenico [13].

#### 2. Reaction Networks

A reaction network, first defined by Aris [14], consists of a set of different kinds of particles—each kind being called a “species”—together with a set of processes, called “reactions”, that turn collections of particles of various species into collections of particles of other species. As the name suggests, it is convenient to draw a reaction network as a graph.

While reaction networks first arose in chemistry, their uses are not limited to this context. Here is an example that arose in work on HIV: the human immunodeficiency virus is [15]Here we have three species: (i): healthy white blood cells(ii): infected white blood cells(iii): virions (that is, individual virus particles)

We also have six reactions:(i): the birth of one healthy cell, which has no input and one as output(ii): the death of a healthy cell, which has one as input and no output(iii): the infection of a healthy cell, which has one and one as input, and one as output(iv): “production”, or the reproduction of the virus in an infected cell, which has one as input and one and one as output(v): the death of an infected cell, which has one as input and no output(vi): the death of a virion, which has one as input and no output

So, we have a finite set of species:but reactions go between “complexes”, which are finite linear combinations of species with natural number coefficients, such as or or . (In the literature on reaction networks the complex , corresponding to nothing at all, is often denoted .) We can think of complexes as elements of , the set of functions from to the natural numbers. The reaction network is a graph with certain complexes as vertices and reactions as edges.

We can formalize this as follows. There are various kinds of graphs; the kind we want is sometimes called “directed multigraphs” or “quivers”, but, to keep our terminology simple, we make the following definition.

*Definition 1. *A** graph** consists of a set of** edges**, a set of** vertices**, and** source** and** target** maps saying where each edge starts and ends.

Then we may say the following.

*Definition 2. *A** reaction network** consists of (i)a finite set of** species**(ii)a finite set of** complexes** (iii)a graph with complexes as vertices and some finite set of** reactions** as edges

In the chemistry literature this sort of thing is called a “chemical reaction network”, but we want to emphasize that they are useful more generally, so we drop the adjective. Petri nets are an alternative formalism that is entirely equivalent to reaction networks, often used in subjects other than chemistry [16–18]. In the Petri net literature the species are called “places” and the reactions are called “transitions”. For a more complete translation guide between reaction networks and Petri nets, see [3].

For convenience we shall often write for the number of species present in a reaction network and identify the set with the set . This lets us write any complex as a -tuple of natural numbers. In particular, we write the source and target of any reaction as

#### 3. The Rate Equation

The amount of each species is represented by a “state” of the chemical reaction network. There are various options here. A** pure state** is a vector with th entry specifying the number of instances of the th species. This generalizes in two ways. First, we define a** classical state** to be any vector of nonnegative real numbers and think of such a state as specifying the expected number or concentration of the species present. Second, we define a** mixed state** to be a probability distribution over the pure states. This assigns a probability to each . The rate equation describes the time evolution of a classical state, while the master equation describes the time evolution of a mixed state.

To write down either of these equations, we must first specify the system’s dynamics. In this paper we consider systems that evolve by the law of mass action [14]. Very roughly, this law states that the rate at which reactions occur is proportional to the product of the concentrations of all its input species. We call the constant of proportionality for each reaction its** rate constant**.

*Definition 3. *A** stochastic reaction network** consists of a reaction network together with a map assigning a** rate constant** to each reaction.

This concept is equivalent to another concept in the literature, namely, that of a stochastic Petri net [3].

Given a stochastic reaction network, the rate equation says that for each reaction the time derivative of a classical state is the product of the following:(i)the vector whose th component is the change in the number of the th species due to the reaction (ii)the concentration of each input species of raised to the power given by the number of times it appears as an input, namely, (iii)the rate constant of

*Definition 4. *The** rate equation** for a stochastic reaction network iswhere and we have used multi-index notation to define

For the HIV reaction network in (10), if we also use the Greek letter names of the reactions as names for their rate constants, we get this rate equation:This example involves only complexes where the coefficient of each species is at most 1. It is worthwhile comparing an example with larger coefficients, for example, the dissociation of a diatomic gas into atoms and the recombination of these atoms into a molecule:This reaction network gives the following rate equation:The recombination of two atoms of chlorine into a diatomic molecule occurs at a rate proportional to the square of the concentration of atomic chlorine, so both terms arising from reaction are proportional to . The reaction produces two atoms of while the reaction uses up two atoms of , so both terms in the second equation have a coefficient of 2. All this follows from (13).

#### 4. The Master Equation

The master equation describes the time evolution of mixed states. For the rest of this paper we fix a stochastic reaction network . Recall that, for each , a “mixed state” gives the probability that, for each , exactly instances of the th species are present. The** master equation** says thatfor some matrix of numbers determined by the stochastic reaction network. The matrix element is the probability per time for the pure state to evolve to the pure state . This probability is a sum over reactions:

To describe the matrix , we need “falling powers”. For any natural numbers and , we define the th** falling power** of by This is the number of ways of choosing an ordered -tuple of distinct elements from a set with elements. Note that if . More generally, for any multi-indices and , we defineThis is the number of ways to choose, for each species , an ordered list of distinct things from a collection of things of that species. Thus, we expect a factor of this sort to appear in the master equation.

Using this notation, we define the matrix as follows:Here is the Kronecker delta, which equals 1 if its subscripts are equal and 0 otherwise. The first term describes the rate for the complex to become the complex via the reaction : it equals the rate constant for this reaction multiplied by the number of ways this reaction can occur. The second term describes the rate at which goes away due to this reaction. Thus, the size of the second term is equal to that of the first, but its sign is opposite.

Putting together (19) and (22), we obtain this formula for the Hamiltonian:

#### 5. The Stochastic Fock Space

We can understand the Hamiltonian for the master equation at a deeper level using techniques borrowed from quantum physics. The first step is to introduce a stochastic version of Fock space. To do this, we write any mixed state as a formal power series in some variables , with the coefficient ofbeing the probability . That is, we write any mixed state asBecause a mixed state is a probability distribution, the coefficients must be nonnegative and sum to 1. Indeed, in this formalism** mixed states** are precisely the formal power series withThe simplest mixed states are the monomials ; these are the** pure states**, where there is a definite number of things of each species.

In quantum mechanics we use a similar sort of Fock space, but with power series having complex rather than real coefficients. To obtain a Hilbert space, we often restrict attention to power series obeying for which a certain norm is finite. However, power series not obeying this condition are also useful, for example, to ensure that operators of interest are everywhere defined, instead of merely densely defined.

Thus, for the purposes of studying the master equation, we define the** stochastic Fock space** to be the vector space of all real formal power series in the variables . We treat the Hamiltonian for the master equation as the operator on stochastic Fock space withfor allwhere the matrix entries are given by (23).

This allows us to express the Hamiltonian in terms of certain special operators on the stochastic Fock space: the creation and annihilation operators. Our notation here follows that used in quantum physics, where the creation and annihilation operators are adjoints of each other. Let . The** creation operator** is given byThis takes any pure state to the pure state with one additional instance of the th species:The corresponding** annihilation operator** is given by formal differentiation:This takes any pure state to the pure state with one fewer instance of the th species, but multiplied by a coefficient :This represents the fact that there are ways to annihilate one of the instances of the th species present.

In what follows we use multi-index notation to define, for any ,andThese expressions are well-defined because all the annihilation operators commute with each other and similarly for the creation operators.

Theorem 5. *For any stochastic reaction network, the Hamiltonian is given by*

*Proof*. We start with a basic result on annihilation and creation operators.

Lemma 6. *For any multi-indices we have*

*Proof. *The first equation follows inductively from the definition of the creation operators, which impliesThe second follows inductively from the definition of annihilation operators, which impliesNote that if for any , and in this case we also have , so the equation holds in this case because both sides vanish.

We now prove the theorem. For any multi-index , (23) givesUsing Lemma 6 we obtainand thusThe last step requires a little justification. Technically speaking, the monomials are not a basis of the stochastic Fock space , because not every formal power series is a* finite* linear combination of monomials. However, these monomials form a “topological basis” of , in the sense that every element of this vector space can be expressed as an* convergent infinite* linear combination of these monomials, with respect to a certain topology. (In this topology, a sequence of formal power series converges if each coefficient converges.) The annihilation and creation operators and indeed all the operators discussed in this paper are continuous in this topology. So, to check equations between these operators, it suffices to check them on the monomials .

Next we investigate what the master equation says about the time evolution of expected values. For this, we start by definingfor any formal power seriesIn general the sum defining may not converge, but it converges and equals 1 when is a mixed state. Suppose is an operator on the space of formal power series in the variables . We define the** expected value** of in the mixed state to be , assuming this converges. This generalizes the usual definition of the expected value of a random variable, in a way that emphasizes the analogy between stochastic mechanics and quantum mechanics. While in quantum mechanics we compute expected values of observables using expressions of the form , where the brackets denote an inner product, in stochastic mechanics we use . For more details see [1, 2].

We are especially interested in the expected value of the** number operators**Note thatSince is the probability of being in a pure state where there are things of the th species, the above formula indeed gives the expected value of the number of things of this species.

Our goal is to computeassuming that obeys the master equation. For this, we need to define the falling powers of an operator as follows:for any . If is a multi-index we defineThis is well-defined because the number operators commute.

Lemma 7. *For any multi-indices and we have*

*Proof. *By Lemma 6 and the definition of the number operators we havefor all . The lemma follows directly from this.

There is a way to extract a classical state from a mixed state, which lets us connect the rate equation to the master equation. To do this, we write for the vector, or list, of number operators . Then, given a mixed state , we setIf the expected values here are well-defined, is a classical state. It is a simplified description of the mixed state , which only records the expected number of instances of each species.

The next result says how this sort of classical state evolves in time, assuming that the mixed state it comes from obeys the master equation.

Theorem 8. *For any stochastic reaction network and any mixed state evolving in time according to the master equation, we haveassuming the expected values and their derivatives exist.*

*Proof. *First note using Lemmas 6 and 7 that for any multi-indices and any natural number we haveThus if at any given timeand evolves in time according to the master equation, we haveRecall that is our notation for the vector of number operators . Thus, the above equation is equivalent to

The above theorem makes it clear that the rate equation is closely related to the master equation, since the former sayswhile the latter impliesSuppose we let be the classical state coming from the mixed state :The rate equation for would then follow from the master equation for if we hadfor every multi-index . This equation is not true in general. However, it should hold* approximately* in a suitable limit of large numbers. And surprisingly, it holds* exactly* for coherent states.

#### 6. Coherent States

For any , we define the** coherent state** with expected value to be the mixed statewhere is the dot product of the vectors and , and we set . Equivalently,where and are defined as products in our usual way, and . The name “coherent state” comes from quantum mechanics [10], where we think of the coherent state as the quantum state that best approximates the classical state . In the state , the probability of having things of the th species is equal toThis is precisely the definition of a Poisson distribution with mean equal to . The state is thus a product of independent Poisson distributions.

Theorem 9. *Given any stochastic reaction network, let be a mixed state evolving in time according to the master equation. If is a coherent state when , thenobeys the rate equation when .*

*Proof*. We prove this using a series of lemmas.

Lemma 10. *For any multi-index and any coherent state we haveand*

*Proof. *The second equation is immediate from the definition, while the first follows from

Lemma 11. *For any multi-index we have*

*Proof. *For any multi-index , Lemmas 6 and 7 implySince the states form a topological basis of and all the operators in question are continuous, the lemma follows.

Lemma 12. *For any multi-index and any we have*

*Proof. *We have

Lemma 13. *For any multi-index and any coherent state we have*

*Proof. *By Lemmas 11 and 10 we haveso by Lemma 12As a special case we have , so we also haveand it follows that . **

To prove Theorem 9, we start by settingwhere is a solution of the master equation. Theorem 8 implies that the time derivative of equalsIf is a coherent state at time , Lemma 13 implies thatso we havewhich is the rate equation at time , as desired.

In general, the above result applies at only one moment in time. The reason is that if a solution of the master equation is a coherent state at some time, it need not be a coherent state at later (or earlier) times. Still, Theorem 8 implies that, as long as the expected values are* close* to the powers , at least when is the source of some reaction, the expected values will* approximately* obey the rate equation. This could be studied in detail using either the techniques introduced here or those discussed by Anderson and Kurtz [19].

Furthermore, in some cases when is initially a coherent state it continues to be so for all times. In this case, continues to exactly obey the rate equation. For example, this is true when all the complexes in the reaction network consist of a single species. It is also true for a large class of equilibrium solutions of the master equation, as shown by Anderson, Craciun, and Kurtz [20]. For more on how quantum techniques apply to this situation, see the companion to this paper [3].

#### Data Availability

No data were used to support this study.

#### Conflicts of Interest

The author declares that there are no conflicts of interest regarding the publication of this paper.

#### Acknowledgments

The author thank Jacob Biamonte, Brendan Fong, and the students at U. C. Riverside who helped come up with a preliminary version of the proof of Theorem 8, notably Daniel Estrada, Reeve Garrett, Michael Knap, Tu Pham, Blake Pollard, and Franciscus Rebro and thank John Rowlands and Blake Stacey for catching typos and other mistakes. This work was done at the Centre of Quantum Technology, in Singapore, and the Mathematics Department of U. C. Riverside.