- About this Journal ·
- Abstracting and Indexing ·
- Aims and Scope ·
- Article Processing Charges ·
- Articles in Press ·
- Author Guidelines ·
- Bibliographic Information ·
- Citations to this Journal ·
- Contact Information ·
- Editorial Board ·
- Editorial Workflow ·
- Free eTOC Alerts ·
- Publication Ethics ·
- Submit a Manuscript ·
- Subscription Information ·
- Table of Contents
Volume 2012 (2012), Article ID 528737, 5 pages
Thermodynamic Derivation of the Fluctuation Theorem and Jarzynski Equality
Department of Meteorology, University of Reading, Reading RG6 6BB, UK
Received 14 February 2012; Accepted 15 March 2012
Academic Editors: M. Appell, C. Pierleoni, and Z. Slanina
Copyright © 2012 Maarten H. P. Ambaum. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
A thermodynamic expression for the analog of the canonical ensemble for nonequilibrium systems is described based on a purely information theoretical interpretation of entropy. It is shown that this nonequilibrium canonical distribution implies some important results from nonequilibrium thermodynamics, specifically, the fluctuation theorem and the Jarzynski equality. Those results are therefore expected to be more widely applicable, for example, to macroscopic systems.
The derivations of the fluctuation theorem [1, 2] and the Jarzynski equality  appear to depend on the underlying microscopic Hamiltonian dynamics. From this it would follow that these theorems are only relevant to microscopic systems, with their associated definitions of entropy and temperature. In contrast, a statistical mechanical description of macroscopic systems often depends on more general forms of entropy, primarily information entropy [4–6]. Two notable examples from fluid dynamics are the statistical mechanics of point vortices  and the statistical mechanics of two-dimensional incompressible flows . In such cases, temperature is defined in terms of the change of entropy with the energy of the system  or, equivalently, in terms of the Lagrange multiplier for the energy under the maximization of entropy at a given expectation value of the energy .
The question is whether for such macroscopic systems we can derive a fluctuation theorem or Jarzynski equality. This is of particular importance for climate science as there are strong indications that the global state of the climate system and, more generally, other components of the Earth system may be governed by thermodynamic constraints on entropy production [11–15]. The theoretical underpinning of those thermodynamic constraints is still lacking. The presence of a fluctuation theorem for such systems would be of great importance.
Here we demonstrate that the information-theoretical definition of entropy implies the fluctuation theorem and the Jarzynski equality. It is shown that these results are due to the counting properties of entropy rather than the dynamics of the underlying system. As such, both these results are applicable to a much wider class of problems, specifically, macroscopic systems for which we can define an entropy and which are thermostated in some general sense.
The central tenet is that for two states and of a system, defined by two sets of macroscopic parameters, the ratio of the probabilities for the system to be in either state is with being the difference in entropy between the states and . This is essentially the Boltzmann definition of entropy: entropy is a counting property of the system. The theoretical background can be found in , where it is shown that this information theoretical interpretation reproduces the statistical mechanics based on Gibbs entropy but furthermore gives a justification of the Gibbs formulation as a statistical inference problem under limited knowledge of the system. Of note is that the entropy only has meaning in relation to the macroscopic constraints on the system (indicated by the subscripts and ), constraints which can be arbitrarily complex and prescriptive, as may be needed for systems far from equilibrium. In an information-theoretical setting the previous definition of entropy is equivalent to the principle of indifference: the absence of any distinguishing information between microscopic states within any of the macroscopic states or is equivalent to equal prior (prior to obtaining additional macroscopic constraints) probabilities for the microscopic states . Note also that we do not need to specify precisely at this point how the states are counted, or how an invariant measure can be defined on the phase space confined by or . The principle of indifference does not imply that all states are assumed equally probable; it is a statement that we cannot a priori assume a certain structure in phase space (such as a precisely defined invariant measure) in the absence of further information. The principle of indifference is not a statement about the structure of phase space; it is a principle of statistical inference and it is the only admissible starting point from an information theoretical point of view.
2. A General Form for the Canonical Ensemble
Following Boltzmann, we define the entropy as the logarithm of the number of states accessible to a system under given macroscopic constraints . For an isolated system, the entropy is related to the size of the accessible phase space: For a classical gas system, is defined by the energy , volume and molecule number , the phase space size is the hyperarea of the energy shell, and it defines the usual microcanonical ensemble. For more complicated systems, where may include several macroscopic order parameters, the energy shell becomes more confined; in the following we will still refer to the accessible phase space under constraints as the energy shell. The hyperarea is nondimensionalised such that is proportional to the number of states between energies and . We will not consider other multiplicative factors which make the argument of the logarithm nondimensional; these contribute an additive entropy constant which will not be of interest to us here. Note also that the microcanonical ensemble does not include a notion of equilibrium: the system is assumed to be insulated, so it cannot equilibrate with an external system. It just moves around on the energy shell (defined by ) and the principle of indifference implies that all states, however improbable from a macroscopic point of view, are members of the ensemble. Of course, the number of unusual states (say, with nonuniform macroscopic properties not defined by ) is much lower than the number of regular states (say, with uniform macroscopic density) for macroscopic systems. Only for small systems, the distinction becomes important but it does not invalidate the previous formal definition of entropy. The previous definition of entropy also ensures that entropy is an extensive property such that for two independent systems considered together the total entropy is the sum of the individual entropies, . The Boltzmann constant ensures dimensional compatibility with the classical thermodynamic entropy when the usual equilibrium assumptions are made [10, 17].
The hyperarea of the energy shell, and thus the entropy, can be a function of several variables which are set as external constraints, such as the total energy , system volume, , or particle number for a simple gas system. For the canonical ensemble we consider a system that can exchange energy with some reservoir. We consider here only a theoretical canonical ensemble in that we consider the coupling between the two systems to be weak such that the interaction energy vanishes compared to the relevant energy fluctuations in the system.
First, we need to define what a reservoir is. Following equilibrium thermodynamics, we formally define an inverse temperature as We make no claim about the equality of and the classical equilibrium inverse temperature; is the expansivity of phase space with energy and as such can be defined for any system, whether it is in thermodynamic equilibrium or not. When an isolated system is prepared far from equilibrium (e.g., when it has a local equilibrium temperature which varies over the system), then is still uniquely defined for the system as a nonlocal property of the energy shell that the system resides on. Because both energy and entropy in the weak coupling limit are extensive quantities, must be an intensive quantity.
Now consider a large isolated system with total (internal) energy . Let this system receive energy from the environment. By expanding its entropy in powers of , we can then write the entropy of this large system as We see that for finite , has to be an extensive quantity. But that means that for a very large system , where is a measure of the size of the system (such as particle number). For a classical thermodynamic system with the heat capacity at constant volume. We conclude that for a very large system (), the entropy equals for all relevant, finite energy exchanges . This expression for the entropy defines a reservoir. The size of the energy shell accessible to the reservoir is, for all relevant energy exchanges , exactly proportional to , with an intensive and constant property of the reservoir. We do not require the reservoir to be in thermodynamic equilibrium. A change of energy in the reservoir pushes the reservoir to a different energy shell ; the functional dependence of the size of the energy shell with energy defines the inverse temperature , as in (3). However, it is not assured that a small and fast thermometer would measure an inverse temperature equal to at some point in the reservoir; only if the reservoir is allowed to equilibrate, its inverse temperature is everywhere equal to . Of course, this is how the temperature of a classical reservoir is determined in practice.
Now suppose that a system of interest has energy . We then allow it to exchange heat with a reservoir. If the system has energy , the reservoir must have given up energy . We can write the hyperarea of the energy shell of the system as a function of . The total entropy of the system plus reservoir can then be written as a function of the exchange energy, , as with . The number of states at each level of exchange energy therefore is proportional to where we omitted proportionality constants related to the additive entropy constants. Nowhere we assume that the system is in equilibrium with the reservoir. This means that is the relevant measure to construct an ensemble average for the system, even for far-from-equilibrium systems. Even the reservoir can be locally out of equilibrium, as discussed previously. We have also made no reference to the size of the system of interest, as long as it is much smaller than the reservoir. However, in contrast to systems in thermodynamic equilibrium, there is no guarantee that the extensive macroscopic variables, such as , , or , define the state of the system in any reproducible sense. To fully define an out-of-equilibrium system we need to introduce order parameters that can describe the nonequilibrium aspects of the system.
The previous density is an integrated version of the usual canonical distribution. The size of the energy shell of the system of interest, , can be written as an integral over states such that with being the Hamiltonian of the system of interest. With this definition, the density in (7) reduces to the usual canonical distribution for states . We will not make further use of this microscopic version of the density.
3. Fluctuation Theorems
The canonical density in (7) can be expanded by parametrizing each energy shell with some continuous coordinate so that every part of phase space has coordinates . The coordinate is again a macroscopic coordinate so that any combination can correspond to many microscopic states. At each value of the differential is proportional to the number of states between coordinate values and , and and , and it is normalised such that The parametrisation is arbitrary at this point and can be chosen such as to divide the phase space in as fine a structure as desired for a given application. We can define an entropy again as the logarithm of the number of available states for the system of interest corresponding to the subset of phase space defined by :
Now consider a process that occurs on the energy shell where some variable changes from . On the parametrized energy shell this corresponds to a coordinate shift from . The number of corresponding states changes from . We can use detailed balance to express the ratio of the probability of making this transition to the probability of making the reverse transition as the ratio of the number of states at to the number of states at : where . If, in addition, during the process the energy of the system of interest changes from through exchange with the reservoir, then the previous ratio of probabilities can still be expressed as but now with We can always write the entropy change of the system of interest as the sum of the entropy change due to heat exchange with the reservoir and an irreversible entropy change associated with uncompensated heat [14, 18], namely, . We thus conclude that ; that is, the relevant entropy change in (11) equals the irreversible entropy change of the system of interest. So for processes that occur either on or across energy shells, we have with being the irreversible entropy change of the system in a process . The right-hand side of this equation is only dependent on the irreversible entropy change between the two states of the system of interest. So this equation must be true for any pair of states that are related by the same irreversible entropy change. We thus arrive at the fluctuation theorem [1, 2]: with being the probability that the system of interest makes a transition with irreversible entropy change of and being the probability for the opposite change.
The fluctuation theorem applies to spontaneous processes that occur in thermostated but otherwise isolated systems. We next consider processes that occur when we modify the system of interest by changing some external macroscopic parameters. The entropy of the energy shell is then also a function of some parameter , namely, . Without loss of generality we set at and at . In this case the irreversible entropy change in (13) is Apart from this, there is no change in the considerations leading to the fluctuation theorem. By definition, thermostated systems that receive work from their environment have an irreversible entropy change equal to with being the change in free energy going from to . Recognising that the right-hand side is again only a function of the difference between the two states, we arrive at the Crooks fluctuation theorem : with being the probability that the system absorbs work when changes from 0 to 1, and being the probability that the system performs work when changes in reverse from 1 to 0. Because the transition probabilities can be normalised with respect to the exchanged work, it is straightforward to use this equation to show that the expectation value of equals unity, or equivalently, This is the Jarzynski equality .
The consistency of the previous argument is strengthened by the following independent route to calculate free energy changes. The phase space measure can be normalised with the partition function : where is proportional to the number of accessible states of the isolated the system of interest when the external parameter is set to . The equilibrium free energy for the thermostated system is Next we consider what happens to the equilibrium free energy of the system when we vary from 0 to 1. The partition function at satisfies where denotes an ensemble average over the initial ensemble, and . As before, the entropy change can be written as the sum of the entropy change due to heat exchange with the reservoir and the irreversible entropy change due to uncompensated heat. Because the system plus the reservoir is thermally insulated, any heat given to the reservoir must be compensated by work performed by the external parameter change. The entropy change can therefore be written as so that we find Because (16) is true for any microscopic realisation of the process, we find that the right-hand side of the previous equation is the same for every realisation and it is equal to . This is consistent with the equilibrium expression for the free energy, (20), from which follows that . The previous equation is only apparently in contradiction to the Jarzynski equality, (18). To arrive at the Jarzynski equality we recognise that (16) implies that , where the last equality follows from integrating the fluctuation theorem over all values of .
We have shown that the fluctuation theorem (14) and Jarzynski equality (18) follow from general counting properties of entropy and not from the underlying dynamics. As such we expect both results to be widely applicable to systems that are in some sense thermostated, that is, systems that are able to settle on a given expectation value for the total energy by interaction with a reservoir.
The climate system is potentially a nontrivial example of such a system: the incoming short-wave radiation from the Sun is depleted by long-wave (thermal infrared) radiation from the Earth to space. The corresponding equilibrium temperature is the bolometric temperature of the planet (about 255 K in case of the Earth ). (The bolometric radiation temperature of the Earth is substantially lower than the observed average surface temperature of about 288 K, because of the greenhouse effect of the atmosphere.) It is not obvious how to apply the fluctuation theorems to the climate system and how the entropy production in the climate system is related to the actual climate on Earth. For example, most of the entropy production in the climate system is due to degradation of radiation (e.g., ); namely, short wavelength visible sunlight is thermalized by molecular absorption into molecular thermal energy corresponding to long wavelength infrared radiation. This degradation of radiative energy is the main source of entropy production in the climate system, but as this entropy production only resides in the photon field, its relation to, for example, kinetic energy dissipation in the atmosphere is not clear. So from this example it appears that we need to select the relevant forms of entropy production before we can use it to make inferences about the climate system.
It remains to be seen whether the fluctuation theorems can be usefully applied to complex systems such as the climate, but we believe that the derivation presented here can pave the way for attempts in that direction.
- D. J. Evans, E. G. D. Cohen, and G. P. Morriss, “Probability of second law violations in shearing steady states,” Physical Review Letters, vol. 71, no. 15, pp. 2401–2404, 1993.
- D. J. Evans and D. J. Searles, “The fluctuation theorem,” Advances in Physics, vol. 51, no. 7, pp. 1529–1585, 2002.
- C. Jarzynski, “Nonequilibrium equality for free energy differences,” Physical Review Letters, vol. 78, no. 14, pp. 2690–2693, 1997.
- C. E. Shannon and W. Weaver, A Mathematical Theory of Communication, The University of Illinois Press, Champaign, Ill, USA, 1963.
- R. T. Cox, The Algebra of Probable Inference, The Johns Hopkins University Press, Johns Hopkins, NJ, USA, 1961.
- E. T. Jaynes, Probability Theory: The Logic of Science, Cambridge University Press, 2003.
- L. Onsager, “Statistical hydrodynamics,” Nuovo Cimento, Supplemento, vol. 6, pp. 279–287, 1949.
- R. Robert and J. Sommeria, “Statistical equilibrium states for two-dimensional flows,” Journal of Fluid Mechanics, vol. 229, pp. 291–310, 1991.
- R. Baierlein, Thermal Physics, Cambridge University Press, 1999.
- E. T. Jaynes, “Information theory and statistical mechanics,” The Physical Review, vol. 106, no. 4, pp. 620–630, 1957.
- G. W. Paltridge, “Global dynamics and climate—a system of minimum entropy exchange,” Quarterly Journal of the Royal Meteorological Society, vol. 101, no. 429, pp. 475–484, 1975.
- H. Ozawa, A. Ohmura, R. D. Lorenz, and T. Pujol, “The second law of thermodynamics and the global climate system: a review of the maximum entropy production principle,” Reviews of Geophysics, vol. 41, no. 4, pp. 1–24, 2003.
- A. Kleidon and R. D. Lorenz, Non-Equilibrium Thermodynamics and the Production of Entropy: Life, Earth, and Beyond, Understanding Complex Systems, Springer, Berlin, Germany, 2005.
- M. H. P. Ambaum, Thermal Physics of the Atmosphere, Wiley-Blackwell, Chichester, UK, 2010.
- S. Pascale, J. M. Gregory, M. H. P. Ambaum, R. Tailleux, and V. Lucarini, “Vertical and horizontal processes in the global atmosphere and the maximum entropy production conjecture,” Earth System Dynamics, vol. 3, no. 1, pp. 19–32, 2012.
- E. T. Jaynes, “Prior probabilities,” IEEE Transactions on Systems Science and Cybernetics, vol. 4, no. 3, pp. 227–241, 1968.
- E. T. Jaynes, “Gibbs vs boltzmann entropies,” American Journal of Physics, vol. 5, pp. 391–398, 1965.
- D. Kondepudi and I. Prigogine, Modern Thermodynamics, John Wiley & Sons, Chichester, UK, 1998.
- G. E. Crooks, “Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences,” Physical Review E, vol. 60, no. 3, pp. 2721–2726, 1999.
- S. Pascale, J. Gregory, M. Ambaum, and R. Tailleux, “Climate entropy budget of the hadcm3 atmosphere-ocean general circulation model and of famous, its low-resolution version,” Climate Dynamics, vol. 36, pp. 1189–1206, 2011.