Journal of Thermodynamics

Volume 2016, Article ID 9137926, 13 pages

http://dx.doi.org/10.1155/2016/9137926

## Historical Prospective: Boltzmann’s versus Planck’s State Counting—Why Boltzmann Did Not Arrive at Planck’s Distribution Law

^{1}Technical University of Applied Sciences Wildau, Hochschulring 1, 15745 Wildau, Germany^{2}Kazakh National Pedagogical Abai University, Dostyk avenue 13, Almaty 050010, Kazakhstan

Received 7 August 2015; Accepted 11 November 2015

Academic Editor: Shripad T. Revankar

Copyright © 2016 Peter Enders. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

Why does Planck (1900), referring to Boltzmann’s 1877 probabilistic treatment, obtain his* quantum* distribution function while Boltzmann did not? To answer this question, both treatments are compared on the basis of Boltzmann’s 1868 three-level scheme (configuration—occupation—occupancy). Some calculations by Planck (1900, 1901, and 1913) and Einstein (1907) are also sketched. For obtaining a quantum distribution, it is crucial to stick with a discrete energy spectrum and to make the limit transitions to infinity at the right place. For correct state counting, the concept of interchangeability of particles is superior to that of indistinguishability.

#### 1. Introduction

Very recently, Sharp and Matschinsky have translated and commented Boltzmann’s famous 1877 paper [1] “On the Relationship between the Second Fundamental Theorem of the Mechanical Theory of Heat and Probability Calculations regarding the Conditions for Thermal Equilibrium” [2]. As a matter of fact, they have done a great service to the scientific community^{1}.

Barely any of Boltzmann’s original scientific work is available in translation. This is remarkable given his central role in the development of both equilibrium and non-equilibrium statistical mechanics, his statistical mechanical explanation of entropy, and our understanding of the Second Law of thermodynamics. What Boltzmann actually wrote on these subjects is rarely quoted directly, his methods are not fully appreciated, and key concepts have been misinterpreted. Yet his work remains relevant today. (*Ibid*., pp. 1971f.)

The paper “exemplifies several of Boltzmann’s most important contributions to modern physics. These include(1)The eponymous Boltzmann distribution, relating the energy scaled by the mean kinetic energy (temperature)…(2)Much of the theoretical apparatus of statistical mechanics is developed with great clarity… His terminology…is incisive, in some ways superior to the two modern terms macro-state and micro-state…(3)The statistical mechanical formulation of entropy…(4)…Boltzmann also clearly demonstrates that there are two distinct contributions to entropy, arising from the distribution of heat (kinetic energy) and the distribution in space of atoms or molecules… It is fitting that Boltzmann was the one to discover the third fundamental contribution to entropy, namely radiation, by deriving the Stefan-Boltzmann Law [3]…” (*ibid*., pp. 1972f.)(5)Boltzmann’s “permutability measure”, Ω (3/2 of Clausius’ entropy, ), is constructed as an* extensive* quantity. “Thus Boltzmann never encounters the apparent Gibbs paradox for the entropy of mixing of identical gases. Furthermore, with Boltzmann’s Permutabilitätmass method for counting states, there is no need for* a posteriori* division by ! to “correct” the derivation using the “somewhat mystical arguments of Gibbs^{2} and Planck [4]” nor a need to appeal to quantum indistinguishability, which has been implausibly described as the appearance of quantum effects at the macroscopic classical level [5]. Subsequently, at least four distinguished practitioners of statistical mechanics have pointed out that correct counting of states* a la* Boltzmann obviates the need for the spurious indistinguishability/! term: Ehrenfest and Trkal [6], Van Kampen [7], Jaynes [8], Swendsen [9] (and possibly Pauli [10]^{3}). This has had little impact on textbooks of statistical mechanics.^{4} An exception is the treatise by Gallavotti [11]” (*Ibid*., p. 1974).Indeed, Jaynes [8] argues similarly: “Some important facts about thermodynamics have not been understood by others to this day, nearly as well as Gibbs understood them over 100 years ago [12]… For 80 years it has seemed natural that, to find what Gibbs had to say about this, one should turn to his Statistical Mechanics. For 60 years, textbooks and teachers (including, regrettably, the present writer) have impressed upon students how remarkable it was that Gibbs, already in [13], had been able to hit upon this paradox which foretold—and had its resolution only in—quantum theory with its lore about indistinguishable particles, Bose and Fermi statistics, etc.^{5} It was therefore a shock to discover that…Gibbs [in “Heterogeneous Equilibrium”] displays a full understanding of this problem, and disposes of it without a trace of that confusion over the “meaning of entropy” or “operational distinguishability of particles” on which later writers have stumbled. He goes straight to the heart of the matter as a simple technical detail, easily understood as soon as one has grasped the full meanings of the words “state” and “reversible” as they are used in thermodynamics. In short, quantum theory did not resolve any paradox, because there was no paradox. Today, the universally taught conventional wisdom holds that “Classical mechanics failed to yield an entropy function that was extensive, and so statistical mechanics based on classical theory gives qualitatively wrong predictions of vapor pressures and equilibrium constants, which was cleared up only by quantum theory in which the interchange of identical particles is not a real event”. We argue that, on the contrary, phenomenological thermodynamics, classical statistics, and quantum statistics are all in just the same logical position with regard to extensivity of entropy; they are silent on the issue, neither requiring it nor forbidding it.”

Last but not least, Boltzmann’s statistical definition of entropy is the first one, which applies to nonequilibrium states, thus “opening the door to the statistical mechanics of non-equilibrium states and irreversible processes” [2, p. 1974].

In this paper, I will concentrate on Boltzmann’s manner of state counting and its consequences for classical and quantum statistics. Boltzmann accounts for the interchangeability of equal particles; nevertheless, he does* not* obtain Planck’s distribution function, while Planck—starting with similar probabilistic settings [14, 15] or even with the same setting [4]—did.

Moreover, I will sketch some calculations by Einstein [16]. In Einstein’s pioneering paper about the specific heat of solids, Planck’s distribution law emerges from the discreteness of the energy spectrum of Planck’s resonators, when compared with the continuous energy spectrum of classical resonators. From this, Einstein concludes quantization to be a selection problem of states.

#### 2. Boltzmann’s 1868 Probabilistic Scheme

According to Bach [17, Sections 3.2.1, 5.1], Boltzmann [18] invents the scheme ():Since, in Boltzmann’s 1877 paper, the cells are energy levels, I will continue with different cells.

This scheme involves 3 levels of descriptions:(1)configurations,(2)occupation numbers,(3)occupancy numbers.

##### 2.1. Level 1: Configurations

A configuration is the most detailed description of the distribution of the particles over the cells. For each particle, (), it provides the number, (), of the cell, in which it is located.

This can be realized by means of a matrix, , where (0), if particle is (is not) in cell . This matrix can be condensed into the vector , where is the number of the cells, in which particle is located ()There are altogether different configurations [17, p. 58].

The configuration is a complete description. However, since the particles are identical, this description is redundant when applied to physical systems like monoatomic gases. The interchange of two atoms does not change the physical properties of the gas. This fact is accounted for in levels 2 and 3.

##### 2.2. Level 2: Occupation Numbers

Occupation numbers represent a condensed description of the distribution of the particles over the cells. It removes the permutation redundancy in the configurations just mentioned. It means that two configurations’ vectors, , which contain the same numbers in merely different sequence—such as and —are physically equivalent. In other words, relevant is not the complete information: which particle is in a given cell, but only the numbers of particles in the cells. The latter is recorded in theoccupation number,** k** = (). There are particles in cell ()The occupation numbers are invariant under any permutation of the particles [17, p. 59].

We have the obvious constraintAltogether, there aredifferent occupation numbers withconfigurations for each occupation number,** k** [17, p. 59].

##### 2.3. Level 3: Occupancy Numbers

A further condensation of the state description consists in the question, how many cells host 0 particles, 1 particle particles? It is answered in the occupancy number vector,** z** = (). There arecells with particles, where [17, eq. (3.55)].

Now, we have two constraints:We are now prepared to compare Boltzmann’s and Planck’s treatments on an equal footing, in particular, their basic entities, the “complexions.”

#### 3. Boltzmann’s 1877 Manner of State Counting

##### 3.1. Boltzmann’s Discrete Gas Model

For simplicity, Boltzmann begins (p. 1976) with an ideal gas model, in which each of the molecules can assume only “a finite number of velocities, , such aswhere and are arbitrary finite numbers. Upon colliding, two molecules may exchange velocities, but after the collision both molecules still have one of the above velocities…” (p. 1976). Accordingly, the kinetic energy, , of each molecule can also assume only a finite number of valuesHowever, when considering solely the distribution of the various values of over the molecules, it is simpler to assume the arithmetic progressionUpon colliding, two molecules may exchange energy, but after the collision both molecules still have one of the above energies.

The total kinetic energy of the gas is = = const.

##### 3.2. The Kinetic Energy Distribution

There are molecules having kinetic energy , molecules having kinetic energy 1*ε*, up to the molecules having kinetic energy . The set is the distribution of the kinetic energy “among” the molecules (*cf.* p. 1977).

(A) One can interpret the set at as distribution of the energy portions, , over the molecules, the probabilistic scheme being Then, tells the number of cells hosting particles. This is the component of the* occupancy* number, .

(B) In turn, one can reverse the role of molecules and energy portions and interpret that as distribution of the molecules over the energy levels, , the probabilistic scheme beingThen, is the number of particles in cell . This is the component of the* occupation* number, .

The set is subject to the constraints^{6}the number of molecules in the gas, andthe total energy of the gas in units of . For case (A), they correspond to the constraints (8a) and (8b). For case (B), is identical to (4), while results additionally from the meaning of the cells as energy levels.

##### 3.3. Complexions

Boltzmann continues, “As a preliminary, we will use a simpler schematic approach to the problem, instead of the exact case” (p. 1977). The energy levels are distributed in all possible ways among the molecules, where = = const. “Any such distribution, in which the first molecule may have a kinetic energy of, e.g., , the second may have , and so on, up to the last molecule, we call a complexion…” (p. 1977). In other words, a complexion is the set , where is the energy of molecule in units of ().

(A) Literally, this means the distribution of the possible amounts of energy, , , over the molecules, the probabilistic scheme beingThen, means the number of cells hosting particles. This is the component of the* occupancy* number,** z**.

(B) In turn, one can reverse the role of molecules and energy levels, again, and understand that as distribution of the molecules over the energy levels, , the probabilistic scheme beingThen, means that particle is in cell . This is the component of the* configuration*,** j**. Boltzmann’s formula below supports this interpretation.

##### 3.4. The State Distribution

This distribution “specifies the number of of complexions [] for that distribution; in other words, it determines the likelihood of that state distribution. Dividing the number by the number of all possible complexions, we get the probability of the state distribution” (p. 1977).

Since a distribution of states does not determine kinetic energies exactly, the goal is to describe the state distribution by writing as many zeros [] as molecules with zero kinetic energy (), ones [1] for those with kinetic energy *ε etc.* All these zeros [], ones [1],* etc*. are the elements defining the state distribution. (p. 1977).

As example, Boltzmann considers the case = 7, = 7, = 7. “There are then 15 possible state distributions” (p. 1978). They are listed in Table 1.