14 Challenges to the Second Law it respect many of the nuances of the subject;for these,the interested reader is directed to the many fine treatises on the subject. Most entropies possess a number of important physical and mathematical prop- erties whose adequate discussion extends beyond the aims of this volume;these include additivity,subadditivity,concavity,invariance,insensitivity,continuity conditions,and monotonicity 24,31,39.Briefly,for a system A composed of two subsystems Ai and A2 such that A =A1+A2,the entropy is additive if S(A)=S(A1)+S(A2).For two independent systems A and B,the entropy is subadditive if their entropy when joined (composite entropy)is never less than the sum of their individual entropies;i.e.,S(A+B)>S(A)+S(B).(Note that for additivity the subsystems (A1,A2)retain their individual identities,while for subadditivity the systems (A,B)lose their individual identities.)For systems A andB,entropy demonstrates concavity if S(λA+(λ-l)B)≥λS(A)+(1-λ)S(B); 0≤入≤1. A workingman's summary of standard properties can be extracted from Gyftopoulous and Beretta [19].Classical entropy musts: a)be well defined for every system and state; b)be invariant for any reversible adiabatic process (dS =0)and in- crease for any irreversible adiabatic process (dS >0); c)be additive and subadditive for all systems,subsystems and states. d)be non-negative,and vanish for all states described by classical me- chanics; e)have one and only one state corresponding to the largest value of entropy; f)be such that graphs of entropy versus energy for stable equilibria are smooth and concave;and g)reduce to relations that have been established experimentally. The following are summaries of the most common and salient formulations of entropy,spiced with a few distinctive ones.There are many more. (1)Clausius [4]The word entropy was coined by Rudolf Clausius (1865)as a thermodynamic complement to energy.The en draws parallels to energy,while tropy derives from the Greek word Tporn,meaning change.Together en-tropy evokes quantitative measure for thermodynamic change9. Entropy is a macroscopic measure of the microscopic state of disorder or chaos in a system.Since heat is a macroscopic measure of microscopic random kinetic energy,it is not surprising that early definitions of entropy involve it.In its original and most utilitarian form,entropy (or,rather,entropy change)is expressed in terms of heat Q and temperature T.For reversible thermodynamic processes,it is 8Q (1.8) 8Many physical systems in this volume do not abide these restrictions,most notably, additivity. 9Strictly speaking,Clausius coined entropy to mean in transformation
14 Challenges to the Second Law it respect many of the nuances of the subject; for these, the interested reader is directed to the many fine treatises on the subject. Most entropies possess a number of important physical and mathematical properties whose adequate discussion extends beyond the aims of this volume; these include additivity, subadditivity, concavity, invariance, insensitivity, continuity conditions, and monotonicity [24, 31, 39]. Briefly, for a system A composed of two subsystems A1 and A2 such that A = A1 + A2, the entropy is additive if S(A) = S(A1) + S(A2). For two independent systems A and B, the entropy is subadditive if their entropy when joined (composite entropy) is never less than the sum of their individual entropies; i.e., S(A + B) ≥ S(A) + S(B). (Note that for additivity the subsystems (A1, A2) retain their individual identities, while for subadditivity the systems (A, B) lose their individual identities.) For systems A and B, entropy demonstrates concavity if S(λA+(λ−1)B) ≥ λS(A)+(1−λ)S(B); 0 ≤ λ ≤ 1. A workingman’s summary of standard properties can be extracted from Gyftopoulous and Beretta [19]. Classical entropy must8: a) be well defined for every system and state; b) be invariant for any reversible adiabatic process (dS = 0) and increase for any irreversible adiabatic process (dS > 0); c) be additive and subadditive for all systems, subsystems and states. d) be non-negative, and vanish for all states described by classical mechanics; e) have one and only one state corresponding to the largest value of entropy; f) be such that graphs of entropy versus energy for stable equilibria are smooth and concave; and g) reduce to relations that have been established experimentally. The following are summaries of the most common and salient formulations of entropy, spiced with a few distinctive ones. There are many more. (1) Clausius [4] The word entropy was coined by Rudolf Clausius (1865) as a thermodynamic complement to energy. The en draws parallels to energy, while tropy derives from the Greek word τ ρoπη, meaning change. Together en-tropy evokes quantitative measure for thermodynamic change9. Entropy is a macroscopic measure of the microscopic state of disorder or chaos in a system. Since heat is a macroscopic measure of microscopic random kinetic energy, it is not surprising that early definitions of entropy involve it. In its original and most utilitarian form, entropy (or, rather, entropy change) is expressed in terms of heat Q and temperature T. For reversible thermodynamic processes, it is dS = δQ T , (1.8) 8Many physical systems in this volume do not abide these restrictions, most notably, additivity. 9Strictly speaking, Clausius coined entropy to mean in transformation
Chapter 1:Entropy and the Second Law 15 while for irreversible processes,it is dS> 6Q T (1.9) These presume that T is well defined in the surroundings,thus foreshadowing the zeroth law.To establish fiduciary entropies the third law is invoked.For systems "far"from equilibrium,neither entropy nor temperature is well defined. (2)Boltzmann-Gibbs [40,41]The most famous classical formulation of entropy is due to Boltzmann: SBG.=S(E,N,V)=k In(E,N,V) (1.10) Here (E,N,V)is the total number of distinct microstates (complexions)acces- sible to a system of energy E,particle number N in volume V.The Boltzmann relation provides the first and most important bridge between microscopic physics and equilibrium thermodynamics.It carries with it a minimum number of as- sumptions and,therefore,is quite general.It applies directly to the microcanoni- cal ensemble(fixed E,N,V),but,with appropriate inclusion of heat and particle reservoirs,also to the canonical and grand canonical ensembles.In principle, it applies to both extensive and nonextensive systems and does not presume the standard thermodynamic limit(i.e.,infinite particle number and volume [N-oo, V→o∞,finite density['=C<o∞)[38;it can be used with boundary condi- tions,which often handicap other formalisms;it does not presume temperature. However,ergodicity (or quasi-ergodicity)is presumed in that the system's phase space trajectory is assumed to visit smoothly and uniformly all neighborhoods of the(6N-1)-dimensional constant-energy manifold consistent with (E,N,V)10. The Gibbs entropy is similar to Boltzmann's except that it is defined via ensem- bles,distributions of points in classical phase space consistent with the macroscopic thermodynamic state of the system.Hereafter,it is called the Boltzmann-Gibbs (BG)entropy.Like other standard forms of entropy,SBG.applies strictly to equilibrium systems. Note that n is not well defined for classical systems since phase space variables are continuous.To remedy this,the phase space can be measured in unit volumes, often in units of h.This motivates coarse-grained entropy.Coarse-graining reduces the information contained in n and may be best described as a kind of phase space averaging procedure for a distribution function.The coarse-grained distribution leads to a proper increase of the corresponding statistical (information)entropy. A perennial problem with this,however,is that the averaging procedure is not unique so that the rate of entropy increase is likewise not unique,in contrast to presumably uniquely defined increase of the thermodynamic entropy. Starting from SBG.primary intensive parameters (temperature T,pressure P,and chemical potential u)can be calculated [42-46]: 10Alternatively,ergodicity is defined as the condition that the ensemble-averaged and time- averaged thermodynamic properties of a system be the same
Chapter 1: Entropy and the Second Law 15 while for irreversible processes, it is dS > δQ T (1.9) These presume that T is well defined in the surroundings, thus foreshadowing the zeroth law. To establish fiduciary entropies the third law is invoked. For systems “far” from equilibrium, neither entropy nor temperature is well defined. (2) Boltzmann-Gibbs [40, 41] The most famous classical formulation of entropy is due to Boltzmann: SBG,µ = S(E,N,V ) = k ln Ω(E,N,V ) (1.10) Here Ω(E,N,V ) is the total number of distinct microstates (complexions) accessible to a system of energy E, particle number N in volume V . The Boltzmann relation provides the first and most important bridge between microscopic physics and equilibrium thermodynamics. It carries with it a minimum number of assumptions and, therefore, is quite general. It applies directly to the microcanonical ensemble (fixed E, N, V ), but, with appropriate inclusion of heat and particle reservoirs, also to the canonical and grand canonical ensembles. In principle, it applies to both extensive and nonextensive systems and does not presume the standard thermodynamic limit (i.e., infinite particle number and volume [N → ∞, V → ∞], finite density [ N V = C < ∞]) [38]; it can be used with boundary conditions, which often handicap other formalisms; it does not presume temperature. However, ergodicity (or quasi-ergodicity) is presumed in that the system’s phase space trajectory is assumed to visit smoothly and uniformly all neighborhoods of the (6N-1)-dimensional constant-energy manifold consistent with Ω(E,N,V ) 10. The Gibbs entropy is similar to Boltzmann’s except that it is defined via ensembles, distributions of points in classical phase space consistent with the macroscopic thermodynamic state of the system. Hereafter, it is called the Boltzmann-Gibbs (BG) entropy. Like other standard forms of entropy, SBG,µ applies strictly to equilibrium systems. Note that Ω is not well defined for classical systems since phase space variables are continuous. To remedy this, the phase space can be measured in unit volumes, often in units of ¯h. This motivates coarse-grained entropy. Coarse-graining reduces the information contained in Ω and may be best described as a kind of phase space averaging procedure for a distribution function. The coarse-grained distribution leads to a proper increase of the corresponding statistical (information) entropy. A perennial problem with this, however, is that the averaging procedure is not unique so that the rate of entropy increase is likewise not unique, in contrast to presumably uniquely defined increase of the thermodynamic entropy. Starting from SBG,µ, primary intensive parameters (temperature T, pressure P, and chemical potential µ) can be calculated [42-46]: 10Alternatively, ergodicity is defined as the condition that the ensemble-averaged and timeaveraged thermodynamic properties of a system be the same
16 Challenges to the Second Law 1 OE )N.VE (1.11) P (1.12) (1.13) If one drops the condition of fixed E and couples the system to a heat reservoir at fixed temperature T,allowing free exchange of energy between the system and reservoir,allowing E to vary as (0 E oo),then one passes from the microcanonical to the canonical ensemble [41-46 For the canonical ensemble,entropy is defined as Sace=ka(②+西=k[品Th(2 (1.14) Here B=and Z is the partition function (Zustandsumme or "sum over states")upon which most of classical equilibrium thermodynamic quantities can be founded: Z=∑eBB, (1.15) where Ei are the constant individual system energies and E is the mean (average) system energy: FEe-E ,C-=∑E (1.16) The probability pi is the Boltzmann factor exp[-Ei/kT].One can define entropy through the probability sum SBG=-k∑plnn, (1.17) or in the continuum limit Sna--kfin fdv, (1.18) where f is a distribution function over a variable v.This latter expression is apropos to particle velocity distributions. If,in addition to energy exchange,one allows particle exchange between a system and a heat-particle reservoir,one passes from the canonical ensemble(fixed T,N,V)to the grand canonical ensemble (fixed T,u,V),for which entropy is defined 41-46]: SBG.ge= 六.v-Nn(间+q=2T② μ,V (1.19)
16 Challenges to the Second Law ( ∂S ∂E )N,V ≡ 1 T (1.11) ( ∂S ∂V )E,N ≡ P T (1.12) ( ∂S ∂N )V,E ≡ − µ T . (1.13) If one drops the condition of fixed E and couples the system to a heat reservoir at fixed temperature T, allowing free exchange of energy between the system and reservoir, allowing E to vary as (0 ≤ E ≤ ∞), then one passes from the microcanonical to the canonical ensemble [41-46]. For the canonical ensemble, entropy is defined as SBG,c ≡ k ln(Z) + βE = k ∂ ∂T (T ln(Z)) . (1.14) Here β ≡ 1 kT and Z is the partition function (Zustandsumme or “sum over states”) upon which most of classical equilibrium thermodynamic quantities can be founded: Z ≡ i e−βEi , (1.15) where Ei are the constant individual system energies and E is the mean (average) system energy: E ≡ i Eie−βEi i e−βEi = i Eipi. (1.16) The probability pi is the Boltzmann factor exp[−Ei/kT]. One can define entropy through the probability sum SBG = −k i pi ln pi, (1.17) or in the continuum limit SBG = −k f ln f dv, (1.18) where f is a distribution function over a variable v. This latter expression is apropos to particle velocity distributions. If, in addition to energy exchange, one allows particle exchange between a system and a heat-particle reservoir, one passes from the canonical ensemble (fixed T, N, V ) to the grand canonical ensemble (fixed T, µ, V ), for which entropy is defined [41-46]: SBG,gc ≡ 1 β ( ∂q ∂T )z,V − N k ln(z) + kq = k[ ∂(T ln(Z) ∂T ]µ,V . (1.19)
Chapter 1:Entropy and the Second Law 17 Here g is the q-potential: q=q(2,V,T)=In 2(2,V,T), (1.20) defined in terms of the grand partition function: 2(,T)=∑exp(-E-aN)=∑Z,(,T) (1.21) i,j N,=0 Here =e-u is the fugacity,ZN,is the regular partition function for fixed par- ticle number Nj,and a=.The sum is over all possible values of particle number and energy,exponentially weighted by temperature.It is remarkable that such a simple rule is able to predict successfully particle number and energy oc- cupancy and,therefrom,the bulk of equilibrium thermodynamics.This evidences the power of the physical assumptions underlying the theory. (3)von Neumann [47]In quantum mechanics,entropy is not an observable,but a state defined through the density matrix,p: SuN(p)=-kTrlpln(p)]. (1.22) (Recall the expectation value of an observable is (A)=Tr(pA).)Roughly,SuN(p) is a measure of the quantity of chaos in a quantum mechanical mixed state.The von Neumann entropy has advantage over the Boltzmann formulation in that, presumably,it is a more basic and faithful description of nature in that the number of microstates for a system is well defined in terms of pure states,unlike the case of the classical continuum.On the other hand,unlike the Boltzmann microcanonical entropy,for the von Neumann formulation,important properties like ergodicity, mixing and stability strictly hold only for infinite systems. The time development of p for an isolated system is governed by the Liouville equation i-L,p≡cp间. (1.23) Here H is the Hamiltonian of the system and C...=[H..]is the Liouville superoperator.It follows that the entropy is constant in time.As noted by Wehrl [39, ..the entropy of a system obeying the Schrodinger equation (with a time-independent Hamiltonian)always remains constant [because the density matrix time evolves as]p(t)=e-iHpeiHt.Since eilt is a unitary operator,the eigenvalues of p(t)are the same eigenvalues of p.But the expression for the entropy only involves the eigenvalues of the density matrix,hence S(p(t))=S(p).(In the classical case,the analogous statement is a consequence of Liouville's theorem.)11 11This statement holds if H is a function of time;i.e.,p(t)=up(o)ut,where u= Texp(-言0Hdt)
Chapter 1: Entropy and the Second Law 17 Here q is the q-potential: q = q(z, V, T) ≡ ln[Z(z, V, T)], (1.20) defined in terms of the grand partition function: Z(z, V, T) ≡ i,j exp(−βEi − αNj ) = ∞ Nj=0 zNjZNj (V,T). (1.21) Here z ≡ e−βµ is the fugacity, ZNj is the regular partition function for fixed particle number Nj , and α = − µ kT . The sum is over all possible values of particle number and energy, exponentially weighted by temperature. It is remarkable that such a simple rule is able to predict successfully particle number and energy occupancy and, therefrom, the bulk of equilibrium thermodynamics. This evidences the power of the physical assumptions underlying the theory. (3) von Neumann [47] In quantum mechanics, entropy is not an observable, but a state defined through the density matrix, ρ: SvN (ρ) = −kTr[ρ ln(ρ)]. (1.22) (Recall the expectation value of an observable is A = T r(ρA).) Roughly, SvN (ρ) is a measure of the quantity of chaos in a quantum mechanical mixed state. The von Neumann entropy has advantage over the Boltzmann formulation in that, presumably, it is a more basic and faithful description of nature in that the number of microstates for a system is well defined in terms of pure states, unlike the case of the classical continuum. On the other hand, unlike the Boltzmann microcanonical entropy, for the von Neumann formulation, important properties like ergodicity, mixing and stability strictly hold only for infinite systems. The time development of ρ for an isolated system is governed by the Liouville equation i d dtρ(t) = 1 ¯h[H, ρ(t)] ≡ Lρ(t). (1.23) Here H is the Hamiltonian of the system and L... = 1 h¯ [H, . . .] is the Liouville superoperator. It follows that the entropy is constant in time. As noted by Wehrl [39], ... the entropy of a system obeying the Schr¨odinger equation (with a time-independent Hamiltonian) always remains constant [because the density matrix time evolves as] ρ(t) = e−iHtρeiHt. Since eiHt is a unitary operator, the eigenvalues of ρ(t) are the same eigenvalues of ρ. But the expression for the entropy only involves the eigenvalues of the density matrix, hence S(ρ(t)) = S(ρ). (In the classical case, the analogous statement is a consequence of Liouville’s theorem.)11 11This statement holds if H is a function of time; i.e., ρ(t) = Uρ(0)U†, where U = T exp(− i h¯ t 0 Hdt)
18 Challenges to the Second Law System Cyclic Machinery Weight Reservoir Figure 1.1:SGHB is based on weight processes. Since the Schrodinger equation alone is not sufficient to motivate the time evolu- tion of entropy as normally observed in the real world,one usually turns to the Boltzmann equation,the master equation,or other time-asymmetric formalisms to achieve this end [43,48,49,50.Finally,the von Neumann entropy depends on time iff p is coarse-grained;in contrast,the fine-grained entropy is constant. (This,of course,ignores the problematic issues surrounding the non-uniqueness of the coarse graining process. (4)Gyftopoulous,et al.[19,51]A utilitarian approach to entropy is advanced by Gyftopoulos,Hatsopoulos,and Beretta.Entropy SGHB is taken to be an intrin- sic,non-probabilistic property of any system whether microscopic,macroscopic, equilibrium,or nonequilibrium.Its development is based on weight processes in which a system A interacts with a reservoir R via cyclic machinery to raise or lower a weight (Figure 1.1).Of course,the weight process is only emblematic of any process of pure work.SGHB is defined in terms of energy E,a constant that depends on a reservoir cR,and generalized available energy R as: ScuB=5o+[(E-Eo)-(OR-). (1.24 CR for a system A that evolves from state Ai to state Ao.Eo and are values of a reference state and So is a constant fixed value for the system at all times. Temperature is not ostensibly defined for this system;rather,cR is a carefully defined reservoir property (which ultimately can be identified with temperature). Available energyR is the largest amount of energy that can be extracted from the system A-reservoir combination by weight processes.Like SGHB,it applies to all system sizes and types of equilibria. At first meeting,SGHB may seem contrived and circular,but its method of weight processes is similar to and no more contrived than that employed by Planck
18 Challenges to the Second Law Figure 1.1: SGHB is based on weight processes. Since the Schr¨odinger equation alone is not sufficient to motivate the time evolution of entropy as normally observed in the real world, one usually turns to the Boltzmann equation, the master equation, or other time-asymmetric formalisms to achieve this end [43, 48, 49, 50]. Finally, the von Neumann entropy depends on time iff ρ is coarse-grained; in contrast, the fine-grained entropy is constant. (This, of course, ignores the problematic issues surrounding the non-uniqueness of the coarse graining process.) (4) Gyftopoulous, et al. [19, 51] A utilitarian approach to entropy is advanced by Gyftopoulos, Hatsopoulos, and Beretta. Entropy SGHB is taken to be an intrinsic, non-probabilistic property of any system whether microscopic, macroscopic, equilibrium, or nonequilibrium. Its development is based on weight processes in which a system A interacts with a reservoir R via cyclic machinery to raise or lower a weight (Figure 1.1). Of course, the weight process is only emblematic of any process of pure work. SGHB is defined in terms of energy E, a constant that depends on a reservoir cR, and generalized available energy ΩR as: SGHB = S0 + 1 cR [(E − E0) − (ΩR − ΩR 0 )], (1.24) for a system A that evolves from state A1 to state A0. E0 and ΩR 0 are values of a reference state and S0 is a constant fixed value for the system at all times. Temperature is not ostensibly defined for this system; rather, cR is a carefully defined reservoir property (which ultimately can be identified with temperature). Available energy ΩR is the largest amount of energy that can be extracted from the system A-reservoir combination by weight processes. Like SGHB, it applies to all system sizes and types of equilibria. At first meeting, SGHB may seem contrived and circular, but its method of weight processes is similar to and no more contrived than that employed by Planck