24 Challenges to the Second Law (a) (b) Figure 1.2:One-dimensional velocity distribution functions:(a)non-Maxwellian; (b)Maxwellian. and complexity.Finally,as it is deduced from thermodynamics,the notion of entropy is critically dependent on the presumed validity of the second law. Among the many foundational issues thwarting a general definition of physi- cal entropy,none is more urgent than extending entropy into the nonequilbrium regime.After all,changes in the world are primarily irreversible nonequilibrium processes,but even the most basic nonequilibrium properties,like transport coef- ficients,cannot be reliably predicted in generall5. The prominent classical and quantum entropies strictly apply at equilibrium only.As a simple example,consider the two one-dimensional velocity distributions in Figure 1.2.Distribution fa is highly nonequilibrium (non-Maxwellian)and does not have a well-defined temperature,while fo is Maxwellian and does have a well- defined temperature.Let's say we wish to add heat 6Q to fa to transform it into fand then calculate the entropy change for this process viaAS.This presents a problem in this formalism because T is not properly defined for fa or any otherother intermediate distribution on its way to the Maxwellian f16. While small excusions into near nonequilibrium can be made via the Onsager relations [71]or fluctuation-dissipation theorems [43,72],in general,far nonequi- librium systems are unpredictable.Only recently has theory begun to make sig- nificant headway into these regimes.Excursions are limited to idealized systems and carry with them their own questionable baggage,but results are heartening [73].Notable past and present exponents of nonequilibrium thermodynamics in- clude Onsager,Prigogine,Meixner,Green,Kubo,Ruelle,Hoover,Evans,Cohen, Gallavotti,Lebowitz,Nicolis,Gaspard,Dorfmann,Maes,Jou,Eu and many others [71-89].Notable recent advances in the microscopic descriptions of nonequilib- i5Some entropies,like SGHB and STs,are claimed to apply at nonequilibrium,but they do not have compelling microscopic descriptions. 16On the other hand,one might aver that,since S=-k [fIn fdv,one could calculate AS= -k[∫foln fodu-∫faln fadu]
24 Challenges to the Second Law Figure 1.2: One-dimensional velocity distribution functions: (a) non-Maxwellian; (b) Maxwellian. and complexity. Finally, as it is deduced from thermodynamics, the notion of entropy is critically dependent on the presumed validity of the second law. Among the many foundational issues thwarting a general definition of physical entropy, none is more urgent than extending entropy into the nonequilbrium regime. After all, changes in the world are primarily irreversible nonequilibrium processes, but even the most basic nonequilibrium properties, like transport coef- ficients, cannot be reliably predicted in general15. The prominent classical and quantum entropies strictly apply at equilibrium only. As a simple example, consider the two one-dimensional velocity distributions in Figure 1.2. Distribution fa is highly nonequilibrium (non-Maxwellian) and does not have a well-defined temperature, while fb is Maxwellian and does have a welldefined temperature. Let’s say we wish to add heat δQ to fa to transform it into fb and then calculate the entropy change for this process via f i δQ T = ∆S. This presents a problem in this formalism because T is not properly defined for fa or any other other intermediate distribution on its way to the Maxwellian fb 16. While small excusions into near nonequilibrium can be made via the Onsager relations [71] or fluctuation-dissipation theorems [43, 72], in general, far nonequilibrium systems are unpredictable. Only recently has theory begun to make significant headway into these regimes. Excursions are limited to idealized systems and carry with them their own questionable baggage, but results are heartening [73]. Notable past and present exponents of nonequilibrium thermodynamics include Onsager, Prigogine, Meixner, Green, Kubo, Ruelle, Hoover, Evans, Cohen, Gallavotti, Lebowitz, Nicolis, Gaspard, Dorfmann, Maes, Jou, Eu and many others [71-89]. Notable recent advances in the microscopic descriptions of nonequilib- 15Some entropies, like SGHB and ST s, are claimed to apply at nonequilibrium, but they do not have compelling microscopic descriptions. 16On the other hand, one might aver that, since S = −k f ln f dv, one could calculate ∆S = −k fb ln fbdv − fa ln fadv
Chapter 1:Entropy and the Second Law 25 rium entropy have proceded largely through study of nonequilibrium steady states (NESS),especially in fluids(gases)[73].This formalism is apropos to many of the challenges in this volume. For NESS,classical phase space volumes (dr dqdp)are often replaced by more general measures,perhaps the best known of which is the Sinai-Ruelle-Bowen (SRB)measure.It is especially useful in describing chaotic systems whose phase space development is hyperbolic;that is,stretching in some dimensions while con- tracting in others.Phase space stretching gives rise to the hallmark of chaos: sensitivity to initial conditions.The separation rate of initially proximate phase space trajectories is given by Lyapounov exponents A,one for each dimension. Negative A indicates convergence of trajectories,while positive A indicates expo- nential separation of nearby trajectories-and chaos. Although a general definition of entropy in NESS is lacking,entropy production can be expressed as S(p)=/(-VzX)p(dz), (1.34) where divergence is with respect to the phase space measure coordinate and the nonequilibrium time development of z is determined via dx dt =X(x), (1.35) where A(x)is a vector field denoting physical forces.Using SRB measures,the second law demands that S(t)>0;for dissipative systems(those producing heat) S(t)>0.This is possible because SRB measures break time reversal symmetry, rendering the system non-Hamiltonian,thus allowing V0. Within the chaotic dynamics paradigm,NESS exist at nonequilibrium attrac- tors in phase space.An example of NESS attractors among second law challenges can be inferred from Figure 6.6 in 86.2.4.3,pertaining to a gravitator that circulates at a steady-state angular velocity within a gas-filled cavity,driven by spontaneous pressure gradients.The primary difference between this and standard NESS is that,while traditional NESS are dissipative (turn work into heat),second law challenges are regenerative(turn heat into work),thus admitting S(t)<0. Nonequilibrium,irreversibility and dissipation are the triumvirate that rules the natural thermodynamic world.Second law challenges obey the former two, but not the third.As such,much of the formalism already developed for nonequi- librium thermodynamics should be directly applicable to the challenges in this volume,the chief proviso being sign reversal for heat and entropy production.By turning this considerable theoretical machinery on the challenges,they may be either further supported or resolved in favor of the second law. It is now commonly held that the second law arises as a consequence of the interaction between a quantum system and its thermal environment [90,91,92]. While this might be true,it should be noted that system-bath interactions can also take an active role in violations of specific formulations of this law in specific situations,as will be shown in Chapter 3
Chapter 1: Entropy and the Second Law 25 rium entropy have proceded largely through study of nonequilibrium steady states (NESS), especially in fluids (gases) [73]. This formalism is apropos to many of the challenges in this volume. For NESS, classical phase space volumes (dx = dqdp) are often replaced by more general measures, perhaps the best known of which is the Sinai-Ruelle-Bowen (SRB) measure. It is especially useful in describing chaotic systems whose phase space development is hyperbolic; that is, stretching in some dimensions while contracting in others. Phase space stretching gives rise to the hallmark of chaos: sensitivity to initial conditions. The separation rate of initially proximate phase space trajectories is given by Lyapounov exponents λ, one for each dimension. Negative λ indicates convergence of trajectories, while positive λ indicates exponential separation of nearby trajectories — and chaos. Although a general definition of entropy in NESS is lacking, entropy production can be expressed as S˙(ρ) = (−∇xX )ρ(dx), (1.34) where divergence is with respect to the phase space measure coordinate and the nonequilibrium time development of x is determined via dx dt = X (x), (1.35) where X (x) is a vector field denoting physical forces. Using SRB measures, the second law demands that S˙(t) ≥ 0; for dissipative systems (those producing heat) S˙(t) > 0. This is possible because SRB measures break time reversal symmetry, rendering the system non-Hamiltonian, thus allowing ∇xX = 0. Within the chaotic dynamics paradigm, NESS exist at nonequilibrium attractors in phase space. An example of NESS attractors among second law challenges can be inferred from Figure 6.6 in §6.2.4.3, pertaining to a gravitator that circulates at a steady-state angular velocity within a gas-filled cavity, driven by spontaneous pressure gradients. The primary difference between this and standard NESS is that, while traditional NESS are dissipative (turn work into heat), second law challenges are regenerative (turn heat into work), thus admitting S˙(t) < 0. Nonequilibrium, irreversibility and dissipation are the triumvirate that rules the natural thermodynamic world. Second law challenges obey the former two, but not the third. As such, much of the formalism already developed for nonequilibrium thermodynamics should be directly applicable to the challenges in this volume, the chief proviso being sign reversal for heat and entropy production. By turning this considerable theoretical machinery on the challenges, they may be either further supported or resolved in favor of the second law. It is now commonly held that the second law arises as a consequence of the interaction between a quantum system and its thermal environment [90, 91, 92]. While this might be true, it should be noted that system-bath interactions can also take an active role in violations of specific formulations of this law in specific situations, as will be shown in Chapter 3.
26 Challenges to the Second Law 1.5 Entropy and the Second Law:Discussion Entropy and the second law are commonly conflated-for example,the non- decrease of entropy for a closed system is an oft-cited version-but many formu- lations of the second law do not involve entropy at all;consider,for instance,the Clausius and Kelvin-Planck forms.Entropy is surely handy,but it is not essential to thermodynamics-one could hobble along without it.It is more critical to statistical mechanics,which grapples with underlying dynamics and microstates, but even there its utility must be tempered by its underlying assumptions and lim- itations,especially when treating chaotic,nonlinear,and nonequilibrium systems (Sees2.3.2.). The majority of second law challenges are phrased in terms of heat and work, rather than in terms of entropy.This is largely because entropy per se is difficult to measure experimentally.Heat,temperature,pressure,and work are measured quantities,while entropy is usually inferred.Thus,entropy,the second law,and its challenges are not as intimate as is often assumed.Entropy is a handmaiden of the second law,not its peer. At the microscopic level an individual molecule doesn't know what entropy is and it couldn't care less about the second law.A classical system of N particles is also oblivious to them insofar as its temporal trajectory in a(6N-1)-dimensional phase space is simply a moving point to which an entropy cannot be ascribed and to which entropy increases are meaningless.(In this context,for ensemble theory, entropy cannot be strictly defined since f is singular.)Entropy is a global property of a system,measurable in terms of the surface area of the constant energy manifold on which the system's phase space point wanders,but this assumes conditions on the motion of the phase space point that,by definition,are either not measured or not measurable and,hence,might not be valid. In its very conception,entropy presumes ignorance of the microscopic details of the system it attempts to describe.In order to close the explanatory gap,one or more far-reaching assumptions about the microscopic behavior or nature of that system must be made.Many of these provisos-e.g.,ergodicity,strong mixing, equal a priori probability,extensivity,thermodynamic limit,equilibrium-allow accurate predictions for large and important classes of thermodynamic phenomena; however,every formulation of entropy makes assumptions that limit the parameter space in which it is valid,such that no known formulation applies to all possible thermodynamic regimes17.It is doubtful that any formulation of entropy can be completely inclusive since there will probably always be special cases outside the range of validity of any proviso powerful enough to close the explanatory gap.The best one can hope to do is to identify when a particular type of entropy will or will not apply to a particular case-and even the criteria for this hope are not known.Insofar as complex systems-and most realistic thermodynamic systems are complex-can display chaotic and unpredictable behavior (unpredictable to the experimenter and perhaps even to the system itself),it seems unlikely that any single form of entropy will be able to capture all the novelty Nature can produce 17Systems are known for which one,many,or all the above provisos fail
26 Challenges to the Second Law 1.5 Entropy and the Second Law: Discussion Entropy and the second law are commonly conflated — for example, the nondecrease of entropy for a closed system is an oft-cited version — but many formulations of the second law do not involve entropy at all; consider, for instance, the Clausius and Kelvin-Planck forms. Entropy is surely handy, but it is not essential to thermodynamics — one could hobble along without it. It is more critical to statistical mechanics, which grapples with underlying dynamics and microstates, but even there its utility must be tempered by its underlying assumptions and limitations, especially when treating chaotic, nonlinear, and nonequilibrium systems (See §2.3.2.). The majority of second law challenges are phrased in terms of heat and work, rather than in terms of entropy. This is largely because entropy per se is difficult to measure experimentally. Heat, temperature, pressure, and work are measured quantities, while entropy is usually inferred. Thus, entropy, the second law, and its challenges are not as intimate as is often assumed. Entropy is a handmaiden of the second law, not its peer. At the microscopic level an individual molecule doesn’t know what entropy is and it couldn’t care less about the second law. A classical system of N particles is also oblivious to them insofar as its temporal trajectory in a (6N-1)-dimensional phase space is simply a moving point to which an entropy cannot be ascribed and to which entropy increases are meaningless. (In this context, for ensemble theory, entropy cannot be strictly defined since f is singular.) Entropy is a global property of a system, measurable in terms of the surface area of the constant energy manifold on which the system’s phase space point wanders, but this assumes conditions on the motion of the phase space point that, by definition, are either not measured or not measurable and, hence, might not be valid. In its very conception, entropy presumes ignorance of the microscopic details of the system it attempts to describe. In order to close the explanatory gap, one or more far-reaching assumptions about the microscopic behavior or nature of that system must be made. Many of these provisos — e.g., ergodicity, strong mixing, equal a priori probability, extensivity, thermodynamic limit, equilibrium — allow accurate predictions for large and important classes of thermodynamic phenomena; however, every formulation of entropy makes assumptions that limit the parameter space in which it is valid, such that no known formulation applies to all possible thermodynamic regimes17. It is doubtful that any formulation of entropy can be completely inclusive since there will probably always be special cases outside the range of validity of any proviso powerful enough to close the explanatory gap. The best one can hope to do is to identify when a particular type of entropy will or will not apply to a particular case — and even the criteria for this hope are not known. Insofar as complex systems — and most realistic thermodynamic systems are complex — can display chaotic and unpredictable behavior (unpredictable to the experimenter and perhaps even to the system itself), it seems unlikely that any single form of entropy will be able to capture all the novelty Nature can produce. 17Systems are known for which one, many, or all the above provisos fail
Chapter 1:Entropy and the Second Law 27 Entropy formulations vary across disciplines,from physics to engineering,from chaos theory to economics,from biology to information theory.Even within a sin- gle discipline(physics)there are numerous versions between classical and quantum regimes,between utilitarian and formal approaches.Not all are equivalent,or even compatible.Most become problematic at nonequilibrium,but this is where physics becomes the most interesting.Most entropies blend seemlessly into others,making clear distinctions nearly impossible.One could say the subject of entropy is well- mixed and somewhat disordered.This state of affairs is intellectually unsatisfying and epistemologically unacceptable. It is the opinion of one of the authors(d.p.s.)that,despite its singular impor- tance to thermodynamics and statistical mechanics,entropy will never have a com- pletely satisfactory and general definition,nor will its sovereign status necessarily endure.Rather,like the calorigue,which was useful but not intellectually persua- sive enough to survive the 19th century,entropy could well fade into history18.In the end,each thermodynamic system (particularly nonequilibrium ones)should be considered individually and microscopically with respect to its boundary con- ditions,constraints,and composition to determine its behavior19.Considered classically,it is the 6N-dimensional phase space trajectory that truly matters and the various approximations that currently expedite calculations are too simplistic to capture the true richness of dynamic behaviors.Thus,each system should be considered on a case by case basis.If entropy is defined at the microscopic level of detail necessary to make completely accurate predictions about phase space trajectories,however,it loses its utility-and meaning 20. Entropy remains enigmatic.The more closely one studies it,the less clear it becomes.Like a pointillisme painting whose meaning dissolves into a collection of meaningless points when observed too closely,so too entropy begins to lose meaning when one contemplates it at a microscopic level.Insofar as our definition of entropy is predicated on what is presumed unknown or unknowable about a system,it is epistemologically unsatisfactory and must ultimately be surpassed. As our understanding of the underlying dynamics of complex systems brightens, so must the utility of entropy dim and,perhaps,entirely disappear.Fortunately, the second law can survive without its handmaiden. 1.6 Zeroth and Third Laws of Thermodynamics The first law is the skeleton of thermodynamics;the second law is its flesh. The first gives structure;the second gives life.By comparison,the zeroth and 18In the near term,however,this will surely not be the case. 19A few simple cases,like the ideal gas,will be predictable due to their thermodynamic sim- plicity,but realistically complex nonequilibrium systems that possess significant thermodynamic depth-like life-will defy tidy description in terms of entropy,or easy prediction in terms of behavior.In the most interesting cases,chaos rules. 200n the other hand,perhaps if a completely general definition of order and complerity is discovered,this will lead to a general definition of physical entropy
Chapter 1: Entropy and the Second Law 27 Entropy formulations vary across disciplines, from physics to engineering, from chaos theory to economics, from biology to information theory. Even within a single discipline (physics) there are numerous versions between classical and quantum regimes, between utilitarian and formal approaches. Not all are equivalent, or even compatible. Most become problematic at nonequilibrium, but this is where physics becomes the most interesting. Most entropies blend seemlessly into others, making clear distinctions nearly impossible. One could say the subject of entropy is wellmixed and somewhat disordered. This state of affairs is intellectually unsatisfying and epistemologically unacceptable. It is the opinion of one of the authors (d.p.s.) that, despite its singular importance to thermodynamics and statistical mechanics, entropy will never have a completely satisfactory and general definition, nor will its sovereign status necessarily endure. Rather, like the calorique, which was useful but not intellectually persuasive enough to survive the 19th century, entropy could well fade into history18. In the end, each thermodynamic system (particularly nonequilibrium ones) should be considered individually and microscopically with respect to its boundary conditions, constraints, and composition to determine its behavior19. Considered classically, it is the 6N-dimensional phase space trajectory that truly matters and the various approximations that currently expedite calculations are too simplistic to capture the true richness of dynamic behaviors. Thus, each system should be considered on a case by case basis. If entropy is defined at the microscopic level of detail necessary to make completely accurate predictions about phase space trajectories, however, it loses its utility — and meaning 20. Entropy remains enigmatic. The more closely one studies it, the less clear it becomes. Like a pointillisme painting whose meaning dissolves into a collection of meaningless points when observed too closely, so too entropy begins to lose meaning when one contemplates it at a microscopic level. Insofar as our definition of entropy is predicated on what is presumed unknown or unknowable about a system, it is epistemologically unsatisfactory and must ultimately be surpassed. As our understanding of the underlying dynamics of complex systems brightens, so must the utility of entropy dim and, perhaps, entirely disappear. Fortunately, the second law can survive without its handmaiden. 1.6 Zeroth and Third Laws of Thermodynamics The first law is the skeleton of thermodynamics; the second law is its flesh. The first gives structure; the second gives life. By comparison, the zeroth and 18In the near term, however, this will surely not be the case. 19A few simple cases, like the ideal gas, will be predictable due to their thermodynamic simplicity, but realistically complex nonequilibrium systems that possess significant thermodynamic depth — like life — will defy tidy description in terms of entropy, or easy prediction in terms of behavior. In the most interesting cases, chaos rules. 20On the other hand, perhaps if a completely general definition of order and complexity is discovered, this will lead to a general definition of physical entropy
28 Challenges to the Second Law third laws are mere hat and slippers.Since one should not go about undressed, let us briefly consider the latter two. Zeroth Law The zeroth law pertains to the transitivity of equilibrium.It can be stated: If system A is in equilibrium with systems B and C,then system B is in equilibrium with system C. More commonly,it is expressed in terms of temperature,because temperature is the easiest equilibrium property to measure experimentally: If the temperature of system A is equal to the temperature of system B, and the temperature of system B is equal to the temperature of system C,then the temperature of system A is equal to the temperature of system C.(If TA TB and TB =Tc,then TA Tc.) Or,to put it succinctly: Thermometers exist. The role of this law is far-reaching since it allows one to introduce,within axiomatic thermodynamics,integral intensive characteristics of mutually equilib- rium systems,such as temperature,pressure,or chemical potential.It is therefore unsettling that quantum mechanical models exist that predict its violation(83.6.7). Third Law21 The third law of thermodynamics pertains primarily to establishing fiduciary entropies.Like the second and zeroth,it can be stated in various ways. The first,the Nernst-Planck form states: Nernst-Planck Any change in condensed matter is,in the limit of the zero absolute temperature,performed without change in entropy. (Nernst 1906) Planck supplemented this in 1912 (in modern form): Planck The entropy of any pure substance at T=0 is finite,and, therefore,can be taken to be zero. The third law says that any substance that has a unique stable or metastable state as its temperature is reduced toward absolute zero can be taken to have zero entropy at absolute zero [93].In fact,at T=0 most substances will have residual zero point entropies associated with such things as mixed isotopic composition, randomly oriented nuclear spins,minor chemical impurities,or crystal defects,but if these do not affect the thermodynamic process for which entropy is pertinent, they can be safely ignored since "they just go along for the ride."In some sense, the entropy depends on the knowledge or opinion of the observer.Ideally,if the 21M.O.Scully maintains,"The third law has all the weight of an Italian traffic advisory
28 Challenges to the Second Law third laws are mere hat and slippers. Since one should not go about undressed, let us briefly consider the latter two. Zeroth Law The zeroth law pertains to the transitivity of equilibrium. It can be stated: If system A is in equilibrium with systems B and C, then system B is in equilibrium with system C. More commonly, it is expressed in terms of temperature, because temperature is the easiest equilibrium property to measure experimentally: If the temperature of system A is equal to the temperature of system B, and the temperature of system B is equal to the temperature of system C, then the temperature of system A is equal to the temperature of system C. (If TA = TB and TB = TC , then TA = TC .) Or, to put it succinctly: Thermometers exist. The role of this law is far-reaching since it allows one to introduce, within axiomatic thermodynamics, integral intensive characteristics of mutually equilibrium systems, such as temperature, pressure, or chemical potential. It is therefore unsettling that quantum mechanical models exist that predict its violation (§3.6.7). Third Law21 The third law of thermodynamics pertains primarily to establishing fiduciary entropies. Like the second and zeroth, it can be stated in various ways. The first, the Nernst-Planck form states: Nernst-Planck Any change in condensed matter is, in the limit of the zero absolute temperature, performed without change in entropy. (Nernst 1906) Planck supplemented this in 1912 (in modern form): Planck The entropy of any pure substance at T = 0 is finite, and, therefore, can be taken to be zero. The third law says that any substance that has a unique stable or metastable state as its temperature is reduced toward absolute zero can be taken to have zero entropy at absolute zero [93]. In fact, at T = 0 most substances will have residual zero point entropies associated with such things as mixed isotopic composition, randomly oriented nuclear spins, minor chemical impurities, or crystal defects, but if these do not affect the thermodynamic process for which entropy is pertinent, they can be safely ignored since “they just go along for the ride.” In some sense, the entropy depends on the knowledge or opinion of the observer. Ideally, if the 21M.O. Scully maintains, “The third law has all the weight of an Italian traffic advisory