2.5 Multiplicity and Entropyofa SpinSystenFigure2.1Aplotofthefraction ofmicroscopic states,Fw(n)thatbelongtothemacroscopicstate"nspinsF(n)up'plotted as a function of n.The macroscopic staten=(n)=N/2containsthemostmicroscopicstates.OAsNincreases,theratio o/(n)decreases as 1/N,andthemacrostaten=(n)beginstodominatethephysicalpropertiesofthesystem.A(n)If all microstates are equally probable,then Fr(n)is the probability offindingthe chain of N spin-1/2 particles with n spins “up, and is given by the binomialdistribution (see Appendix A).Forlarge N, thebinomial distribution can be ap-proximated by a Gaussian distribution (this is derived in Appendix A) so we canwrite(n-(n)21(2.13)Pr(n)~20N0~V2元where(n)=N/2 is the peak of thedistribution and on=VN/2 is a measureofits width.Notice that lim-= on/(n)=0. Thus, for very large N,to good ap-proximation,the macrostate with n=(n)governs the physical properties ofthesystemIfweplotthe fractionF(n)ofmicroscopic states having n spins up (seeFig-ure 2.1),wefind that itis sharply peaked about thevalue n=(n).As the number ofdegreesoffreedomtendtoinfinity(N-→co),thephysicalpropertiesofthesystembecomedeterminedbythatonevalue ofthemacroscopicvariable n=(n),andthisis called the equilibrium stateof the system.The tendency ofa macrostateto bedominated by a single most-probable value of its parameter, in the limit ofa largenumber of degrees offreedom,is universal to all systems whose interactions haveshortrange.It is a manifestation of the Central Limit Theorem (AppendixA)andis the basisfor the universal behavior found in thermodynamic systems.2.5.2EntropyofSpinSystemThe entropy of a spin lattice (with N spin-1/2 particles) that has n spins up isgiven byEqs. (2.8)and (2.9)and can bewrittenN!(2.14)S(N,n) = kg Inn!(N-n)!Forlarge N (N>10),wecan use Stirling's approximations,N!V2NNNe-Nand In(N)~NIn(M)-N,(2.15)to simplify thefactorials.Theentropy then takes theformNNS(N,n)~kgln(2.16)nn(N-n)N-
2.5 Multiplicity and Entropy of a Spin System 13 Figure 2.1 A plot of the fraction of microscopic states, N(n) that belong to the macroscopic state “n spins up,” plotted as a function of n. The macroscopic state n = ⟨n⟩ = N∕2 contains the most microscopic states. As N increases, the ratio σN∕⟨n⟩ decreases as 1∕ √N, and the macrostate n = ⟨n⟩ begins to dominate the physical properties of the system. If all microstates are equally probable, then N (n) is the probability of finding the chain of N spin-1∕2 particles with n spins “up,” and is given by the binomial distribution (see Appendix A). For large N, the binomial distribution can be approximated by a Gaussian distribution (this is derived in Appendix A) so we can write N (n) ≈ 1 σN √ 2π exp [ −(n − ⟨n⟩) 2 2σ2 N ] , (2.13) where ⟨n⟩ = N∕2 is the peak of the distribution and σN = √ N∕2 is a measure of its width. Notice that limN→∞ σN ∕⟨n⟩ = 0. Thus, for very large N, to good approximation, the macrostate with n = ⟨n⟩ governs the physical properties of the system. If we plot the fraction N (n) of microscopic states having n spins up (see Figure 2.1), we find that it is sharply peaked about the value n = ⟨n⟩. As the number of degrees of freedom tend to infinity (N → ∞), the physical properties of the system become determined by that one value of the macroscopic variable n = ⟨n⟩, and this is called the equilibrium state of the system. The tendency of a macrostate to be dominated by a single most-probable value of its parameter, in the limit of a large number of degrees of freedom, is universal to all systems whose interactions have short range. It is a manifestation of the Central Limit Theorem (Appendix A) and is the basis for the universal behavior found in thermodynamic systems. 2.5.2 Entropy of Spin System The entropy of a spin lattice (with N spin-1∕2 particles) that has n spins up is given by Eqs. (2.8) and (2.9) and can be written S(N, n) = kB ln [ N! n!(N − n)! ] . (2.14) For large N (N > 10), we can use Stirling’s approximations, N! ≈ √ 2πNNN e−N and ln(N!) ≈ N ln(N) − N , (2.15) to simplify the factorials. The entropy then takes the form S(N, n) ≈ kB ln [ NN nn(N − n)N−n ] . (2.16)
2 Complexity and EntropyThe form of the entropy in Eq. (2.16) is easier to deal with than Eq. (2.14) becauseitdoesnotdependonfactorials.In the limitN-→co,theentropy is well approximatedbythevalueNN(2.17)S(N,(n))~kgln(n)(n)(N- (n))N-(n)which is called the entropy of the equilibrium state of the system.If no externalmagnetic fields are present, then (n)= N/2, and we find(2.18)S(N,(n))~NkgIn2.In this case,because the spins are independent ofone another, the total entropyof the system is just N times the entropy of a single spin.Note that the entropy isadditivebecausetheentropyofthewhole system isthesum of theentropies ofthe independentparts ofthe system.2.5.2.1Entropyand Fluctuations AboutEquilibrium福In the limit N -→ co, the entropy is equal to S(N,(n), which is the equilibriumvalue of the entropy.However, in the real world we never reach the limit N = co.Anygiven systemalways has afinite number ofparticles and there will bemacro-scopic states with n +(n).Therefore, there will be fluctuations in the entropyabout the equilibrium value S(N,(n). Since the multiplicity of the macroscopicstates withn(n)isalwayslessthanthat ofthe statewithn=(n),fluctuationsawayfrom equilibrium must cause the value ofthe entropyto decrease.Thus,forsystems with fixed energy,the entropytakes its maximum value at equilibrium.The spin system considered above has zero magnetic energy so we have sup-pressed the energy dependence ofthe entropy.If all microscopic states with thesame energy,particle number, and number ofspins-up are equally probable, thenthe probability P(n) of finding the system in the macrostate, (N,n) is simplythefraction of microstates,Fr(n)=(N(n)/Nwith parameters N,n.Therefore,we can writeNr(n)1S(N,n(2.19)PN(n)=FN(n)=exINN-NNKsThus, the entropy, written as a function of the macroscopic variable n,can beusedtodeterminetheprobabilityoffluctuationsinthevalueofnawayfromtheequilibrium state n = (n).2.5.2.2EntropyandTemperatureIn the absence ofa magnetic field,the spin latticehas zero magnetic energy.How-ever, if a magnetic flux density B is present and directed upward, then spin-up lat-tice siteshave energy-μBand spin-down lattice sites have energy+μB,where μ isthe magneticmoment of the atoms. In the limit of large N, we can make the re-placement n -→(n)and the energy becomes a thermodynamic energy.Then thetotal magnetic energy takes theform(2.20)(E)=-μ(n)B +μ(N -(n))B=μB(N -2(n))
14 2 Complexity and Entropy The form of the entropy in Eq. (2.16) is easier to deal with than Eq. (2.14) because it does not depend on factorials. In the limit N → ∞, the entropy is well approximated by the value S(N,⟨n⟩) ≈ kB ln [ NN ⟨n⟩⟨n⟩(N − ⟨n⟩)N−⟨n⟩ ] , (2.17) which is called the entropy of the equilibrium state of the system. If no external magnetic fields are present, then ⟨n⟩ = N∕2, and we find S(N,⟨n⟩) ≈ NkB ln 2 . (2.18) In this case, because the spins are independent of one another, the total entropy of the system is just N times the entropy of a single spin. Note that the entropy is additive because the entropy of the whole system is the sum of the entropies of the independent parts of the system. 2.5.2.1 Entropy and Fluctuations About Equilibrium In the limit N → ∞, the entropy is equal to S(N,⟨n⟩), which is the equilibrium value of the entropy. However, in the real world we never reach the limit N = ∞. Any given system always has a finite number of particles and there will be macroscopic states with n ≠ ⟨n⟩. Therefore, there will be fluctuations in the entropy about the equilibrium value S(N,⟨n⟩). Since the multiplicity of the macroscopic states with n ≠ ⟨n⟩ is always less than that of the state with n = ⟨n⟩, fluctuations away from equilibrium must cause the value of the entropy to decrease. Thus, for systems with fixed energy, the entropy takes its maximum value at equilibrium. The spin system considered above has zero magnetic energy so we have suppressed the energy dependence of the entropy. If all microscopic states with the same energy, particle number, and number of spins-up are equally probable, then the probability PN (n) of finding the system in the macrostate, (N, n) is simply the fraction of microstates, N (n)=(N (n))∕N with parameters N, n. Therefore, we can write PN (n) = N (n) = N (n) N = 1 N exp ( 1 kB S(N, n) ) . (2.19) Thus, the entropy, written as a function of the macroscopic variable n, can be used to determine the probability of fluctuations in the value of n away from the equilibrium state n = ⟨n⟩. 2.5.2.2 Entropy and Temperature In the absence of a magnetic field, the spin lattice has zero magnetic energy. However, if a magnetic flux density is present and directed upward, then spin-up lattice sites have energy−μ and spin-down lattice sites have energy+μ, where μ is the magnetic moment of the atoms. In the limit of large N, we can make the replacement n → ⟨n⟩ and the energy becomes a thermodynamic energy. Then the total magnetic energy takes the form ⟨E⟩ = −μ⟨n⟩ + μ(N − ⟨n⟩) = μ(N − 2⟨n⟩) , (2.20)
2.5Multiplicity andEntropy ofa Spin Systenandthe magnetization is(2.21)(M)=μ(2(n)-N) :The physical properties of the system are determined by the equilibrium valuen =(n).Notethat, in the presenceofa magnetic field, the average number ofspins-up (n)will be shifted away from its valuefor the field-free casebut,usingEq. (2.20), it can be written in terms of the magnetic energyN(E)(n)=(2.22)2uB2Theentropycan be written in terms ofthe average energyand number ofatoms onthelattice.If we combine Eqs.(2.14)and (2.22)and use Stirling's approximationin Eq. (2.15),the entropy takes the form(E)(E)NNS(N,(E),B)~kgNInN-k22μB2uB2(E)(E)NN(2.23)In2uB2uBNote that, both the average energy and entropy are proportional to the numberof degrees offreedom.Letus nowintroducea resultfromthermodynamicsthatwewill justifyinthenext chapter.Therateatwhichtheentropychangesaswechangethethermody-namic energy is related to the temperature T of the system (in kelvin) so that1as(2.24)Ta(E))At very low temperature (in kelvin), a small change in energy can cause a largechange in the entropy ofthe system.At hightemperature, a small change in energycauses a verysmall change in theentropy.We can use Eq.(2.24)todeterminehowthe thermodynamic energy of the sys-tem varieswith temperature.Weneed totakethederivative of S(N,(E))withrespect to (E)holding N and B constant.Then with a bit of algebra, we obtainkB1as(N-(E)/(uB)1(2.25)T2μB(E)(N+(E)/(uB)Solvingfor(E),wefinally obtainNu2B2UB(2.26)(E)(N, T,B) =-NμBtanhkgTto lowest order in B.We have just demonstrated the power of thermodynamicsin allowing us to relate seemingly unrelated physical quantities.However,havingentered the realm of thermodynamics, the thermodynamic energy (E)(N, T,B),nowcontainsinformationaboutthermalpropertiesofthesystem
2.5 Multiplicity and Entropy of a Spin System 15 and the magnetization is ⟨M⟩ = μ (2⟨n⟩ − N) . (2.21) The physical properties of the system are determined by the equilibrium value n = ⟨n⟩. Note that, in the presence of a magnetic field, the average number of spins-up ⟨n⟩ will be shifted away from its value for the field-free case but, using Eq. (2.20), it can be written in terms of the magnetic energy ⟨n⟩ = N 2 − ⟨E⟩ 2μ . (2.22) The entropy can be written in terms of the average energy and number of atoms on the lattice. If we combine Eqs. (2.14) and (2.22) and use Stirling’s approximation in Eq. (2.15), the entropy takes the form S(N,⟨E⟩, ) ≈ kBN ln N − kB (N 2 − ⟨E⟩ 2μ ) ln (N 2 − ⟨E⟩ 2μ ) − kB (N 2 + ⟨E⟩ 2μ ) ln (N 2 + ⟨E⟩ 2μ ) . (2.23) Note that, both the average energy and entropy are proportional to the number of degrees of freedom. Let us now introduce a result from thermodynamics that we will justify in the next chapter. The rate at which the entropy changes as we change the thermodynamic energy is related to the temperature T of the system (in kelvin) so that ( 𝜕S 𝜕⟨E⟩ ) ,N = 1 T . (2.24) At very low temperature (in kelvin), a small change in energy can cause a large change in the entropy of the system. At high temperature, a small change in energy causes a very small change in the entropy. We can use Eq. (2.24) to determine how the thermodynamic energy of the system varies with temperature. We need to take the derivative of S(N,⟨E⟩) with respect to ⟨E⟩ holding N and constant. Then with a bit of algebra, we obtain ( 𝜕S 𝜕⟨E⟩ ) ,N = kB 2μ ln (N − ⟨E⟩∕(μ) N + ⟨E⟩∕(μ) ) = 1 T . (2.25) Solving for ⟨E⟩, we finally obtain ⟨E⟩(N, T, )=−N μ tanh ( μ kBT ) ≈ − N μ22 kBT , (2.26) to lowest order in . We have just demonstrated the power of thermodynamics in allowing us to relate seemingly unrelated physical quantities. However, having entered the realm of thermodynamics, the thermodynamic energy ⟨E⟩(N, T, ), now contains information about thermal properties of the system
2 Complexity and EntropyWe can also obtain themagnetization (M) of this system.WefindNuB(M)=(2.27)kgTtolowest order in B.Equation (2.27)is equation of statefor themagnetic system.Themagnetizationcanalsobefoundfromtheentropy,butwewill needtode-velopthefull machineryofthermodynamicsinordertoseehowthiscanbedoneproperly.The equation of state relates the mechanical and thermal properties ofa system, and generally can be determined from measurements in the laboratoryonthesysteminquestion.Itisoneof themostcommonandimportantrelation-ships that we can knowabout most physical systems.Themagnetic equation of state (2.28)is oftenwritten in termsof the numberof moles n of atoms in the system.The total number of moles is related to thetotal numberof atomsonthelatticeviaAvogadro'snumberNA=6.022×1023.Avogadro's number is thenumber of atoms in one mole of atoms or N =nNA.Then the magnetic equation of state takes the formnDmB(M) =(2.28)Twhere Dm = NAμ/kg is a parameter determined by fundamental constants andthemagnetic moment of the atoms in the particular systembeing considered.2.6Entropic Tension in a PolymerA very simple model ofa polymer consists ofa freelyjointed chain (FJC) of N non-interacting directedlinks,eachoflength.Thelinks arenumberedfrom1toNand each link is equallyprobable tobe eitherleft pointing ()or right pointing().ThenetlengthX ofthepolymerchain isdefined as thenetdisplacementfromthe unattached end of link 1 to theunattached end of link N so X=nrf-n,e,where n, (n) is the number ofleft (right) pointing links, and N = nr + nt.This system is mathematicallyanalogous to thechainof spin-1/2particles inSection2.5.Themultiplicity ofmicroscopic states with nr links to theright isN!(2.29)NN(nR)=nR!(N-nR)!Thetotalnumberofmicroscopic states is 2N.Assumingthatallmicroscopic statesareequallyprobable,the probabilityoffindingapolymerthat hasatotalofN linkswithnright-directed links is1N!N!p"qN-ng(2.30)PN(nR)=2Nng!(N-nR)!nR!(N-ng)
16 2 Complexity and Entropy We can also obtain the magnetization ⟨M⟩ of this system. We find ⟨M⟩ = N μ2 kBT , (2.27) to lowest order in . Equation (2.27) is equation of state for the magnetic system. The magnetization can also be found from the entropy, but we will need to develop the full machinery of thermodynamics in order to see how this can be done properly. The equation of state relates the mechanical and thermal properties of a system, and generally can be determined from measurements in the laboratory on the system in question. It is one of the most common and important relationships that we can know about most physical systems. The magnetic equation of state (2.28) is often written in terms of the number of moles 𝔫 of atoms in the system. The total number of moles is related to the total number of atoms on the lattice via Avogadro’s number NA = 6.022 × 1023. Avogadro’s number is the number of atoms in one mole of atoms or N = 𝔫NA. Then the magnetic equation of state takes the form ⟨M⟩ = 𝔫Dm T , (2.28) where Dm = NAμ2∕kB is a parameter determined by fundamental constants and the magnetic moment of the atoms in the particular system being considered. 2.6 Entropic Tension in a Polymer A very simple model of a polymer consists of a freely jointed chain (FJC) of N noninteracting directed links, each of length 𝓁. The links are numbered from 1 to N, and each link is equally probable to be either left pointing (←) or right pointing (→). The net length X of the polymer chain is defined as the net displacement from the unattached end of link 1 to the unattached end of link N so X = nR𝓁 − nL𝓁, where nL (nR) is the number of left (right) pointing links, and N = nR + nL. This system is mathematically analogous to the chain of spin-1/2 particles in Section 2.5. The multiplicity of microscopic states with nR links to the right is N (nR) = N! nR!(N − nR)! . (2.29) The total number of microscopic states is 2N . Assuming that all microscopic states are equally probable, the probability of finding a polymer that has a total of N links with nR right-directed links is PN (nR) = 1 2N N! nR!(N − nR)! = N! nR!(N − nR)! pnR qN−nR , (2.30)
2.6 EntropicTension in a Polymerwhere p = q = 1/2. This probability is a binomial distribution (see Appendix A).The average number of right-pointing links ng is given byNNZnRPN(nR)=pN=(2.31)(nR) =nR=0so the average number of left pointing links is(n,)=N-(ng)=N/2 and theaverage net length of the polymer is (x) = 0. In the limit N -→ co the proba-bility distribution in Eq.(2.30) approaches a Gaussian narrowly peaked aboutn = (nr) = N/2. Thus, most of the polymers are tightly and randomly coiled.The entropy of the collection of polymers with nright-pointing links isN!S=kglnkg[NInN-ngnng-(N-ng)In(N-ng))nR(N-nR)!(2.32)where we have used Stirling's approximation.If weplotthe entropyas a functionof nr,the curve has anextremumwhoselocation is givenby the conditiondsJ-HR=kgIn(2.33)dnRThis has the solution n = N/2, so the state ofmaximum entropy (the peak ofthecurve) occurs for ng = N/2 and X = 0. Thus, the collection of the most tightlycurled-uppolymershavethemaximumentropy.In the absence ofinteractions, all microscopic states have the same energy.Thetension J of the polymer can be related to the displacement X via the thermodynamic relation J =-T(oS/aX)E,N.But we can write nr = X/(2e)+ N/2 soJ=-T/(2)(aS/ang)u.N.Weusetheexpressionfortheentropytofindtheten-sionJin thechain,asa function of X.We obtainkgTkgT,N-nRk.T(N-x/e)X+...(2.34)广2e24N+x/eNe2nRIn thelast term,we have expandedJ in powers of x/Ne (which is only valid ifX/Ne≤1).ForthecaseX/Ne≤1,wehaveobtainedJ~kgT/(Ne2)X+.,whichisHooke'slawfortheelasticforceneeded to stretchthepolymer.Theforceconstant isk=kgT/(Ne2).ThetensionJisanentropicforce (perunitlength).Ifthechain is stretched to maximum length,it will have veryfewmicroscopic statesavailable.Ontheaverage,itwill contractbacktoalengthwhereitmaximizestheentropy (multiplicity of states).The theory described here is a random walk model for polymer coiling in onespace dimension.The results would be different if weconsidered therandom walkin three space dimensions. Nevertheless, this type of one-dimensional entrop-ic elasticity hasbeen observed inpolymers.One example is the macromoleculeDNA, which is a very long molecule, with lengths on the order of tens of mil-limeters (although it isgenerally coiled into a complex structure).There are short
2.6 Entropic Tension in a Polymer 17 where p = q = 1∕2. This probability is a binomial distribution (see Appendix A). The average number of right-pointing links nR is given by ⟨nR⟩ = ∑N nR=0 nRPN (nR) = pN = N 2 , (2.31) so the average number of left pointing links is ⟨nL⟩ = N − ⟨nR⟩ = N∕2 and the average net length of the polymer is ⟨X⟩ = 0. In the limit N → ∞ the probability distribution in Eq. (2.30) approaches a Gaussian narrowly peaked about nR = ⟨nR⟩ = N∕2. Thus, most of the polymers are tightly and randomly coiled. The entropy of the collection of polymers with nR right-pointing links is S = kB ln [ N! nR!(N − nR)!] ≈ kB[N ln N − nR ln nR − (N − nR)ln(N − nR) ], (2.32) where we have used Stirling’s approximation. If we plot the entropy as a function of nR, the curve has an extremum whose location is given by the condition dS dnR = kB ln (N − nR nR ) = 0 . (2.33) This has the solution nR = N∕2, so the state of maximum entropy (the peak of the curve) occurs for nR = N∕2 and X = 0. Thus, the collection of the most tightly curled-up polymers have the maximum entropy. In the absence of interactions, all microscopic states have the same energy. The tension J of the polymer can be related to the displacement X via the thermodynamic relation J = −T(𝜕S∕𝜕X)E,N . But we can write nR = X∕(2𝓁) + N∕2 so J = −T∕(2𝓁)(𝜕S∕𝜕nR)U,N . We use the expression for the entropy to find the tension J in the chain, as a function of X. We obtain J = −kBT 2𝓁 ln ( N − nR nR ) = −kBT 2𝓁 ln (N − X∕𝓁 N + X∕𝓁 ) ≈ kBT N𝓁2 X + . (2.34) In the last term, we have expanded J in powers of X∕N𝓁 (which is only valid if X∕N𝓁 ≪ 1). For the case X∕N𝓁 ≪ 1, we have obtained J ≈ kBT∕(N𝓁2)X + ., which is Hooke’s law for the elastic force needed to stretch the polymer. The force constant is k = kBT∕(N𝓁2). The tension J is an entropic force (per unit length). If the chain is stretched to maximum length, it will have very few microscopic states available. On the average, it will contract back to a length where it maximizes the entropy (multiplicity of states). The theory described here is a random walk model for polymer coiling in one space dimension. The results would be different if we considered the random walk in three space dimensions. Nevertheless, this type of one-dimensional entropic elasticity has been observed in polymers. One example is the macromolecule DNA, which is a very long molecule, with lengths on the order of tens of millimeters (although it is generally coiled into a complex structure). There are short