chapter 2: the second law. start with combinatorics, probability and multiplicity combinatorics and...
TRANSCRIPT
Chapter 2: The Second Law.Start with Combinatorics, Probability and Multiplicity
• Combinatorics and probability
• 2-state paramagnet and Einstein solid
• Multiplicity of a macrostate– Concept of Entropy
• Directionality of thermal processes (irreversibility)– Overwhelmingly probable
Combinatorics is the branch of mathematics studying the enumeration, combination, and permutation of sets of elements and the mathematical relations that characterize their properties.
Combinatorics and probability
Examples: random walk, two-state systems, …
Probability is the branch of mathematics that studies the possible outcomes of given events together with the outcomes' relative likelihoods and distributions. In common usage, the word "probability" is used to mean the chance that a particular event (or set of events) will occur.
Probability
Multiplication rule for independent events: P (i and j) = P (i) x P (j)
Example: What is the probability of the same face appearing on two successive throws of a dice?
The probability of any specific combination, e.g., (1,1): 1/6x1/6=1/36 (multiplication rule) . Hence, by addition rule, P(same face) = P(1,1) + P(2,2) +...+ P(6,6) = 6x1/36 = 1/6
An event (very loosely defined) – any possible outcome of some measurement.An event is a statistical (random) quantity if the probability of its occurrence, P, in the process of measurement is < 1.
The “sum” of two events: in the process of measurement, we observe either one of the events. Addition rule for independent events: P (i or j) = P (i) + P (j)
The “product” of two events: in the process of measurement, we observe both events.
(independent events – one event does not change the probability for the occurrence of the other).
NN APA
,...,,..., 11Expectation value of a macroscopic
observable A: (averaged over all accessible microstates)
Two model systems with fixed positions of particles and discrete energy levels
- the models are attractive because they can be described in terms of discrete microstates which can be easily counted (for a continuum of microstates, as in the example with a freely moving particle, we still need to learn how to do this). This simplifies calculation of . On the other hand, the results will be applicable to many other, more complicated models. Despite the simplicity of the models, they describe a number of experimental systems in a surprisingly precise manner.
- two-state paramagnet ....
(“limited” energy spectrum)
- the Einstein model of a solid
(“unlimited” energy spectrum)
The Two-State Paramagnet
The energy of a macrostate:
NNNN - the number of “up” spins
N - the number of “down” spins
- a system of non-interacting magnetic dipoles in an external magnetic field B, each dipole can have only two possible orientations along the field, either parallel or any-parallel to this axis (e.g., a particle with spin ½ ). No “quadratic” degrees of freedom (unlike in an ideal gas, where the kinetic energies of molecules are unlimited), the energy spectrum of the particles is confined within a finite interval of E (just two allowed energy levels).
- the magnetic moment of an individual dipole (spin)
E
E1 = - B
E2 = + B
0an arbitrary choice
of zero energy
- B for parallel to B,
+B for anti-parallel to B
The total magnetic moment:(a macroscopic observable)
The energy of a single dipole in the external magnetic field:
A particular microstate (....) is specified if the directions of all spins are specified. A macrostate is specified by the total # of dipoles that point “up”, N (the # of dipoles that point “down”, N = N - N ).
Example
Consider two spins. There are four possible configurations of microstates:
M = 2 0 0 - 2
In zero field, all these microstates have the same energy (degeneracy). Note that the two microstates with M=0 have the same energy even when B0: they belong to the same macrostate, which has multiplicity =2. The macrostates can be classified by their moment M and multiplicity :
M = 2 0 - 2
= 1 2 1
For three spins:
M = 3 - - - -3
M = 3 - -3
= 1 3 3 1macrostates:
The Multiplicity of Two-State Paramagnet
Each of the microstates is characterized by N numbers, the number of equally probable microstates – 2N, the probability to be in a particular microstate – 1/2N.
n ! n factorial = 1·2·....·n 0 ! 1 (exactly one way to arrange zero objects)
)!(!
!
!!
!),(
NNN
N
NN
NNN
For a two-state paramagnet in zero field, the energy of all macrostates is the same (0). A macrostate is specified by (N, N). Its multiplicity - the number of ways of choosing N objects out of N :
1)0,( N NN )1,( 2
1)2,(
NNN
23
21)3,(
NNN
N
n
N
nNn
N
n
nNNNnN
!!
!
123...
1...1),(
The multiplicity of a macrostate of a two-state paramagnet with (N, N):
Stirling’s Approximation for N! (N>>1)
Ne
NNeNN
NNN 22!
Multiplicity depends on N!, and we need an approximation for ln(N!):
NNNN ln!ln
Check:
More accurately:
because ln N << N for large N
NNNNNNNN ln2ln2
1ln
2
1ln!ln
N
e
NN
!or
The Probability of Macrostates of a Two-State PM (B=0)
(http://stat-www.berkeley.edu/~stark/Java/Html/BinHist.htm)
- as the system becomes larger, the P(N,N) graph becomes more sharply peaked:
N =1 (1,N) =1, 2N=2, P(1,N)=0.5
N
NN
NN
NNNNNNP
2
),(
),(
),(
#
),(),(
allsmicrostate all of
NNNN
N
NNNNNNN
NN
N
NNN
N
eNNeN
eN
NNN
NNNP
2
22!!
!),(
N
P(1, N)0.5
0 1 n0 0.5·1023 1023
N N
P(15, N) P(1023, N) - random orientation of spins in B=0 is overwhelmingly more probable
2nd law!
Multiplicity (Entropy) and Disorder
In general, we can say that small multiplicity implies “order”, while large multiplicity implies “disorder”. An arrangement with large could be achieved by a random process with much greater probability than an arrangement with small .
large small
The Einstein Model of a Solid
In 1907, Einstein proposed a model that reasonably predicted the thermal behavior of crystalline solids (a 3D bed-spring model):
a crystalline solid containing N atoms behaves as if it contained 3N identical independent quantum harmonic oscillators, each of which can store an integer number ni of energy units = ħ.
We can treat a 3D harmonic oscillator as if it were oscillating independently in 1D along each of the three axes:
classic:
quantum:
the solid’s internalenergy:
the zero-point energy
the effective internalenergy:
1 2 3 3N
ħ
all oscillators are identical, the energy quanta are the same
The Einstein Model of a Solid (cont.)
At high kBT >> ħ (the classical limit of large ni):
moleJ/K9.24
332
1)2(3
3
1
BBBi
N
i
NkdT
dUTNkTkNnU
solid dU/dT, J/K·mole
Lead 26.4
Gold 25.4
Silver 25.4
Copper 24.5
Iron 25.0
Aluminum 26.4
To describe a macrostate of an Einstein solid, we have to specify N and U, a microstate – ni for 3N oscillators.
Example: the “macrostates” of an Einstein Model with only one atom
(1,0) =1
(1,1) =3
(1,2) =6
(1,3) =10
Dulong-Petit’s rule
The Multiplicity of Einstein Solid
Proof: let’s consider N oscillators, schematically represented as follows: - q dots and N-1 lines, total q+N-1 symbols. For given q and N, the multiplicity is the number of ways of choosing n of the symbols to be dots, q.e.d.
The multiplicity of a state of N oscillators (N/3 atoms) with q energy quanta distributed among these oscillators:
In terms of the total
internal energy U =q:
!)1(!/
!1/),(
NU
NUUN
Example: The multiplicity of an Einstein solid with three atoms and eight units of energy shared among them
!)19(!8
!198)8,9(
12,870
Multiplicity of a Large Einstein Solid (kBT >> )
1 ! !ln ( , ) ln ln ln ! ln ! ln !
!( 1)! ! !
Stirling approxmation: ln ! ln
ln ln ln
ln ln ln
q N q NN q q N q N
q N q N
N N N N
q N q N q N q q q N N N
q N q N q q N N
q = U/ = N - the total # of energy quanta in a solid.
= U/( N) - the average # of quanta (microstates) available for each molecule
Dulong-Petit’s rule:
Multiplicity of a Large Einstein Solid (kBT >> )
q = U/ = N - the total # of energy quanta in a solid.
= U/( N) - the average # of quanta (microstates) available for each molecule
General statement:
For any system with N “quadratic” degrees of freedom
(“unlimited” spectrum), the multiplicity is proportional to U N/2.
Einstein solid:(2N degreesof freedom)
high temperatures:
(kBT >> , >>1, q >> N )
Multiplicity of a Large Einstein Solid (kBT << )
low temperatures:(kBT << , <<1, q << N )
Nqe
q
eNqN
),( (Pr. 2.17)
Microstates of a system (e.g. ideal gas)
Microstate: the state of a system specified by describing the quantum state of each molecule in the system. For a classical particle – 6 parameters (xi , yi , zi , pxi , pyi
, pzi )
For a macro system – 6N parameters.
The evolution of a system can be represented by a trajectory in the multidimensional (configuration, phase) space of micro-parameters. Each point in this space represents a microstate.
During its evolution, the system will only pass through accessible microstates – the ones that do not violate the conservation laws: e.g., for an isolated system, the total internal energy must be conserved.
1
2
i
Statistics Probabilities of Macrostates
Macrostate: the state of a macro system specified by its macroscopic parameters. Two systems with the same values of macroscopic parameters are thermodynamically indistinguishable. A macrostate tells us nothing about a state of an individual particle.
For a given set of constraints (conservation laws), a system can be in many macrostates.
The statistical approach: to connect the macroscopic observables (averages) to the probability for a certain microstate to appear along the system’s trajectory in configuration space, P( 1, 2,..., N).
The Phase Space vs. the Space of Macroparameters
V
T
P
1
2
ithe surface
defined by an equation of
state
some macrostate
1
2
i
1
2
i
1
2
i
numerous microstates in a multi-dimensional configuration (phase) space that correspond the same macrostate
etc., etc., etc. ...
Examples: Two-Dimensional Configuration Space
motion of a particle in a one-dimensional box
-L L
-L L x
px
-px
“Macrostates” are characterized by a single parameter: the kinetic energy K0
K0
Each “macrostate” corresponds to a continuum of microstates, which are characterized by specifying the
position and momentum
K=K0
Another example: one-dimensional harmonic oscillator
x
px
K + U =const
x
U(r)
The Fundamental Assumption of Statistical Mechanics
The ergodic hypothesis: an isolated system in an equilibrium state, evolving in time, will pass through all the accessible microstates at the same recurrence rate, i.e. all accessible microstates are equally probable.
The average over long times will equal the average over the ensemble of all equi-energetic microstates: if we take a snapshot of a system with N microstates, we will find the system in any of these microstates with the same probability.
Probability for a stationary system
many identical measurements on a single systema single measurement on many copies of the system
The ensemble of all equi-energetic states a microcanonical ensemble.
1
2
i
microstates which correspond to the
same energy
Probability of a Macrostate, Multiplicity
smicrostate accessible all of
macrostate given a to correspond that smicrostate of
macrostate particular a ofy Probabilit
#
#
The probability of a certain macrostate is determined by how many microstates correspond to this macrostate – the multiplicity of a given macrostate .
This approach will help us to understand why some of the macrostates are more probable than the other, and, eventually, by considering the interacting systems, we will understand irreversibility of processes in macroscopic systems.
smicrostate accessible all of #
1
ensemble icalmicrocanona of microstate particulara ofy Probabilit
Concepts of Statistical Mechanics
1. The macrostate is specified by a sufficient number of macroscopically measurable parameters (for an Einstein solid – N and U).
2. The microstate is specified by the quantum state of each particle in a system (for an Einstein solid – # of the quanta of energy for each of N oscillators)
3. The multiplicity is the number of microstates in a macrostate. For each macrostate, there is an extremely large number of possible microstates that are macroscopically indistinguishable.
4. The Fundamental Assumption: for an isolated system, all accessible microstate are equally likely.
5. The probability of a macrostate is proportional to its multiplicity. This will be sufficient to explain irreversibility.
Entropy and Temperature (Ch. 2 and a bit of 3)
Ideas:
Each accessible microstate of an isolated system is equally probable (the fundamental assumption).
Every macrostate has a countable number of microstates (follows from Q.M.).
The probability of a macrostate is proportional to its multiplicity. When systems get large, multiplicities get outrageously large.
On this basis, we will introduce the concept of entropy and discuss the Second Law of Thermodynamics.
Our plan:
As our point of departure, we’ll use the models of an Einstein solid. We have already discussed one advantage of this model – “discrete” degrees of freedom. Another advantage – by considering two interacting Einstein solids, we can learn about the energy exchange between these two systems, i.e. how to reach thermal equilibrium.
By using our statistical approach, we’ll identify the most probable macrostate of a combined system of two interacting Einstein solids after reaching an equilibrium;
We’ll introduce the entropy as a measure of the multiplicity of a given macrostate
The Second Law of Thermodynamics
Two Interacting Einstein Solids, Macropartitions
Suppose that we bring two Einstein solids A and B (two sub-systems with NA, UA and NB, UB) into thermal contact, to form a larger isolated system. What happens to these solids (macroscopically) after they have been brought into contact?
NA, UA NB, UB
energy
The combined sys. – N = NA+ NB , U = UA + UB
Question: what would be the most probable macrostate for given NA, NB , and U ?
The macropartition of the combined system is defined by macroparameter UA
Macropartition: a given pair of macrostates for sub-systems A and B that are consistent with conservation of the total energy U = UA + UB.
Example: the pair of macrostates where UA= 2 and UB= 4 is one possible macropartition of the combined system with U = 6
As time passes, the system of two solids will randomly shift between different microstates consistent with the constraint that U = const.
Different macropartitions amount to different ways that the energy can be macroscopically divided between the sub-systems.
The Multiplicity of Two Sub-Systems Combined
Example: two one-atom “solids” into thermal contact, with the total U = 6.
Macro-
partitionUA UB A B AB
0 : 6 0 6 1 28 28
1 : 5 1 5 3 21 63
2 : 4 2 4 6 15 90
3 : 3 3 3 10 10 100
4 : 2 4 2 15 6 90
5 : 1 5 1 21 3 63
6 : 0 6 0 28 1 28
Possible macropartitions for NA= NB = 3, U = qA+qB= 6
462
!)16(!6
!166
!)1(!/
!1/
NU
NU
Grand total # of microstates:
The probability of a macropartition is proportional to its multiplicity: BAAB
macropartitionA+B
sub-systemA
sub-systemB
Exercise: check the multiplicities of macrostates for NA= NB = 100, U = qA+qB= 200
The Probability of Macrostates of a Two-State PM (B=0)
(http://stat-www.berkeley.edu/~stark/Java/Html/BinHist.htm)
- as the system becomes larger, the P(N,N) graph becomes more sharply peaked:
N =1 (1,N) =1, 2N=2, P(1,N)=0.5
N
NN
NN
NNNNNNP
2
),(
),(
),(
#
),(),(
allsmicrostate all of
NNNN
N
NNNNNNN
NN
N
NNN
N
eNNeN
eN
NNN
NNNP
2
22!!
!),(
N
P(1, N)0.5
0 1 n0 0.5·1023 1023
N N
P(15, N) P(1023, N) - random orientation of spins in B=0 is overwhelmingly more probable
2nd law!
Recall
The Multiplicity of Two Sub-Systems Combined
BAAB
sub-systemA
sub-systemB
In real systems, N ~1023, U ~1023
How to count the multiplicity?(Spreadsheet fails)
How to find out the maximum multiplicity?
Answer: Analytic approximation
The probability of a macropartition is proportional to its multiplicity:
macropartitionA+B
Where is the Maximum? The Average Energy per Atom
In general, for two systems in thermal contact, the equilibrium (most probable) macropartition of the combined system is the one where the average energy per atom in each system is the same (the basis for introducing the temperature).
BA N
B
BBBB
N
A
AAAA N
eUUN
N
eUUN
),(,),(
BA N
B
A
N
A
ABBBAAAAB N
UUe
N
eUUNUN
),(),(
Let’s explore how the macropartition multiplicity for two sub-systems A and B (NA, NB, A= B= ) in thermal contact depends on the energy of one of the sub-systems:
AB UUU The high-T
limit (q >> N):
0
11
B
N
B
A
N
A
AB
N
B
A
A
N
A
AA
A
AB
N
e
N
UUe
N
eUN
N
UUe
N
e
N
eUN
dU
dBABA
B
B
A
A
N
U
N
U
Simpler argument
Take-home exercise: find the position of the maximum of AB(UA) for NA = 200, NB = 100, U = 180
A
UAU/2
B
UAU/2
AB
UAU/2
A special case: two identical sub-systems (NA = NB), AB(UA) is peaked at UA= UB= ½ U :
ABBA B
B
A
A
N
U
N
U
Sharpness of the Multiplicity Function
Example: N = 100,000 x = 0.01 (0.9999)100,000 ~ 4.5·10-5 << 1
How sharp is the peak? Let’s consider small deviations from the maximum for two identical sub-systems:
UA= (U/2) (1+x) UB= (U/2) (1-x) (x <<1)
AB
UAU/2
More rigorously(p. 65):
2U
The peak width:
a Gaussian function
When the system becomes large, the probability as a function of UA (macropartition) becomes very sharply peaked, i.e. “fluctuation” is very small
A BN N N
xU
U2
Implications? Irreversibility!
When two macroscopic solids are in thermal equilibrium with each other, completely random and reversible microscopic processes (leading to random shuffling between microstates) tend at the macroscopic level to push the solids inevitably toward an equilibrium macropartition (an irreversible macro behavior). Any random fluctuations away from the most likely macropartition are extremely small !
The vast majority of microstates are in macropartitions close to the most probable one (in other words, because of the “narrowness” of the macropartition probability graph). Thus,
(a) If the system is not in the most probable macropartition, it will rapidly and inevitably move toward that macropartition. The reason for this “directionality” (irreversibility): there are far more microstates in that direction than away. This is why energy flows from “hot” to “cold” and not vice versa.
(b) It will subsequently stay at that macropartition (or very near to it), in spite of the random shuffling of energy back and forth between the two solids.
Problem:
Consider the system consisting of two Einstein solids P and Q in thermal equilibrium. Assume that we know the number of atoms in each solid and . What do we know if we also know
(a) the quantum state of each atom in each solid?
(b) the total energy of each of the two solids?
(c) the total energy of the combined system?
the system’s macrostate
the system’s microstate
the system’s macropartition
(a)
(b)
(c) +fluctuation
NA, UA NB, UB
energy
Problem:
Imagine that you discover a strange substance whose multiplicity is always 1, no matter how much energy you put into it. If you put an object made of this substance (sub-system A) into thermal contact with an Einstein solid having the same number of atoms but much more energy (sub-system B), what will happen to the energies of these sub-systems?
• Energy flows from B to A until they have the same energy.
• Energy flows from A to B until A has no energy.
• No energy will flow from B to A at all.
Entropy ..., , VUNkS B lnof a system in a given macrostate (N,U,V...):
Entropy is just another (more convenient) way of talking about multiplicity.
Convenience:
reduces ridiculously large numbers to manageable numbers
Examples: for N~1023, , ln ~1023, being multiplied by kB ~ 10-23, it gives
S ~ 1J/K.
The “inverse” procedure: the entropy of a certain macrostate is 4600kB.
What is the multiplicity of the macromacrostate?
• if a system contains two or more interacting sub-systems having their own
distinct macrostates, the total entropy of the combined system in a given
macropartition is the sum of the entropies of the subsystems they have in that
macropartition: AB = A x B x C x.... S AB = S A + S B + S C + ...
231010~
The entropy is a state function, i.e. it depends on the macrostate alone and not on the path of the system to this macrostate.
Units: J/K
20004600 10~ee Bk
S
Problem:
Imagine that one macropartition of a combined system of two Einstein solids has an entropy of 1 J/K, while another (where the energy is more evenly divided) has an entropy of 1.001 J/K. How many times more likely are you to find the system in the second macropartition compared to the first?
The Second Law of Thermodynamics
An isolated system, being initially in a non-equilibrium state, will evolve from macropartitions with lower multiplicity (lower probability, lower entropy) to macropartitions with higher multiplicity (higher probability, higher entropy). Once the system reaches the macropartition with the highest multiplicity (highest entropy), it will stay there. Thus,
The entropy of an isolated system never decreases.
(one of the formulations of the second law of thermodynamics).
Whatever increases the number of microstates will happen if it is allowed by the fundamental laws of physics and whatever constraints we place on the system.“Whatever” - energy exchange, particles exchange, expansion of a system, etc.
( Is it really true that the entropy of an isolated system never decreases? consider a pair of very small Einstein solids. Why is this statement more accurate for large systems than small systems? )
Entropy and Temperature
Thus, when two solids are in equilibrium, the slope is the same for both of them.
On the other hand, when two solids are in equilibrium, they have the same temperature.
S AB
UAU/2
S A
S B
NB
NA
N
BBAAAB UUN
eUNUN
2
),(),(
BABAAB UNUNN
eNSSS lnlnln2
Equilibrium:
B
B
A
A
U
S
U
S
1
,
NV
U
ST
Units: T – K, S – J/K, U - J
UA, VA, NA UB, VB, NB
To establish the relationship between S and T, let’s consider two sub-systems, A and B, isolated from the environment. The sub-systems are separated by a rigid membrane with finite thermal conductivity (Ni and Vi are fixed, thermal energy can flow between the sub-systems). The sub-systems – with the “quadratic” degrees of freedom (~U fN/2). For example, two identical Einstein solids (NA = NB = N) near the equilibrium macropartition (UA= UB= U/2): equilibrium
The stat. mech. definition of the temperature
0
B
B
A
A
A
B
A
A
A
AB
U
S
U
S
U
S
U
S
U
S
S AB
UAU/2
S A
S B
2. The slope is inversely proportional to T.US /
1
,
NV
U
ST
We have been considering the entropy changes in the processes where two interacting systems exchanged the thermal energy but the volume and the number of particles in these systems were fixed. In general, however, we need more than just one parameter to specify a macrostate:
1. Note that the partial derivative in the definition of T is calculated at V=const and N=const.
NVUkS B ,,ln
The physical meaning of the other two partial derivatives of S will be considered in L.7.
- the energy should flow from higher T to lower T; in thermal equilibrium, TA and TB should be the same.
The sub-system with a larger S/U (lower T) should receive energy from the sub-system with a smaller S/U (higher T), and this process will continue until SA/UA and SB/UB become the same.
Problems
Problem: Imagine that you discover a strange substance whose multiplicity is always 1, no matter how much energy you put into it. If you put an object made of this substance (sub-system A) into thermal contact with an Einstein solid having the same number of atoms but much more energy (sub-system B), what will happen to the energies of these sub-systems?
Problem: An object whose multiplicity is always 1, no matter what its thermal energy is has a temperature that: (a) is always 0; (b) is always fixed; (c) is always infinite.
Problem: If an object has a multiplicity that decreases as its thermal energy increases (e.g., a two-state paramagnetic over a certain U range), its temperature would: (a) be always 0; (b) be always fixed; (c) be negative; (d) be positive.
1
,
NV
U
ST
From S(N,U,V) - to U(N,T,V)
Find (U,V,N,...) – the most challenging step
S (U,V,N,...) = kB ln (U,V,N,...)
Solve for U = f (T,V,N,...)
1
,
,...),,(
NVU
NVUST
Now we can get an (energy) equation of state U = f (T,V,N,...) for any
system for which we have an explicit formula for the multiplicity (entropy)!!
Thus, we’ve bridged the gap between statistical mechanics and
thermodynamics! The recipe:
Measuring Entropy Even if we cannot calculate S, we can still measure it:
For V=const and N=const :T
dTTC
T
Q
T
dUdS V )(
T
V
T
TdTCSTS
0
)()0()(
By heating a cup of water (200g, CV =840 J/K) from 200C to 1000C, we increase its entropy by
J/K 200J/K) 840(373
293
T
TdS
At the same time, the multiplicity of the system is increased by25105.1 e
- in L.6, we’ll see that this equation holds for all reversible (quasi-static) processes (even if V is changed in the process).T
QdS
This is the “thermodynamic” definition of entropy, which Clausius introduced in 1854, long before Boltzmann gave his “statistical” definition S kBln .
An Einstein Solid: from S(N,U) to U(N,T) at high T
NN
N
eU
N
qeUN
),(High temperatures:
(kBT >> , q >>N)
N
ekNUkN
N
eUkNNUS BBB lnlnln),(
U
kN
U
S
TB
1
TkNTNU B),(
- in agreement with the equipartition theorem: the total energy should be ½kBT times the number of degrees of freedom.
To compare with experiment, we can measure the heat capacity:
NVV T
UC
,
dT
PdVdU
dT
QC
the heat capacity at constant volume
BBV kNTkNT
C
- in a nice agreement with experiment
An Einstein Solid: from S(N,U) to U(N,T) at low T
Low temperatures:
(kBT <<, q <<N)Uq
U
eN
q
eNUN
),(
U
eNUkUNS B
ln),(
U
Nkk
U
eNk
U
S
TBBB
lnln1
TkBeNTNU
),,(
- as T 0, the energy goes to zero as expected (Pr. 3.5).
The low-T heat capacity: (more accurate result will be obtained on the basis of the
Debye model of solids)
Tk
BB
B
TkTkV
BBB eTk
NkTk
eNeNT
C
2
2
Example (Pr. 3.14, page 97)
For a mole of aluminum, CV = aT + bT3 at T < 50 K (a = 1.35·10-3 J/K2, b = 2.48·10–5 J/K4). The linear term – due to mobile electrons, the cubic term – due to the crystal lattice vibrations. Find S(T) and evaluate the entropy at T = 1K,10 K.
3
0
3
0 3
)()( T
baT
T
TdTbTa
T
TdTCTS
TTV
J/K 1036.1K1J/K 1048.23
1K1J/K 1035.1)1( 334523 KST = 1K
- at low T, nearly all the entropy comes from the mobile electrons
T = 10K J/K 1018.2K10J/K 1048.23
1K10J/K 1035.1)10( 2334523 KS
- most of the entropy comes from lattice vibrations
2123
220
23
3
106.1J/K 1038.1
J/K 1018.2)10( 10
J/K 1038.1
J/K 1035.1)1(
BB k
KS
k
KS
- much less than the # of particles, most degrees of freedom are still frozen out.
Residual Entropy
Glasses aren’t really in equilibrium, the relaxation time – huge. They do not have a well-defined T or CV. Glasses have a particularly large entropy at T = 0.
T
S
liquid
crystal
glass
supercooledliquid
residualentropy
Debenedetti & Stillinger, Nature (2001)
Entropy of an Ideal Gas
Find (U,V,N,...) – the most challenging step
S (U,V,N,...) = kB ln (U,V,N,...)
Solve for U = f (T,V,N,...)
1,...),,(
U
NVUST
Now we will derive the equation(s) of state for an ideal gas from the principles of statistical mechanics. We will follow the path prescribed by the ‘microcanonical’ ensemble thinking.
So far we have treated quantum systems whose states in the configuration (phase) space may be enumerated. When dealing with classical systems with translational degrees of freedom, we need to learn how to calculate the multiplicity.
Multiplicity for a Single particle- is more complicated than that for an Einstein solid, because it depends on three rather than two macro parameters (e.g., N, U, V).
Example: particle in a one-dimensional “box”
-L L
-L L x
px
px
-px x
2x x
x
L p L p
x p h
The number of microstates:
Q.M.
hpx x
Quantum mechanics (the uncertainty principle) helps us to numerate all different states in the configuration (phase) space:
pspaceThe total number of ways of filling up the cells in phase space is the product of the number of ways the “space” cells can be filled times the number of ways the “momentum” cells can be filled
Multiplicity of a Monatomic Ideal Gas (simplified)
For a molecule in a three-dimensional box: the state of the molecule is a point in the 6D space - its position (x,y,z) and its momentum (px,py,pz). The number of “space” microstates is:
There is some momentum distribution of molecules in an ideal gas (Maxwell), with a long “tail” that goes all the way up to p = (2mU)1/2 (U is the total energy of the gas). However, the momentum vector of an “average” molecule is confined within a sphere of radius p ~ (2mU/N)1/2 (U/N is the average energy per molecule). Thus, for a single “average” molecule:
The total number of microstates for N molecules:
For N molecules:
p
n
p
However, we have over-counted the multiplicity, because we have assumed that the atoms are distinguishable. For indistinguishable quantum particles, the result should be divided by N! (the number of ways of arranging N identical atoms in a given set of “boxes”):
1 particle -
2 particles -
The accessible momentum volume for N particles = the “area” of a 3N-dimensional hyper-sphere p
Monatomic ideal gas:
(3N degrees of freedom)
N =1
f N- the total # of “quadratic” degrees of freedom
The reason why m matters: for a given U, a molecule with a larger mass has a larger momentum, thus a larger “volume” accessible in the momentum space.
px
py
pz
Momentum constraints:
More Accurate Calculation of N (I)
More Accurate Calculation of N (II)
For a particle in a box (L)3: (Appendix A)
If p>>p, the total degeneracy (multiplicity) of 1 particle with energy U is:
Plug in the “area” of the hyper-sphere:
px
py
pz
If p>>p, the total degeneracy (multiplicity) of N indistinguishable particle with energy U is:
, , 0x y zn n n
Entropy of an Ideal Gas
f 3 (monatomic), 5 (diatomic), 6 (polyatomic)
(Monatomic ideal gas)
3/ 2
2
3 5 4 3ln ln ln ln ( , )
2 3 3 2B B B B
V U m V UNk Nk Nk Nk N m
N N h N N
The Sackur-Tetrode equation:
In general, for a gas of polyatomic molecules:
3/ 2
2
4 5( , , ) ln ln 2
3 2B B
V m US N V U Nk Nk p
N h N
an average volume per molecule
an average energy per molecule
Problem Two cylinders (V = 1 liter each) are connected by a valve. In one of the cylinders – Hydrogen (H2) at P = 105
Pa, T = 200C , in another one – Helium (He) at P = 3·105 Pa, T=1000C. Find the entropy change after mixing and equilibrating.
i
fB
i
fB T
TkN
f
V
VkNTVNS ln
2ln),,(
H2 :
ftotal TUTUTU 2211
2
2
1
1
21
21
2211
35
35
2
3
2
52
3
2
5
T
P
T
PPP
kNkN
TkNTkNT
BB
BB
f
1
ln2
52ln
2 T
TkNkNS f
BBH He :2
22 ln2
32ln
T
TkNkNS f
BBHe
22
1121 ln3ln5
22ln
2 T
TN
T
TN
kkNNSSS ffB
BHeHtotal J/K 67.0total S
The temperatureafter mixing: fBBBB TkNkNTkNTkN
212211 2
3
2
5
2
3
2
5
For each gas:
Entropy of MixingConsider two different ideal gases (N1, N2) kept in two separate volumes (V1,V2) at the same temperature. To calculate the increase of entropy in the mixing process, we can treat each gas as a separate system. In the mixing process, U/N remains the same (T will be the same after mixing). The parameter that changes is V/N:
2
5
3
4ln),,(
2/3
2 N
U
h
m
N
VkNUVNS B
The total entropy of the system is greater after mixing – thus, mixing is irreversible.
22
11 lnln
V
VN
V
VN
k
SS
k
S
B
BA
B
total
if N1=N2=1/2N , V1=V2=1/2V 2ln2/
ln22/
ln2
NV
VN
V
VN
k
S
B
total
Gibbs “Paradox”
If two mixing gases are of the same kind (indistinguishable molecules):
1 2 1 2total 1 2 1 2
1 2 1 2
1 2 1 21 2
1 2 1 2
/ ln ln ln
ln ln 0 if
B total A B B
V V V VS k S S S k N N N N
N N N N
V N V N N N NN N
N V N V V V V
NN
N
N
N mUNh
V
N
32/3
32
!2
32
!
1
BB kNN
U
h
m
N
VkNUVNS
2
5
3
4ln),,(
2/3
2
Quantum-mechanical indistinguishability is important! (even though this equation applies only in the low density limit, which is “classical” in the sense that the distinction between fermions and bosons disappear.
Stotal = 0 because U/N and V/N available for each molecule remain the same after mixing.
22
11 lnln
V
VN
V
VN
k
SS
k
S
B
BA
B
total
- applies only if two gases are different !
ProblemTwo identical perfect gases with the same pressure P and the same number of particles N, but with different temperatures T1 and T2, are confined in two vessels, of volume V1 and V2 , which are then connected. find the change in entropy after the system has reached equilibrium.
BBBBB kNTkh
m
N
VkNkN
N
U
h
m
N
VkNUVNS
2
5
2
3
3
4ln
2
5
3
4ln),,(
2/3
2
2/3
2
BfBf kNTN
VVkNS 2
2
5
2ln2 2/321
21
22121
2121
221
21
221
21
2
21
221
2/32
2/31
3
21
22
21
4ln
2
5
22
4ln
2
3
4ln
ln2
3
4lnln
2ln
TT
TTTTNkVVP
TT
TT
VV
VV
TT
T
VV
VV
TT
T
VV
N
N
VV
kN
S
B
ff
B
at T1=T2, S=0, as it should be (Gibbs paradox)
BBBBi kNTN
VkNkNT
N
VkNSSS
2
5ln
2
5ln 2/3
222/3
11
21
221 TT
T f
- prove it!
An Ideal Gas: from S(N,V,U) - to U(N,V,T)
2/,, NfNUVNfNVU Ideal gas:(fN degrees of freedom)
),(lnln2
,, mNN
VNk
N
UNk
fNVUS BB
U
Nkf
U
S
TB
2
1
TkNf
TVNU B2),,(
The heat capacity for a monatomic ideal gas: B
NVV kN
f
T
UC
2,
- in agreement with the equipartition theorem, the total energy should be ½kBT times the number of degrees of freedom.
- the “energy” equation of state
Partial Derivatives of the Entropy
We have been considering the entropy changes in the processes where two interacting systems exchanged the thermal energy but the volume and the number of particles in these systems were fixed. In general, however, we need more than just one parameter to specify a macrostate, e.g. for an ideal gas
Today we will explore what happens if we let the V vary, and analyze the physical meaning of the other two partial derivatives of the entropy:
NUV
S
,
We are familiar with the physical meaning only one partial derivative of entropy: TU
S
NV
1
,
NVUkNVUSS B ,ln, ,,
When all macroscopic quantities S,V,N,U are allowed to vary:
NdN
SVd
V
SUd
U
SdS
VUUNVN ,,,
Mechanical Equilibrium and Pressure
Let’s fix UA,NA and UB,NB , but allow V to vary (the membrane is insulating, impermeable for gas molecules, but its position is not fixed). Following the same logic, spontaneous “exchange of volume” between sub-systems will drive the system towards mechanical equilibrium (the membrane at rest). The equilibrium macropartition should have the largest (by far) multiplicity (U, V) and entropy S (U, V).
UA, VA, NA UB, VB, NB
0
B
B
A
A
A
B
A
A
A
AB
V
S
V
S
V
S
V
S
V
S
BA VV B
B
A
A
V
S
V
S
,
B
U N
k NS P
V V T
In mechanical equilibrium:
- the volume-per-molecule should be the same for both sub-systems, or, if T is the same, P must be the same on both sides of the membrane.
S AB
VAVAeq
S A
S B
NVNUNUU
S
V
S
V
STP
,,,
/
The stat. phys. definition of pressure:
e.g. ideal gas
The “Pressure” Equation of State for an Ideal Gas
Ideal gas:(fN degrees of freedom)
V
kNT
V
STP B
NU
,
TkNPV B
UkN
f
U
S
T BNV
1
2
1
,
The “energy” equation of state (U T):
TkNf
U B2
The “pressure” equation of state (P T):
- we have finally derived the equation of state of an ideal gas from first principles!
),(ln2
ln),,( mNTkNf
N
VkNTVNS BB
Thermodynamic identity I
, ,N V N U
S SdS dU dV
U V
Let’s assume N is fixed,
thermal equilibrium: TU
S
NV
1
,
mechanical equilibrium: ,U N
S P
V T
1 PdS dU dV
T T dU TdS PdV i.e.
Quasi-Static Processes
constVTS f 2/01
1VT const
The quasi-static adiabatic process with an ideal gas :
constVT 1constPV - we’ve derived these equations from the 1st Law and PV=RT
On the other hand, from the Sackur-Tetrode equation for an isentropic process :
Quasistatic adiabatic (Q = 0) processes: 0Sd isentropic processes
(all processes)
dVPSdTUd WQUd
(quasi-static processes with fixed N)
SdTQ Thus, for quasi-static processes :T
QSd
T
QSd
- is an exact differential (S is a state function). Thus, the factor 1/T converts Q into an exact differential for quasi-static processes.
Comment on State Functions : P
V
0
0
Q
S
Problem:
(a) Calculate the entropy increase of an ideal gas in an isothermal process.
(b) Calculate the entropy increase of an ideal gas in an isochoric process.
2/,, NfNUVNfNVU
You should be able to do this using (a) Sackur-Tetrode eq. and (b) T
QSd
dV
V
TdT
fNkPdVdUQ B 2
V
dV
T
dTfNk
T
QdS B 2
2/ln fB VTNgNkS
i
fBconstT V
VNkS ln
i
fBconstV T
TNk
fS ln
2
(all the processes are quasi-static)
Let’s verify that we get the same result with approaches a) and b) (e.g., for T=const):
Since U = 0,
i
fB
V
V
B
V
VTkNdV
V
TkNWQ
f
ln T
QS
(Pr. 2.34)
Problem:
A bacterias of mass M with heat capacity (per unit mass) C, initially at temperature T0+T, is brought into thermal contact with a heat bath at temperature T0..
(a) Show that if T<<T0, the increase S in the entropy of the entire system (body+heat bath) when equilibrium is reached is proportional to (T)2.
• Find S if the body is a bacteria of mass 10-15kg with C=4 kJ/(kg·K), T0=300K, T=0.03K.
• What is the probability of finding the bacteria at its initial T0+T for t =10-12s over the lifetime of the Universe (~1018s).
0ln0
00
0
0
0
TT
TC
T
TCd
T
QΔS
T
TT
T
TT
body
(a)
0000
0
0
T
TC
T
TCd
T
QΔS
T
TTbathheat
02
...32
1lnln2
0
32
0
0
0
T
TC
T
TTC
T
TCΔSΔSΔS bathheatbodytotal
(b)
KJKJ
T
TCΔStotal /102
300
03.0
2
/101104
220
21532
0
Problem (cont.)
(b) for the (non-equilibrium) state with Tbacteria = 300.03K is greater than in the equilibrium state with Tbacteria = 300K by a factor of
630145023
20
10/1038.1
/102expexp
0
0
eKJ
KJ
k
S
B
total
TT
T
The number of “1ps” trials over the lifetime of the Universe: 3012
18
1010
10
Thus, the probability of the event happening in 1030 trials:
30 630# events probability of occurrence of an event 10 10 0