lecture 4: entropy - department of physics - sdsm&tandre/phys743/lecture4.pdf · 2017. 9....

15
Lecture 4: Entropy Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743 September 7, 2017 Chapter I. Basic Principles of Stat Mechanics Lecture 4: Entropy September 7, 2017 1 / 13

Upload: others

Post on 31-Mar-2021

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Lecture 4: Entropy - Department of Physics - SDSM&Tandre/PHYS743/Lecture4.pdf · 2017. 9. 13. · Chapter I. Basic Principles of Stat MechanicsA.G. Petukhov, PHYS 743Lecture 4: Entropy

Lecture 4: Entropy

Chapter I. Basic Principles of Stat Mechanics

A.G. Petukhov, PHYS 743

September 7, 2017

Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743Lecture 4: Entropy September 7, 2017 1 / 13

Page 2: Lecture 4: Entropy - Department of Physics - SDSM&Tandre/PHYS743/Lecture4.pdf · 2017. 9. 13. · Chapter I. Basic Principles of Stat MechanicsA.G. Petukhov, PHYS 743Lecture 4: Entropy

Subsystem in a ThermostatWe divide a macroscopic system A into twosubsystems a and b. The probability dwa thatthe energy of the subsystem a falls into theinterval (Ea, Ea + dEa) is proportionals thephase volume dqa · dpa between the surfacesof the constant energies Ea and Ea + dEa:

dwa(qa, pa) = ρa(Ea)dqa·dpa/hs = ρa(Ea)dΓa

We can calculate ρa(Ea) using micro-canonical distribution andintegrating over all possible states of the system b (thermostat):

dwa(qa, pa) = C(dqa · dpa/hs)∫δ (E0 − Ea − Eb) dqb · dpb/hs

= C · gb(E0 − Ea)dΓa = C · gb(E0 − Ea)ga(Ea)dEa, (1)

Thereforeρa(Ea) = C · gb(E0 − Ea) (2)

Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743Lecture 4: Entropy September 7, 2017 2 / 13

Page 3: Lecture 4: Entropy - Department of Physics - SDSM&Tandre/PHYS743/Lecture4.pdf · 2017. 9. 13. · Chapter I. Basic Principles of Stat MechanicsA.G. Petukhov, PHYS 743Lecture 4: Entropy

Sharpness of the Energy DistributionThus we represented dwa as:

dwa = W (Ea)dEa,

where the probability density

W (Ea) = ga(Ea)ρa(Ea) ∝ ga(Ea) · gb(E0−Ea) (3)

We know that g(E) ∝ Es is extremely fast,monotonically increasing function of its argument. The second formula inEq. (3) shows that W (E) must have an extremely sharp maximum atsome energy E because it’s a product of increasing (ga) and decreasing(gb) functions of Ea.

The conservation of energy dictates that the energyof the subsystem lies in a very narrow interval ∆E around the mostprobable value E. From now on we drop index a for simplicity andapproximate W (E) with a rectangular function

W (E) =

{g(E)ρ(E), E −∆E/2 ≤ E ≤ E + ∆E/2

0 otherwise

Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743Lecture 4: Entropy September 7, 2017 3 / 13

Page 4: Lecture 4: Entropy - Department of Physics - SDSM&Tandre/PHYS743/Lecture4.pdf · 2017. 9. 13. · Chapter I. Basic Principles of Stat MechanicsA.G. Petukhov, PHYS 743Lecture 4: Entropy

Sharpness of the Energy DistributionThus we represented dwa as:

dwa = W (Ea)dEa,

where the probability density

W (Ea) = ga(Ea)ρa(Ea) ∝ ga(Ea) · gb(E0−Ea) (3)

We know that g(E) ∝ Es is extremely fast,monotonically increasing function of its argument. The second formula inEq. (3) shows that W (E) must have an extremely sharp maximum atsome energy E because it’s a product of increasing (ga) and decreasing(gb) functions of Ea.The conservation of energy dictates that the energyof the subsystem lies in a very narrow interval ∆E around the mostprobable value E.

From now on we drop index a for simplicity andapproximate W (E) with a rectangular function

W (E) =

{g(E)ρ(E), E −∆E/2 ≤ E ≤ E + ∆E/2

0 otherwise

Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743Lecture 4: Entropy September 7, 2017 3 / 13

Page 5: Lecture 4: Entropy - Department of Physics - SDSM&Tandre/PHYS743/Lecture4.pdf · 2017. 9. 13. · Chapter I. Basic Principles of Stat MechanicsA.G. Petukhov, PHYS 743Lecture 4: Entropy

Sharpness of the Energy DistributionThus we represented dwa as:

dwa = W (Ea)dEa,

where the probability density

W (Ea) = ga(Ea)ρa(Ea) ∝ ga(Ea) · gb(E0−Ea) (3)

We know that g(E) ∝ Es is extremely fast,monotonically increasing function of its argument. The second formula inEq. (3) shows that W (E) must have an extremely sharp maximum atsome energy E because it’s a product of increasing (ga) and decreasing(gb) functions of Ea.The conservation of energy dictates that the energyof the subsystem lies in a very narrow interval ∆E around the mostprobable value E. From now on we drop index a for simplicity andapproximate W (E) with a rectangular function

W (E) =

{g(E)ρ(E), E −∆E/2 ≤ E ≤ E + ∆E/2

0 otherwise

Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743Lecture 4: Entropy September 7, 2017 3 / 13

Page 6: Lecture 4: Entropy - Department of Physics - SDSM&Tandre/PHYS743/Lecture4.pdf · 2017. 9. 13. · Chapter I. Basic Principles of Stat MechanicsA.G. Petukhov, PHYS 743Lecture 4: Entropy

Boltzmann’s Definition of the EntropyNormalization of W (E) (Area=1) yields:

ρ(E)∆Γ = 1, (4)

where ∆Γ is the multiplicity of the subsystem,i.e. the total number of the microstates accessibleto the subsystem in the characteristic interval∆E around the average value E. The quantity

S = kB · log ∆Γ (5)

is called the entropy. This is the celebrated Boltzmann’s definition of theentropy. Here kB=1.38·10−23 J/K is the Boltzmann’s constant. In manyderivations we will follow L&L and set kB = 1:

S = log ∆Γ

We argued previously that

log ρ(E) = α− βE

can be approximated by the linear term in its Taylor’s expansion.Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743Lecture 4: Entropy September 7, 2017 4 / 13

Page 7: Lecture 4: Entropy - Department of Physics - SDSM&Tandre/PHYS743/Lecture4.pdf · 2017. 9. 13. · Chapter I. Basic Principles of Stat MechanicsA.G. Petukhov, PHYS 743Lecture 4: Entropy

Gibbs Definition of the Entropy

The linear approximation for log ρ(E) is the onlymeaningful way to represent its energy dependenceand preserve additivity. Let us explore this linearity:

log ρ(E)=log ρ(〈E〉)=α−β 〈E〉=〈α− βE〉=〈log ρ〉

Using Eqs. (4) and (5) we obtain (kB = 1):

S = −〈log ρ〉

or

S = −∫

log ρ(q, p)ρ(q, p)dq · dp/hs (6)

Eq. (6) is the Gibbs definition of entropy which is valid not only forequilibrium but also for non-equilibrium statistical mechanics. It is also atthe core of the information theory.

Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743Lecture 4: Entropy September 7, 2017 5 / 13

Page 8: Lecture 4: Entropy - Department of Physics - SDSM&Tandre/PHYS743/Lecture4.pdf · 2017. 9. 13. · Chapter I. Basic Principles of Stat MechanicsA.G. Petukhov, PHYS 743Lecture 4: Entropy

Entropy in Quantum StatisticsIn Quantum Mechanics (QM) we deal with discrete energy levels En.Then we can introduce a QM analog of ρ(q, p) denoted as ρn (≡ wn inL&L). The quantity ρn (sometimes called population or occupationnumber) is the probability that a QM system occupies the energy level En.Then the QM analog of Eq. (6) reads:

S = −∑n

ρn log ρn (7)

The quantities ρn describe a QM subsystem of a larger closed system.Such systems are ”mixtures” that cannot be described by wave functionsbut rather by density matrices. It turns out that the density matricesrelevant for equilibrium statistical mechanics are diagonal, i.e.ρnm = ρnδnm in energy representation, i.e. when indices n and menumerate eigenstates of the Hamiltonian:

H |n〉 = En |n〉

Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743Lecture 4: Entropy September 7, 2017 6 / 13

Page 9: Lecture 4: Entropy - Department of Physics - SDSM&Tandre/PHYS743/Lecture4.pdf · 2017. 9. 13. · Chapter I. Basic Principles of Stat MechanicsA.G. Petukhov, PHYS 743Lecture 4: Entropy

Density MatrixIn more general, i.e. non-equilibrium, case the density matrix isnon-diagonal. Let us introduce the density operator:

ρ =∑m,n

|n〉 ρmn 〈m|

From QM we know that the expectation value of any operator A can beobtained using the density operator ρ:

〈A〉 =∑m,n

ρmnAnm =∑m

(ρA)mm

= TrρA (8)

Let us consider two extreme cases of density matrices:

Pure state

|Ψ〉 =∑n

cne−iωnt |n〉 ωn = En/~

ρ =∑n,m

c∗ncm exp[i(ωn − ωm)t] |n〉 〈m|

Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743Lecture 4: Entropy September 7, 2017 7 / 13

Page 10: Lecture 4: Entropy - Department of Physics - SDSM&Tandre/PHYS743/Lecture4.pdf · 2017. 9. 13. · Chapter I. Basic Principles of Stat MechanicsA.G. Petukhov, PHYS 743Lecture 4: Entropy

Example of Pure States: Neutrino OscillationsIn general ρ depends on time and satisfies QM Liouiville equation:

∂ρ

∂t=i

~[ρ, H]

For pure states non-diagonal elements of the density matrix(coherences) are of critical importance. They are responsible forcoherent quantum oscillations.Example: Consider neutrino oscillations between µ and τ neutrinos.By all means neutrinos should be treated as non-interacting particlesin pure states (the probability of a single interaction during neutrino’stravel through Earth is 10−15!). These states, however, are not theeigenstates of the Hamiltonian (mass matrix) and neutrinos constantlyoscillate from one flavor to another. Let us describe this phenomenonusing the density matrix formalism. For simplicity we will take intoaccount only µ and τ flavors. The corresponding states read:

|µ〉 = cos θ |ν1〉+ sin θ |ν2〉|τ〉 = − sin θ |ν1〉+ cos θ |ν2〉

Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743Lecture 4: Entropy September 7, 2017 8 / 13

Page 11: Lecture 4: Entropy - Department of Physics - SDSM&Tandre/PHYS743/Lecture4.pdf · 2017. 9. 13. · Chapter I. Basic Principles of Stat MechanicsA.G. Petukhov, PHYS 743Lecture 4: Entropy

Neutrino Oscillations cont’dwhere |νi〉 are the mass eigenstatesand θ is the mixing angle. If the neutrinosare initially in |µ〉 state the density matrix reads:

ρ(t) =

(cos2 θ sin(θ) cos(θ)e−iωt

sin(θ) cos(θ)eiωt sin2 θ

)(9)

We canalso find the density matrix of the pure |τ〉-state:

ρτ = |τ〉 〈τ | =(

sin2 θ − sin(θ) cos(θ)− sin(θ) cos(θ) cos2 θ

)(10)

Using Eqs. (8)-(10) we can obtain the probability to find the system in thestate τ :

Pτ (t) = 〈ρτ 〉 = Tr [ρτ · ρ(t)] = sin2(2θ) sin2(ωt/2) (11)

Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743Lecture 4: Entropy September 7, 2017 9 / 13

Page 12: Lecture 4: Entropy - Department of Physics - SDSM&Tandre/PHYS743/Lecture4.pdf · 2017. 9. 13. · Chapter I. Basic Principles of Stat MechanicsA.G. Petukhov, PHYS 743Lecture 4: Entropy

Statistical Density Matrix (≡ Statistical Matrix, StatisticalOperator)

ρ1 ρ2

ρn

.

.

.

1 2

n

.

.

.

ΔE

When a system approaches thermal equilibriumthe oscillating non-diagonal matrix elements ofρ tend to zero as exp(−t/τ)→ 0 (decoherence)and only the diagonal matrix elements of thedensity matrix in energy representation survive:

ρ =∑n

|n〉 ρn 〈n| (12)

The statistical operator in Eq. (12) describes a mixture (ensemble) ofsystems in different energy states weighed with probabilities ρn. Thus ρnis the probability to find the system in the micro state |n〉 with energy En.The most common statistical distributions are microcanonical and Gibbs orcanonical distribution:

ρn =exp(−En/T )

Zwith Z =

∑n

exp(−En/T ) (13)

Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743Lecture 4: Entropy September 7, 2017 10 / 13

Page 13: Lecture 4: Entropy - Department of Physics - SDSM&Tandre/PHYS743/Lecture4.pdf · 2017. 9. 13. · Chapter I. Basic Principles of Stat MechanicsA.G. Petukhov, PHYS 743Lecture 4: Entropy

Properties of the Entropy

The microcanonical distribution is invaluable for general justificationof the statical mechanics. However it is not very practical due to theconservation of energy constraint. On the other hand, the Gibbsdistribution is very practical and powerful.

In QM the formulas of Stat Mech look simpler because all we need todo is ”counting”. For instance the number and the density of statesread: Γ(E) =

∑En≤E 1 and g(E) =

∑E≤En≤E+dE 1 = dΓ(E)/dE.

The values of statistical weight (multiplicity) ∆Γ end the entropy areinsensitive to the choice of an arbitrary interval ∆E. Indeed, we knowthat g(E) ∝ Es where s ∼ 1023. The interval ∆E must be muchlarger than spacing δE between the energy levels but stillmacroscopically small. Let’s take ∆E ' s · δE. Then:

S = log ∆Γ = log[g(E)∆E] = log[g(E)] + log(∆E) ' s+ log(s) = s

Here we used that s ∼ 1023 � log(s) ∼ 23. If we change ∆E in awide range (many orders of magnitude) the result will not change!This insensitivity explains success of statistical mechanics.

Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743Lecture 4: Entropy September 7, 2017 11 / 13

Page 14: Lecture 4: Entropy - Department of Physics - SDSM&Tandre/PHYS743/Lecture4.pdf · 2017. 9. 13. · Chapter I. Basic Principles of Stat MechanicsA.G. Petukhov, PHYS 743Lecture 4: Entropy

Properties of Entropy cont’dThe entropy is additive. For a set of subsystems

∆Γ =∏a

∆Γa and S =∑a

log(∆Γa) =∑a

Sa

As we have already seen for two subsystems in contact (i.e. they canexchange energy) the probability to find the subsystem outside of thenarrow energy interval in the vicinity of the average energy 〈E〉 isnegligible. This argument can be generalized to many subsystems incontact. We write:

dw(E1, E2, ...) = Cδ(E0 −∑a

Ea) · e∑

a Sa∏a

dEa ∝ eS(E1,E2,...EN )

The probability and the entropy reach sharp maxima when the energyof each subsystem equals to its average value 〈Ea〉. This is the stateof thermal equilibrium.

The entropy of a closed system reaches its maximum in the stateof thermal equilibrium.

Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743Lecture 4: Entropy September 7, 2017 12 / 13

Page 15: Lecture 4: Entropy - Department of Physics - SDSM&Tandre/PHYS743/Lecture4.pdf · 2017. 9. 13. · Chapter I. Basic Principles of Stat MechanicsA.G. Petukhov, PHYS 743Lecture 4: Entropy

The Law of Increase of EntropyFor a closed system which is not in the state of thermodynamicequilibrium the transition to equilibrium is a chain of more and moreprobable states. It means that both dw and the entropy S increasesteadily until S reaches it’s maximum possible value corresponding tothe state of complete thermal equilibrium (The second law ofthermodynamics).When transitions between the states of the system occur we candistinguish between two types of processes: irreversible, dS/dt ≥ 0and reversible, dS/dt = 0.Example

Vi

V f

Γ i ∝ViN

Γ f ∝ (2Vi )

N

Γ i / Γ f ∝ 2−N

Chapter I. Basic Principles of Stat Mechanics A.G. Petukhov, PHYS 743Lecture 4: Entropy September 7, 2017 13 / 13