part 1 overview
TRANSCRIPT
-
7/29/2019 Part 1 Overview
1/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof
Fundamentals ofStatistical PhysicsLeo P. Kadanoff
University of Chicago, USA
1
text:Statistical Physics,
Statics, Dynamics, RenormalizationLeo Kadanoff
I also referred often to Wikipedia and found itaccurate and helpful.
-
7/29/2019 Part 1 Overview
2/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof
Course Outline
2
part number text chapter number title length(slides)
number oflectures
1
2
3
4
56
7
8
Fundamentals of StatisticalPhysics
1 Once over Lightly 16 1
2 & 3 Basics 24 2
4 & 13 &15 Quantum Mechanics and Lattices 28 3
5 Diffusion and Hops 24 3
6 & 8 Momentum hops 27 39 Bose and Fermi 16 1
10 Phase Transitions & mean fields 40 2
11 & 12 & 13 &14 &15 Phase Transitions: Beyond meanfields
42 2
17
-
7/29/2019 Part 1 Overview
3/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof3
Part I: Once over lightly
Concepts which specifically belong to statistical physicsInteresting Physical Science Advances have a Major Statistical ComponentProbabilities: One dieQuantum Stat Mech
Classical Stat MechAverages from DerivativesThermodynamicsFrom Quantum to Classical: The Ising modelDegenerate DistributionsThermodynamic PhasesPhase TransitionsRandom WalkBrownian DynamicsBig Words
-
7/29/2019 Part 1 Overview
4/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof4
Where do we come from?
Undergraduate Institution:
41
-
7/29/2019 Part 1 Overview
5/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof
Concepts which specifically belong to statistical physics:Not in quantum mechanics or in Classical Mechanics
5
Temperature
-
7/29/2019 Part 1 Overview
6/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof
Interesting Physical Science Advances have a MajorStatistical Component
6
Bekenstein-Hawking: entropy of black holes
Fluctuation spectrum of 3 degree kelvin background radiation
Bells theorem: statistics of quantum measurements
source of complexity in the universe
probabilities of hearing from civilizations elsewhere in universe
Why do markets crash?
Time Reversal Invariance: Nature of Irreversability
Probabilities of major earth-asteroid collision
Probabilistic interpretation of quantum mechanics and of wave functions.
Is our universe likely?
-
7/29/2019 Part 1 Overview
7/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof
Part 2. Start with Probabilities: Dice
7
fair dice --> all probabilities are equal -->
average number on a throw =< >=
= 3.5
general rule: To calculate the average of anyfunction f() that gives the probability that what
will come out will be , you use the formula < f() >= f()
Do we understand what this formula means?? How would we describe a loaded die? Anaverage from a loaded die? If I told you that =2 was twice as likely as all the other values,
and these others were all equally likely, what would be the probability? What would we havefor the average throw on the die?
number of times turns up = N; total number of events N
probability of choosing a side with number = =N/Ni.1
total probability =1 --> =1
i.2
i.3
for all values of=1/6
r = relative probability of event . e.g. for fair dice r = const z= r = r/z
-
7/29/2019 Part 1 Overview
8/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof
Part 2. Start with Probabilities: Dice
8
fair dice --> all probabilities are equal -->
average number on a throw =< >=
= 3.5
general rule: To calculate the average of anyfunction f() that gives the probability that what
will come out will be , you use the formula < f() >= f()
number of times turns up = N; total number of events N
probability of choosing a side with number = =N/Ni.1
total probability =1 --> =1
i.2
i.3
for all values of=1/6
r = relative probability of event . e.g. for fair dice r = const z= r = r/z
-
7/29/2019 Part 1 Overview
9/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof
Part 3: Lattices
stable fixed point
unstable fixed point
024 3
432 60
iterations
iterations
flow
-
7/29/2019 Part 1 Overview
10/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof
Part 4: Random Walks & Diffusion
10
http://particlezoo.files.wordpress.com/2008/09/randomwalk.png
http://particlezoo.files.wordpress.com/2008/09/randomwalk.pnghttp://particlezoo.files.wordpress.com/2008/09/randomwalk.png -
7/29/2019 Part 1 Overview
11/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof
Part 5 : Statistics of Motion
11
tf(p,r,t) + (p/m) .r f(p,r,t) -r U(r,t) .p f(p,r,t) = effects of collisions
p
q
p
q
Albert Einstein (1905) explained this dancing by many, many collisions with molecules in fluid
dp/dt=......+ (t)-p/
p=(px, py, pz) = (x,y,z)
(t) is a Gaussian random variable resulting from random kicks produced by collisions. Since
the kicks have random directions =0. Different collisions are assumed to bestatistically independent
=(t-s)j,k
-
7/29/2019 Part 1 Overview
12/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof
Part 6: Bose & Fermi
12
particle statistics, i.e. the symmetry properties of the particles wave functions, havea major role in determining the behavior of many interesting physical systems. Thisis especially true when the system is degenerate, i.e. there is a sufficiently highdensity of identical particles so that there could be a substantial overlap of the wavefunctions involved. Important degenerate systems include:
for fermions: the electrons in atoms
for conserved bosons:
for non-conserved bosons
-
7/29/2019 Part 1 Overview
13/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof
Part 7: Phase Transitions and Mean Fields
13
phases of matter:
http://azahar.files.wordpress.com/2008/12/snowflake_.jpg
which symmetries of nature have beenlost in the snowflake?
are they really lost?
http://azahar.files.wordpress.com/2008/12/snowflake_.jpghttp://azahar.files.wordpress.com/2008/12/snowflake_.jpg -
7/29/2019 Part 1 Overview
14/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof
Part 8: After Mean Fields: Big Words
14
Universality:In appropriate limits, very different systems can have essentially identical properties
Renormalization
Take advantage of scale invariance and universality to produce a theory of phasetransitions.
Scale InvarianceSystems look the same at different spatial scales
-
7/29/2019 Part 1 Overview
15/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof
A start:
15
Ising system has as its basic variable a spin, z which takes on the values 1.
We shall use the abbreviation, for this spin.
The behavior of a physical system is described by its Hamiltonian. If we put thisspin in a magnetic field in the z-direction it has a Hamiltonian H=- Bz ,
Statistical Mechanics is defined by a probability. Here the probability is
() =(1/z) exp[-H/(kBT)]=(1/z) exp[-H/(kBT)]= (1/z) exp[ Bz/(kBT)]
We describe this by using the abbreviation, h, for the parameters in the probability
() = (1/z) exp(h ) h= Bz /(kBT)
normalization: total probability =1= (1) + (-1)= (1/z) exp(h)+ (1/z) exp(-h)
therefore z= exp(h)+ exp(-h)=2 cosh h
averageX = =()X
therefore < > = (1)1 + (-1)(-1)= 1/(2 cosh h) {exp (h)-exp(-h)}
= (2 sinh h)/(2 cosh h)= tanh h
-
7/29/2019 Part 1 Overview
16/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof
Averages from Derivatives
16
z= exp(h) = 2coshh
d(lnz) / dh= exp(h)
/ z=< >= tanhh
note how the second derivative gives the mean squared fluctuationshomework: Read Chapters 1 and 2 in textbook.
show that -d(ln Z) /d = = and d2(ln Z) /d2 =< (-)2> and
All derivatives of the log of the partition function are thermodynamic functions of
some kinds. As I shall say below, we expect simple behavior from the log of Z butnot Z itself. The derivatives described above are respectively called themagnetization, M= and the magnetic susceptibility,, = dM/dH. The analogous
first derivative with respect to is minus the energy. The next derivative withrespect to is proportional to the specific heat, or heat capacity, another traditionalthermodynamic quantity. The derivative of partition function with respect tovolume is the pressure.
d2(lnz) / (dh)2 = ( < >)2 exp(h)
/ z=< ( < >)2 >
=1 < >2=1 (tanh h)2
If lnZ= const + N ln + N ln T, with being the volume, find the average pressureand its fluctuations.
-
7/29/2019 Part 1 Overview
17/17
Perimeter Institute Lecture Notes on Statistical Physics: part I: OverviewVersion 1.7 9/13/09 LeoKadanof
End Survey: Start More Intensive/Extensive Discussions
17
Do you know what intensive and extensive mean in statistical physics?