parallel tempering and adaptive spacing in monte carlo ... · adaptive spacing algorithm gives...

1
Adaptive spacing algorithm gives better convergence on various models including 2D Ising and 2D Heisenberg. Temperature points are denser near phase transition. Upper: 2D Ising Lower: 2D Heisenberg Parallel Tempering and Adaptive Spacing in Monte Carlo Simulation Zhao, Yiwei, Kevin (CUHK) Cheung, Siu Wun, Tony (CUHK) Mentors: Markus Eisenbach (ORNL) Ying Wai Li (ORNL) Kwai L. Wong (UTK) Sampling from distribution may be hard in some physical models because of the complexity of the phase space. The n-vector model in statistical mechanics is a lattice model in which a magnetized spin is located on every spot pointing at a point on a N-dimensional sphere surface. For example, 2D Ising Model (n = 1) is a square lattice with each spin pointing either up or down. The Boltzmann distribution for a state with energy E under temperature T is given by ,where Z(T) is some unknown constant The Metropolis-Hastings algorithm is a Markov Chain Monte Carlo (MC) sampling method where in each step, a random flip is proposed and accepted with a probability chosen according to the distribution concerned. If it is accepted update the system and data rejected remain at previous state We do not record the first few data until the system is equilibrated, i.e., after the equilibration time. This method gives asymptotically correct results, but sometimes with a slow convergence rate especially at lower temperatures, where a spin flip is somewhat difficult to be accepted. The physical quantities of interest are calculated using following formulae: We first run the serial Metropolis Algorithm at a set of temperatures in a given range. Simulations at lower temperatures are easily trapped at local minima, resulting in the bad behavior in the following graph. 0 0.2 0.4 0.6 0.8 1 0.5 0.8 1.1 1.4 1.7 2 2.3 2.6 2.9 Mean Magne’za’on Per Spin k B T (J) E M C E M C E M C E M C Allowing configurations to be exchanged with each other after a certain amount of MC steps can improve the convergence rate significantly, shown below. Replica exchange among random walkers at different temperatures lowers correlation, thus improves convergence. However, variance data (e.g. susceptibility) is not converging well. We suspect that some simulations are not fully equilibrated before MC sampling starts. When we allow replica exchange during equilibration, the improvement is instant. 0 0.2 0.4 0.6 0.8 1 0.5 0.8 1.1 1.4 1.7 2 2.3 2.6 2.9 Mean magne’za’on per spin k B T (J) 250 exchanges 10^4 exchanges 10^7 exchanges E M C E M C E M C E M C 0 40 80 120 160 200 0.5 0.8 1.1 1.4 1.7 2 2.3 2.6 2.9 3.2 Magne’c Suscep’bility k B T (J) S/P 10^4 P/P 10^4 E M C E M C E M C E M C Past researches have pointed out that parallel tempering becomes most efficient when every pair of replicas exchange with the same probability. However, we found that it is extremely difficult for replicas near transition temperatures to exchange with its neighbors. This motivates us to do adaptive spacing after every certain amount of equilibration exchanges. Temperature is adjusted according to the density function. The i th temperature point is placed by finding such T that: where After a certain amount of adaptive spacing, temperature points are denser at areas where exchanges were hard before spacing. 0 0.2 0.4 0.6 0.8 1 0.5 1.5 2.5 3.5 4.5 Exchange acceptance ra’o k B T (J) Replica exchange difficulty 0 1 2 3 4 5 0.5 1 1.5 2 2.5 3 3.5 4 k B T (J) Temperature spacing density E1 1.0 M C E2 1.0 Adj E1 2.0 M C E2 2.3 Adj E1 3.0 M C E2 2.7 Adj E1 4.0 M C E2 4.0 Adj Overview Experiment 1 Serial Metropolis Algorithm on 2D Ising Model Experiment 2 Parallel Tempering on 2D Ising Model Adaptive Temperature Spacing Scheme Experiment 4 Adaptive Equilibration on various models 0 0.2 0.4 0.6 0.8 1 1.3 1.6 1.9 2.2 2.5 2.8 3.1 3.4 3.7 4 Exchange acceptance ra’o k B T (J) Exchange Probability w/o adap’ve spacing 0 30 60 90 120 150 2.1 2.2 2.3 2.4 2.5 2.6 Magne’c suscep’bility k B T (J) Regular AdapDve 0 0.2 0.4 0.6 0.8 0.6 0.9 1.2 1.5 1.8 2.1 2.4 2.7 3 Exchange acceptance ra’o k B T (J)

Upload: others

Post on 25-Aug-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Parallel Tempering and Adaptive Spacing in Monte Carlo ... · Adaptive spacing algorithm gives better convergence on various models including 2D Ising and 2D Heisenberg. Temperature

Adaptive spacing algorithm gives better convergence on various models including 2D Ising and 2D Heisenberg. Temperature points are denser near phase transition. Upper:

2D Ising

Lower:

2D Heisenberg

Parallel Tempering and Adaptive Spacing in Monte Carlo Simulation

Zhao, Yiwei, Kevin (CUHK) Cheung, Siu Wun, Tony (CUHK)

Mentors: Markus Eisenbach (ORNL) Ying Wai Li (ORNL) Kwai L. Wong (UTK)

Sampling from distribution may be hard in some physical models because of the complexity of the phase space. The n-vector model in statistical mechanics is a lattice model in which a magnetized spin is located on every spot pointing at a point on a N-dimensional sphere surface. For example, 2D Ising Model (n = 1) is a square lattice with each spin pointing either up or down. The Boltzmann distribution for a state with energy E under temperature T is given by

,where Z(T) is some unknown constant The Metropolis-Hastings algorithm is a Markov Chain Monte Carlo (MC) sampling method where in each step,

a random flip is proposed and accepted with a probability chosen according to the distribution concerned. If it is

accepted → update the system and data rejected → remain at previous state

We do not record the first few data until the system is equilibrated, i.e., after the equilibration time. This method gives asymptotically correct results, but sometimes with a slow convergence rate especially at lower temperatures, where a spin flip is somewhat difficult to be accepted. The physical quantities of interest are calculated using following formulae:

We first run the serial Metropolis Algorithm at a set of temperatures in a given range. Simulations at lower temperatures are easily trapped at local minima, resulting in the bad behavior in the following graph.

0  

0.2  

0.4  

0.6  

0.8  

1  

0.5   0.8   1.1   1.4   1.7   2   2.3   2.6   2.9  

Mean  Magne

'za'

on  Per  Spin

kBT (J)

E

MC

E

MC

E

MC

E

MC

Allowing configurations to be exchanged with each other after a certain amount of MC steps can improve the convergence rate significantly, shown below. Replica exchange among random walkers at different temperatures lowers correlation, thus improves convergence. However, variance data (e.g. susceptibility) is not converging well. We suspect that some simulations are not fully equilibrated before MC sampling starts. When we allow replica exchange during equilibration, the improvement is instant.

0  

0.2  

0.4  

0.6  

0.8  

1  

0.5   0.8   1.1   1.4   1.7   2   2.3   2.6   2.9  

Mean  magne

'za'

on  per  sp

in

kBT  (J)

250  exchanges   10^4  exchanges   10^7  exchanges  

E

MC

E

MC

E

MC

E

MC

0  

40  

80  

120  

160  

200  

0.5   0.8   1.1   1.4   1.7   2   2.3   2.6   2.9   3.2  

Magne

'c  Suscep'

bility

kBT (J)

S/P  10^4   P/P  10^4  

E

MC

E

MC

E

MC

E

MC

Past researches have pointed out that parallel tempering becomes most efficient when every pair of replicas exchange with the same probability. However, we found that it is extremely difficult for replicas near transition temperatures to exchange with its neighbors. This motivates us to do adaptive spacing after every certain amount of equilibration exchanges. Temperature is adjusted according to the density function. The ith temperature point is placed by finding such T that: where After a certain amount of adaptive spacing, temperature points are denser at areas where exchanges were hard before spacing.

0  

0.2  

0.4  

0.6  

0.8  

1  

0.5   1.5   2.5   3.5   4.5  

Exchan

ge  accep

tance  ra'o

kBT (J)

Replica  exchange  difficulty

0  

1  

2  

3  

4  

5  

0.5   1   1.5   2   2.5   3   3.5   4  kBT (J)

Temperature  spacing  density

E1  1.0

MC

E2  1.0  

Adj

E1  2.0

MC

E2  2.3  

Adj

E1  3.0

MC

E2  2.7  

Adj

E1  4.0

MC

E2  4.0  

Adj

Overview Experiment 1 Serial Metropolis Algorithm on 2D Ising Model

Experiment 2 Parallel Tempering on 2D Ising Model

Adaptive Temperature Spacing Scheme

Experiment 4 Adaptive Equilibration on various models

0  

0.2  

0.4  

0.6  

0.8  

1   1.3   1.6   1.9   2.2   2.5   2.8   3.1   3.4   3.7   4  

Exchan

ge  accep

tance  ra'o

kBT (J)

Exchange  Probability  w/o  adap've  spacing

0  

30  

60  

90  

120  

150  

2.1   2.2   2.3   2.4   2.5   2.6  

Magne

'c  su

scep

'bility

kBT (J)

Regular   AdapDve  

0  

0.2  

0.4  

0.6  

0.8  

0.6   0.9   1.2   1.5   1.8   2.1   2.4   2.7   3  Exchan

ge  accep

tance  ra'o

kBT (J)