motivation - stata · penalised quasi-likelihood (pql) • taylor series expansion of the...

34
Motivation Research question: Identify ICUs with unusual performance Data: Hierarchical Model: Generalized Linear Mixed Model (GLMM) Binary response of mortality ML is used to obtained the parameter estimates in GLMMs The (proled) log-likehood function is approximated to a specied degree Can we rely on the estimates produced?

Upload: others

Post on 22-Jul-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Motivation

Research question:• Identify ICUs with unusual performance

Data:• Hierarchical

Model:• Generalized Linear Mixed Model (GLMM)• Binary response of mortality

ML is used to obtained the parameter estimates in GLMMs• The (profiled) log-likehood function is approximated to a

specified degreeCan we rely on the estimates produced?

Page 2: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Motivating data: ANZICS Adult Patient Database

mortality hosp/icu patid APACHEIII covariates

0 1 1 49 x�1,11 1 2 88 x�1,2...

......

......

0 1 n_1 59 x�1,n_11 2 1 91 x�2,10 2 2 45 x�2,2...

......

......

0 2 n_2 94 x�1,n_2...

......

......

1 m 1 49 x�m,11 m 2 147 x�m,2...

......

......

0 m n_m 57 x�1,n_m

Page 3: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

2-level GLMM

• ICUs i = 1, . . . ,m• Patients j = 1, . . . , ni within ICU i

• Fixed effects x ij (including APACHEIII)• Random effects (population of ICUs)

• Random intercept ui0 ∼ N(0, τ0)• Random APACHEIII slope ui1 ∼ N(0, τ1)• cor(ui0, ui1) = ρ

• Mortality yij ∈ {0, 1}• (yij |x ij , ui0, ui1) ∼ Bernoulli (ηij)

logit (ηij) = xTij β + zT

ij u i

Page 4: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Model of Zhang et al. (2011)

The GLMM:

logit (ηij) = β0 + ui0 + xij (β1 + ui1) ,

with i = 1, 2, . . . , 500 and j = 1, 2, 3.

Data were generated using:• β0 = β1 = 1• ui0 ∼ N

�0, τ2

0 = 4�

• ui1 ∼ N�0, τ2

1 = 4�

• cor(ui0, ui1) = ρ = 0.25• xij ∼ N (0, 1)• (yij |xij ,u i ) ∼ Bernoulli (ηij)

The model parameters are:

λ = {θ,u} = {{β0,β1, τ0, τ1, ρ} , {u1,u2, . . . ,um}}

Page 5: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Profiled density

As the random effects u i are nuisance parameters,1 the profileddensity can be used. The profiled density:

�p (θ; y)

= ln�

UL (θ,u; y) dU

= −m ln (2πτ0τ1) +m�

i=1

yi � (β0 + xiβ1) +

ln�

· · ·� exp

��mi=1 yi � (ui0 + xiui1)− u2

i02τ2

0− u2

i12τ2

1

�mi=1 [1 + exp {β0 + ui0 + xi (β1 + ui1)}]ni

du1 . . . dum

1ML relies on the assumption the number of model parameters is invariantto the number of observations. u i are nuisance parameters as morehospitals/groups means the number of parameters increase. By marginalisingthese parameters, the ML assumption of fixed number of parameters (to beoptimised) holds.

Page 6: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Estimation of coefficients

The optimisation problem (θ̂ = arg maxθ �p) using the profiled loglikelihood has two parts

�p (θ; y) = a (θ) + ln�

· · ·�

g (θ,u) du

≈ a (θ) + ln b (θ)

• Estimate b (θ), i.e. create approximate function of integralterm using Laplace or aGHQ

• a (θ) + ln b (θ) can then be optimised (arg maxθ)

Many optimisation algorithms can be employed• Iterate until a minmum change threshold is met

Page 7: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Gauss-Hermite Quadrature (GHQ)

A univariate integral:

� ∞

−∞g (x) dx ≈

Q�

q=1

wqg (xq)

• Q is what is referred to as the number of quadrature points• xq and wq are the nodes and weights

• the xq are the roots of the Q th-order Hermite polynomialHQ (x)

• the wq are values of the (Q − 1)th-order Hermite polynomialat the xq: HQ−1 (xq)

• Assumes, amongst other things, that the distribution iscentred around zero

Page 8: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Adaptive GHQ (aGHQ)

aGHQ simply means standard normal variate type tranformationtakes place which alters the formula (Liu and Pierce, 1992):

� ∞

−∞g (x) dx ≈

√2σ̂

Q�

q=1

e−x2qwqg

�µ̂+

√2σ̂xq

where• µ̂ = arg maxx g (x), and• σ̂ is the Fisher Information at µ̂: σ̂ = 1�

− ∂2∂x2

ln g(x)��x=µ̂

Page 9: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

aGHQ example (Q = 7)

0 5 10 15 20 25

0.00

0.05

0.10

0.15

0.20

x

g(x)

⌠⌡−∞∞

g(x)dx = ⌠⌡−∞∞

fχ42(x)dx =

⌠⌡−∞∞ 1

22Γ(2)xe−x 2dx = 1

0 5 10 15 20 25

0.00

0.05

0.10

0.15

0.20

x

g(x)

⌠⌡−∞∞

g(x)dx ≈ 2 σ̂∑q=1

7

wq*g(µ̂ + 2 σ̂xq)

=0.93

A4=0.422

w4*=0.81

A5=0.293

w5*=0.83

A6=0.147

w6*=0.90

A7=0.064

w7*=1.10

Page 10: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Multidimensional aGHQ grids

1-D (Q = 7):

● ● ● ● ● ● ●

x1(q)

−2 −1 0 1 2

2-D (Q = 7):

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

x1(q)

x 2(q

)

−2 −1 0 1 2

−2

−1

01

2

Page 11: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Multidimensional aGHQ grids

3-D (Q = 7):

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ●

x1(q)

x 2(q

)

−2 −1 0 1 2

−2

−1

01

2

p-D:The total number of quadrature points in the p-dimensionalapproximation is

Qp

Page 12: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Laplace ≡ (Q = 1)

First, note that the Fisher information can be rearranged:

σ̂ = 1�− ∂2

∂x2ln g(x)

��x=µ̂

⇔ ∂2

∂x2 ln g(x)��x=µ̂

= − 1σ̂2

If we take a 2nd-order Taylor series of g∗ (x) = ln g (x) around µ̂:

ln g (x) = g∗ (x) ≈ g∗ (µ̂) + (x − µ̂) g �∗ (µ̂) +

12(x − µ̂)2 g ��

∗ (µ̂)

≈ ln g (µ̂)− (x − µ̂)2

2σ̂2

∴� ∞

−∞g (x) dx =

� ∞

−∞e ln g(x)dx ≈ g (µ̂)

� ∞

−∞e−

(x−µ̂)2

2σ̂2 dx ≈ g (µ̂)√

2πσ̂

• Equivalent to a Q = 1 aGHQ (x1 = 0 and w1 =√π)

Page 13: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Penalised quasi-likelihood (PQL)

• Taylor series expansion of the likelihood function• Biased, especially when Bernoulli trials low samples per

cluster2

• Avoid using this method3

2W. W. Stroup. Generalized Linear Mixed Models: Modern Concepts,Methods and Applications. Chapman & Hall, 2012.

3C. E. McCulloch, S. R. Searle, J. M. Neuhaus. Generalized, Linear, andMixed Models, 2nd Edition. John Wiley & Sons, 2008.

Page 14: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Survey of available software

Software/package Routine/functionStata xtmelogit

SAS NLMIXED

SAS GLIMMIX

ADMB ADMB-RE

R/lme4 glmer

R/glmmADMB glmmADMB

S-Plus nlme

Matlab fitglme

SPSS GENLINMIXED

Page 15: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Notable absentees

Software Routine/package CommentJulia GLM or MixedModels Neither seem to fit GLMMs as

of yetPython StatsModels Has linear mixed effect models

and GEE GLMS but no GLMMsas of yet

Page 16: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Optimisers

There are 3 general choices:• Hessian second order partial derivatives (e.g.

Newton-Raphson, trust region)• Gradient first order partial derivatives (e.g. Quasi-Newton)• Non-gradient based (e.g. Nelder-Mead simplex)

Second order methods• Requires more memory• Non-positive-definite errors

Non-derivative methods• More iterations required, less computation per iteration

Page 17: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Survey of available software

Integral Default OptimiserFunction estimation optimiser ∂ order

xtmelogit aGHQ Newton-Raphson 2NLMIXED aGHQ Dual Quasi-Newton 1GLIMMIX aGHQ Dual Quasi-Newton 1ADMB-RE aGHQ Quasi-Newton 2glmer Laplace† BOBYQA/Nelder-Mead 0

glmmADMB Laplace [ADMB’s optimiser]nlme Laplace Newton-type 2

fitglme Laplace Quasi-Newton 1GENLINMIXED PQL? ???

†aGHQ available for random intercept only models

Page 18: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Finally results

Firstly, a look at the fixed slope estimation, β̂1.

Results shown as ‘spine plots’• Zhang et al. (2011) datasets were

randomly generated 1000 times• Each dataset’s 95% CI is a horizontal line• Spine is the true value which should be

covered by 95%• Horizontal lines that do not cover the

true value are blackened

A table of summary statistics, the α error andthe average value.

Estimate

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

510

1520

Page 19: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Fixed effects: Laplaceβ1 = 1 glmer glmmADMB ADMB GLIMMIX xtmelogit fitglme GENLINMIXED

α�β̂1

�0.182 0.102 0.102 0.102 0.102 0.254 1.000

¯̂β1 0.982 0.985 0.985 0.985 0.985 0.985 0.454

β1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

020

040

060

080

010

00

●● ●● ●●● ● ●● ●●● ●●●● ●● ●●● ●●● ●● ●● ●● ● ●●●●● ●● ● ● ●● ● ●●● ●●●● ● ●● ●●●● ● ●●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ●●● ●●● ●● ●●● ● ●●●● ●● ●● ●●● ●●● ●● ●● ●● ●● ●● ●● ●●●●●● ● ●● ●● ● ● ●●● ● ● ●●●● ● ●● ●● ●● ● ● ●●● ● ● ●● ●● ●●● ●●● ●● ●● ●● ● ●● ●●●● ● ●● ●● ● ●● ● ●● ●● ● ●●● ●● ●● ● ●● ●●● ●● ●● ●● ●● ●●●● ●● ●● ●●● ●● ●●●● ●● ●● ● ●● ●●● ● ●●● ●● ●● ●● ●●●● ●● ●● ● ● ●●● ●● ●● ● ●●● ●● ● ●●● ● ● ●●● ●● ●●●● ●● ●●●● ● ● ●●● ●● ●●● ● ●●●● ●● ●● ●●●● ● ●●●● ● ●●● ●● ●●● ●● ●● ●● ●●● ● ●● ●● ●● ● ●●●● ●● ● ●● ● ●● ● ●●● ●● ● ●●● ● ● ●● ●● ●●● ●● ●●● ●● ● ●● ●● ● ●● ●● ●●● ●●● ●● ● ●●● ●●●●● ●●● ●●● ● ●● ●●● ●●●● ●● ●●●● ●●● ●● ●● ●●● ● ●● ●●● ● ● ●●● ●● ●●● ●● ●● ●●●● ●● ● ●●● ● ● ●● ●● ●● ●●● ●● ●●● ●●● ●●● ●● ●●● ● ● ● ●● ●● ●●● ●●● ● ●●●●● ●● ● ●● ●● ●● ●● ●●● ● ● ● ●● ● ● ●●●●●●● ●● ●●●● ● ●● ●● ● ●● ● ●● ●●● ● ●● ●● ●● ● ●● ● ● ●●● ●● ● ●●● ●●●● ● ●● ●●● ●●● ●●● ● ●● ●● ● ●● ● ●● ●● ●●● ● ●● ●● ●● ●● ●●●●●● ● ●●●●● ●● ●●● ● ●●●●●● ● ●● ●●● ● ● ●● ● ● ●●● ●● ●● ●●● ●● ● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●● ● ●● ●● ●●● ●● ● ●●● ●● ●● ●●● ● ●●● ●● ● ●●● ● ●●● ● ●●● ●● ●● ● ●● ● ● ●●● ● ●●●● ●● ● ● ●● ●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●● ●● ●● ●●● ●● ● ●● ●●● ●●●● ● ●● ● ●● ●● ● ●● ●● ●● ● ● ● ●●●● ●● ●●● ●●● ●●● ●● ●● ● ●●● ● ●●●● ● ●● ●● ●● ●●● ●●● ● ● ●●● ●●● ●●● ●● ● ● ●● ● ●●● ●●●● ● ●● ●●●● ● ●● ●●● ●●● ●● ●●●●● ●● ●● ●●● ● ●●● ● ●● ●●●● ● ●● ●● ●●● ● ●●●● ●● ● ●● ●●● ●●● ●● ●●● ●●●● ●● ● ●● ● ●● ● ● ● ●●●●●●● ●●● ●● ●● ● ●●● ●

α(β1̂)=0.182

β1̂ = 0.982

β1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

020

040

060

080

010

00

●● ●● ●●● ● ●● ●●● ●●●● ●● ●●● ●●● ●● ●● ●● ● ●●●●● ●● ● ● ●● ● ●●● ●●●● ● ●● ●●●● ● ●●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ●●● ●●● ●● ●●● ● ●●●● ●● ●● ●●● ●●● ●● ●● ●● ●● ●● ●● ●●●●●● ● ●● ●● ● ● ●●● ● ● ●●●● ● ●● ●● ●● ● ● ●●● ● ● ●● ●● ●●● ●●● ●● ●● ●● ● ●● ●●●● ● ●● ●● ● ●● ● ●● ●● ● ●●● ●● ●● ● ●● ●●● ●● ●● ●● ●● ●●●● ●● ●● ●●● ●● ●●●● ●● ●● ● ●● ●●● ● ●●● ●● ●● ●● ●●●● ●● ●● ● ● ●●● ●● ●● ● ●●● ●● ● ●●● ● ● ●●● ●● ●●●● ●● ●●●● ● ● ●●● ●● ●●● ● ●●●● ●● ●● ●●●● ● ●●●● ● ●●● ●● ●●● ●● ●● ●● ●●● ● ●● ●● ●● ● ●●●● ●● ● ●● ● ●● ● ●●● ●● ● ●●● ● ● ●● ●● ●●● ●● ●●● ●● ● ●● ●● ● ●● ●● ●●● ●●● ●● ● ●●● ●●●●● ●●● ●●● ● ●● ●●● ●●●● ●● ●●●● ●●● ●● ●● ●●● ● ●● ●●● ● ● ●●● ●● ●●● ●● ●● ●●●● ●● ● ●●● ● ● ●● ●● ●● ●●● ●● ●●● ●●● ●●● ●● ●●● ● ● ● ●● ●● ●●● ●●● ● ●●●●● ●● ● ●● ●● ●● ●● ●●● ● ● ● ●● ● ● ●●●●●●● ●● ●●●● ● ●● ●● ● ●● ● ●● ●●● ● ●● ●● ●● ● ●● ● ●●●● ●● ● ●●● ●●●● ● ●● ●●● ●●● ●●● ● ●● ●● ● ●● ● ●● ●● ●●● ● ●● ●● ●● ●● ●●●●●● ● ●●●●● ●● ●●● ● ●●●●●

● ● ●● ●●● ● ● ●● ● ● ●●● ●● ●● ●●● ●● ● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●● ● ●● ●● ●●● ●● ● ●●● ●● ●● ●●● ● ●●● ●● ● ●●● ● ●●● ● ●●● ●● ●● ● ●● ● ● ●●● ● ●●●● ●● ● ● ●● ●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●● ●● ●● ●●● ●● ● ●● ●●● ●●●● ● ●● ● ●● ●● ● ●● ●● ●● ● ● ● ●●●● ●● ●●● ●●● ● ●● ●● ●● ● ●●● ● ●●●● ● ●● ●● ●● ●●● ●●● ● ● ●●● ●●● ●●● ●● ● ● ●● ● ●●● ●●●● ● ●● ●●●● ● ●● ●●● ●●● ●● ●●●●● ●● ●● ●●● ● ●●● ● ●● ●●●● ● ●● ●● ●●● ● ●●●● ●● ● ●● ●●● ●●● ●● ●●● ●●●● ●● ● ●● ● ●● ● ● ● ●●●● ●●● ●●● ●● ●● ● ●●● ●

α(β1̂)=0.102

β1̂ = 0.985

β1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

020

040

060

080

010

00

●● ●● ●●● ● ●● ●●● ●●●● ●● ●●● ●●● ●● ●● ●● ● ●●●●● ●● ● ● ●● ● ●●● ●●●● ● ●● ●●●● ● ●●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ●●● ●●● ●● ●●● ● ●●●● ●● ●● ●●● ●●● ●● ●● ●● ●● ●● ●● ●●●●●● ● ●● ●● ● ● ●●● ● ● ●●●● ● ●● ●● ●● ● ● ●●● ● ● ●● ●● ●●● ●●● ●● ●● ●● ● ●● ●●●● ● ●● ●● ● ●● ● ●● ●● ● ●●● ●● ●● ● ●● ●●● ●● ●● ●● ●● ●●●● ●● ●● ●●● ●● ●●●● ●● ●● ● ●● ●●● ● ●●● ●● ●● ●● ●●●● ●● ●● ● ● ●●● ●● ●● ● ●●● ●● ● ●●● ● ● ●●● ●● ●●●● ●● ●●●● ● ● ●●● ●● ●●● ● ●●●● ●● ●● ●●●● ● ●●●● ● ●●● ●● ●●● ●● ●● ●● ●●● ● ●● ●● ●● ● ●●●● ●● ● ●● ● ●● ● ●●● ●● ● ●●● ● ● ●● ●● ●●● ●● ●●● ●● ● ●● ●● ● ●● ●● ●●● ●●● ●● ● ●●● ●●●●● ●●● ●●● ● ●● ●●● ●●●● ●● ●●●● ●●● ●● ●● ●●● ● ●● ●●● ● ● ●●● ●● ●●● ●● ●● ●●●● ●● ● ●●● ● ● ●● ●● ●● ●●● ●● ●●● ●●● ●●● ●● ●●● ● ● ● ●● ●● ●●● ●●● ● ●●●●● ●● ● ●● ●● ●● ●● ●●● ● ● ● ●● ● ● ●●●●●●● ●● ●●●● ● ●● ●● ● ●● ● ●● ●●● ● ●● ●● ●● ● ●● ● ●●●● ●● ● ●●● ●●●● ● ●● ●●● ●●● ●●● ● ●● ●● ● ●● ● ●● ●● ●●● ● ●● ●● ●● ●● ●●●●●● ● ●●●●● ●● ●●● ● ●●●●●

● ● ●● ●●● ● ● ●● ● ● ●●● ●● ●● ●●● ●● ● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●● ● ●● ●● ●●● ●● ● ●●● ●● ●● ●●● ● ●●● ●● ● ●●● ● ●●● ● ●●● ●● ●● ● ●● ● ● ●●● ● ●●●● ●● ● ● ●● ●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●● ●● ●● ●●● ●● ● ●● ●●● ●●●● ● ●● ● ●● ●● ● ●● ●● ●● ● ● ● ●●●● ●● ●●● ●●● ● ●● ●● ●● ● ●●● ● ●●●● ● ●● ●● ●● ●●● ●●● ● ● ●●● ●●● ●●● ●● ● ● ●● ● ●●● ●●●● ● ●● ●●●● ● ●● ●●● ●●● ●● ●●●●● ●● ●● ●●● ● ●●● ● ●● ●●●● ● ●● ●● ●●● ● ●●●● ●● ● ●● ●●● ●●● ●● ●●● ●●●● ●● ● ●● ● ●● ● ● ● ●●●● ●●● ●●● ●● ●● ● ●●● ●

α(β1̂)=0.102

β1̂ = 0.985

Page 20: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Fixed effects: Laplaceβ1 = 1 glmer glmmADMB ADMB GLIMMIX xtmelogit fitglme GENLINMIXED

α�β̂1

�0.182 0.102 0.102 0.102 0.102 0.254 1.000

¯̂β1 0.982 0.985 0.985 0.985 0.985 0.985 0.454

β1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

020

040

060

080

010

00

●● ●● ●●● ● ●● ●●● ●●●● ●● ●●● ●●● ●● ●● ●● ● ●●●●● ●● ● ● ●● ● ●●● ●●●● ● ●● ●●●● ● ●●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ●●● ●●● ●● ●●● ● ●●●● ●● ●● ●●● ●●● ●● ●● ●● ●● ●● ●● ●●●●●● ● ●● ●● ● ● ●●● ● ● ●●●● ● ●● ●● ●● ● ● ●●● ● ● ●● ●● ●●● ●●● ●● ●● ●● ● ●● ●●●● ● ●● ●● ● ●● ● ●● ●● ● ●●● ●● ●● ● ●● ●●● ●● ●● ●● ●● ●●●● ●● ●● ●●● ●● ●●●● ●● ●● ● ●● ●●● ● ●●● ●● ●● ●● ●●●● ●● ●● ● ● ●●● ●● ●● ● ●●● ●● ● ●●● ● ● ●●● ●● ●●●● ●● ●●●● ● ● ●●● ●● ●●● ● ●●●● ●● ●● ●●●● ● ●●●● ● ●●● ●● ●●● ●● ●● ●● ●●● ● ●● ●● ●● ● ●●●● ●● ● ●● ● ●● ● ●●● ●● ● ●●● ● ● ●● ●● ●●● ●● ●●● ●● ● ●● ●● ● ●● ●● ●●● ●●● ●● ● ●●● ●●●●● ●●● ●●● ● ●● ●●● ●●●● ●● ●●●● ●●● ●● ●● ●●● ● ●● ●●● ● ● ●●● ●● ●●● ●● ●● ●●●● ●● ● ●●● ● ● ●● ●● ●● ●●● ●● ●●● ●●● ●●● ●● ●●● ● ● ● ●● ●● ●●● ●●● ● ●●●●● ●● ● ●● ●● ●● ●● ●●● ● ● ● ●● ● ● ●●●●●●● ●● ●●●● ● ●● ●● ● ●● ● ●● ●●● ● ●● ●● ●● ● ●● ● ●●●● ●● ● ●●● ●●●● ● ●● ●●● ●●● ●●● ● ●● ●● ● ●● ● ●● ●● ●●● ● ●● ●● ●● ●● ●●●●●● ● ●●●●● ●● ●●● ● ●●●●●

● ● ●● ●●● ● ● ●● ● ● ●●● ●● ●● ●●● ●● ● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●● ● ●● ●● ●●● ●● ● ●●● ●● ●● ●●● ● ●●● ●● ● ●●● ● ●●● ● ●●● ●● ●● ● ●● ● ● ●●● ● ●●●● ●● ● ● ●● ●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●● ●● ●● ●●● ●● ● ●● ●●● ●●●● ● ●● ● ●● ●● ● ●● ●● ●● ● ● ● ●●●● ●● ●●● ●●● ● ●● ●● ●● ● ●●● ● ●●●● ● ●● ●● ●● ●●● ●●● ● ● ●●● ●●● ●●● ●● ● ● ●● ● ●●● ●●●● ● ●● ●●●● ● ●● ●●● ●●● ●● ●●●●● ●● ●● ●●● ● ●●● ● ●● ●●●● ● ●● ●● ●●● ● ●●●● ●● ● ●● ●●● ●●● ●● ●●● ●●●● ●● ● ●● ● ●● ● ● ● ●●●● ●●● ●●● ●● ●● ● ●●● ●

α(β1̂)=0.102

β1̂ = 0.985

β1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

020

040

060

080

010

00

●● ●● ●●● ● ●● ●●● ●●●● ●● ●●● ●●● ●● ●● ●● ● ●●●●● ●● ● ● ●● ● ●●● ●●●● ● ●● ●●●● ● ●●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ●●● ●●● ●● ●●● ● ●●●● ●● ●● ●●● ●●● ●● ●● ●● ●● ●● ●● ●●●●●● ● ●● ●● ● ● ●●● ● ● ●●●● ● ●● ●● ●● ● ● ●●● ● ● ●● ●● ●●● ●●● ●● ●● ●● ● ●● ●●●● ● ●● ●● ● ●● ● ●● ●● ● ●●● ●● ●● ● ●● ●●● ●● ●● ●● ●● ●●●● ●● ●● ●●● ●● ●●●● ●● ●● ● ●● ●●● ● ●●● ●● ●● ●● ●●●● ●● ●● ● ● ●●● ●● ●● ● ●●● ●● ● ●●● ● ● ●●● ●● ●●●● ●● ●●●● ● ● ●●● ●● ●●● ● ●●●● ●● ●● ●●●● ● ●●●● ● ●●● ●● ●●● ●● ●● ●● ●●● ● ●● ●● ●● ● ●●●● ●● ● ●● ● ●● ● ●●● ●● ● ●●● ● ● ●● ●● ●●● ●● ●●● ●● ● ●● ●● ● ●● ●● ●●● ●●● ●● ● ●●● ●●●●● ●●● ●●● ● ●● ●●● ●●●● ●● ●●●● ●●● ●● ●● ●●● ● ●● ●●● ● ● ●●● ●● ●●● ●● ●● ●●●● ●● ● ●●● ● ● ●● ●● ●● ●●● ●● ●●● ●●● ●●● ●● ●●● ● ● ● ●● ●● ●●● ●●● ● ●●●●● ●● ● ●● ●● ●● ●● ●●● ● ● ● ●● ● ● ●●●●●●● ●● ●●●● ● ●● ●● ● ●● ● ●● ●●● ● ●● ●● ●● ● ●● ● ●●●● ●● ● ●●● ●●●● ● ●● ●●● ●●● ●●● ● ●● ●● ● ●● ● ●● ●● ●●● ● ●● ●● ●● ●● ●●●●●● ● ●●●●● ●● ●●● ● ●●●●●

● ● ●● ●●● ● ● ●● ● ● ●●● ●● ●● ●●● ●● ● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●● ● ●● ●● ●●● ●● ● ●●● ●● ●● ●●● ● ●●● ●● ● ●●● ● ●●● ● ●●● ●● ●● ● ●● ● ● ●●● ● ●●●● ●● ● ● ●● ●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●● ●● ●● ●●● ●● ● ●● ●●● ●●●● ● ●● ● ●● ●● ● ●● ●● ●● ● ● ● ●●●● ●● ●●● ●●● ● ●● ●● ●● ● ●●● ● ●●●● ● ●● ●● ●● ●●● ●●● ● ● ●●● ●●● ●●● ●● ● ● ●● ● ●●● ●●●● ● ●● ●●●● ● ●● ●●● ●●● ●● ●●●●● ●● ●● ●●● ● ●●● ● ●● ●●●● ● ●● ●● ●●● ● ●●●● ●● ● ●● ●●● ●●● ●● ●●● ●●●● ●● ● ●● ● ●● ● ● ● ●●●● ●●● ●●● ●● ●● ● ●●● ●

α(β1̂)=0.102

β1̂ = 0.985

β1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

020

040

060

080

010

00

●● ●● ●●● ● ●● ●●● ●●●● ●● ●●● ●●● ●● ●● ●● ● ●●●●● ●● ● ● ●● ● ●●● ●●●● ● ●● ●●●● ● ●●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ●●● ●●● ●● ●●● ● ●●●● ●● ●● ●●● ●●● ●● ●● ●● ●● ●● ●● ●●●●●● ● ●● ●● ● ● ●●● ● ● ●●●● ● ●● ●● ●● ● ● ●●● ● ● ●● ●● ●●● ●●● ●● ●● ●● ● ●● ●●●● ● ●● ●● ● ●● ● ●● ●● ● ●●● ●● ●● ● ●● ●●● ●● ●● ●● ●● ●●●● ●● ●● ●●● ●● ●●●● ●● ●● ● ●● ●●● ● ●●● ●● ●● ●● ●●●● ●● ●● ● ● ●●● ●● ●● ● ●●● ●● ● ●●● ● ● ●●● ●● ●●●● ●● ●●●● ● ● ●●● ●● ●●● ● ●●●● ●● ●● ●●●● ● ●●●● ● ●●● ●● ●●● ●● ●● ●● ●●● ● ●● ●● ●● ● ●●●● ●● ● ●● ● ●● ● ●●● ●● ● ●●● ● ● ●● ●● ●●● ●● ●●● ●● ● ●● ●● ● ●● ●● ●●● ●●● ●● ● ●●● ●●●●● ●●● ●●● ● ●● ●●● ●●●● ●● ●●●● ●●● ●● ●● ●●● ● ●● ●●● ● ● ●●● ●● ●●● ●● ●● ●●●● ●● ● ●●● ● ● ●● ●● ●● ●●● ●● ●●● ●●● ●●● ●● ●●● ● ● ● ●● ●● ●●● ●●● ● ●●●●● ●● ● ●● ●● ●● ●● ●●● ● ● ● ●● ● ● ●●●●●●● ●● ●●●● ● ●● ●● ● ●● ● ●● ●●● ● ●● ●● ●● ● ●● ● ●●●● ●● ● ●●● ●●●● ● ●● ●●● ●●● ●●● ● ●● ●● ● ●● ● ●● ●● ●●● ● ●● ●● ●● ●● ●●●●●● ● ●●●●● ●● ●●● ● ●●●●●

● ● ●● ●●● ● ● ●● ● ● ●●● ●● ●● ●●● ●● ● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●● ● ●● ●● ●●● ●● ● ●●● ●● ●● ●●● ● ●●● ●● ● ●●● ● ●●● ● ●●● ●● ●● ● ●● ● ● ●●● ● ●●●● ●● ● ● ●● ●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●● ●● ●● ●●● ●● ● ●● ●●● ●●●● ● ●● ● ●● ●● ● ●● ●● ●● ● ● ● ●●●● ●● ●●● ●●● ● ●● ●● ●● ● ●●● ● ●●●● ● ●● ●● ●● ●●● ●●● ● ● ●●● ●●● ●●● ●● ● ● ●● ● ●●● ●●●● ● ●● ●●●● ● ●● ●●● ●●● ●● ●●●●● ●● ●● ●●● ● ●●● ● ●● ●●●● ● ●● ●● ●●● ● ●●●● ●● ● ●● ●●● ●●● ●● ●●● ●●●● ●● ● ●● ● ●● ● ● ● ●●●● ●●● ●●● ●● ●● ● ●●● ●

α(β1̂)=0.254

β1̂ = 0.985

Page 21: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Fixed effects: PQL?β1 = 1 glmer glmmADMB ADMB GLIMMIX xtmelogit fitglme GENLINMIXED

α�β̂1

�0.182 0.102 0.102 0.102 0.102 0.254 1.000

¯̂β1 0.982 0.985 0.985 0.985 0.985 0.985 0.454

β1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

020

040

060

080

010

00

●● ●● ●●● ●● ●● ●● ●●●● ●● ●●● ●● ● ●● ●● ●●● ●●●●● ●● ●● ●● ● ●●● ●●● ● ● ●●●●●● ● ●●● ●●● ●● ●● ● ●● ●●●● ●● ●● ●●●●●●● ●● ●●● ●● ●● ●●● ● ●● ●● ●●● ●● ●● ●● ●● ●● ●● ●● ● ●●● ●●● ●● ●● ●●● ● ● ●●●●● ●●●● ●● ●● ● ●●● ● ●● ●● ●●● ●●● ●● ●● ●●● ●● ●●●●● ●● ●● ●●● ●●● ●●● ●●● ●● ●● ● ●●●●● ●● ●● ●● ●● ●● ●● ●● ●● ●● ● ●● ●●●● ●● ●● ●● ●●●● ● ●●● ● ●●● ●● ●●●● ● ●●● ● ●●●● ●● ●● ● ●● ● ●● ●●●● ●● ●●● ●● ● ●●● ● ● ●● ●● ● ●●● ● ●● ● ●● ● ●●●● ●● ●● ●●●● ●● ● ●● ●●● ●●● ●●● ●● ●● ●● ● ●●● ●● ●● ●● ● ●● ●● ●● ●●● ● ●● ● ●●● ●● ● ●●● ●● ●● ●●● ●● ●● ●●● ●● ● ●● ●●● ●● ●● ●●● ●●● ●● ● ●●● ● ●●●● ●●● ●● ●● ●● ●●● ●● ●●●● ●●●● ●●● ● ●●● ●● ●●●● ● ●● ● ●●●● ●●● ●● ●●● ● ●●● ●●●● ●● ●● ● ●●●● ●● ●●● ●● ●●● ●●● ●●● ●● ●● ●● ●● ●● ● ●●●● ● ●●● ●●●● ●●● ●●●●● ●● ● ● ●●● ●●● ●● ● ●●● ●●● ●● ●● ●● ●● ●●●●●●●● ● ●● ●●● ●●● ●● ●●● ●●● ●● ●● ●● ● ●●● ●●●●● ●● ●● ● ●●● ●●● ● ●● ●● ● ●● ●●● ●●●● ●● ●●●●●● ●● ●●●● ●● ● ●● ●●● ●● ● ●● ● ●● ●●●●● ●●●●●●●●● ● ● ●●● ●● ●●●●● ●● ● ●●●● ● ●●● ●● ●● ●●● ● ●● ●● ●●●●● ●●● ●● ● ●●● ●● ●● ●●● ●●●● ●●● ●●●●●● ●● ●●●●● ●● ●●●● ● ●●●● ●●●● ●● ●● ●●●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●● ●● ●● ●●● ●● ● ●● ●●● ●●●● ● ●● ●●●

●● ● ●● ●● ●● ●● ●●● ●● ●● ●●● ●● ● ●●● ● ● ●● ●●● ● ● ●●●● ● ●● ●●● ● ● ●● ●●● ● ●● ● ● ●●● ●●● ●● ● ●● ●●●●●●● ●● ● ●● ●● ●● ●●● ●●● ●●● ●● ●●●●● ●● ●●●●● ● ● ●●● ●●● ●●●● ● ●●● ●●● ● ●●● ● ●● ● ●● ●●● ●●● ●●● ●● ●●● ● ●● ● ●● ●●●●● ● ●●● ●●●● ●●● ●● ●●● ●●● ●

α(β1̂)=1.000

β1̂ = 0.454

β1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

020

040

060

080

010

00

●● ●● ●●● ●● ●● ●● ●●●● ●● ●●● ●●● ●● ●● ●●● ●●●●● ●● ●● ●● ● ● ●● ●●● ●● ● ●●●●● ● ●●● ●●● ●● ●●● ●● ●●●● ●● ●● ●●●●●●● ●● ●●● ●● ●● ●●● ● ●● ●● ●●● ●● ●● ●● ●● ●● ●● ●● ● ●●● ●●● ●● ●● ●●● ● ● ●●●●● ●●●● ●● ●● ● ●●● ● ●● ●● ●●● ●●● ●● ●● ●●●●● ●●●●● ●● ●● ●●● ●●● ●●● ●●● ●● ●● ● ●●●●● ●● ●● ●● ●● ●● ●● ●●●● ●● ● ●● ●●●● ●● ●● ●● ●● ●●● ●●● ● ●●● ●● ●●●● ● ●●● ●●●●●● ● ●● ●●●● ●● ●●●● ●● ●●● ●● ● ●●● ● ● ●● ●● ● ●●● ● ●● ● ●● ● ●● ●● ●● ●● ●●●● ●● ● ●● ●●● ●●● ●●● ●● ●● ● ●● ●●● ●● ●● ●● ● ●● ●● ●● ●●● ● ●● ● ●●● ●●● ●●● ●● ●● ●●● ●● ●● ●●●●● ●●● ●●●●● ●● ● ●● ●●● ●● ●●●● ● ●●●● ●●● ●● ●● ●●●●● ●● ●●●● ●●●● ●●● ● ●●● ●● ●●●

● ● ●● ● ●●●● ●●● ●● ●●● ● ●●● ●●●● ●● ●● ● ●●●● ●● ●● ● ●● ●●●●●● ●●● ●● ●● ●● ●● ●● ● ●●●● ● ●●● ●●●● ●●● ●●●●● ●● ● ●●●● ●●● ●● ● ●●● ●●● ●● ●● ●● ●● ●●●●●●●● ● ●● ● ●● ●●● ●● ●●● ●●● ●● ●● ●● ● ●●● ●●●●● ●● ●● ● ●●● ●●●● ●● ●● ● ●● ●●●●●●● ●● ●●●●●● ●● ●● ●● ●●● ●● ●●● ●● ● ●● ● ●● ●●●●● ●●●●●●●

●●● ● ●●● ●● ●●●●● ●●● ●●●● ● ●●● ●● ●● ●●● ● ●● ●● ●●●●● ●●● ●● ● ●●● ●● ●● ●●● ●●●● ●●● ●●●●

●● ●● ●●●●● ●● ●●●● ● ●●●● ●●●● ●●●● ●●●● ● ●●● ●● ●●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●● ●●●● ●●● ●● ● ●● ●●● ●●●● ●●● ●●●●● ● ●● ●● ●● ●● ●●● ●● ●● ●●● ●● ● ●●●● ● ●● ●●● ● ●●●●●● ● ● ●●● ●● ●● ●●● ● ●● ● ● ●●● ●●● ●● ● ●● ●●●●●●● ●● ● ●● ●● ●● ●●● ● ●● ●●● ●● ●●●●

● ●● ●●●●● ●● ●●● ●●● ●●●● ● ●●● ●●● ● ●●● ● ●● ● ●● ●●● ●●● ●●● ●● ●●● ● ●● ● ●● ●●●●● ● ●●● ●●●● ●●● ●● ●●● ●●● ●

α(β1̂)=1.000

β1̂ = 0.397

GLIMMIXMETHOD=MMPL

(PQL)

Page 22: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Fixed effects: aGHQ=7

β1 = 1 ADMB GLIMMIX NLMIXED xtmelogit

α�β̂1

�0.056 0.055 0.060 0.054

¯̂β1 0.998 0.998 0.968 0.998

β1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

020

040

060

080

010

00

●● ●● ●●● ● ●● ●●● ●●●● ●● ●●● ●●● ●● ●● ●● ● ●●●●● ●● ● ● ●● ● ●●● ●●● ● ● ●● ●●●● ● ●●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ●●● ●●● ●● ●●● ● ●●●● ●● ●● ●●● ●●● ●● ●● ●● ●● ●● ●● ●●●●●● ● ● ● ●●● ● ●●● ● ● ●●●●● ●● ●● ●● ● ● ●●● ● ● ●● ●● ●●● ●●● ●● ●● ●●● ●● ●●●● ● ●● ●● ●●● ● ●● ●● ● ●● ● ●● ●● ● ●● ●●● ●● ●● ●● ●● ●● ●● ●● ●● ●●● ●● ●●●● ●● ●● ●●●●●● ● ●●● ● ● ●● ●● ●●●● ●● ●● ● ● ●●● ●● ●● ● ●●● ●● ● ●●● ●● ●●● ●● ●●●● ●● ●●●● ● ● ●● ● ●● ●●● ● ●●●● ●● ●● ●●●● ● ● ●●● ● ●● ● ●● ●●● ●● ●● ●● ●●● ● ●● ●● ●● ● ●●●● ●● ●●● ● ●● ● ●●● ●● ● ●●● ●● ●● ●● ●●● ●● ●●● ●● ● ●● ●● ● ●● ●● ●●● ●●● ●● ● ●●● ●●●●● ●●● ●● ● ● ●● ●●● ●● ●● ●● ●●●● ●●● ●● ●● ●● ● ● ●● ●●● ● ●●●● ●● ●●● ●● ● ● ●●●● ●● ● ●●● ● ● ●● ●● ●● ●●● ●● ●●● ●●● ●●● ●● ●●● ● ● ● ●● ●● ●●● ●●● ● ●●●●● ●● ● ●● ●● ●● ●● ●●● ● ● ● ●● ● ● ●●●●●●● ●● ●● ●● ● ●● ●● ● ●● ● ●● ●●● ● ●● ●● ●● ● ●●● ● ●●● ●● ● ●●● ●●●● ● ●● ●●● ●●● ●●● ● ●● ●● ● ●● ● ●● ●● ●●● ● ●●●●●● ●● ●●●●●● ● ●● ●●● ●● ●●● ● ●●●●●● ● ●● ●●● ● ● ●● ● ● ●●● ●● ●● ●●● ●● ● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●● ●●● ●● ●●● ●● ● ●●● ●● ●● ●●● ● ●●● ●● ● ●●● ● ●● ● ● ●●● ●● ●● ● ●● ● ● ●●● ● ●●●● ●● ● ● ●● ●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●● ●● ●● ●●● ●● ● ●● ●●● ●●●● ● ●●● ●●●● ● ●● ●● ●● ● ● ● ●● ●● ●● ●● ● ●● ● ●●● ●● ●● ● ●●● ● ●●●● ● ●● ●● ● ● ●●● ●●● ● ● ●●● ●●● ●●● ●● ●● ●●● ●●● ●● ●● ● ●● ●●●● ● ●● ●●● ●●● ●● ●●●●● ●● ●● ●●● ● ●●● ● ●● ●●●● ● ●●●● ●●● ● ●●● ● ●● ● ●● ●●● ●● ● ●● ● ●● ●●● ● ●● ● ●● ● ●● ●● ● ●●● ●●●● ●●● ●● ●● ● ●●● ●

α(β1̂)=0.056

β1̂ = 0.998

β1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

020

040

060

080

010

00

●● ●● ●●● ● ●● ●●● ●●●● ●● ●●● ●●● ●● ●● ●● ● ●●●●● ●● ● ● ●● ● ●●● ●●● ● ● ●● ●●●● ● ●●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ●●● ●●● ●● ●●● ● ●●●● ●● ●● ●●● ●●● ●● ●● ●● ●● ●● ●● ●●●●●● ● ● ● ●●● ● ●●● ● ● ●●●●● ●● ●● ●● ● ● ●●● ● ● ●● ●● ●●● ●●● ●● ●● ●●● ●● ●●●● ● ●● ●● ●●● ● ●● ●● ● ●● ● ●● ●● ● ●● ●●● ●● ●● ●● ●● ●● ●● ●● ●● ●●● ●● ●●●● ●● ●● ●●●●●● ● ●●● ● ● ●● ●● ●●●● ●● ●● ● ● ●●● ●● ●● ● ●●● ●● ● ●●● ●● ●●● ●● ●●●● ●● ●●●● ● ● ●● ● ●● ●●● ● ●●●● ●● ●● ●●●● ● ● ●●● ● ●● ● ●● ●●● ●● ●● ●● ●●● ● ●● ●● ●● ● ●●●● ●● ●●● ● ●● ● ●●● ●● ● ●●● ●● ●● ●● ●●● ●● ●●● ●● ● ●● ●● ● ●● ●● ●●● ●●● ●● ● ●●● ●●●●● ●●● ●● ● ● ●● ●●● ●● ●● ●● ●●●● ●●● ●● ●● ●● ● ● ●● ●●● ● ●●●● ●● ●●● ●● ● ● ●●●● ●● ● ●●● ● ● ●● ●● ●● ●●● ●● ●●● ●●● ●●● ●● ●●● ● ● ● ●● ●● ●●● ●●● ● ●●●●● ●● ● ●● ●● ●● ●● ●●● ● ● ● ●● ● ● ●●●●●●● ●● ●● ●● ● ●● ●● ● ●● ● ●● ●●● ● ●● ●● ●● ● ●●● ● ●●● ●● ● ●●● ●●●● ● ●● ●●● ●●● ●●● ● ●● ●● ● ●● ● ●● ●● ●●● ● ●●●●●● ●● ●●●●●● ● ●● ●●● ●● ●●● ● ●●●●●● ● ●● ●●● ● ● ●● ● ● ●●● ●● ●● ●●● ●● ● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●● ●●● ●● ●●● ●● ● ●●● ●● ●● ●●● ● ●●● ●● ● ●●● ● ●● ● ● ●●● ●● ●● ● ●● ● ● ●●● ● ●●●● ●● ● ● ●● ●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●● ●● ●● ●●● ●● ● ●● ●●● ●●●● ● ●●● ●●●● ● ●● ●● ●● ● ● ● ●● ●● ●● ●● ● ●● ● ●●● ●● ●● ● ●●● ● ●●●● ● ●● ●● ● ● ●●● ●●● ● ● ●●● ●●● ●●● ●● ●● ●●● ●●● ●● ●● ● ●● ●●●● ● ●● ●●● ●●● ●● ●●●●● ●● ●● ●●● ● ●●● ● ●● ●●●● ● ●●●● ●●● ● ●●● ● ●● ● ●● ●●● ●● ● ●● ● ●● ●●● ● ●● ● ●● ● ●● ●● ● ●●● ●●●● ●●● ●● ●● ● ●●● ●

α(β1̂)=0.055

β1̂ = 0.998

β1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

020

040

060

080

010

00

●● ●● ●●● ● ●● ●●● ●●●● ●● ●●● ●●● ●● ●● ●● ● ●●●●● ●● ●● ●● ● ●●● ●●● ● ● ●● ●●●● ● ●●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ●●● ●●● ●● ●●● ● ●●● ● ●● ●● ●●● ●●● ●● ●● ●● ●● ●● ●● ●● ●●●● ●● ● ●●● ● ●●● ●● ●●●●● ●● ●● ●● ●● ●●● ● ● ●● ●● ●●● ●●● ●● ●● ●●● ●● ●●●● ● ●● ●● ●●● ● ●● ●● ● ●●● ●● ●● ● ●● ●●● ●● ●● ●● ●● ●● ●● ●● ●● ●●● ●● ●●●● ●● ●● ●●●●●● ● ●●● ● ● ●● ●● ●●●● ● ● ●● ● ● ●●● ●● ●● ● ●●● ●● ● ●●● ●● ●●● ●● ● ●●● ●● ●●●● ● ● ●● ● ●● ● ●● ● ●●●● ●● ●● ●●●● ● ● ●●● ● ●● ● ●● ●●● ●● ●● ●● ●●● ● ●● ●● ●● ● ●●●● ●● ●●● ● ●● ● ●●● ●● ● ●●● ●● ●● ●● ●●● ●● ●●● ●● ● ●● ●● ● ●● ●● ●●● ●●● ●● ● ●●● ●●●●● ●●● ●● ● ● ●● ●●● ●● ●● ●● ●●●● ●●● ●● ●● ●● ● ● ●● ●●● ● ●●●● ●● ●●● ●● ● ● ●●●● ●● ● ●●●● ● ●● ●● ●● ●●● ●● ●●● ●●● ●●● ●● ●●● ● ● ● ●● ●● ●●● ● ●● ● ●●●●● ●● ● ●● ●● ●● ●● ●●● ● ●● ●● ● ● ●●●●●●● ●● ●● ●● ● ●● ●● ● ●● ● ●● ●●● ● ●● ●● ●●● ●●● ● ●●● ●● ● ●●● ●●●● ● ●● ●●● ●●● ●●● ● ●● ●● ● ●● ● ●● ●● ●●● ● ●●●●●● ●● ●●●●●● ● ●● ●●● ●● ●●● ● ●● ●●●● ● ●● ●●● ● ●●● ● ● ●●● ●● ●● ●●● ●● ● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●● ●●● ●● ●●● ●● ● ●●● ●● ●● ●●● ● ●●● ●● ● ●●● ● ●● ● ● ●●● ●● ●● ● ●● ● ● ●●● ● ●●●● ●● ● ● ●● ●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●● ●● ●● ●●● ●● ● ●● ●●● ●●●● ● ●●● ●●●● ● ●● ●● ●● ● ● ● ●● ●● ●● ●● ● ●● ● ●●● ●● ●● ● ●●● ● ●●●● ● ●● ●● ● ● ●●● ●●● ● ● ●●● ●●● ●●● ●● ● ● ●●● ●●● ●● ●● ● ●● ●●●● ●●● ●●● ●●● ●● ●●●●● ●● ●● ●●● ● ●●● ● ●● ● ●●● ● ●● ●● ●●● ● ●●● ● ●● ● ●● ●●● ●● ● ●● ● ●● ●●● ● ●● ● ●● ● ●● ●● ● ●●● ●●●● ●●● ●● ●● ● ●●● ●

α(β1̂)=0.060

β1̂ = 0.968

β1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

020

040

060

080

010

00

●● ●● ●●● ● ●● ●●● ●●●● ●● ●●● ●●● ●● ●● ●● ● ●●●●● ●● ● ● ●● ● ●●● ●●● ● ● ●● ●●●● ● ●●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ●●● ●●● ●● ●●● ● ●●●● ●● ●● ●●● ●●● ●● ●● ●● ●● ●● ●● ●●●●●● ● ● ● ●●● ● ●●● ● ● ●●●●● ●● ●● ●● ● ● ●●● ● ● ●● ●● ●●● ●●● ●● ●● ●●● ●● ●●●● ● ●● ●● ●●● ● ●● ●● ● ●● ● ●● ●● ● ●● ●●● ●● ●● ●● ●● ●● ●● ●● ●● ●●● ●● ●●●● ●● ●● ●●●●●● ● ●●● ● ● ●● ●● ●●●● ●● ●● ● ● ●●● ●● ●● ● ●●● ●● ● ●●● ●● ●●● ●● ●●●● ●● ●●●● ● ● ●● ● ●● ●●● ● ●●●● ●● ●● ●●●● ● ● ●●● ● ●● ● ●● ●●● ●● ●● ●● ●●● ● ●● ●● ●● ● ●●●● ●● ●●● ● ●● ● ●●● ●● ● ●●● ●● ●● ●● ●●● ●● ●●● ●● ● ●● ●● ● ●● ●● ●●● ●●● ●● ● ●●● ●●●●● ●●● ●● ● ● ●● ●●● ●● ●● ●● ●●●● ●●● ●● ●● ●● ● ● ●● ●●● ● ●●●● ●● ●●● ●● ● ● ●●●● ●● ● ●●● ● ● ●● ●● ●● ●●● ●● ●●● ●●● ●●● ●● ●●● ● ● ● ●● ●● ●●● ●●● ● ●●●●● ●● ● ●● ●● ●● ●● ●●● ● ● ● ●● ● ● ●●●●●●● ●● ●● ●● ● ●● ●● ● ●● ● ●● ●●● ● ●● ●● ●● ● ●●● ● ●●● ●● ● ●●● ●●●● ● ●● ●●● ●●● ●●● ● ●● ●● ● ●● ● ●● ●● ●●● ● ●●●●●● ●● ●●●●●● ● ●● ●●● ●● ●●● ● ●●●●●● ● ●● ●●● ● ● ●● ● ● ●●● ●● ●● ●●● ●● ● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●● ●●● ●● ●●● ●● ● ●●● ●● ●● ●●● ● ●●● ●● ● ●●● ● ●● ● ● ●●● ●● ●● ● ●● ● ● ●●● ● ●●●● ●● ● ● ●● ●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●● ●● ●● ●●● ●● ● ●● ●●● ●●●● ● ●●● ●●●● ● ●● ●● ●● ● ● ● ●● ●● ●● ●● ● ●● ● ●●● ●● ●● ● ●●● ● ●●●● ● ●● ●● ● ● ●●●● ●● ● ● ●●● ●●● ●●● ●● ●● ●●● ●●● ●● ●● ● ●● ●●●● ● ●● ●●● ●●● ●● ●●●●● ●● ●● ●●● ● ●●● ● ●● ●●●● ● ●●●● ●●● ● ●●● ● ●● ● ●● ●●● ●● ● ●● ● ●● ●●● ● ●● ● ●● ● ●● ●● ● ●●● ●●●● ●●● ●● ●● ● ●●● ●

α(β1̂)=0.054

β1̂ = 0.998

Page 23: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Stata results: increasing Q

• Effect of increasing Q on estimation of• β1 (= 1)• τ0 (= 2)• τ1 (= 2)• ρ (= 0.25)

Note: xtmelogit calculates the variance components on the scalesln(τ0), ln(τ1), tanh−1(ρ) because the sampling distribution

• cannot be assumed symmetric, and• are constrained

Page 24: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

β1 = 1 Q = 1 Q = 2 Q = 3 Q = 4 Q = 5 Q = 6 Q = 7α(β̂1) 0.102 0.496 0.125 0.057 0.060 0.052 0.055

¯̂β1 0.985 0.771 0.895 0.990 0.968 1.024 0.998

β1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

020

040

060

080

010

00

●● ●● ●●● ● ●● ●●● ●●●● ●● ●●● ●●● ●● ●● ●● ● ●●●●● ●● ● ● ●● ● ●●● ●●●● ● ●● ●●●● ● ●●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ●●● ●●● ●● ●●● ● ●●●● ●● ●● ●●● ●●● ●● ●● ●● ●● ●● ●● ●●●●●● ● ●● ●● ● ● ●●● ● ● ●●●● ● ●● ●● ●● ● ● ●●● ● ● ●● ●● ●●● ●●● ●● ●● ●● ● ●● ●●●● ● ●● ●● ● ●● ● ●● ●● ● ●●● ●● ●● ● ●● ●●● ●● ●● ●● ●● ●●●● ●● ●● ●●● ●● ●●●● ●● ●● ● ●● ●●● ● ●●● ●● ●● ●● ●●●● ●● ●● ● ● ●●● ●● ●● ● ●●● ●● ● ●●● ● ● ●●● ●● ●●●● ●● ●●●● ● ● ●●● ●● ●●● ● ●●●● ●● ●● ●●●● ● ●●●● ● ●●● ●● ●●● ●● ●● ●● ●●● ● ●● ●● ●● ● ●●●● ●● ● ●● ● ●● ● ●●● ●● ● ●●● ● ● ●● ●● ●●● ●● ●●● ●● ● ●● ●● ● ●● ●● ●●● ●●● ●● ● ●●● ●●●●● ●●● ●●● ● ●● ●●● ●●●● ●● ●●●● ●●● ●● ●● ●●● ● ●● ●●● ● ● ●●● ●● ●●● ●● ●● ●●●● ●● ● ●●● ● ● ●● ●● ●● ●●● ●● ●●● ●●● ●●● ●● ●●● ● ● ● ●● ●● ●●● ●●● ● ●●●●● ●● ● ●● ●● ●● ●● ●●● ● ● ● ●● ● ● ●●●●●●● ●● ●●●● ● ●● ●● ● ●● ● ●● ●●● ● ●● ●● ●● ● ●● ● ●●●● ●● ● ●●● ●●●● ● ●● ●●● ●●● ●●● ● ●● ●● ● ●● ● ●● ●● ●●● ● ●● ●● ●● ●● ●●●●●● ● ●●●●● ●● ●●● ● ●●●●●

● ● ●● ●●● ● ● ●● ● ● ●●● ●● ●● ●●● ●● ● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●● ● ●● ●● ●●● ●● ● ●●● ●● ●● ●●● ● ●●● ●● ● ●●● ● ●●● ● ●●● ●● ●● ● ●● ● ● ●●● ● ●●●● ●● ● ● ●● ●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●● ●● ●● ●●● ●● ● ●● ●●● ●●●● ● ●● ● ●● ●● ● ●● ●● ●● ● ● ● ●●●● ●● ●●● ●●● ● ●● ●● ●● ● ●●● ● ●●●● ● ●● ●● ●● ●●● ●●● ● ● ●●● ●●● ●●● ●● ● ● ●● ● ●●● ●●●● ● ●● ●●●● ● ●● ●●● ●●● ●● ●●●●● ●● ●● ●●● ● ●●● ● ●● ●●●● ● ●● ●● ●●● ● ●●●● ●● ● ●● ●●● ●●● ●● ●●● ●●●● ●● ● ●● ● ●● ● ● ● ●●●● ●●● ●●● ●● ●● ● ●●● ●

α(β1̂)=0.102

β1̂ = 0.985

β1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

020

040

060

080

010

00

●● ●● ●●● ● ● ●●●● ●●●● ●● ●●● ●● ● ●● ●● ●● ● ●●●●● ●● ●● ●● ● ●●● ●●● ● ● ●● ●●●● ● ●●● ●●● ●● ●● ●●● ●●●● ●● ●● ● ●●● ● ●● ●● ●●● ● ● ●● ●●● ●● ● ●● ●●● ●● ●● ●● ●● ●● ●● ●● ●●●● ●● ● ●●● ● ●●● ●● ●●●●● ●● ●● ●● ●● ●●● ● ● ●● ●● ●●● ●●● ●● ●● ●●● ●● ●●●● ● ●● ●● ●●● ●●● ●●● ●●● ●● ●● ● ●● ●●● ●● ●● ●● ●● ●● ●● ●● ●● ●● ● ●● ●●●● ●● ●● ●●●●●● ● ●●● ● ● ●● ●● ●●●● ● ● ●● ● ● ●●● ●● ●● ● ●●● ●● ● ●●● ●● ●●● ●● ● ●●● ●● ●●●● ● ● ●● ● ●● ● ●● ● ●●●● ●● ●● ●●●● ●● ●●● ● ●● ● ●● ●●● ●● ●● ●● ●●● ● ●● ●● ●● ● ●● ●● ●● ● ●● ● ●● ● ●●● ●● ● ●●● ●● ●● ●● ● ●● ●● ●●● ●● ● ●● ●●● ●● ●● ●●● ●●● ●● ● ●●● ●●●●● ●●● ●● ●● ●● ●●● ●● ●●●● ●●●● ●●● ●● ●● ●● ● ● ●● ●●● ● ●●●● ●● ● ●● ●● ● ● ●●● ● ●● ● ●● ●● ● ●● ●● ●● ●●● ●● ●●● ●●● ●●● ●● ●●●● ● ● ●● ● ● ●●● ● ●●● ●●●● ● ●● ● ●●●● ●● ●● ●●● ● ●● ●● ● ● ●●●●● ●● ●● ●● ●● ● ●● ●● ● ●● ● ●● ●●● ● ●● ●● ●●● ●●● ● ●●● ●● ● ●●● ●●●● ● ●● ●● ● ●●● ●●● ● ●● ●● ● ●● ● ●● ●●●●●● ●●●● ●● ●● ●●●●●● ● ●● ●●● ●● ●●● ● ●● ●●●●● ●● ●●● ● ●●● ● ● ●●● ●● ●● ●●● ●● ● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●● ●●●●● ●●● ●● ● ●●● ●● ●● ●●● ● ●●● ●● ● ●●● ● ●● ●● ●●●●● ●● ●●●● ● ●●● ● ●●●● ●● ●● ●● ●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●● ●● ●● ●●● ●● ● ●● ●●● ●●●● ● ●● ● ●●●● ● ●● ●● ●● ●● ● ●● ●● ●● ●●● ●● ● ●●● ●● ●● ● ●● ● ● ●●●● ● ●● ●●● ● ●●● ●●● ● ●●● ● ●●● ●●● ●● ● ● ●●● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●●● ●● ●●●●● ●● ●● ●●● ● ●●●● ●● ● ●●●● ●●●● ●●● ● ●●● ● ●● ● ●● ●●● ●●● ●● ● ●● ●●● ● ●● ● ●● ●●● ●● ● ●●● ●●●● ●●● ●● ●● ● ●●● ●

α(β1̂)=0.496

β1̂ = 0.771

β1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

020

040

060

080

010

00

●● ●● ●●● ● ●●●●● ●●●● ●● ●●● ●●● ●● ●● ●● ● ●●●●● ●● ●● ●● ● ●●● ●●● ● ● ●● ●●●● ● ●●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ●●● ●●● ●● ●●● ● ●●● ● ●● ●● ●●● ●●● ●● ●● ●● ●● ●● ●● ●● ●●●● ●● ● ●●● ● ●●● ● ● ●●●●● ●● ●● ●● ●● ●●● ● ● ●● ●● ●●● ●●● ●● ●● ●●● ●● ●●●● ● ●● ●● ●●● ● ●● ●●● ●●● ●● ●● ● ●● ●●● ●● ●● ●● ●● ●● ●● ●● ●● ●●● ●● ●●●● ●● ●● ●●●●●● ● ●●● ● ● ●● ●● ●●●● ●● ●● ● ● ●●● ●● ●● ● ●●● ●● ● ●●● ●● ●●● ●● ● ●●● ●● ●●●● ● ● ●● ● ●● ● ●● ● ●●●● ●● ●● ●●●● ● ● ●●● ● ●● ● ●● ●●● ●● ●● ●● ●●● ● ●● ●● ●● ● ●●●● ●● ●●● ● ●● ● ●●● ●● ● ●●● ●● ●● ●● ●●● ●● ●●●●● ● ●● ●● ● ●● ●● ●●● ●●● ●● ● ●●● ●●●●● ●●● ●● ● ● ●● ●●● ●● ●● ●● ●●●● ●●● ●● ●● ●● ● ● ●● ●●● ● ●●●● ●● ●●● ●● ● ● ●●● ● ●● ● ●●●● ● ●● ●● ●● ●●● ●● ●●● ●●● ●●● ●● ●●● ● ● ● ●● ● ● ●●● ● ●●● ●●●● ● ●● ● ●● ●● ●● ●● ●●● ● ●● ●● ● ● ●● ●●●●● ●● ●● ●● ● ●● ●● ● ●● ● ●● ●●● ● ●● ●● ●●● ●●● ● ●●● ●● ● ●●● ●●●● ● ●● ●●● ●●● ●●● ● ●● ●● ● ●● ● ●● ●● ●●● ● ●●●● ●● ●● ●●●●●● ● ●● ●●● ●● ●●● ● ●● ●●●●● ●● ●●● ● ●●● ● ● ●●● ●● ●● ●●● ●● ● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●● ●●● ●● ●●● ●● ● ●●● ●● ●● ●●● ● ●●● ●● ● ●●● ● ●● ●● ●●● ●● ●● ●●● ● ● ●●● ● ●● ●● ●● ●● ●● ●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●● ●● ●● ●●● ●● ● ●● ●●● ●●●● ● ●●● ●●●● ● ●● ●● ●● ● ● ● ●● ●● ●● ●● ● ●● ● ●●● ●● ●● ● ●● ● ● ●●●● ● ●● ●●● ● ●●● ●●● ● ●●●● ●●● ●●● ●● ● ● ●●● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●●● ●● ●●●●● ●● ●● ●●● ● ●●● ● ●● ● ●●● ● ●● ●● ●●● ● ●●● ● ●● ● ●● ●●● ●●● ●● ● ●● ●●● ● ●● ● ●● ●●● ●● ● ●●● ●●●● ●●● ●● ●● ● ●●● ●

α(β1̂)=0.125

β1̂ = 0.895

β1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.0 0.5 1.0 1.5 2.0

020

040

060

080

010

00

●● ●● ●●● ● ●● ●●● ●●●● ●● ●●● ●●● ●● ●● ●● ● ●●●●● ●● ●● ●● ● ●●● ●●● ● ● ●● ●●●● ● ●●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ●●● ●●● ●● ●●● ● ●●● ● ●● ●● ●●● ●●● ●● ●● ●● ●● ●● ●● ●● ●●●● ●● ● ●●● ● ●●● ●● ●●●●● ●● ●● ●● ●● ●●● ● ● ●● ●● ●●● ●●● ●● ●● ●●● ●● ●●●● ● ●● ●● ●●● ● ●● ●● ● ●●● ●● ●● ● ●● ●●● ●● ●● ●● ●● ●● ●● ●● ●● ●●● ●● ●●●● ●● ●● ●●●●●● ● ●●● ● ● ●● ●● ●●●● ● ● ●● ● ● ●●● ●● ●● ● ●●● ●● ● ●●● ●● ●●● ●● ● ●●● ●● ●●●● ● ● ●● ● ●● ● ●● ● ●●●● ●● ●● ●●●● ● ● ●●● ● ●● ● ●● ●●● ●● ●● ●● ●●● ● ●● ●● ●● ● ●●●● ●● ●●● ● ●● ● ●●● ●● ● ●●● ●● ●● ●● ●●● ●● ●●● ●● ● ●● ●● ● ●● ●● ●●● ●●● ●● ● ●●● ●●●●● ●●● ●● ● ● ●● ●●● ●● ●● ●● ●●●● ●●● ●● ●● ●● ● ● ●● ●●● ● ●●●● ●● ●●● ●● ● ● ●●●● ●● ● ●●●● ● ●● ●● ●● ●●● ●● ●●● ●●● ●●● ●● ●●● ● ● ● ●● ●● ●●● ● ●● ● ●●●●● ●● ● ●● ●● ●● ●● ●●● ● ●● ●● ● ● ●●●●●●● ●● ●● ●● ● ●● ●● ● ●● ● ●● ●●● ● ●● ●● ●●● ●●● ● ●●● ●● ● ●●● ●●●● ● ●● ●●● ●●● ●●● ● ●● ●● ● ●● ● ●● ●● ●●● ● ●●●●●● ●● ●●●●●● ● ●● ●●● ●● ●●● ● ●● ●●●● ● ●● ●●● ● ●●● ● ● ●●● ●● ●● ●●● ●● ● ●● ●● ● ●●● ●● ●● ●●● ● ●● ●● ●●● ●● ●●● ●● ● ●●● ●● ●● ●●● ● ●●● ●● ● ●●● ● ●● ● ● ●●● ●● ●● ● ●● ● ● ●●● ● ●●●● ●● ● ● ●● ●● ● ●●● ●● ● ●●● ●● ●● ● ●● ●● ●● ●●● ●●● ●● ●● ●● ●●● ●● ● ●● ●●● ●●●● ● ●●● ●●●● ● ●● ●● ●● ● ● ● ●● ●● ●● ●● ● ●● ● ●●● ●● ●● ● ●●● ● ●●●● ● ●● ●● ● ● ●●● ●●● ● ● ●●● ●●● ●●● ●● ● ● ●●● ●●● ●● ●● ● ●● ●●●● ●●● ●●● ●●● ●● ●●●●● ●● ●● ●●● ● ●●● ● ●● ● ●●● ● ●● ●● ●●● ● ●●● ● ●● ● ●● ●●● ●● ● ●● ● ●● ●●● ● ●● ● ●● ● ●● ●● ● ●●● ●●●● ●●● ●● ●● ● ●●● ●

α(β1̂)=0.060

β1̂ = 0.968

Page 25: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

τ0 = 2 Q = 1 Q = 2 Q = 3 Q = 4 Q = 5 Q = 6 Q = 7α (τ̂0) 0.298 0.993 0.209 0.038 0.047 0.060 0.040

¯̂τ0 1.590 1.201 1.671 1.930 1.885 2.069 1.982

τ0̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.5 1.0 1.5 2.0 2.5 3.0 3.5

020

040

060

080

010

00

● ● ●● ● ● ●● ●● ●●● ●●● ●●● ●●● ● ●● ●●●● ●● ●●●●● ● ●●● ● ●● ●●● ● ● ●●● ● ●● ●●●● ● ● ●●● ●● ●● ●● ●● ● ●● ●●● ● ●●● ●●● ●●● ●● ● ●● ● ●● ●● ●● ●● ●●● ●● ●● ●● ●●●●●● ●● ● ● ●●● ●● ● ●●● ●●●●● ● ● ●●● ●● ● ●● ●● ● ●● ●●●● ● ●●● ●● ● ●● ● ●● ● ● ●●●● ●● ● ●●●● ● ●●● ●● ●● ●●●●● ● ●●● ●● ● ●●●● ●● ● ●● ●●● ●●●● ●●● ● ● ●● ● ●● ●●● ●●● ●●●● ● ●● ●● ● ● ●●● ●● ●●● ●● ●●● ●● ●●● ● ●●● ●●●● ● ●●●●● ● ●●●● ●● ●● ●● ●●● ● ●● ● ●● ●● ● ●●● ●● ●●●● ●●● ● ●● ●● ●●●

● ● ●●●● ● ● ●● ● ●● ●● ● ●●● ●● ●● ●● ● ● ●● ●● ● ●●● ● ● ●● ●● ● ●● ● ●●● ● ● ●●●● ● ●●● ●● ●● ●●● ●●● ● ● ● ●● ●●●●●●● ●●● ●●● ●● ● ●● ● ●● ●●●● ●● ●●● ● ●● ●● ● ● ●●● ●● ●●●● ● ●● ●● ●● ●●● ● ●● ●● ●● ● ●●● ●● ●● ● ●● ●● ●●●● ●● ●● ●● ● ● ●● ●●●● ●●●● ●● ●● ●●● ●● ● ●● ●●● ●● ● ●● ●● ●●● ●●● ●●● ● ●● ●● ● ●● ●● ●● ●● ●● ●● ● ●●● ● ● ●● ●● ●● ● ●● ● ●● ●● ● ● ●● ● ●●● ●● ●●●● ●● ●● ● ●●●● ●● ●●● ●●● ●●● ●● ●● ●●● ● ●● ●●●● ●● ●●● ●● ● ●●● ●● ●● ●●● ● ● ● ●● ●● ●● ● ●●●● ● ●● ●● ●● ● ●● ●● ● ●●● ●●● ●●● ● ●● ● ● ●●● ● ●●●● ● ●● ●●● ●● ● ●● ●● ●●● ●● ●● ●● ●● ● ●● ●● ● ●● ●●● ●● ●● ●●●● ●● ●● ●●● ● ●●● ●● ●●●● ● ●●● ● ●●● ●● ●●● ●● ●●●●● ●● ● ●● ●● ● ●●● ●● ●●●● ●● ● ●● ●●● ●● ●●● ●● ●● ●●● ●●● ● ●●● ●● ●●● ●●● ●● ●●● ●●●● ● ●●● ●● ● ●● ●● ●● ●● ● ●● ●●● ●● ● ●● ●● ●● ● ●● ●● ● ● ● ●●●● ●●●● ●●●● ● ●● ●●● ●● ●●● ●●● ●●● ●●● ● ● ●● ●● ● ● ●● ●●● ● ● ●● ● ●●●● ●● ●●●● ●● ●● ●●●● ● ●●●● ●● ● ●●● ● ●● ● ●●●● ● ●● ●● ●●● ●●● ●● ●●● ●● ● ●●● ●● ●● ●●● ●●●● ●●●● ● ●●● ● ● ●● ● ●● ● ●● ●● ●● ● ●● ● ●● ●●

α(τ0̂)=0.298

τ0̂ = 1.590

τ0̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.5 1.0 1.5 2.0 2.5 3.0 3.5

020

040

060

080

010

00

● ● ●●● ● ●● ●● ● ●● ●●● ●●● ●●● ● ●● ●●●● ●● ●●● ●● ● ●●● ● ●● ●●● ●● ● ●●● ●● ●●●● ●● ●●● ●● ●●● ● ●● ● ●● ●●● ● ●●● ●●● ●●● ●● ● ●● ● ●● ●● ●● ●● ●●● ●● ●● ●● ● ●●●●● ●● ● ● ●●● ●●● ●●● ●● ●●● ● ● ●●● ●● ● ●●●●● ●● ●●●● ● ●● ●●● ● ●● ● ●●● ● ●●●● ●● ●●●●● ● ●●● ●● ●● ●●●● ● ● ●●● ●● ● ●●●● ●● ● ●● ●● ● ●● ●● ●●● ● ● ●● ● ●● ●●●●●● ●●●● ●●● ●● ● ● ●●● ●● ●●● ●● ●●● ●● ●●● ● ●●● ●●●● ● ● ●●●● ● ●●● ● ●● ●●●● ●●● ●●● ● ●● ●● ● ●● ●●● ●● ●● ●●●● ●●●● ● ●●● ● ●●●● ● ● ●● ● ●● ●● ● ●●● ●● ●● ●● ● ●● ●●● ● ●●● ● ● ●● ● ● ● ●●● ●●● ● ● ●● ●● ● ●●● ●● ●● ●●● ●●● ● ● ●●●● ●● ●●●● ● ●● ●● ● ●● ● ●● ● ●● ●●●● ●● ● ●● ● ●● ●● ●●●● ● ●● ●●●● ● ●● ●● ●● ●●● ● ●● ●● ●● ● ●●● ●● ●● ● ●● ●● ●●●● ● ● ●● ●● ●● ●●● ●●● ●●●● ●● ●● ●●● ●● ●● ● ●●● ●● ● ●● ●●●● ●●● ● ●● ●● ●● ●● ● ●● ●●●● ●● ●● ●● ● ● ●● ● ● ●● ●● ●● ● ●● ● ●● ●●● ● ●● ● ●●●●● ●●●● ●● ●● ● ●●●● ●● ●●● ●●● ●●● ●● ●● ●●●● ●● ●●●

● ●● ●●● ●● ● ●●● ● ● ●● ●●● ●● ● ●● ●● ●● ● ●●●● ● ●● ●● ●● ● ●● ●●● ●●● ●●● ●●● ● ●● ●● ●●● ● ●●●●● ●● ●●●●● ●●● ●● ●●● ●● ●● ●● ●● ●●● ●● ●●● ●●● ●● ●● ●● ● ●●● ●● ●●● ● ●●● ●● ●● ●● ●●●● ● ●●● ●● ●● ● ●● ●●● ●● ●● ● ●● ●● ● ●● ● ●● ●●●● ●●● ●● ●●● ●● ●●● ●● ●● ●●● ●●● ● ●●●●●●●● ●●● ●● ●●● ●●●●● ●●●●●● ●● ● ● ●● ●● ● ●● ●●● ●● ● ●● ●● ●● ● ●● ●● ● ● ●●●●● ●●●● ●●●● ● ●● ●● ● ●● ●●● ●●● ●●● ●●● ● ●●● ●●● ●●● ●●● ● ●●● ● ●●●● ●● ●● ●● ●● ●● ●●●● ● ●●●● ●● ●●●● ● ●● ● ●●●● ● ●● ●● ● ●●●● ● ●● ● ●●●● ● ●●● ●● ●● ●●● ●●● ● ●●●● ● ●●● ●● ●● ● ●● ● ●● ●● ●● ● ●● ● ●● ●●

α(τ0̂)=0.209

τ0̂ = 1.671

τ0̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.5 1.0 1.5 2.0 2.5 3.0 3.5

020

040

060

080

010

00● ● ●●● ● ●● ●● ●●● ●●● ●●● ●●● ● ●● ●●●● ●● ●●● ●● ● ●●● ● ●● ●●● ●● ● ●●● ●● ●● ●● ●● ●●● ●● ●●● ● ●● ● ●● ●●● ● ●●● ●●● ●●● ●● ● ●● ● ●● ●● ●● ●● ●●● ●● ●● ●● ● ●●●●● ●● ● ● ●●● ●●● ● ●● ●● ●●● ● ● ●●● ●● ● ●● ●●● ●● ●●●● ● ●● ●●● ● ●● ● ●●● ● ●●●● ●●●●●●● ● ●●● ●● ●● ●●●● ● ● ●● ● ●● ● ●●●● ●● ● ●● ●● ● ●● ●● ●●● ●● ●● ● ●● ●●● ●●● ●●●● ● ●● ●● ● ● ●●● ●● ●●● ●● ●●● ●● ●●● ● ●●● ●●●● ● ● ●●●● ● ●●● ● ●● ●●●● ●●● ●●● ● ●● ●● ● ●● ●●● ●● ●● ●●● ● ●●●● ● ●●● ● ●●●● ● ● ●● ● ●●●● ● ●●● ●● ●● ●● ● ●● ●●● ● ●●● ● ● ●● ● ● ● ●● ● ●●● ● ● ●● ●● ● ●●● ●● ●● ●●● ●●● ● ● ● ●● ● ●● ●●●● ● ●● ●● ● ●● ● ●● ● ●● ●●●● ●● ● ●● ● ●● ●● ●●●● ● ●● ●●●● ● ●● ●● ●● ●●● ● ●●●● ●● ● ●●● ●● ●● ● ●● ●● ●●●● ●● ●● ●● ●● ●●● ●●● ●●●● ●● ●● ●●● ●● ●● ● ●●● ●● ● ●● ●● ●● ●●● ● ●● ●● ●● ●●● ●● ●● ●● ●● ●● ●● ● ● ●● ● ● ●● ●●●● ● ●● ● ●● ●● ● ● ●● ● ●●● ●● ●●●● ●● ●● ● ●●●● ●● ●●● ●●● ●●● ●● ●● ●●●● ●● ●●●● ●● ● ●● ●● ● ●●● ● ● ●● ●●● ●● ● ●● ●● ●● ● ●●●● ● ●● ●● ●● ● ●● ●●● ●●● ●●● ●●● ● ●● ●● ●●● ● ●●●● ● ●● ●●● ●● ●●● ●● ●●● ●● ●● ●● ●● ●●● ●● ●●● ●●● ●● ●● ●● ● ●●● ●● ●●● ● ●●● ●● ●● ●● ●●●● ● ●●● ●● ●● ● ●● ●●● ●● ●● ● ●● ●● ● ●●● ●● ●●●● ●●● ●● ●●● ●● ●●● ●● ●● ●●● ●●● ● ●●● ●● ●●● ●●● ●● ●●● ●●●● ● ●●● ●●● ●● ●● ●● ●● ● ●● ●●● ●● ● ●● ●● ●● ● ●● ●● ● ● ●●●●● ●●●● ●●●● ● ●● ●● ● ●● ●●● ●●● ●●● ●●● ● ●●● ●●● ●●● ●●● ● ●●● ● ●●●● ●● ●● ●● ●● ●● ● ●●●● ●●●● ●● ●● ●● ● ●● ● ●● ●● ● ●● ●● ● ●●●● ● ●● ●●●

●● ● ●●● ●● ●● ●●● ●●● ● ●●●● ● ●●● ● ● ●● ● ●● ● ●● ●● ●● ● ●● ● ●● ●●

α(τ0̂)=0.047

τ0̂ = 1.885

Page 26: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

τ1 = 2 Q = 1 Q = 2 Q = 3 Q = 4 Q = 5 Q = 6 Q = 7α (τ̂1) 0.373 0.954 0.204 0.035 0.037 0.040 0.033

¯̂τ1 1.671 1.456 1.757 1.949 1.927 2.033 1.990

τ1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.5 1.0 1.5 2.0 2.5 3.0 3.5

020

040

060

080

010

00

●● ●● ● ●●●● ● ●● ●● ●● ●●● ●● ● ●●● ●●●● ●● ●● ●●●● ●● ●● ●● ●●● ● ●● ●● ●●●●● ●● ● ● ●● ● ●● ●● ●● ●●● ●● ●● ●●● ●● ●●● ●●● ●● ●●● ●●●●● ●● ●● ●● ● ●● ● ●● ● ●●●● ●●● ●● ● ●●● ● ●● ● ●●● ●● ●●● ●● ●● ● ●●●● ●● ●● ● ● ●●● ● ● ●● ●● ● ●●● ●● ● ●● ●●● ● ●● ●●●●● ● ●●● ● ●● ● ●● ●● ● ●● ●●● ● ●● ●● ●●● ●● ●● ● ●●●●●● ● ● ● ●● ●●● ●●● ●●●●● ●● ●●●●● ● ● ●●● ●● ●●● ●●●●●● ●● ● ●●● ●● ●● ●● ● ●● ●●● ● ● ●● ● ● ●●● ●● ●●●● ●● ●●● ●● ●●●● ● ● ●● ●●● ●● ●● ●●● ●● ●●● ●●●● ●●●● ●●● ●●● ● ●● ●● ●● ● ●●● ●●● ● ● ●●● ●●● ● ●● ● ●●●●●● ●● ●●●●●● ●●● ● ●●●●● ●●● ● ●● ●● ●● ● ●●●● ●●● ● ●● ●● ● ●●● ● ●●●● ●●● ●●● ● ●● ●● ●● ●●● ●● ●●● ● ●● ● ●●●●● ●● ● ●●●● ●●● ●●● ● ● ● ●● ●● ●●● ● ●● ●● ● ●●● ● ●●● ●● ●● ●●●●● ● ●● ●● ●● ● ●● ● ●●●●● ● ●● ●● ●● ● ●● ● ●● ●●●● ● ●● ●● ●● ●● ●● ●●● ● ● ●●●●● ● ●● ●●● ● ●● ● ●●●●● ●● ●● ●●● ●● ● ●●● ●● ● ●● ● ● ●● ● ●●● ● ●● ● ●●● ●● ●● ●●● ●●● ● ●●●●● ● ●● ●● ● ●● ● ●●●● ●● ● ●● ●● ●●● ●● ●● ●●●● ● ●●●●● ●● ●●● ● ●●●● ●● ●● ● ●●● ●● ●●● ● ● ● ●● ● ●● ●● ●● ● ● ●● ● ● ● ●● ● ● ●●● ● ●●● ●●●●● ●●●● ● ●● ●● ● ● ●●● ● ●● ●●● ● ● ●● ●● ●●● ●● ●●● ● ●●● ●●●●● ●●●● ●●● ●● ●●●● ● ● ● ●● ●● ●●●● ●● ● ●● ● ●● ●● ● ●● ●● ●● ●● ● ●● ●● ●●● ●● ●● ●●●● ●●● ●● ●●●● ●● ●● ● ●●● ● ●● ●● ●● ●●● ● ●●●● ●●●● ●● ● ● ●● ●● ●● ● ●●● ● ●●●● ●●●●● ● ● ●●● ●●● ●● ●●● ●●● ●●●● ●●● ●● ● ●●● ●●●● ●●●● ●● ●● ●● ●●● ●● ● ●● ● ●●●● ●●● ● ● ● ●● ●● ●●●● ●● ●● ● ● ●●● ● ●●● ●●● ● ●●● ●● ●● ●●● ●●● ●●● ●●●● ●●● ●● ● ●●● ● ●●●●● ●●● ●● ●● ● ●● ●● ●●●

α(τ1̂)=0.373

τ1̂ = 1.671

τ1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.5 1.0 1.5 2.0 2.5 3.0 3.5

020

040

060

080

010

00

●●●● ● ●●●● ●●● ●● ●● ●●● ●● ● ●●● ●●● ● ●● ●● ●● ●● ●● ●●●● ●●●●●● ●● ●●●●● ●● ● ●●● ● ●● ●● ●● ● ●● ●● ●● ●●● ●●●●● ● ●●●● ●●● ●●●●● ● ● ●● ●● ●●● ● ●● ●●●●● ●●● ●● ● ●●●● ●● ● ●●● ●● ● ●● ●● ●● ● ●●●● ●● ●● ●● ●●● ● ● ●● ●●● ●●● ●●●●● ●●●● ●● ●● ●●● ● ●●● ● ●●● ●● ●● ●●● ●●● ● ●● ●● ● ●●● ● ●● ● ●●●●●● ●● ●● ● ●●●● ●● ●●●● ● ●● ●● ●●● ●● ● ●● ●● ●●● ●●●●●● ●● ● ●●● ●● ●● ●● ●●● ●●● ●● ●● ●● ●●● ●● ●●●●●● ●● ● ●● ●●●● ● ● ●● ●●● ●● ●● ●●● ●● ● ●● ●●● ● ●●●● ●●● ●●● ● ●● ● ● ●● ● ●●● ●●● ● ●●●● ●●● ●●● ● ●●●● ●● ●● ●● ● ●●● ●●● ● ●●●●● ●●● ● ●● ●● ●● ● ●● ● ● ●●● ● ●● ●●● ●●● ● ●●●● ●● ● ●●● ● ●●● ●●● ●●● ●● ●●● ● ●● ● ●●● ●● ●● ●●●● ● ●●● ●●●● ●● ●● ●● ●●● ● ●● ●● ● ● ●● ● ●●● ●● ●● ●●● ●● ● ●● ●● ●● ● ●● ● ●●●●● ●●●●● ●● ● ●● ● ●● ●●●● ● ●● ●● ●● ●●●● ●●● ● ● ●● ●●● ● ●● ●●●●●● ●● ●●●● ●● ●● ●●●●● ● ● ●● ● ●● ●● ●●●● ● ●●● ● ●● ●●●● ●● ●● ●● ● ●●● ● ●●●●●● ●● ●● ● ●● ● ●●●●●● ●●● ●● ●● ● ● ●●● ●●●● ● ●●●● ●●● ●●●● ●●●● ●● ●● ● ●●● ●●●●● ●● ● ●● ● ●● ●● ●● ● ● ●● ● ●● ●● ● ●●●● ● ●●● ● ●● ●● ●●●● ● ●● ●●● ● ●●● ●●●

● ●●●● ●● ●● ●●● ●● ●●● ● ●●●●●● ●● ● ●●● ●●● ●● ●●●● ●●● ●● ●● ●●●● ●●● ● ●● ●● ●● ● ●●●● ●● ●● ● ●● ●● ●●● ●● ●● ●● ●● ●●● ●●●●●● ●● ●● ● ●●● ● ●● ●●● ● ●●● ● ●●● ● ●● ●● ●● ●● ●●●●●●● ●●●● ●●●● ●●●●●●●● ●● ●●● ●● ●●● ●●●●●●● ●●● ●● ● ●●● ●●●●●● ●● ●● ●● ●● ●●● ●● ● ●● ● ●●●●●●● ● ● ● ●● ●● ●●●● ●● ●● ● ● ●● ●● ●●● ●●● ● ●●●●●●● ●●● ●●● ●●● ●●●● ●●● ●● ● ●●● ● ●●●● ● ●●● ●●●● ● ●● ●● ●●●

α(τ1̂)=0.954

τ1̂ = 1.456

τ1̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

0.5 1.0 1.5 2.0 2.5 3.0 3.5

020

040

060

080

010

00●●●● ● ●●●● ● ●● ●● ●● ●●● ●● ● ●●● ●●● ● ●● ●● ● ●●● ● ●● ●●● ●●● ● ●● ●● ●●●●● ●● ● ● ●● ● ●● ●● ●● ●●● ●● ●● ●●● ●● ●●● ●●● ●● ●●● ●●●●● ● ● ●● ●● ● ●● ● ●● ● ●●●● ●●●●● ● ●●● ● ●● ● ●●● ●● ●●● ●● ●● ● ●●●● ●● ●●● ● ●●● ● ● ●● ●●● ●●● ●●● ●● ●●● ● ●● ●● ●●● ● ●●● ● ●●● ●● ●● ●●● ●●● ● ●● ●● ● ●● ● ● ●● ● ●●●●●● ● ● ● ●● ●●● ● ●● ●●●● ● ●● ●●●●● ● ● ●●● ●● ●●● ●●●●●● ●● ● ●●● ●● ●● ●● ● ●● ●●● ● ● ●● ● ● ●●● ●● ●●● ●●● ● ●● ●● ●●●● ● ● ●● ●●● ●● ●● ●●● ●● ●●● ●●●● ●●●● ●●● ●●● ●●● ●● ●● ● ●●● ●●● ● ● ●●● ●● ● ●●● ● ●●●●●● ●● ●● ●●●● ●●● ● ●●●●● ●●● ● ●● ●● ●● ● ●●●● ●●● ● ●● ●●● ●●● ●●●●● ●●● ●●● ● ●● ●● ●● ●●● ●● ●●● ● ●● ● ●●● ●● ●● ● ●●●● ●●● ●●● ● ● ●●● ●● ●●● ● ●● ●● ● ● ●● ● ●●● ●● ●● ●●●●● ● ●● ●● ●● ● ●● ● ●●●●● ● ●●●● ●● ● ●● ● ●● ●●●● ● ●● ●● ●● ●● ●● ●●●● ● ●●●●● ● ●● ●●● ● ●● ● ●●●●● ●● ●● ●●● ●● ● ●●● ●● ● ●● ● ●●● ● ●●● ● ●● ● ●●● ●● ●● ●●● ●●● ● ●●●●●● ●● ●● ● ●● ● ●●●● ●● ● ●● ●● ●●● ●●●● ●●●● ● ●●●●●● ● ●●● ● ●●●● ●● ●● ● ●●● ●● ●●● ●● ● ●● ● ●● ●● ●● ● ● ●● ● ● ● ●● ● ● ●●● ● ●●● ●●● ●● ●●●● ● ●● ●● ●● ●●● ● ●●●●● ●● ●● ●● ●●● ●● ●●● ● ●●● ●●●● ● ●●●● ●●● ●● ●●●● ● ● ● ●● ●● ●●●● ●●● ●● ●●● ●● ● ●● ●● ●● ●● ● ●● ●● ●●● ●● ●● ●● ●● ●●● ●● ●●●● ●● ●● ● ●●● ● ●● ●●● ● ●●● ● ●●●● ●● ●● ●●● ● ●● ●●●● ● ●●● ● ●●●● ●●●● ● ●● ●●● ●●● ●● ●●● ●●● ●●●● ●●● ●●● ●●● ●●●● ●● ●● ●●●● ●● ●●● ●● ● ●● ● ●●●● ●●● ● ● ● ●● ●● ● ●●● ●● ●● ● ● ●●● ● ●●● ●● ● ● ●●●●● ●● ●●● ●● ● ●●● ●●●● ●●● ●● ● ●●● ● ●●●●● ●● ● ●● ●● ● ●● ●● ●●●

α(τ1̂)=0.033

τ1̂ = 1.990

Page 27: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

ρ = 0.25 Q = 1 Q = 2 Q = 3 Q = 4 Q = 5 Q = 6 Q = 7α (ρ̂) 0.139 0.028 0.028 0.049 0.036 0.046 0.043

¯̂ρ 0.377 0.248 0.243 0.248 0.259 0.251 0.258

ρ̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

−1.0 −0.5 0.0 0.5 1.0

020

040

060

080

010

00

● ●● ●● ●●● ● ● ●●● ●●● ●● ●●● ●●● ●● ●● ●● ●● ● ●●●●●● ●● ●●● ●● ●●● ●● ● ●● ●●●● ●●●● ●● ●●● ●●●● ●● ●●●●● ● ● ● ●● ●● ●● ●●● ●●● ●●●●● ●●● ●● ● ●● ● ●● ●● ●●● ● ●● ●● ● ● ● ●● ● ●●● ●●● ●●●● ●● ●●●● ● ●● ●●● ● ●● ●●● ● ●●● ●●● ●●● ● ●●● ● ●●●● ●●●● ●● ●● ●●●●● ●● ●●● ●●● ●●● ●●● ●●● ●● ●● ●●●● ●●●● ●● ●●● ●● ●● ●● ●●● ●● ●● ●●●●● ●●● ●● ●●● ● ●● ●● ●● ● ●●● ●●● ●● ●●● ●● ●● ●● ●● ●● ●●●● ● ● ●●● ●● ●●● ● ●●● ●● ●● ●●●●● ●●● ● ● ●● ● ●●●●●● ● ●●● ●● ●● ●●● ●● ●● ● ●● ●● ●● ● ●● ●● ●●●● ●● ●● ●● ● ●● ●●● ●● ●● ●●● ●● ●● ●●● ●●● ● ● ●● ● ●●● ● ●● ● ●●●● ●●● ●● ●● ●●●● ●● ●●●● ●● ● ●●●●● ●●●● ● ●●●●●● ● ●●● ●● ● ●●●●● ● ●● ● ●●● ●●● ●● ●●●● ● ●● ● ●●●●●● ● ● ●● ●● ●● ● ●●●● ●●● ● ●●●● ●● ●●● ●●● ●●● ●●● ●●● ●●●● ●● ●● ● ● ●● ●● ●●● ● ●● ●●● ●● ● ●●●● ●● ● ●●●● ● ●● ●●● ●● ● ●● ●● ●●● ●● ● ● ●● ●● ●●●● ●● ● ● ● ●●● ● ●●● ● ●● ●● ● ● ●● ●●●● ●●● ● ●● ●● ●● ●●●● ● ●● ●●● ●● ●● ●●● ●●● ●●● ●● ●● ●● ●● ●● ●● ●● ●● ● ●● ●●● ●● ●●●● ●●● ● ●●● ● ●●● ●●●●●●● ● ●●● ● ●●●●●● ● ●● ●● ●● ●● ●●● ●● ●●●●● ●● ● ●● ●●●● ● ● ●●●● ●● ● ●●●● ●●● ●● ● ●●● ●●● ●●● ● ● ● ● ●● ●●● ●● ●●● ●●● ●● ● ● ●● ●● ● ● ● ●● ● ● ●●● ● ● ● ●●●●● ● ●● ●● ● ●●● ●●●● ●●● ●●● ● ●●●● ●●● ●●● ● ●● ●●● ● ● ●●● ●● ●●●● ●●● ●●● ● ●● ● ●● ●●●●● ●●● ● ● ●● ●● ● ● ●● ●● ●● ● ● ●● ● ●●● ●● ●● ●●● ●● ●●● ● ● ●●● ●●● ● ●● ●● ● ●● ●● ●● ●● ●● ●● ●● ●●● ● ● ●●● ●●● ●●●● ● ●● ● ● ●● ●● ●●● ●●● ●● ●● ● ●● ●● ●● ● ● ●● ●● ●● ● ●● ●● ●●● ●● ●● ● ●●●●● ●● ●● ●●● ●● ●● ●● ●● ●●●● ●●●

α(ρ̂)=0.139

ρ̂ = 0.377

ρ̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

−1.0 −0.5 0.0 0.5 1.0

020

040

060

080

010

00

● ●● ●● ●●● ● ● ●●● ●●● ●● ● ●● ●●● ●● ●● ●● ●●● ●●●●●● ●● ●●● ●● ●●● ●● ● ●● ●●●● ●● ●● ● ●●● ● ●●●● ●● ●●●●●●● ● ●● ●● ●● ●● ● ●●● ●●●●● ●●● ●● ● ●● ● ●● ●● ●●● ●●● ●● ● ● ● ●● ● ●●● ●●● ● ●●● ●● ●●●● ● ●● ●●● ● ●● ●●● ● ●●● ●●● ●●● ● ●●● ● ●●●● ●●●● ●● ●● ●●●●● ●● ●●● ●●● ●●● ● ●● ●●● ●● ●● ●●●● ●●●● ●● ●●● ●● ●● ●● ●● ●●● ●● ●●●●● ●●● ●● ●●● ● ●● ●● ●● ● ●●● ●●● ●● ●●● ●● ●● ●● ●● ●● ●●●● ● ● ●●● ●● ●●● ● ●●● ●● ●● ●●●●● ●●● ● ● ●● ● ●●●● ●● ● ●●● ●● ●● ●●● ●● ●● ● ●● ●● ●● ●●● ●● ●●●● ●● ●● ●●● ●● ●●● ●● ●● ●● ●●● ●● ●●● ●●● ● ● ●●● ● ●● ● ●● ● ●●●● ●●●●● ●● ●●●● ●● ●●●● ●● ● ●●●●● ●●●● ● ●●● ●●● ● ●●● ●● ● ●●●●● ● ●● ● ●●● ●●● ●● ●● ●● ● ●● ● ●● ●●●● ● ● ●● ●● ●● ● ●●●●●●● ● ●●●● ●● ●● ● ●●● ●●● ●●● ●●● ●●●● ●● ●● ● ● ●● ●● ●●● ● ●● ● ●● ●● ● ●●●● ●● ● ●●●● ● ●● ●●● ●● ● ●● ●● ●●● ●● ●● ●● ●● ●● ●● ●●● ● ● ●●● ● ●●● ● ●● ●● ● ● ●● ● ●● ● ● ●● ● ●● ●● ●● ●●●● ● ●● ●●● ●● ●● ●●● ●●● ●●● ●● ●● ●● ●● ●● ●● ●● ●● ● ●● ●●● ●● ●●●● ● ●● ● ●●● ● ●●● ●●● ●●●● ● ●●● ● ●●●●●● ● ●● ●● ●● ●● ●●● ●● ●●●●● ●●● ●● ●●●● ● ● ●●●● ●● ●●● ●● ●●● ●● ● ●●● ●●● ●●● ● ●● ● ●● ●●● ●●●

●● ●●● ●● ● ● ●● ●●● ● ● ●● ● ● ●●● ●● ● ●●● ●● ● ●● ●●● ●●● ●●●● ●●● ●●● ● ●●●● ●●● ●●● ● ●● ●●● ● ● ●●● ●● ●●●● ●●● ●●● ●●●● ●● ●● ●●● ●●● ● ● ●● ●● ● ● ●● ●● ●● ● ● ●●● ●●● ●● ●● ● ●● ●● ●●● ● ● ●●● ●●●● ●●●● ● ●● ●● ●● ●● ●● ●● ●● ●●● ● ● ●●● ●●● ●●●●● ●● ● ● ●● ●● ●●● ●●● ●● ●● ● ●● ●● ●● ●● ●● ●● ●● ● ●● ●●●●● ●● ●● ● ●●●●● ●● ●● ●●● ●● ●● ●● ●● ●●●● ●●●

α(ρ̂)=0.028

ρ̂ = 0.248

ρ̂

Ran

dom

ly g

ener

ated

dat

aset

num

ber

−1.0 −0.5 0.0 0.5 1.0

020

040

060

080

010

00● ●● ● ● ●●● ● ● ●●● ●●● ●● ● ●● ●●● ●● ●● ●● ●●● ●●●●●● ●● ●●● ●● ●●● ●● ● ●● ●●●● ●● ●● ● ●●●● ●●●● ●● ●●●●●●

● ● ●● ●● ●● ●●● ●●● ●●●●● ●●● ●● ● ●● ● ●●●● ●●● ● ●● ●● ● ● ●●● ● ●●● ●●● ●●●● ●● ●●●● ● ●● ●●● ● ●● ●●● ● ●●● ●●● ●●● ● ●●● ● ●●●● ●●●● ●● ●● ●●●●● ●● ●●● ●●● ●●● ● ●● ●●● ●● ●● ●●●● ●●●● ●● ●●● ●● ●● ●● ●●●●● ●● ●●●●● ●●● ●● ●●● ● ●● ●● ●● ● ●●● ●●● ●●●●● ●● ●● ●●●● ●● ●●●● ● ● ●●● ●● ● ●●● ●●● ●● ●● ●●●●● ●●● ● ● ●● ● ●●●● ●● ●●●● ●● ●● ●●● ●● ●●● ●● ●● ●● ●●● ●● ●●● ● ●● ●● ●● ● ●● ●●● ●● ●● ●● ● ●● ●● ●●● ●●● ● ● ●● ● ● ●● ● ●● ● ●●●● ●●● ●● ●● ●●●● ●● ●●●● ●●● ●●●●● ●●●● ● ●●● ●●● ● ●●● ●● ● ●●●●● ● ●● ● ●●● ●●● ●● ●● ●● ● ●● ● ●● ●●●● ● ● ●● ●● ●● ● ●●●●●

●● ● ● ●●● ●● ●●● ●●● ●●● ●●● ●●● ●●●● ●● ●● ● ●●● ●● ●●●● ●● ● ●● ●● ● ●●●● ●● ● ●●●● ● ●● ●●● ●● ● ●● ●● ●●● ●● ●● ●● ●● ●●●● ●●● ● ● ●●● ● ●●● ● ●● ●● ●● ●● ● ●●● ● ●● ● ●● ●● ●● ●●●●● ●● ●●● ●● ●● ●●● ●●● ●●● ●● ●● ●● ●● ●● ●● ●● ●● ● ●● ●●● ●● ●●●● ● ●● ● ●●● ● ●●● ●● ● ●●●● ● ● ●● ● ●●●●●● ● ●● ●● ●● ●● ●●● ●● ●●●●● ●● ● ●● ●● ●● ● ● ●●● ● ●● ●●● ●● ●●● ●● ●● ●● ●●● ●●● ● ● ●● ●● ●●● ●●●●● ●●● ●● ● ● ●● ●●● ● ● ●● ● ● ●●● ●●● ●●● ●● ● ●● ●● ● ●●● ●●●● ●●● ●●● ● ●●●● ●●● ●●● ● ●● ●●● ● ● ●●● ●● ●●●● ●●● ●●● ● ●● ● ●● ●●●●● ●●● ● ● ●● ●● ● ● ●● ●● ●● ● ● ●●● ●●● ● ● ●● ● ●● ●● ●●● ● ● ●●● ●●●● ●●●● ● ●● ●● ●● ●● ●● ●● ●● ●●●●● ●●● ●●● ● ●●● ● ●● ● ● ●● ●● ●●● ●●● ●● ●● ● ●● ●● ●● ●● ●● ●● ●● ● ●● ●●● ●● ●● ●● ● ●●●●● ●● ●● ●●● ●● ●● ●● ●● ●●●● ●●●

α(ρ̂)=0.043

ρ̂ = 0.258

Page 28: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Computational time (minutes)1

γ (ηij) =

fixed� �� �β0 + β1x1ij +

fixed/noise� �� �21�

k=2

βkxkij +

random� �� �ui0 + ui1x1ij ,

β2 = . . . = β21 = 0i = 1, 2, . . . , 200j = 1, 2, . . . , ni

Method Software ni = 10 ni = 100 ni = 1000Laplace xtmelogit 2 5 32

NLMIXED†† 2 21 187GLIMMIX† 0 0 3ADMB-RE 2 14 N/A

glmer 2 3 16fitglme 0 1 17

aGHQ (Q = 7) xtmelogit 6 12 90NLMIXED†† 18 203 4299GLIMMIX† 0 1 17ADMB-RE 2 17 N/A

1Mac Pro (2010): 2 × 2.93GHz 6-Core Intel Xeon, 32GB DDR3, SSD)

Page 29: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Forward step-wise model selection using AIC

γ (ηij) =

fixed/signal� �� �

β0 +5�

k=1

βkxkij +

fixed/noise� �� �25�

k=6

βkxkij +

random� �� �ui0 + ui1x1ij ,

β6 = . . . = β25 = 0i = 1, 2, . . . , 20j = 1, 2, . . . , ni

ni = 10 (run 100 times) Laplace Q = 7Correctly identified covariates (/5) 4.76 4.76Incorrectly identified covariates (/20) 3.54 3.55Random slope identified (/1) 0.69 0.67

Page 30: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Forward step-wise model selection using AIC

γ (ηij) =

fixed/signal� �� �

β0 +5�

k=1

βkxkij +

fixed/noise� �� �25�

k=6

βkxkij +

fixed/interactions� �� �24�

k=1

25�

k �=k+1

βkk �xkijxk �ij +

random� �� �ui0 + ui1x1ij ,

where β12,β13,β45 �= 0all other βkk � = 0

ni = 30 (run 10 times) Laplace Q = 7Correctly identified covariates (/5) 4.9 4.9Incorrectly identified covariates (/20) 3.0 3.0Correctly identified interactions (/3) 2.8 2.8Incorrectly identified interactions (/297) 2.5 2.5Random slope identified (/1) 0.9 0.9

Page 31: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

Summary

For GLMMs with random effects actually generated from theassumed distribution (Gaussian) AND binary outcome data:

• Don’t use Q = 2• Q = 1 is insufficient to estimate variance components• Q ≥ 7 gives reasonably accurate results• SAS was the fastest package for aGHQ• Model building using AIC is the same irrespective of Q = 1 orQ = 7 (AIC overfits)

Page 32: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

AcknowledgementsThank you to The University of Adelaide and my co-author Professor PattySolomon.

This research was supported under Australian Research Council’s DiscoveryProjects funding scheme. The views expressed are those of the authors and arenot necessarily those of the Australian Research Council.

Much of the computation was made feasible using the command line parallelcomputing utility: GNU Parallel. Please see http://www.gnu.org/s/parallel

or the ;login: The USENIX Magazine article (O. Tange; 2011) for more details.

Page 33: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

. xtmelogit y || id: , intpoints(Q)

�����������������������m{0},�������������������������������������������������������

�����id�

������������������������Q�������������������

���

������m{f+1}����������������x(f+1)*�����������������������������������������

�����������������

�����������������x(f+1)����������������������������������������

�������m{f}�

�������ff�������������

����������f+1���������f≥��������ff������������������������������������x1*�x2*�����xf*��������������������������������������������

����������x(f+1)���������

. xtmelogit y x1* x2* ... xf* x(f+1) || id: , intpoints(Q)

��

������

������������������������������������������������������������������������������

xr ∈��x1*�x2*�����xf*�

. xtmelogit y x1* x2* ... xf* || id: xr , intpoints(Q) cov(unstructured)

. xtmelogit y x1* x2* ... xf* || id: , intpoints(Q)

. xtmelogit y x1* x2* ... xf* || id: xr* , intpoints(Q) cov(unstructured)

�������������������������

���������xr����������������������������������������������������

���������������������

�����������������������������

��������������������������������������������

�������������xr*�����������������������

��������ff�������������

��

���

Page 34: Motivation - Stata · Penalised quasi-likelihood (PQL) • Taylor series expansion of the likelihood function • Biased, especially when Bernoulli trials low samples per cluster2

β1̂ using ADMB (REML) with 95% CIs (H0: β1 = 1 true)

Iter

atio

n (c

omm

on r

ando

m s

eed

acro

ss m

odel

s)

aGHQ=1 aGHQ=2 aGHQ=3 aGHQ=4 aGHQ=5 aGHQ=6 aGHQ=7 aGHQ=8 aGHQ=9 aGHQ=10 aGHQ=11 aGHQ=12 aGHQ=13 aGHQ=14 aGHQ=15

020

040

060

080

010

00 α(β1̂) = 0.102 0.494 0.124 0.057 0.060 0.052 0.056 0.055 0.056 0.056 0.056 0.056 0.056 0.056 0.056