general linear model & classical inference

24
General Linear Model & Classical Inference London, SPM-M/EEG course May 2014 C. Phillips, Cyclotron Research Centre, ULg, Belgium http://www.cyclotron.ulg.ac.be

Upload: tanith

Post on 01-Feb-2016

28 views

Category:

Documents


0 download

DESCRIPTION

General Linear Model & Classical Inference. London, SPM-M/EEG course May 2014. C. Phillips, Cyclotron Research Centre, ULg, Belgium http:// www.cyclotron.ulg.ac.be. Overview. Introduction ERP example General Linear Model Definition & design matrix Parameter estimation & interpretation - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: General Linear Model & Classical Inference

General Linear Model & Classical Inference

London, SPM-M/EEG courseMay 2014

C. Phillips, Cyclotron Research Centre, ULg, Belgiumhttp://www.cyclotron.ulg.ac.be

Page 2: General Linear Model & Classical Inference

Overview

• Introduction– ERP example

• General Linear Model– Definition & design matrix– Parameter estimation & interpretation– Contrast & inference– Correlated regressors

• Conclusion

Page 3: General Linear Model & Classical Inference

Overview

• Introduction– ERP example

• General Linear Model– Definition & design matrix– Parameter estimation & interpretation– Contrast & inference– Correlated regressors

• Conclusion

Page 4: General Linear Model & Classical Inference

Pre-processing:• Converting• Filtering• Resampling• Re-referencing• Epoching• Artefact rejection• Time-frequency

transformation• …

General Linear Model

Raw EEG/MEG dataRaw EEG/MEG data

Overview of SPM

Image convertion

Design matrixDesign matrix

Parameter Parameter estimatesestimates

Inference & correction for

multiple comparisons

Contrast:Contrast:c’ = [-1 1]c’ = [-1 1]

Statistical Parametric Statistical Parametric Map (SPM)Map (SPM)

Page 5: General Linear Model & Classical Inference

ERP example

• Random presentation of ‘faces’ and ‘scrambled faces’

• 70 trials of each type• 128 EEG channels

Question:is there a difference between the ERP of

‘faces’ and ‘scrambled faces’?

Page 6: General Linear Model & Classical Inference

ERP example: channel B9

compares size of effect to its error standard deviation

sf

2

sf

11nn

t

Focus on N170

Page 7: General Linear Model & Classical Inference

Overview

• Introduction– ERP example

• General Linear Model– Definition & design matrix– Parameter estimation & interpretation– Contrast & inference– Correlated regressors

• Conclusion

Page 8: General Linear Model & Classical Inference

Data modeling

= + +

Erro

r

Faces ScrambledData

= + +XXY ••

Page 9: General Linear Model & Classical Inference

Design matrix

= +

= +XY •

Data

vect

orDes

ign

mat

rix Param

eter

vect

orErr

or

vect

or

Page 10: General Linear Model & Classical Inference

XY XY

N: # trials

p: # regressors

General Linear Model

YY

N

1

+

N

1

=

N

p

1

p

X

GLM defined by),0(~ 2IN

design matrix X

error distribution, e.g.

Page 11: General Linear Model & Classical Inference

General Linear Model

• The design matrix embodies all available knowledge about experimentally controlled factors and potential confounds.

• Applied to all channels & time points

• Mass-univariate parametric analysis– one sample t-test– two sample t-test– paired t-test– Analysis of Variance (ANOVA)– factorial designs– correlation– linear regression– multiple regression

Page 12: General Linear Model & Classical Inference

Estimate parameters

such that

N

i i1

2 minimal

XY

ˆˆ XY Residuals:

Parameter estimation

= +

XY

If iid. error assumed:

),0(~ 2IN

YXXX TT 1)(ˆ Ordinary Least

Squares parameter estimate

Page 13: General Linear Model & Classical Inference

Hypothesis Testing

The Null Hypothesis H0

Typically what we want to disprove (i.e. no effect).

Alternative Hypothesis HA = outcome of interest.

Page 14: General Linear Model & Classical Inference

Contrast : specifies linear combination of parameter vector: c´

ERP: faces < scrambled ?=

-1x + 1x > 0 ?=

^^

test H0 : c´ > 0 ?^

T =

contrast ofestimated

parameters

varianceestimate

T =

s2c’(X’X)+c

c’ ^

Contrast & t-test

< ? ( : estimation of ) =

^ ^ ^

c’ = -1 +1 SPM-t over time & space

Page 15: General Linear Model & Classical Inference

Hypothesis Testing

The Null Hypothesis H0

Typically what we want to disprove (i.e. no effect).

Alternative Hypothesis HA = outcome of interest.

The Test Statistic T

• summarises evidence about H0.

• (typically) small in magnitude when H0 is true and large when false.

know the distribution of T under the null hypothesis. Null Distribution of T

Page 16: General Linear Model & Classical Inference

t

P-val

Null Distribution of T

Null Distribution of T

u

Hypothesis Testing

Significance level α: Acceptable false positive rate α. threshold uα, controls the false positive

rate

Observation of test statistic t, a realisation of T

Conclusion about the hypothesis: reject H0 in favour of Ha if t > uα

P-value:summarises evidence against H0.

= chance of observing value more extreme than t under H0. )|( 0HtTp )|( 0HtTp

)|( 0HuTp

Page 17: General Linear Model & Classical Inference

Contrast & T-test, a few remarks

• Contrasts = simple linear combinations of the betas

• T-test = signal-to-noise measure (ratio of estimate to standard deviation of estimate).

• T-statistic, NO dependency on scaling of the regressors or contrast

• Unilateral test:

H0: 0Tc vs. HA: 0Tc

Page 18: General Linear Model & Classical Inference

Model comparison: Full vs. Reduced model?

Null Hypothesis H0: True model is X0 (reduced model)Null Hypothesis H0: True model is X0 (reduced model)

RSS

RSSRSSF

0

RSS

RSSRSSF

0

21 ,~ FRSS

ESSF

21 ,~ FRSS

ESSF

Test statistic: ratio of explained and unexplained

variability (error)

1 = rank(X) – rank(X0)2 = N – rank(X)

RSS

2ˆ fullRSS0

2ˆreduced

Full model ?

X1 X0

Or reduced model?

X0

Extra-sum-of-squares & F-test

Page 19: General Linear Model & Classical Inference

F-test & multidimensional contrasts

Tests multiple linear hypotheses:

H0: True model is X0H0: True model is X0

Full or reduced model?

X1 (3-4) X0 X0

0 0 1 0 0 0 0 1

cT =

H0: 3 = 4 = 0H0: 3 = 4 = 0 test H0 : cT = 0 ?test H0 : cT = 0 ?

Page 20: General Linear Model & Classical Inference

x1

x2x2*

y

x2 orthogonalized w.r.t. x1

only the parameter estimate for x1 changes, not that for x2!

Correlated regressors explained variance

shared between regressors

121

2211

exxy

121

2211

exxy

1;1 *21

*2

*211

exxy

1;1 *21

*2

*211

exxy

Correlated and orthogonal regressors

Page 21: General Linear Model & Classical Inference

Inference & correlated regressors

• implicitly test for an additional effect only – be careful if there is correlation– orthogonalisation = decorrelation (not generally needed)

parameters and test on the non modified regressor change

• always simpler to have orthogonal regressors and therefore designs.

• use F-tests in case of correlation, to see the overall significance. There is generally no way to decide to which regressor the « common » part should be attributed to.

• original regressors may not matter: it’s the contrast you are testing which should be as decorrelated as possible from the rest of the design matrix

Page 22: General Linear Model & Classical Inference

Overview

• Introduction– ERP example

• General Linear Model– Definition & design matrix– Parameter estimation & interpretation– Contrast & inference– Correlated regressors

• Conclusion

Page 23: General Linear Model & Classical Inference

1. Decompose data into effects and error2. Form statistic using estimates of effects

(of interest) and error

Make inferences about effects of interestWhy?

How?

Use any available knowledgeModel?

Modelling?

Contrast:e.g. [1 -1 ]

Contrast:e.g. [1 -1 ]

modelmodel

effects estimate

effects estimate

error estimate

error estimate

statisticstatisticdatadata

Experimental effects

Experimental effects

Page 24: General Linear Model & Classical Inference

Thank you for your attention!

Any question?

Thanks to Klaas, Guillaume, Rik, Will, Stefan, Andrew & Karl for the borrowed slides!