non-orthogonal regressors: concepts and consequences

40
Non-orthogonal regressors: concepts and consequences

Upload: blaine-mansell

Post on 14-Dec-2015

222 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Non-orthogonal regressors: concepts and consequences

Non-orthogonal regressors: concepts and consequences

Page 2: Non-orthogonal regressors: concepts and consequences

• Problem of non-orthogonal regressors

• Concepts: orthogonality and uncorrelatedness

• SPM (1st level):– covariance matrix– detrending– how to deal with correlated regressors

• Example

overview

Page 3: Non-orthogonal regressors: concepts and consequences

design matrix

• Each column in your design matrix represents 1) events of interest or 2) a measure that may confound your results. Column = regressor• The optimal linear combination of all these columns attempts to explain as much variance in your dependent variable (the BOLD signal) as possible

Scan number

regressors

Page 4: Non-orthogonal regressors: concepts and consequences

BOLD signal

Tim

e =1 2+ +

err

or

x1 x2 e

exxy 2211 exxy 2211

Source: spm course 2010, Stephanhttp://www.fil.ion.ucl.ac.uk/spm/course/slides10-zurich/

Page 5: Non-orthogonal regressors: concepts and consequences

The beta’s are estimated on a voxel-by-voxel basis

high beta means regressor explains much of BOLD signal’s variance (i.e. strongly covaries with signal)

exxy 2211 exxy 2211

Page 6: Non-orthogonal regressors: concepts and consequences

Problem of non-orthogonal regressors

total variance in BOLD signal

Y

Page 7: Non-orthogonal regressors: concepts and consequences

Orthogonal regressors

total variance in BOLD signal

X1 X2

every regressor explains unique part of the variance in the BOLD signal

Y X2

= +

X1

Page 8: Non-orthogonal regressors: concepts and consequences

Orthogonal regressors

total variance in BOLD signal

X1 X2

There is only 1 optimal linear combination of both regressors to explain as much variance as possible. Assigned beta’s will be as large as possible, stats using these beta’s will have optimal power

Y X2

= +

X1

Page 9: Non-orthogonal regressors: concepts and consequences

non-orthogonal regressors

Regressor 1 & 2 are not orthogonal. Part of the explained variance can be accounted for by both regressors and is assigned to neither. Therefore, betas for both regressors will be suboptimal

Y X2

= +

X1

Page 10: Non-orthogonal regressors: concepts and consequences

Entirely non-orthogonal

total variance in BOLD signal

regressor 2regressor 1

Betas can’t be estimated. Variance can not be assigned to one or the other

Y X2

= +

X1

Page 11: Non-orthogonal regressors: concepts and consequences

“It is always simpler to have orthogonal regressors and therefore designs.“

(spm course 2010)

Page 12: Non-orthogonal regressors: concepts and consequences

orthogonality

Regressors can be seen as vectors in n-dimensional space, where n = number of scans.

Suppose now n = 2

r1 r2---------------1 22 1

1 2

1

2r1

r2

Page 13: Non-orthogonal regressors: concepts and consequences

orthogonality

• Two vectors are orthogonal if raw vectors have– inner product == 0– angle between vectors == 90°– cosine of angle == 0

Inner product:

r1 • r2 = (1 * 2) + (2 * 1) = 4

θ = acos(4 / (|r1| * |r2|) = about 35 degrees

1 2

1

2r1

r2

35

Page 14: Non-orthogonal regressors: concepts and consequences

Orthogonalizing one vector wrt another: it matters which vector you choose! (Gram-Schmidt orthogonalization)

Orthogonalize r1 wrt r2:

u1 = r1 – projr2(r1)

u1 = [1 2] – (r1 • r2)/(r2 • r2)

u1 = [-0.6 1.2]

Inner product:

u1 • r2 = (-0.6 * 2) + (1.2 * 1) = 0

orthogonality

1 2

r1

r2

u1

Page 15: Non-orthogonal regressors: concepts and consequences

• Orthogonal is defined as: X’Y = 0

(inner product of two raw vectors = 0)• Uncorrelated is defined as: (X – mean(X))’(Y – mean(Y)) = 0

(inner product of two detrended vectors = 0)• Vectors can be orthogonal while being correlated, and vice

versa!

orthogonality & uncorrelatednessAn aside on these two concepts

Page 16: Non-orthogonal regressors: concepts and consequences

please read Rodgers et al. (1984) Linearly independent, orthogonal and uncorrelated variables. The American Statistician, 38:133-134. Will be in the FAM folder as well

Orthogonal because:

Inner product

1*5 + -5*1 + 3*1 + -1*3 = 0

Page 17: Non-orthogonal regressors: concepts and consequences

please read Rodgers et al. (1984) Linearly independent, orthogonal and uncorrelated variables. The American Statistician, 38:133-134. Will be in the FAM folder as well

Detrend:

Mean(X) = -0.5

Mean(Y) = 2.5

X_det Y_det

1.5 2.5

-4.5 -1.5

3.5 -1.5

-0.5 0.5

==================

Mean(X_det) = 0

Mean(Y_det) = 0

==================

Inner product: 5

Orthogonal, but correlated!

3.75

6.75

-5.25

-0.25

Page 18: Non-orthogonal regressors: concepts and consequences

1 2

r1

r2 detrend

r1_det r2_det-0.9 0.50.9 -0.5

r1 r2-0.6 21.2 1

Page 19: Non-orthogonal regressors: concepts and consequences

Q: So should my regressors be uncorrelated or orthogonal?

A: When building your SPM.mat (i.e. running your jobfile) all regressors are detrended (except the grand mean scaling regressor). This is why orthogonal and uncorrelated are both used when talking about regressors

update: it is unclear whether all regressors are detrended when building an SPM.mat. This seems to be the case, but recent SPM mailing list activity suggests detrending might not take place in versions newer than SPM99.

Donders batch?

orthogonality & uncorrelatedness

“effectively there has been a change between SPM99 and SPM2 such that regressors were mean-centered in SPM99 but they are not any more (this is regressed out by the constant term anyway).” Link

Page 20: Non-orthogonal regressors: concepts and consequences

Despite scrupulous design, your regressors likely still correlate to some extent

This causes beta estimates to be lower than they could be

You can see correlations using review SPM.mat Design design orthogonality

Your regressors correlate

Page 21: Non-orthogonal regressors: concepts and consequences
Page 22: Non-orthogonal regressors: concepts and consequences

For detrended data, the cosine of the angle (black = 1, white = 0) between two regressors is the same as the correlation r !orthogonal vectors cos(90) = 0 r = 0 r2 = 0correlated vector cos(81) = 0.16 r = 0.16 r2 = 0.0256

r2 indicates how much variance is common between the two vectors (2.56% in this example). Note: -1 ≤ r ≤ 1 and 0 ≤ r2 ≤ 1

Page 23: Non-orthogonal regressors: concepts and consequences

Correlated regressors: variance from single regressor to shared

Page 24: Non-orthogonal regressors: concepts and consequences

Correlated regressors: variance from single regressor to shared

t-test uses beta, determined by amount of variance explained by single regressor.

Page 25: Non-orthogonal regressors: concepts and consequences

Correlated regressors: variance from single regressor to shared

t-test uses beta, determined by amount of variance explained by single regressor.

Large shared variance: low statistical power

Page 26: Non-orthogonal regressors: concepts and consequences

Correlated regressors: variance from single regressor to shared

t-test uses beta, determined by amount of variance explained by single regressor.

Large shared variance: low statistical power

Not necessarily a problem if you do not intend to test these two regressors!

Movement regressor 1

Movement regressor 2

Page 27: Non-orthogonal regressors: concepts and consequences

- Strong correlations between regressors are not necessarily a problem. What is relevant is correlation between contrasts of interest relative to the rest of the design matrix- Example: lights on vs lights off. If movement regressors correlate with

these conditions (contrast of interest not orthogonal to rest of design matrix), there is a problem.

- If nuisance regressors only correlate with each other, no problem!- Grand mean scaling is not centered around 0 (i.e. not detrended),

these correlations are not informative

How to deal with correlated regressors?

Page 28: Non-orthogonal regressors: concepts and consequences
Page 29: Non-orthogonal regressors: concepts and consequences

• Orthogonalize regressor A wrt regressor B: all shared variance will now be assigned to B.

How to deal with correlations between contrast and rest of design matrix?

Page 30: Non-orthogonal regressors: concepts and consequences

orthogonality

1 2

1

2r1

r2

Page 31: Non-orthogonal regressors: concepts and consequences

orthogonality

1 2

r1

r2

total variance in BOLD signal

regressor 1regressor 2

Page 32: Non-orthogonal regressors: concepts and consequences

• Orthogonalize regressor A wrt regressor B: all shared variance will now be assigned to B.

Only permissible given a priori reason to do this: hardly ever the case

How to deal with correlations between contrast and rest of design matrix?

Page 33: Non-orthogonal regressors: concepts and consequences

• do an F-test to test overall significance of your model. For example, to see if adding a regressor will significantly improve your model. Shared variance is taken along to determine significance then.

• In the case where a number of regressors represent the same manipulation (e.g. switch activity, convolved with different hrfs) you can serially orthogonalize the regressors before estimating betas.

How to deal with correlations between contrast and rest of design matrix?

Page 34: Non-orthogonal regressors: concepts and consequences

Example how not to do it:

• 2 types of trials: gain and loss

Voon et al. (2010) Mechanisms underlying dopamine-mediated reward bias in compulsive behaviors. Neuron

Page 35: Non-orthogonal regressors: concepts and consequences

Example how not to do it:

• 4 regressors:– Gain predicted outcome– Positive prediction error (gain trials)

– Loss predicted outcome– Negative prediction error (loss trials)

Voon et al. (2010) Mechanisms underlying dopamine-mediated reward bias in compulsive behaviors. Neuron

Highly correlated!

Highly correlated!

Page 36: Non-orthogonal regressors: concepts and consequences

Example how not to do it:

• Performed 6 separate analyses (GLMs)

• Shared variance is attributed to single regressor in all GLMs

• Amazing! Similar patterns of activation!

Voon et al. (2010) Mechanisms underlying dopamine-mediated reward bias in compulsive behaviors. Neuron

Page 37: Non-orthogonal regressors: concepts and consequences

• If regressors correlate, explained variance in your BOLD signal will be assigned to neither, which reduces power on t-tests

• If you orthogonalize regressor A with respect to regressor B, values of A will be changed and A will have equal uniquely explained variance. B, the unchanged variable, will come to explain all variance shared by A and B. However, don’t do this unless you have a valid reason.

• Orthogonality and uncorrelatedness are only the same thing if your data is centered around 0 (detrended, spm_detrend)

• SPM does (NOT?) detrend your regressors the moment you go from job.mat to SPM.mat

Take home messages

Page 38: Non-orthogonal regressors: concepts and consequences

http://imaging.mrc-cbu.cam.ac.uk/imaging/DesignEfficiency#head-525685650466f8a27531975efb2196bdc90fc419

Combines SPM book and Rik Henson’s own attempt at explaining design efficiency and the issue of correlated regressors.

Rodgers et al. (1984) Linearly independent, orthogonal and uncorrelated variables. The American Statistician, 38:133-134

15-minute read that describes three basic concepts in statistics/algebra

Interesting reads

Page 39: Non-orthogonal regressors: concepts and consequences

regressors

Page 40: Non-orthogonal regressors: concepts and consequences

Inner product: 54

Non-orthogonal

Same vectors, but detrended:

x y-3 30 -63 3

inner product:0

uncorrelated

x y3 66 -39 6

Raw vectors:

But!