figure 14.1. simplifications for association rules. and...

47
Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman 2009 Chap 14 X 1 X 1 X 1 X 2 X 2 X 2 FIGURE 14.1. Simplifications for association rules. Here there are two inputs X 1 and X 2 , taking four and six distinct values, respectively. The red squares indi- cate areas of high density. To simplify the computa- tions, we assume that the derived subset corresponds to either a single value of an input or all values. With this assumption we could find either the middle or right pattern, but not the left one.

Upload: others

Post on 11-May-2020

11 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

X1X1X1X

2

X2

X2

FIGURE 14.1. Simplifications for association rules.Here there are two inputs X1 and X2, taking four andsix distinct values, respectively. The red squares indi-cate areas of high density. To simplify the computa-tions, we assume that the derived subset correspondsto either a single value of an input or all values. Withthis assumption we could find either the middle or rightpattern, but not the left one.

Page 2: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

0 10 20 30 40 50

0.0

0.02

0.04

0.06

Attribute

Rel

ativ

e F

requ

ency

in D

ata

inco

me

sex

mar

stat

age

educ

occu

p

yrs−

bay

dual

inc

perh

ous

pery

oung

hous

e

type

hom

e

ethn

ic

lang

uage

0 10 20 30 40 50

0.0

0.04

0.08

0.12

Attribute

Rel

ativ

e F

requ

ency

in A

ssoc

iatio

n R

ules

inco

me

sex

mar

stat

age

educ

occu

p

yrs−

bay

dual

inc

perh

ous

pery

oung

hous

e

type

hom

e

ethn

ic

lang

uage

FIGURE 14.2. Market basket analysis: relative fre-quency of each dummy variable (coding an input cate-gory) in the data (top), and the association rules foundby the Apriori algorithm (bottom).

Page 3: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

-1 0 1 2

-20

24

6

••

••

•••

••

••

••

•••

•••

••

••

••• • ••

••

••

••

••

••

•• ••

•• ••

•••

••

••

••

••

••

••

••

••

••

••

••

••

• •••

• •

••• • •••

• ••

• ••

•••••

•••

• ••

••

••••

-1 0 1 2

-20

24

6

••

••

•••

••

••

••

•••

•••

••

••

••• • ••

••

••

••

••

••

•• ••

•• ••

•••

••

••

••

••

••

••

••

••

••

••

••

••

• •••

• •

••• • •••

• ••

• ••

•••••

•••

• ••

••

••••

••

••

••

••

• •

••

•••

••

••

• •

••

••

••

••

•••

••

• •

• •

••

••

••

• •••

• •

••

••

••

••

••

•••

••

••

••

••

•• •

• •

X1X1

X2

X2

FIGURE 14.3. Density estimation via classification.(Left panel:) Training set of 200 data points. (Rightpanel:) Training set plus 200 reference data points,generated uniformly over the rectangle containing thetraining data. The training sample was labeled as class1, and the reference sample class 0, and a semiparamet-ric logistic regression model was fit to the data. Somecontours for g(x) are shown.

Page 4: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

• • •

••

••

•••

••

•••

•• • •

••

••

••••

• •

• •••

••

••

••

••

••

• •••

••

• ••

• •• •

• •• •

• •

••

••

• •

•• • •

•• •• •

•• ••

••

••

••

•• •

••

••

••

••

••••

• •

••• ••

X1

X2

FIGURE 14.4. Simulated data in the plane, clus-tered into three classes (represented by orange, blue andgreen) by the K-means clustering algorithm

Page 5: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

-6 -4 -2 0 2 4

-6-4

-20

24

• •

• ••

••

••••

•••

••

••

•••

• •••

••

• • ••

• ••

•••••

••

••••

••

• ••• •••

•• ••

••

••

••

••

••

• •

•••

••

•••

•••

••

-2 -1 0 1 2

-2-1

01

2

••

••

••

••

•••

••

••

••••

•• •

••

•••

• •

•••

•••

•••

••

••

•••

• ••

• ••

••

•• •• • ••

•••

X1X1

X2

X2

FIGURE 14.5. Simulated data: on the left, K-meansclustering (with K=2) has been applied to the raw data.The two colors indicate the cluster memberships. Onthe right, the features were first standardized beforeclustering. This is equivalent to using feature weights1/[2 · var(Xj)]. The standardization has obscured thetwo well-separated groups. Note that each plot uses thesame units in the horizontal and vertical axes.

Page 6: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

-4 -2 0 2 4 6

-20

24

6

Initial Centroids

• • •

••

••

•••

• •• ••

•• • •

••

••

••

•••••

• ••

• •••

••

••

•••

••

••

••••

• •••

••

•• •

•• •• •

• •• •

••

• •

• ••••

••

•• • •

•• •• •

• ••

••• •

•••

•• •

••

••

••

••

••

••

• •

••• ••

••

••

• • •

••

••

•••

• •• ••

•• • •

••

••

••

•••••

• ••

• •••

••

••

•••

••

••

••••

• •••

••

•• •

•• •• •

• •• •

••

• •

• ••••

••

•• • •

•• •• •

• ••

••• •

•••

•• •

••

••

••

••

••

••

• •

••• ••

••

••

Initial Partition

• • •

••

••

•••

• •• ••

•• • •

••

••

••

•••••

• ••

• •••

••

••

•••

••

••••

• •••

••

•• •

•• •• •

• •• •

••

• •

•• ••

••

••

•• •• •

••

••• •

••

• •• •

••

••

••

••

••

••

• •

••• ••

Iteration Number 2

••

••

• • •

••

••

•••

• •• ••

•• • •

••

••

••

•••••

• ••

• •••

••

••

••

•••

••

••••

• •••

••

•• •

•• •• •

• •• •

••

• •

••

••• •

•• • •

•• •• •

•• ••

•••

•••

•• •

••

••

••

••

••••

• •

••• ••

Iteration Number 20

••

••

FIGURE 14.6. Successive iterations of the K-meansclustering algorithm for the simulated data of Fig-ure 14.4.

Page 7: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

• •

Res

pons

ibili

ties

0.0

0.2

0.4

0.6

0.8

1.0

• •

Res

pons

ibili

ties

0.0

0.2

0.4

0.6

0.8

1.0

σ = 1.0σ = 1.0

σ = 0.2σ = 0.2

FIGURE 14.7. (Left panels:) two Gaussian densitiesg0(x) and g1(x) (blue and orange) on the real line, anda single data point (green dot) at x = 0.5. The col-ored squares are plotted at x = −1.0 and x = 1.0, themeans of each density. (Right panels:) the relative den-sities g0(x)/(g0(x) + g1(x)) and g1(x)/(g0(x) + g1(x)),called the “responsibilities” of each cluster, for this datapoint. In the top panels, the Gaussian standard devia-tion σ = 1.0; in the bottom panels σ = 0.2. The EMalgorithm uses these responsibilities to make a “soft”assignment of each data point to each of the two clus-ters. When σ is fairly large, the responsibilities canbe near 0.5 (they are 0.36 and 0.64 in the top rightpanel). As σ → 0, the responsibilities → 1, for thecluster center closest to the target point, and 0 for allother clusters. This “hard” assignment is seen in thebottom right panel.

Page 8: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

Number of Clusters K

Sum

of S

quar

es

2 4 6 8 10

1600

0020

0000

2400

00

••

••

• •• •

FIGURE 14.8. Total within-cluster sum of squaresfor K-means clustering applied to the human tumor mi-croarray data.

Page 9: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

FIGURE 14.9. Sir Ronald A. Fisher (1890 − 1962)was one of the founders of modern day statistics, towhom we owe maximum-likelihood, sufficiency, andmany other fundamental concepts. The image on theleft is a 1024×1024 grayscale image at 8 bits per pixel.The center image is the result of 2 × 2 block VQ, us-ing 200 code vectors, with a compression rate of 1.9bits/pixel. The right image uses only four code vectors,with a compression rate of 0.50 bits/pixel

Page 10: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

CH

I

CU

B

US

S

YU

G

BR

A

IND

ZA

I

BE

L

EG

Y

FR

A

ISR

CUB

USS

YUG

BRA

IND

ZAI

BEL

EGY

FRA

ISR

USA

Reordered Dissimilarity Matrix First MDS Coordinate

Sec

ond

MD

S C

oord

inat

e

-2 0 2 4

-2-1

01

23

CHICUB

USS

YUG

BRA INDZAI

BEL

EGY

FRA

ISRUSA

FIGURE 14.10. Survey of country dissimilarities.(Left panel:) dissimilarities reordered and blocked ac-cording to 3-medoid clustering. Heat map is coded frommost similar (dark red) to least similar (bright red).(Right panel:) two-dimensional multidimensional scal-ing plot, with 3-medoid clusters indicated by differentcolors.

Page 11: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

Number of Clusters

2 4 6 8

-3.0

-2.5

-2.0

-1.5

-1.0

-0.5

0.0 •

• •

• ••

• •

••

••

•• •

Number of Clusters

Gap

2 4 6 8

-0.5

0.0

0.5

1.0

••

• • •

log

WK

FIGURE 14.11. (Left panel): observed (green) andexpected (blue) values of log WK for the simulated dataof Figure 14.4. Both curves have been translated toequal zero at one cluster. (Right panel): Gap curve,equal to the difference between the observed and ex-pected values of log WK . The Gap estimate K∗ is thesmallest K producing a gap within one standard devi-ation of the gap at K + 1; here K∗ = 2.

Page 12: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

CN

SC

NS

CN

SR

EN

AL

BR

EA

ST

CN

SCN

S

BR

EA

ST

NS

CLC

NS

CLC

RE

NA

LR

EN

AL

RE

NA

LRE

NA

LR

EN

AL

RE

NA

L

RE

NA

L

BR

EA

ST

NS

CLC

RE

NA

L

UN

KN

OW

NO

VA

RIA

N

ME

LAN

OM

A

PR

OS

TA

TE

OV

AR

IAN

OV

AR

IAN

OV

AR

IAN

OV

AR

IAN

OV

AR

IAN

PR

OS

TA

TE

NS

CLC

NS

CLC

NS

CLC

LEU

KE

MIA

K56

2B-r

epro

K56

2A-r

epro

LEU

KE

MIA

LEU

KE

MIA

LEU

KE

MIA

LEU

KE

MIA

LEU

KE

MIA

CO

LON

CO

LON

CO

LON

CO

LON

CO

LON

CO

LON

CO

LON

MC

F7A

-rep

roB

RE

AS

TM

CF

7D-r

epro

BR

EA

ST

NS

CLC

NS

CLC

NS

CLC

ME

LAN

OM

AB

RE

AS

TB

RE

AS

T

ME

LAN

OM

A

ME

LAN

OM

AM

ELA

NO

MA

ME

LAN

OM

A

ME

LAN

OM

A

ME

LAN

OM

A

FIGURE 14.12. Dendrogram from agglomerative hi-erarchical clustering with average linkage to the humantumor microarray data.

Page 13: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

Average Linkage Complete Linkage Single Linkage

FIGURE 14.13. Dendrograms from agglomerative hi-erarchical clustering of human tumor microarray data.

Page 14: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

FIGURE 14.14. DNA microarray data: average link-age hierarchical clustering has been applied indepen-dently to the rows (genes) and columns (samples), de-termining the ordering of the rows and columns (seetext). The colors range from bright green (negative, un-der-expressed) to bright red (positive, over-expressed).

Page 15: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

−1−0.5

00.5

11.5

−1

−0.5

0

0.5

1

1.5−1

−0.5

0

0.5

1

1.5

FIGURE 14.15. Simulated data in three classes, nearthe surface of a half-sphere.

Page 16: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

•• •

••

• •

•••

••

••• •

• •••

••

••

• •

••

• • ••• •

••• •

••••

••

• •••

••

••

••

1 2 3 4 5

1

2

3

4

5

••

• • • •••

•••

••

•••

•• ••

••

• ••

•••

••

•• •

•• •

••

••

••

••

••

••

•• •

••

1 2 3 4 5

1

2

3

4

5

FIGURE 14.16. Self-organizing map applied to half–sphere data example. Left panel is the initial config-uration, right panel the final one. The 5 × 5 grid ofprototypes are indicated by circles, and the points thatproject to each prototype are plotted randomly withinthe corresponding circle.

Page 17: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

FIGURE 14.17. Wiremesh representation of the fit-ted SOM model in IR3. The lines represent the hori-zontal and vertical edges of the topological lattice. Thedouble lines indicate that the surface was folded diag-onally back on itself in order to model the red points.The cluster members have been jittered to indicate theircolor, and the purple points are the node centers.

Page 18: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

Iteration

Rec

onst

ruct

ion

Err

or

0 500 1000 1500 2000 2500

010

2030

4050

••

•••••

•••••

••••

•••••••••••••••••••••••••••••

FIGURE 14.18. Half-sphere data: reconstruction er-ror for the SOM as a function of iteration. Error fork-means clustering is indicated by the horizontal line.

Page 19: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

FIGURE 14.19. Heatmap representation of theSOM model fit to a corpus of 12,088 newsgroupcomp.ai.neural-nets contributions (courtesy WEB-SOM homepage). The lighter areas indicate higher–density areas. Populated nodes are automatically la-beled according to typical content.

Page 20: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

••

• •

• ••

• •

••

• •

• ••

• •

v1v1v1v1v1v1v1v1

ui1d1ui1d1ui1d1ui1d1ui1d1ui1d1ui1d1ui1d1

xixixixixixixixi

FIGURE 14.20. The first linear principal componentof a set of data. The line minimizes the total squareddistance from each point to its orthogonal projectiononto the line.

Page 21: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

First principal component

Sec

ond

prin

cipa

l com

pone

nt

−1.0 −0.5 0.0 0.5 1.0

−1.

0−

0.5

0.0

0.5

1.0

•• •

••

••

••

••

••

• ••

••

•• •

••• •• •

••

••

••

• •

• ••

••

• •

• •

• ••

• •

••

•• •

••

• •

••

••

••

••

FIGURE 14.21. The best rank-two linear approxima-tion to the half-sphere data. The right panel shows theprojected points with coordinates given by U2D2, thefirst two principal components of the data.

Page 22: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

FIGURE 14.22. A sample of 130 handwritten 3’sshows a variety of writing styles.

Page 23: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

First Principal Component

Sec

ond

Prin

cipa

l Com

pone

nt

-6 -4 -2 0 2 4 6 8

-50

5

••

• •

••

••

• •

••

••

••

••

••

•• •

••

••

••

••

• •

••

••

••

••

••

•••

••

••

••

••

• •

••

••

•••

••

• •

••

••

•• •

•••

••

••

••

• •

• •

• •

••

••

• •

••

••

••

••

• •

••

••

••

• •

••

••

••

••

•••

••

••

••

••

••

•••

••

••

••

••

••

• •

• •

••

••

••

••

•• •

••

••

• ••

••

• •

••

••

•••

••

•••

••

••

•••

• • •

••

••

••

• ••

••

••

• •

• •

••

••

••

••

• •

••

••

• •

••

••

••

• •

••

••

••

•• •

••

O O O O

O

O O OO

O

OO

O O O

O O O OO

O O O O O

FIGURE 14.23. (Left panel:) the first two princi-pal components of the handwritten threes. The circledpoints are the closest projected images to the vertices ofa grid, defined by the marginal quantiles of the principalcomponents. (Right panel:) The images correspondingto the circled points. These show the nature of the firsttwo principal components.

Page 24: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

Dimension

Sin

gula

r V

alue

s

0 50 100 150 200 250

020

4060

80

••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••

•••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••••

• Real Trace• Randomized Trace

FIGURE 14.24. The 256 singular values for the digi-tized threes, compared to those for a randomized versionof the data (each column of X was scrambled).

Page 25: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

FIGURE 14.25. (Left panel:) Two different digitizedhandwritten §s, each represented by 96 correspondingpoints in IR2. The green § has been deliberately ro-tated and translated for visual effect. (Right panel:)A Procrustes transformation applies a translation androtation to best match up the two set of points.

Page 26: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

FIGURE 14.26. The Procrustes average of threeversions of the leading § in Suresh’s signatures. Theleft panel shows the preshape average, with each of theshapes X′

� in preshape space superimposed. The rightthree panels map the preshape M separately to matcheach of the original §’s.

Page 27: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

....•

• •

••

• • •

• •

• •

••

• • ••

• •

.....

f(λ) = [f1(λ), f2(λ)]f(λ) = [f1(λ), f2(λ)]f(λ) = [f1(λ), f2(λ)]f(λ) = [f1(λ), f2(λ)]f(λ) = [f1(λ), f2(λ)]f(λ) = [f1(λ), f2(λ)]f(λ) = [f1(λ), f2(λ)]f(λ) = [f1(λ), f2(λ)]f(λ) = [f1(λ), f2(λ)]

FIGURE 14.27. The principal curve of a set of data.Each point on the curve is the average of all data pointsthat project there.

Page 28: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

-0.1 0.0 0.1 0.2

-0.2

-0.1

0.0

0.1

0.2

•• •

••

••

••

••

••

• ••

••

•••

••• •

• •

••

• •••

••

• •

••

••

• •

••

••

•••

••

••

••

••

••

••

λ1

λ2

FIGURE 14.28. Principal surface fit to half-spheredata. (Left panel:) fitted two-dimensional surface.(Right panel:) projections of data points onto the sur-

face, resulting in coordinates λ1, λ2.

Page 29: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

−4 −2 0 2 4

−4

−2

02

4

x1

x2

0.0

0.1

0.2

0.3

0.4

0.5

Number

Eig

enva

lue

1 3 5 10 15

0 100 200 300 400

Eigenvectors

Index

2nd

Sm

alle

st3r

d S

mal

lest

−0.

05 0

.05

−0.

05 0

.05

−0.04 −0.02 0.00 0.02

−0.

06−

0.02

0.02

0.06

Second Smallest Eigenvector

Thi

rd S

mal

lest

Eig

enve

ctor

Spectral Clustering

FIGURE 14.29. Toy example illustrating spectralclustering. Data in top left are 450 points falling inthree concentric clusters of 150 points each. The pointsare uniformly distributed in angle, with radius 1, 2.8and 5 in the three groups, and Gaussian noise withstandard deviation 0.25 added to each point. Using ak = 10 nearest-neighbor similarity graph, the eigen-vector corresponding to the second and third smallest

Page 30: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

−0.10 −0.06 −0.02 0.02

−0.

10−

0.05

0.00

0.05

0.10

First Largest Eigenvector

Sec

ond

Larg

est E

igen

vect

or

Radial Kernel (c=2)

−0.06 −0.02 0.02 0.06

−0.

050.

000.

05First Largest Eigenvector

Sec

ond

Larg

est E

igen

vect

or

Radial Kernel (c=10)

0.00 0.05 0.10 0.15

−0.

2−

0.1

0.0

0.1

0.2

First Largest Eigenvector

Sec

ond

Larg

est E

igen

vect

or

NN Radial Kernel (c=2)

−0.05 0.00 0.05 0.10 0.15

−0.

100.

000.

050.

100.

15

Second Smallest Eigenvector

Thi

rd S

mal

lest

Eig

enve

ctor

Radial Kernel Laplacian (c=2)

FIGURE 14.30. Kernel principal components appliedto the toy example of Figure 14.29, using different ker-nels. (Top left:) Radial kernel (14.67) with c = 2.(Top right:) Radial kernel with c = 10. (Bottom left):Nearest neighbor radial kernel W from spectral cluster-ing. (Bottom right:) Spectral clustering with Laplacianconstructed from the radial kernel.

Page 31: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

Walking Speed

Verbal Fluency

Principal Components Sparse Principal Components

FIGURE 14.31. Standard and sparse principal com-ponents from a study of the corpus callosum variation.The shape variations corresponding to significant prin-cipal components (red curves) are overlaid on the meanCC shape (black curves).

Page 32: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

FIGURE 14.32. An example of a mid-saggital brainslice, with the corpus collosum annotated with land-marks.

Page 33: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

VQ

× =

NMF

PCA

Original

FIGURE 14.33. Non-negative matrix factoriza-tion (NMF), vector quantization (VQ, equivalent tok-means clustering) and principal components analysis(PCA) applied to a database of facial images. Details

Page 34: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

h1

h2

FIGURE 14.34. Non-uniqueness of the non-negativematrix factorization. There are 11 data points in twodimensions. Any choice of the basis vectors h1 and h2

in the open space between the coordinate axes and data,gives an exact reconstruction of the data.

Page 35: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

2 Prototypes 4 Prototypes 8 Prototypes

FIGURE 14.35. Archetypal analysis (top panels) andK-means clustering (bottom panels) applied to 50 datapoints drawn from a bivariate Gaussian distribution.The colored points show the positions of the prototypesin each case.

Page 36: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

FIGURE 14.36. Archetypal analysis applied to thedatabase of digitized 3’s. The rows in the figure showthe resulting archetypes from three runs, specifying two,three and four archetypes, respectively.

Page 37: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

Source Signals Measured Signals

PCA Solution ICA Solution

FIGURE 14.37. Illustration of ICA vs. PCA on ar-tificial time-series data. The upper left panel shows thetwo source signals, measured at 1000 uniformly spacedtime points. The upper right panel shows the observedmixed signals. The lower two panels show the principalcomponents and independent component solutions.

Page 38: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

*

**

*

*

*

*

*

*

*

*

* *

*

*

*

*

*

*

*

**

**

*

*

*

*

**

*

*

*

*

*

**

**

*

**

*

*

* *

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

**

*

*

*

**

*

*

*

*

*

**

*

**

*

**

****

*

*

* *

*

*

*

*

*

**

*

*

*

*

*

* *

*

*

*

*

*

**

*

* * *

**

*

*

*

*

*

*

*

*** *

*

*

**

*

*

*

*

**

*

*

*

*

*

* *

**

*

*

*

**

*

*

*

*

*

*

*

*

*

*

* *

*

*

*

*

*

*

*

*

*

*

*

*

*

* *

*

*

*

*

*

**

*

*

** *

*

*

*

*

*

*

*

*

*

*

**

**

*

***

*

*

*

**

*

**

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

**

*

*

* *

*

*

*

*

*

*

*

*

*

*

*

**

*

*

*

*

**

*

*

**

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

**

*

*

*

**

*

*

*

*

*

*

*

*

**

*

*

*

*

*

**

*

*

*

**

*

*

*

*

*

*

*

*

*

*

**

*

* **

*

*

*

*

*

* *

*

*

*

**

**

*

*

*

*

*

**

*

*

*

*

*

*

*

*

**

*

*

*

**

*

*

*

*

**

**

*

*

*

*

*

**

**

*

*

*

*

**

*

**

**

*

*

*

*

*

*

**

*

*

**

*

*

**

*

*

*

**

*

*

**

*

*

*

**

*

*

*

**

*

*

*

*

*

* *

*

* *

*

***

*

*

**

* * *

**

*

*

*

*

*

**

**

*

*

*

*

*

*

*

*

* * *

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

Source S

*

**

*

*

*

*

*

***

*

*

*

*

*

**

*

* **

*

*

*

*

*

**

*

*

*

*

*

**

*

**

*

*

*

*

*

*

**

*

*

*

*

**

*

**

**

*

*

*

*

*

**

*

*

*

*

*

***

*

***

*

*

*

*

*

*

*

**

*

**

**

*

**

*

*

*

*

**

*

**

* **

*

*

*

*

*

**

*

*

*

*

*

*

*

*

*

*

**

**

*

**

**

*

*

* *

*

*

*

* *

*

*

*

*

* *

*

*

*

*

*

*

*

*

*

*

*

*

* *

* *

*

*

*

* *

**

*

*

*

*

*** *

**

*

*

*

*

*

*

**

*

*

*

*

*

*

*

*

*

**

**

*

*

*

* *

*

*

*

* **

**

*

**

**

*

*

*

*

*

*

*

*

*

**

*

*

*

*

*

*

*

*

*

*

**

*

*

*

* *

*

**

*

**

*

*

**

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

* *

*

*

***

*

*

**

**

*

*

* *

* *

*

*

*

**

*

*

**

*

*

*

**

*

*

**

*

*

**

* *

*

*

*

*

*

* *

*

***

*

*

*

*

**

*

*

*

*

*

*

*

**

*

*

*

*

**

*

*

*

*

*

***

*

*

*

*

*

**

*

**

*

**

*

** *

*

*

*

**

*

*

*

*

*

*

*

*

*

*

*

*

*

*

**

*

*

**

*

*

*

*

* *

*

*

**

**

***

*

**

*

*

**

*

*

*

*

*

*

**

*

*

*

*

*

*

*

*

****

*

*

*

*

*

**

**

**

**

*

*

**

*

*

*

*

*

*

* *

*

**

**

*

*

*

*

*

*

*

*

*

*

*

*

*

* **

*

*

**

*

*

*

*

Data X

*

**

*

**

*

*

*

*

**

*

**

*

**

**

**

*

*

*

** **

*

*

*

*

**

*

**

*

*

**

*

**

**

*

*

**

*

*

**

*

*

**

*

*

*

*

*

*

*

*

*

* *

*

*

*

***

*

*

***

**

*

**

*

**

***

*

* *

*

*

**

*

*

*

*

*

*

*

**

*

*

*

*

*

*

*

*

*

**

**

*

*

*

*

*

*

**

*

*

*

*

*

*

** *

**

*

**

*

*

*

*

*

*

*

*

*

*

*

*

*

**

*

*

**

*

**

**

**

*

*

*

*

*

*

*

*

**

*

*

*

*

*

*

*

*

*

*

*

* *

*

*

*

*

**

*

**

*

*

*

*

*

**

*

*

* *

*

*

*

*

*

*

*

*

*

**

*

*

*

*

*

*

*

*

*

*

*

*

*

*

**

*

**

*

*

*

*

**

*

*

**

*

*

**

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

* *

*

*

*

* *

*

*

*

*

*

*

*

**

***

*

*

*

* *

*

*

**

*

*

*

*

*

*

*

**

** *

*

**

*

*

* *

*

*

*

*

*

**

*

*

**

*

***

*

*

***

*

**

**

*

*

*

**

*

*

**

* *

*

*

*

*

*

*

*

*

*

*

*

*

**

*

* *

*

*

**

*

*

*

*

*

*

*

**

*

*

*

*

*

*

**

*

*

*

*

*

* *

*

*

**

**

*

**

*

*

*

**

**

*

**

**

***

*

*

*

*

*

*

*

*

*

**

*

*

**

**

*

*

*

**

*

*

*

*

*

*

****

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

**

*

*

**

*

*

*

*

**

*

*

*

**

*

*

*

*

* *

*

**

*

*

PCA Solution

*

* *

*

*

*

*

*

*

*

*

**

*

*

*

*

*

*

*

**

**

*

*

*

*

**

*

*

*

*

*

**

**

*

**

*

*

**

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

* *

*

*

*

**

*

*

*

*

*

**

*

**

*

**

* * **

*

*

**

*

*

*

*

*

* *

*

*

*

*

*

**

*

*

*

*

*

**

*

***

**

*

*

*

*

*

*

*

*

***

*

*

**

*

*

*

*

**

*

*

*

*

*

**

*

**

*

*

* *

*

*

*

*

*

*

*

*

*

*

**

*

*

*

*

*

*

*

*

*

*

*

*

*

**

*

*

*

*

*

**

*

*

* **

*

**

*

*

*

*

*

*

*

**

**

*

** *

*

*

*

* *

*

**

*

**

**

*

*

*

*

*

*

*

*

*

*

*

**

*

*

**

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

**

*

*

* *

*

*

*

*

*

*

**

*

*

*

*

*

*

*

*

*

*

*

**

*

*

*

**

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

**

*

*

*

**

*

*

*

*

*

*

*

*

*

*

**

*

** *

*

*

*

*

*

**

*

*

*

**

**

*

*

*

*

*

**

*

*

*

*

*

*

*

*

* *

*

*

*

*

*

*

*

*

*

* **

*

*

*

*

*

*

* *

* *

*

*

*

*

**

*

**

*

* *

*

*

*

*

*

***

*

*

*

*

*

**

*

*

*

**

*

*

**

*

*

*

*

*

*

*

*

**

*

*

*

*

*

**

*

**

*

* **

*

*

* *

***

**

*

*

*

*

*

* *

**

*

*

**

**

*

****

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

*

ICA Solution

FIGURE 14.38. Mixtures of independent uniformrandom variables. The upper left panel shows 500 real-izations from the two independent uniform sources, theupper right panel their mixed versions. The lower twopanels show the PCA and ICA solutions, respectively.

Page 39: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

Component 1

ooooooo

oo

o

o

ooo

o

oo

o

ooooo

oooooooo o

o

o

o oo

oooo

o

o

o

o

o

ooooo

o

oo ooo

ooo

o

oooo

ooo oo

ooooo

oooo

o o

o

oooo

oo

o

o

o

oooooo oo

oo

oo

o

o ooo

o ooooo

ooooo

oo

oo

o

ooo ooooo oo oo

o oooooo

o

oo o

o

ooo oooo

ooo

oo

o

oo

oo oo oo ooo

o

oo

ooo ooo

o

oo o

oo o o

ooo

oo

o oo

oooo

o

ooo

oooo

oo

ooo

o oooo

o

ooo

o

o

ooo

o

o oo o o

ooo

oo o

oo oooooo

ooooooooo

oo

ooo oooooo

ooo

oooo o

oo

oo oo

oo

o

oo oooo

oooo o

oo

oo o

o

o oooo

o ooo

oo o

oo

o

o

oooo

oo oo

o

o

o

ooooo o

ooooo oo ooo

oooo

o

oo

o

oo

oo

ooo

oo

ooo

o

o oooo

oo

oo

o

o

o

oo

oooooo

oo

o

oo ooo

o

ooo oo

o

o

oooo o

oo o

o

o

oo

o

oooo

oo

o ooo oo

oooo

oooo

ooo

o

oo

o

o ooo

ooo

o

oooo

oo

o

ooo o

o oooo

o

ooo

o

o

oo

ooo oo

oo

ooo

o

o

oo

o

ooo

oo oo

o

oooooo ooooooo

ooo

o

oo ooo o

o

ooo

oo o

oo

o

o

oo

oooo

o

o

ooo o

oo

ooooo

oooo

oooo

ooo o o

ooo oo

ooo

ooo

oo

oo

oo

oooooo

ooo

ooo o

oooo

oo oo

o

ooo o

ooooo

o

oo

oo

o

ooo oo o

ooo o

ooo

o oo o

oo

o

oo o

o

ooo o

o

o

oo

o

o

o

o

oo

ooo

ooooooooo

o

o

ooo

o

oo

o

oo oooo

oooo

ooooo

o

o oo

ooo

o

o

o

o

o

o

ooooo

o

oo oooooo

o

o ooo

oo oo

oo o

ooo

oo oooo

o

oooooo

o

o

o

ooooo ooo

oooo

o

oooo

oooooo

ooooo

ooo

oo

ooo oooooo

ooooooo o

oo

o

oo o

o

oo oooooooo

oo

o

oo

o ooooooo o

o

oo

oo oo oo

o

oooooo o

oo

oo

o

ooo

oooo

o

ooo

oo oo

oo

ooooo

ooo

o

ooo

o

o

oo o

o

o oo oo

oo

oooooooooo

o oooooo ooo ooo

oo ooo oo

oooo

o

oooo o

o o

ooo o

oo

o

ooooooooooo

oo

o oo

o

ooooo

o ooo

ooo

o o

o

o

oooo

oo oo

o

o

o

o ooo

oooooooooooooooo

o

oo

o

oo

oo

oooo

oooo

o

oo ooo

oo

oo

o

o

o

ooo o

ooo oo o

o

ooo oo

o

oooo oo

o

ooooo

ooo

o

o

oo

o

o ooo

oo

oooo

ooooo

oooooo

oo

o

oo

o

o ooo

ooo

o

o ooo

oo

o

ooooooo

oo

o

ooo

o

o

oo

ooooo

oo

ooo

o

o

oo

o

ooo

oooo

o

oo

o oooo ooo oooo

oo

o

oooo oo

o

oo

oo oo

oo

o

o

oo

oooo

o

o

ooo o

oo

oooo o

ooo ooo

oo

oo ooooo

o ooo ooo

ooo

ooo

oo

oooo

o oo o

ooooo

ooo

oo

ooo

o

oooo

o oooo

o

oo

oo

o

oo o oooooo

o

ooooo

oo

oo

o

oo o

o

oooo

o

o

oo

o

o

o

o

oooo o

oo ooo oooo

o

o

oo o

o

oo

o

o ooooo

o oo o

ooo oo

o

o oo

o oo

o

o

o

o

o

o

ooo

oo

o

ooo oo

oo o

o

oo oo

oooo

ooo

ooo

ooo ooo

o

oooo

oo

o

o

o

ooooo oooo

oo

o

o

o ooooo o

ooo

oooo oo

ooo

o

o ooo ooooo

oooo oo oo

oo

o

oo o

o

o oo oooo

oo

o

oo

o

oo

o ooooooo o

o

oo

oo o ooo

o

ooo

ooo o

oo

oo

o

o oo

ooo o

o

o oo

ooooo

oooo

o oo oo

o

ooo

o

o

oo o

o

ooooo

oo

oo

o oooo

ooo

o ooooooooooo

o

oooo ooo

o oo o

o

o ooo o

oo

ooo

o

oo

o

oooo oo

ooo oo

oo

ooo

o

ooooo

o o oo

oo o

oo

o

o

oooo

o ooo

o

o

o

oooo

ooo

oo ooo oo oooooo

o

oo

o

oo

oo

ooo

oo

o oo

o

ooooo

oo

oo

o

o

o

oooo

oo oooo

o

ooooo

o

oooooo

o

o ooo o

ooo

o

o

oo

o

oooo

oo

ooooo o

oooo

oooo

ooo

o

oo

o

o oo o

o oo

o

o ooooo

o

oooooo oo

o

o

ooo

o

o

oo

ooo oo

oo

ooo

o

o

oo

o

ooo

oooo

o

oo

oo oo oo oooooo

o o

o

oooooo

o

oo

oo oo

oo

o

o

oo

oooo

o

o

oooo

oo

ooo oo

oooooo

ooo oo o o

oooo ooo

oo

ooo

oo

oo

oooo

oooo oo

ooo oo

ooo

oo oo

o

oooo

oo ooo

o

oo

oo

o

oo ooooooo

o

o oo

o oo o

oo

o

ooo

o

oooo

o

o

oo

o

o

o

o

oo

ooo

oo ooooo

oo

o

o

ooo

o

oo

o

ooooo

ooo

ooo ooo

o

o

ooo

ooo

o

o

o

o

o

o

ooo

oo

o

oooo o

ooo

o

o ooo

oooo

ooo

ooo

oo oooo

o

oo oooo

o

o

o

ooooo oooo

ooo

o

o ooo

oo oooo

oooo oo

oo

oo

oo o ooo

o ooooo

o o oo oo

o

o

ooo

o

oo oo ooooo

o

oo

o

oo

oo oooo o oo

o

oooooo oo

o

ooo

oooo

oo

oo

o

ooo

oooo

o

ooo

oooo

oo

ooo

o oooo

o

ooo

o

o

ooo

o

oooo o

oo

oooo

oooo

ooo o

oooooooooo

o

o o ooooooo

ooo

ooooo

oo

oo oo

oo

o

oo oo oo

ooooo

oo

ooo

o

oooo o

oooo

ooo

oo

o

o

oo oo

ooo o

o

o

o

oooooo

oo o oooo o oo

ooo o

o

oo

o

oo

oo

ooo

oo

ooo

o

o oo oo

oo

oo

o

o

o

oo

oo o oooo o

o

o oo oo

o

oooo o

o

o

o oo oo

oo o

o

o

oo

o

oooo

ooo o

ooo o

o ooo

oo oo

oo o

o

oo

o

o ooo

oo o

o

oooo

oo

o

oo oo

oo ooo

o

o oo

o

o

oo

o oooo

oo

ooo

o

o

oo

o

oo o

ooo o

o

oo

oo oo oooooooooo

o

ooo ooo

o

ooo

ooo

oo

o

o

oo

o ooo

o

o

oo oo

oo

o oo ooo

ooooo

oooo

ooooooo

ooo

oo

ooo

oo

oo

ooo o

ooooo

oo

oooo

ooo

ooo o

o

oooo

ooooo

o

oo

o o

o

oooooo

o ooo

ooo

oooo

oo

o

ooo

o

oooo

o

o

oo

o

o

o

o

ooo o

o

oo

oo

o

oo

oo o

o

o

o

oo

o oo

oo

o

oo

o

o

oo

oo

oo

o

oo

o oo

oo

o

o

oo

o

o

o

ooo o oo

oo

oo

o

ooo

o

o

o

o

oo

o

oo

oo

o

o

oo

o ooo

oo

o

o

o

o oo

o

oo

o

oooo

o

o

o o

o

ooo

o

o

o o

oo

o

ooo

o

o oo

o

o

o

oo

o

oo

o

oo o

oo

o

o

o

oo o

ooo

o

ooo

o

oo

oo

o

o

oo

o oo

o

ooo

oo

o

o

oo

o

o

oo

oo ooo

oo

o

o

oo

oo

oo

o

o o

oo

o

oo

o

oo

ooo

ooo

o

o ooo

o

o

oo o o

oooo

o

o ooo

oo

o

o

oo

o

o

o

oooo

o

oo

o oo

o

o

o

o o

o oo

o

oo

o

ooo oo

o oo o

oo

oo

o oo

o

oo

o

oo o

o oo

oooo

oo

oo

o

o

o

o

o

oo

oo

oo

o

o

o

o o

o

o

oo

o

ooo

o

oooo

oo

oo

oo oo

oo

o

oo

ooooo

oo

ooo

o oo

oo

ooo

o oooo

oo ooo

oo

oo

ooo

oo

ooo o

oo oo

o

o

o

o

oo o

oo

oo

oo

o

o

oo

o oo

o

oo

oo

o

o

o

o

o

o

o

o

oo

o

o

o

oo

oo

o

ooo o

o

ooo

o

o

ooo

oo oo

o

oo

oo

o

o

o

oo

o

o

oo

o

oo o

o

o

o

o

o

o

o

oo

o

o

o

o

oo

o

o

o

oo

o o

oo

o

o

o

o

o oo

o

o

o

oo

oo

oo

o

o

o ooo

o oo

oo

o

o

o

o

ooo

ooo

ooo

ooo

o o ooo

ooo

o

oo

o

o

ooo

o o oo

o

o

o

o

o

oo

o

o

oo

ooo

oo

oo

ooo

oo

o

o

o

o

o

o

o

oo

oo

ooo

o

oo o

oo

ooo

o

oo

o

oo

o

oo o

o

o

oo

o

o

o oo ooo

oooo

o

oo

o

oo

o

oo

o

oo

oo o

ooo

o

oooo

o

o

o

o

o

o

o

o

oo

o o

oo oo

o

o

o

o

ooo

o

Component 2

oooo

ooooo

o o

o

oooo

oooo o

oo

oo

o

oo

ooo

oooo

oo

ooo o

ooooo ooooo ooo

o

oooooo o oo

o

o

o o

o

oo

o

oooo

o

ooo

ooooo ooo

oo

oooo

oo oooo

oo

o

o

oo

oooooooo oo o

oo

ooo o

ooo

ooooo

oooo

oo

ooo ooo

o

oo

o

o

oo o

o

o

ooooooooooo

oo

ooo

oo

oo

o

o

o

o oo oo o o

o

o

o

o

oo

o

ooo

oo

o

o

oo

o

oo

oo

ooo oo o

o

oo oo

oooo

o

ooo ooo

o ooo

o

oo

ooo o

oo

oooo

oooo ooooo

o ooo oo

o

oo o

oo ooo

o oo

o

ooo

ooo o

o

oo

oo

ooo

o

o

ooo

o

ooo

o

o oo ooo

oo

o ooo

oo

o

o

ooo oo

o

ooo

o

ooo

o

o

o

o

o

oo ooo

ooooo

ooooo

oooo o

ooo

ooo o

oooo

o

ooo

ooo o

o

o

ooo ooo

o

oo

ooo

oo oo o

o

ooo ooo o

oo

o oo

o

o

oooo

oo

ooo o

o

o

o oo

ooo

o

ooo

o

o

oo

o

ooooo

ooo

oooooo

oooooo

o ooo oo

o

oo

o

oooo o

oo

o

o

ooo oooo

o

ooo

o

o

ooo

o

o

o

o

o oo oo

oooooo ooo

oooo ooooooo

ooo

o oo

o

ooo

o oo

oooo

oo oo oo

o

oooo

o

oo

o

oooo

ooo ooooo

oo

o

o

o

ooo

o

o

oo

oo

oooo

o

oo o

ooo

oo

oo oo

ooo

ooooooo

oooo o

oo

o oo

ooo o

oooo

oo o

oo

o

oooo

o

o

oo

o

o

o

o

o

ooo

ooo

o

o

oo oooo

o

o

o

o

o

oo

o oo oo

o oooo

oo

o

o ooo

oo

o oo

oo

oo

o

oo

ooo

ooo

o

oo

o oo o

o ooo oooo oo ooo

o

oooo oooo oo

o

oo

o

ooo

oooo

o

oo o

oooooooo

oo

oooo

oo oooooo

o

o

oo

oooo ooo ooo o

oo

oo

ooo

o ooo ooo

oooo

oo

oo oo

oo

o

oo

o

o

ooo

o

o

oooo oo

oooo

ooo

oo

o

oo

o o

o

o

o

o oooooo

o

o

o

o

oo

o

o oo

oo

o

o

oo

o

oo

o o

oooooo

o

oooo

oo oo

o

oooooo

o oo o

o

oo

oo oo

oo

oooooooo ooo

ooooooo

oo

oooo ooo

ooo

oo

o oo

oo oo

o

oo

oo

oo o

o

o

ooo

o

ooo

o

o oooooo

oooooo o

o

o

ooooo

o

ooo

o

o ooo

o

o

o

o

oo o oo

ooo ooo

oo ooooooo

oo

oo

oooo

ooo

o

o oo

ooooo

o

ooo ooo

o

oo

oo

oo oooo

o

ooooo ooooooo

o

o

ooo

oo

oooo o

o

o

ooo

ooo

o

ooo

o

o

oo

o

ooo

o ooo

ooooo

oo

o oo ooo

o ooooo

o

oo

o

oo

o ooo

o

o

o

o oo oo oo

o

ooo

o

o

ooo

o

o

o

o

ooo oo

oo

ooo oo oo

oo ooooo

oo oo

ooo

ooo

o

ooo

o oo

oo

ooo ooo

oo

o

o ooo

o

o o

o

oo oo

oooooooo

o o

o

o

o

oo o

o

o

oo

oo

oo oo

o

oo o

ooo

oo

oo oo

ooo

o ooooo

oooo

o ooo

oo o

ooo o

oooo

oo o

oo

o

oooo

o

o

oo

o

o

o

o

o

oo o

ooo

o

o

ooo oooo

o

o

o

o

oo

o oo oo

o ooo o

oo

o

ooo o

oo

ooo

oo

o o

o

oo

ooo

oooo

oo

ooo o

ooooo ooooooooo

o oooo oo ooo

o

oo

o

ooo

oooo

o

ooo

oooo oooo

oo

oooooo oooo

oo

o

o

oo

oooo oooo oo oo

oo

oo o

ooo

ooooo

ooo

oo

oo

oo ooo

o

oo

o

o

oo o

o

o

ooooo o

oooo

oo

o

oo

o

oo

oo

o

o

o

ooo ooo o

o

o

o

o

oo

o

o oo

oo

o

o

oo

o

oooo

oooo oo

o

oooo

oooo

o

ooo ooo

oooo

o

oo

oooo

ooooo

oo ooo o

ooooooooo

oo

oo o

ooooo

o oo

o

ooooo oo

o

oo

oo

ooo

o

o

ooo

o

ooo

o

o oooooo

ooooo

oo

o

o

oooo o

o

ooo

o

ooo

o

o

o

o

o

oooo

ooo o o

ooo

o ooooo ooo

oo

ooo o

ooo

o

o

ooo

oo

ooo

o

ooo ooo

o

oo

oo

oooo o o

o

o oo oooo

ooo o

o

o

o

ooo

oo

ooo

o oo

o

ooo

oo

o

o

ooo

o

o

o o

o

ooo

oooo

oo

ooo oo

oo oo oo

ooo o oo

o

oo

o

ooo oo

oo

o

o

oooooo o

o

o oo

o

o

ooo

o

o

o

o

ooooo

ooooooo ooo

ooooo

oooo

ooo

o

ooo

o

ooo

ooo

oo

oo

ooo ooo

o

ooo o

o

oo

o

oo oo

o ooooo

oooo

o

o

o

oo o

o

o

oo

oo

oo oo

o

ooooo

o

o o

oooo

o oo

ooooo oo

ooo

o ooo

ooo

ooo o

oo

ooo

ooo

o

o

o ooo

o

o

oo

o

o

o

o

o

oo o

ooo

o

o

ooo o ooo

o

o

o

o

oo

o

oo ooo

o o

oo

o

oooooo

o

o

oo

o

oo

ooo

o

o

oo

o

oo

o

o ooo

o

o

o o

ooooo

ooo

oo

o

ooo

oo

o

o

ooo

o o

oooo

oo o

o

oo

o

o

oo

o

o

oo

ooo

oo

oo

o

o

oo

oo

o

o

oo

o

o

o

o

o

o

o

ooo

o

oo

o

o

oo

o

o

o

oo

o

oo ooo o

o

o

oo

oo

o

o

o

o

o

o

o

o

o

o

o

o oo

oo

o

o

o ooo o

o o

o

o

o

o

o

oo

o

o

oo

o oo

o

oooo

oo

o

o

o

oo

oo

ooo

oo

o

oo

ooo

oo

o

o

o

o

oo

o

o

oo

o

o

o

o

oo

o

oo

o

o

o

o

o

o

ooo oo

oo

oo

o

o

oo

oo

o

o

oo o

oo

o

oo

o

oo

oo

o

o

oo

o

o

o

o

ooo

oo

o ooo

o

oo

o

o

ooo oo

o o

oo

o

o

o

o ooo

o

o

o

oooo

oooo

ooo

o

o

oo

o

o

o

o

o

o

oo o

o

oo

o

o

o

o

o

o

ooo

ooo

oo

o

o

o

oo

o

o

ooo

o

o

o

ooo

oo

o

o

oo

oo

oo

o oooo

o

oo

oo

oo oo

oo

oo

o

o

o

oo

o

oo

o

o

o

o

oo

o

o

oo

o o

o

oo

o

o

o

o

o

oo

oo

oo

o

o

o

o

oo o

o

o o

o

oo

oo

oo

oo

ooo

o

o ooo oo oo oo

o

oo

oo

o

o

oo

ooo

oo o

o

o oo

o

o oo

o

o

o oo

o

ooo

o o

o

o

oo

oo

o o

o

o

o

o

oo

oo

oo

o

oo

o

o oo

oo

oo

oo

o

oo

oo

oo

o

o

o o

oo

oo

oo

oo

o

o

o o

oo

ooo

o

o

o

o

o

o

o oo

oo

o

oo

ooo

o

o

o

o

o

o

o

o oo

o o

o

o

o

o

oo ooo

ooo o

oo

o

oo

o o

o

o

o

oo

oooo

o

o

o

o o

oo oo

o

ooo

o

o

o

oo

o

o

o o

o

oo

oo

oo

ooo

o

o

oo

o

oo

o

o

o

oo

oo

o

oo

o

o

o

oo

o

oo o

ooo oo

oo

oo

o

oo o

ooo

o

o

oo

o

oo

oo o

o

o

o o

o

o o

o

oooo

o

o

o o

oo

oo o

ooo

oo

o

ooo

oo

o

o

ooo

oo

oo o o

ooo

o

oo

o

o

oo

o

o

oo

o oo

oo

oo

o

o

o

o

oo

o

o

oo

o

o

o

o

o

o

o

o oo

o

oo

o

o

oo

o

o

o

o o

o

ooo

ooo

o

o

oo

oo

o

o

o

o

o

o

o

o

o

o

oo o

oo

o

o

o

oooo ooo

o

o

o

o

o

oo

o

o

oo

ooo

o

oo

oo

oo

o

o

o

oo o

o

ooo

oo

o

oo

o oo

oo

o

o

o

o

o o

o

o

oo

o

o

o

o

oo

o

oo

o

o

o

o

o

o

ooo o

o

oo

oo

o

o

oo

oo

o

o

ooo

oo

o

o o

o

oo

oo

o

o

oo

o

o

o

o

oo o

oo

oo oo

o

oo

o

o

o oooo

o o

oo

o

o

o

o oo o

o

o

o

oo oo

oo oo

ooo

o

o

oo

o

o

o

o

o

o

ooo

o

oo

o

o

o

o

o

o

ooo

oo o

oo

o

o

o

oo

o

o

oo o

o

o

o

ooo

oo

o

o

oo

ooo

o

oo ooo

o

oo

oo

oo oo

ooo

o

o

o

o

oo

o

o o

o

o

o

o

oo

o

o

o o

o o

o

oo

o

o

o

o

o

oo

oo

oo

o

o

o

o

o oo

o

o o

o

oo

oo

ooo

o

oo o

o

o oo oo o oooo

o

oo

oo

o

o

oo

oo o

ooo

o

ooo

o

ooo

o

o

ooo

o

o oo

oo

o

o

oo

oo

o o

o

o

o

o

oo

o o

oo

o

oo

o

o oo

oo

oo

ooo

oooo

oo

o

o

oo

oo

ooo o

oo

o

o

oo

oo

o oo

o

o

o

o

o

o

ooo

o o

o

oo

oo o

o

o

o

o

o

o

o

ooo

o o

o

o

o

o

oo oo

oo

o oooo

o

oo

oo

o

o

o

oo

ooo

oo

o

o

oo

oooo

o

ooo

o

o

o

o o

o

o

oo

o

oo

oo

o o

o oo

o

o

oo

o

oo

o

o

o

oo

oo

o

oo

o

o

o

oo

o

oo o

Component 3

ooooo oo

ooo

o

oo o

o

o

ooo

oo

ooo

o o

o

o

o

oo oooo

oo

o ooo

o ooo

o

ooo oo

o

oo

ooooo oo

o

o

oooo

oo oo

ooooo

oo o

o

o

o

ooooo

o

o

ooooo

oooooooo o

o

o oo

oo

o ooo oo

oo

o o

o

oo

o oooo

o oooooooo

o

oo oo oooo

o

o

o

oo

o oo oooo ooo oo

o

oooo

o ooo

o

oo

ooo

o

o

o

o

oooo oo

o

ooo ooo

o

o

o

o

o

o oo

oo

o

ooo

o

oo

o

o

oo oo oo

o

oo

oo

ooo

o

oo

o

ooo

o o

o

o o oooooooo

oooo

oo

o

ooooo

oo

oo ooo

o o

o o

oo

o

o

o

oo

ooo

oo

o

o

o

ooo

ooo

ooo

ooo

ooo

oooo

o

ooo

o o

o

oo o

oo

o

oo

ooo

o

o

ooo

ooo

o

oo

ooooo

ooooo oo

ooooo

oooo

ooo

oo

oo

o

o oo ooo

oooo

oo

o

o

ooooo

o ooo

ooo

o

o

oooo o

o

oo

o

oo

oo ooo oo

oo

oo

o

oo

oooo

ooo

oooo o

ooo

o

ooo oo

o

o

o

oo

oooo

oo ooo

ooo

o

o

o

o

oooo oo o

oooo

oo

oo

ooooo

oooooo

ooo

oo

oo

oo

oooo o

oo

oo

oo

o

ooo

oooo

o oo

o ooooooo

ooooooo oo

o oo

o

oooo o

o

o

oo o

oo

o o

oo

o

o

oooo

oo o

oo

ooo

o

o ooo

oooo o

ooo

o

ooooo

oo

o

oooo

o o

o

ooooo

o

oo o

oooo

o oooooo ooo

o

o

oo ooo

ooo

o

ooo o

o o

oooo o

oo

o oooo

o

o

o

o

o

oo

o

o oo

ooo

ooo ooo o

oo

ooo

o

o

oo ooo

ooo

oo

o

o

o

oooo oo

oo

oooo

ooooo

ooooo

o

ooo

o ooooo

o

o

oo oo

oooo

oooo o

oooo

o

o

ooo

oo

o

o

ooooooo

oooo ooo

o

o oo

ooo oooo

o

oo

o o

o

oo

oo oo o

oooo ooooo

o

o ooooooo

o

o

o

oo

oo ooo ooo oooo

o

oo oo

oo o o

o

ooooo

o

o

o

o

oo oo oo

o

o oo ooo

o

o

o

o

o

oooo

o

o

ooo

o

oo

o

o

oo oooo

o

oo

oo

ooo

o

ooo

oo

ooo

o

oooooo

o oooo

ooooo

o

oooo o

oo

ooo

oooo

oo

oo

o

o

o

oo

ooo

oo

o

o

o

oo o

ooo

ooo

ooo

ooo

oooo

o

o oo

oo

o

ooo

oo

o

oo

o oo

o

o

oo o

ooo

o

oooo

oo oooo

o o oooo

o oooo o

oo

oo

oo

oo

o

ooooo o

o ooo

oo

o

o

o oooo

o o oo

o oo

o

o

oo

ooo

o

oo

o

oo

oo oo ooo

oo

oo

o

oo

oooo

ooo

oo

oo oo oo

o

oo ooo

o

o

o

oo

o ooo

o oooo

ooo

o

o

o

o

oooo oo o

oooo

oo

oo

ooo ooo o

o oo oooo

oo

oo

ooo

oooooo

oo

oo

o

oooooo ooo

oooo ooo oo

ooo

oo oo oo

ooo

o

oo ooo

o

o

ooo

o o

o o

oo

o

o

ooo o

ooo

oo

ooo

o

oo oo

oooo o

o oo

o

ooo ooo

o

o

ooo o

oo

o

ooo oo

o

ooo

oooo

ooooo

ooo ooo

o

oooooooo

o

oooo

oo

oooo o

ooo oooo

o

o

o

o

o

o o

o

ooo

o

oo o

ooo

ooo ooo

o

o

oo

o

o

o

oo

oo

o

oo

o

ooo

oo

o

o

oo

o

oo

oo

o

o

o

o

o

ooo

o

oo

o

ooo

oo

oo

oo

o

o

o

o

o

oo

oooo

o

oo

o

oo

o

ooo

o

o

oo

oo

oo

oo

o

o

o

o o

o o

o

oo

o

ooo

oo

oooo

o

o oo o

o

o oo

o

o

o

ooo

o

oo

o

o

oo

o

o

ooo

o

oo

o

o

o

o

o

o

o

oo

o

o o

oo

o

ooo

o

oo

oo

o

o o

o

o o

o

oo

ooo

ooo ooo oooo oo

o

oo

o

o

o

o

o

o oo

o

o

o

o

o

o

o

o

o

o o

oo

oo

ooo

ooo oo

o

o

ooo

oo

ooo

o

ooo

ooo o

ooo

o

oo

ooo

o

o

oo o o

o

o

oo

o

o

oo

oo oo ooo

oo

o

oo

o

o

o

oooo

o o

oo

oo

oo

oo

o

o

o

o

o

o

oo

ooo

oo

ooo

ooo

ooo

o

o

o

o

o

o

oo

o

oo

o

ooo

oooo

o

o

o

o

o

oo

ooo

oo

o o

oo

ooo

o

oooo

oo

o

o

o

o

o

ooo

ooo

oo

oo

oooo

oo

o

oo

oo o

oooo

o

o

oo

o

o

oo

oo

oo

ooooo

o

oooo

oo

o

ooo

oo

oo

o

o

o

o

o

oo

o oo

o

oo

o

o

ooo

o

o

o

o

o

o

ooo

o o

o

o

o

o

oo

ooo

oo

o

o

o

o o

o

oo

o

oo o

oo

oo

oo

ooo

o

o

o

oo

o

o

oo o oo

oo

ooo

o

o

o

oo

ooo

ooo oo

o

o

oo

o o

o

o

oo

oo

o

o

ooooo

oo

oo

o

oo

oo

o

o

oo

ooo

o oo

oo

o

o

ooo

o

ooo o o

o

oo

oo

o

o

o o

o

o oo oo oo o

o

o

oo o

o

oo

oo o

o oo

o

o

o

o

o

o

o

oo

oo

o

o

o

oo

oo

o

o

o

ooooo

ooo

oo

oo

o

ooo

o oo

o

o

o

o oo

o o

o

o

o

oo

oo

o

o

o

oo o

ooo

oo

oooo o o

o

o

ooo

o

o

oo

oo

o

oo

o

oo o

oo

o

o

oo

o

oo

oo

o

o

o

o

o

ooo

o

oo

o

ooo

oo

oo

oo

o

o

o

o

o

oo

o ooo

o

o o

o

oo

o

ooo

o

o

oo

oo

oo

oo

o

o

o

oo

o o

o

oo

o

o oo

oo

oooo

o

ooo o

o

ooo

o

o

o

o oo

o

oo

o

o

oo

o

o

oo

o

o

oo

o

o

o

o

o

o

o

oo

o

oo

oo

o

oo

o

o

oo

oo

o

oo

o

oo

o

oo

ooo

oo

o oo ooo oooo

o

oo

o

o

o

o

o

ooo

o

o

o

o

o

o

o

o

o

oo

oo

ooo o

ooooo o

o

o

ooo

o o

oo oo

oo

o

ooooo

ooo

oo

oo o

o

o

oooo

o

o

oo

o

o

oo

o ooo oo o

oo

o

oo

o

o

o

oo

o o

oo

oo

oo

oo

o o

o

o

o

o

o

o

ooo

oo

oo

o oo

o ooo

oo

o

o

o

o

o

o

oo

o

oo

o

ooo

oooo

o

o

o

o

o

oo

oo

o

oo

oo

ooo

ooo

oo oo

oo

o

o

o

o

o

oooo

oo

oo

oo

o ooo

oo

o

oo

oo o

ooo

o

o

o

oo

o

o

oo

oo

oo

ooo oo

o

oooo

oo

o

oo

oo

o

oo

o

o

o

o

o

oo

ooo

o

oo

o

o

oo

o

o

o

o

o

o

o

oo

o

oo

o

o

o

o

o o

oooo

oo

o

o

o o

o

oo

o

ooo

oo

oo

oo

oo

oo

o

o

oo

o

o

oo oooo

oo

ooo

o

o

oo

ooo

ooo oo

o

o

oo

oo

o

o

oo

oo

o

o

ooo

oo

oo

ooo

o o

oo

o

o

o o

o oo

ooo

o o

o

o

o oo

o

oo o oo

o

oo

oo

o

o

o o

o

o ooo oo oo

o

o

oo o

o

oo

ooo

o oo

o

o

o

o

o

o

o

oo

oo

o

o

o

oo

oo

o

o

o

o oo o o

ooo

o o

oo

o

oo

o

ooo

o

o

o

o oo

oo

o

o

o

oo

oo

o

o

o

oo o

oooo

oo

oo oo o

o

o

oo

o

o

o

oo

o o

o

oo

o

ooo

ooo

o

oo

o

oo

oo

o

o

o

o

o

ooo

o

oo

o

ooo

oo

oo

oo

o

o

o

o

o

oo

o ooo

o

oo

o

oo

o

ooo

o

o

oo

ooo

o

oo

o

o

o

o o

o o

o

oo

o

o oo

oo

o ooo

o

oo o o

o

o oo

o

o

o

ooo

o

oo

o

o

oo

o

o

oo

o

o

oo

o

o

o

o

o

o

o

oo

o

oo

ooo

oo

o

o

oo

oo

o

o o

o

o o

o

oo

ooo

ooooo oo oo o oo

o

oo

o

o

o

o

o

ooo

o

o

o

o

o

o

o

o

o

oo

oo

ooooo

o oooo

o

o

ooo

oo

ooo

o

oo

o

ooo o

oo o

o

oo

ooo

o

o

ooo o

o

o

oo

o

o

oo

oo oo ooo

oo

o

oo

o

o

o

oo

oo

o o

oo

oo

oo

oo

o

o

o

o

o

o

oo

oo

ooo

ooo

o oo

oo

o

o

o

o

o

o

o

oo

o

oo

o

o oo

oooo

o

o

o

o

o

oo

oo

o

ooo o

oo

ooo

o

o ooo

oo

o

o

o

o

o

ooo

ooo

oo

oo

ooo o

oo

o

o o

oo o

ooo o

o

o

oo

o

o

oo

oo

oo

ooo oo

o

oooo

oo

o

oo

ooo

o o

o

o

o

o

o

oo

ooo

o

oo

o

o

ooo

o

o

o

o

o

o

oo o

o o

o

o

o

o

oo

ooo

oo

o

o

o

o o

o

oo

o

ooo

oo

oo

ooo

oo

o

o

o

oo

o

o

oo ooo

oo

oo o

o

o

o

oo

ooo

oo ooo

o

o

oo

o o

o

o

oo

oo

o

o

oo ooo

oo

oo

o

oo

oo

o

o

o o

o oo

o oo

oo

o

o

ooo

o

ooo o o

o

oo

oo

o

o

oo

o

ooooo ooo

o

o

ooo

o

oo

ooo

o oo

o

o

o

o

o

o

o

ooo

o

o

o

o

oo

oo

o

o

o

o ooo o

oo o

oo

oo

o

oo

o

ooo

o

o

o

oo o

oo

o

o

o

oo

o o

o

o

o

oo Component

4

oo

o

oo

oo

o oo

o

o oo o

oo

oo

oo

ooo o

o

o

oo oo

oo

oo

ooo

oo

o

o

o oo

oooo

o

oo

ooo

o oo

oo

oo

o

oo oo

oo

ooooooo

o

oo

ooooo

oooo

o

o

oooo

o

o

ooooo

o

o

o oo

oooo

oooo

oo

o

o

oo

o

o oo

o

oo o

ooooo

ooo

oo o

oo

oo

oo

o

o

o

o

o

o

o

o

oo oo

o o

o

ooo

o

o oo

o

o

o o

o

o

o

o

o

oo

o ooo o

oo

o

oo

oo

o

oo

o

o

o

o

o

o o

ooo

oo ooo oo

o

oooo

oo

o

oo

ooo

o

o

o

oo o

oo

oo o

o

ooooooo

oooo

o

o

ooooo

ooooo

o

o o ooo

oo

o

o oo

oooo

oo

oo oo o

o

o oo

o oo

ooo o

oo

oo

ooo

o

oooooo oo

oo

ooo

ooo ooooo

oo

oo o

ooooo

oo

o

oo

o

o

oooo

oo

oooooo

oo

oo

oo oo

oo o

o

o

oo

oo oo oo

o

o

o

ooo oo ooo

oo

oo o ooo oo oo

o

oooo o oo

ooo o

ooo

oo

ooo

ooooo

oooooo

o

oo oo

o

ooo

ooo

o

o oooo

o

o

oo

o oo

o

o

oo oo

o

ooo

ooo

ooo

oo o

o

oo

oo

ooo

o

oo oo

oooo

o

o

o

ooo

ooo o

ooo

o

oo

o

o

o

o

oooooo

oo

ooo

oooo

ooo

ooooo

ooooo ooo o

o

oo o

o

o

oo

oo

ooo ooooo oooo

oo

o

o

o

o

o

o ooo

o

o

o

oo

ooo

o oo oo ooo

oo

o

o

o

ooooo

ooo o

o

o

ooooo

o

oo

o

oo

o o

oo

o

oo

ooooooo

o

o

o

oo

o

o

o

oo

oo

o

ooo

oooo o

oo

o

o

o

o

o

o

o

o

oo

o

oo o

oo

o

o

oo

o

oo o

ooo

o

oo

oooo

o

ooo

oo

o

o

o

o

o

o

o

o

o

o

oooo

oo

o

ooo

o

o

oo o

oo oo

oo

o

oo

ooo

o

o

o o

o

o oo

o

o

oo

o

o

o

oo

o

o

oo

oo

o

o oo

o o ooooo

ooo

o

o o

ooo

o

o

o

o

oo

o

o

o

o

ooo

ooo

o

ooo

o

o

oo

oo

o

o oo

oo

o

o

oo

o

o

o

o

o o

o

ooo

oo

ooo

ooo

o

o

o

o

o oo o

ooo

o

oo

oo

oo

oo

oo

oo

oo

o oo

oo

o

o

o

oo o

ooo

ooo

o

o oooo

ooo

o

o

oo

o

o

o

oo

o oo

o

ooo

o

o

oo

oo

o

o

oo o

o oo

o

oo

oo

o

o

oo

o

oo

ooo ooo

o

o o

o

o

ooo

o

o

oooo

o

oo oo

oo

oo

o

oooo

o

o

o

o

o

o o

o

o

ooo

o

o

o o

o

o

ooo

oo

oo

o

o

ooo

o

oo

oo

oo

o o

o

o

oo

oo

o

o

o

o

oo

oo

o

o

o

o

oo o

ooo o

o

ooo

oo oooo

o

ooo

o

o

o

o

o

oo

o

o

o

oo

o

o

o

ooooo

o

oo

o

o oo

o o

o

oo

o

o

oo

o

o

o ooo

oo

ooo

oo

ooo

o oooo

o

o o

o

oo

oo

o

o

o

oo

oo

oo

o

oo

o o oooo

o

oo oo

o

oo

o

o

o

o

oo

o

oo

oo

o

o

o

o

o

o

o o

o

o

o

o

oo

o oo

o

o

o

o

oo

o

oo

o

o

o

oo

ooo

oooo

o

oo

oooo

o

ooo

o

oo oo

oooo

oo o oo

ooo

o

o

o

oo

o

oo

oo oo

o

o

o oo

o

oo

o

o

o

oo

o oo

o

o

oo

o

o

o

oo

o oo

ooo

o

oo

o

oo

o

o

o

oo

o o

o

o

o

oo

oo

o

o

oo

oo

o

o

oo

oo

oooo

o

oooo

o

oo

o

o

o

o

ooo

o

oo

o

ooo o

o

o

o

oo

o

o

ooo

o

o

ooo

o

oo

o

ooo

oo

o

o

oo

o

ooo

ooo

o

oo

oooo

o

o oo

oo

o

o

o

o

o

o

o

o

o

o

oo

o oooo

ooo

o

o

ooo

oo oo

oo

o

oo

o oo

o

o

o o

o

oooo

o

oo

o

o

o

oo

o

o

ooo o

o

o oo

oo ooo oo

o oo

o

oo

ooo

o

o

o

o

oo

o

o

o

o

ooooo

o

o

o oo

o

o

oo

oo

o

o oo

oo

o

o

oo

o

o

o

o

oo

o

ooo

oo

o oo

o oo

o

o

o

o

ooo o

o oo

o

oo

oo

oo

o o

oo

oo

oo

ooo

oo

o

o

o

oo o

oo

oo

oo

o

oo ooo

oo

o

o

o

oo

o

o

o

oo

o oo

o

o oo

o

o

ooo

o

o

o

ooo

ooo

o

oo

oo

o

o

oo

o

oo

oooo

oo

o

o o

o

o

oo

o

o

o

o o oo

o

o o oo

oo

oo

o

oo

oo

o

o

o

o

o

oo

o

o

oooo

ooo

o

o

ooo

oo

oo

o

o

oo o

o

oo

oo

oo

oo

o

o

oo

oo

o

o

o

o

oo

o oo

o

o

o

ooo

ooo o

o

ooo

ooo ooo

o

oooo

o

o

o

o

oo

o

o

o

oo

o

o

o

ooo

oo

o

oo

o

ooo

o o

o

oo

o

o

oo

o

o

o ooo

oo

ooo

oo

ooo

o oooo

o

oo

o

oo

oo

o

o

o

oo

oo

oo

o

oo

oo oo oo

o

o o oo

o

oo

o

o

o

o

oo

o

oooo

o

o

o

o

o

o

oo

o

o

o

o

oo

o oo

o

o

o

o

oo

o

oo

o

o

o

o o

oooo

ooo

o

oo

ooo

o

o

ooo

o

ooo o

oooo

oo ooo

ooo

o

o

o

o o

o

oo

o oo o

o

o

ooo

o

oo

o

o

o

oo

ooo

o

o

oo

o

o

o

o o

oo o

o ooo

ooo

oo

o

o

o

oo

oo

o

o

o

oo

oo

o

o

ooo

o

o

o

oo

oo o

oo

oo

o o oo

o

oo

o

o

o

o

ooo

o

oo

o

o oo o

o

o

o

oo

o

o

o oo

oo

ooo

o

oo

o

ooo

oo

o

o

oo

o

ooo

o oo

o

oo

ooo o

o

ooo

oo

o

o

o

o

o

o

o

o

o

o

oo

oo oo

o

ooo

o

o

ooo

oo oo

oo

o

oo

ooo

o

o

o o

o

ooo

o

o

oo

o

o

o

oo

o

o

oo

oo

o

o oo

oo oo ooo

o oo

o

oo

ooo

o

o

o

o

oo

o

o

o

o

o oo

ooo

o

oo o

o

o

oo

oo

o

o oo

o o

o

o

oo

o

o

o

o

oo

o

ooo

oo

ooo

o oo

o

o

o

o

oo

o o

o oo

o

oo

oo

oo

oo

ooo

o

oo

o oo

oo

o

o

o

oo o

ooo

ooo

o

o oo oo

oo

o

o

o

oo

o

o

o

oo

ooo

o

ooo

o

o

oo

oo

o

o

o oo

ooo

o

oo

oo

o

o

oo

o

oo

oooo

oo

o

o o

o

o

oo

o

o

o

o ooo

o

ooo o

oo

oo

o

ooo

o

o

o

o

o

o

oo

o

o

ooo o

o

oo

o

o

oo o

oo

oo

o

o

ooo

o

oo

oo

oo

oo

o

o

oo

oo

o

o

o

o

oooo

o

o

o

o

ooo

o oo o

o

oo o

ooooo o

o

ooo

o

o

o

o

o

o o

o

o

o

o o

o

o

o

ooo o

oo

oo

o

o oo

oo

o

oo

o

o

oo

o

o

o ooo

oo

ooo

oo

oo

o

o ooo o

o

oo

o

o o

oo

o

o

o

oo

ooo

o

o

oo

oo oooo

o

oo oo

o

oo

o

o

o

o

oo

o

oo

oo

o

o

o

o

o

o

oo

o

o

o

o

oo

ooo

o

o

o

o

oo

o

oo

o

o

o

o o

ooo

ooo o

o

oo

ooo

o

o

oo o

o

ooo o

oo oo

o o ooo

o oo

o

o

o

oo

o

oo

oo oo

o

o

o oo

o

oo

o

o

o

oo

ooo

o

o

oo

o

o

o

oo

ooo

ooo

o

oo

o

oo

o

o

o

oo

o o

o

o

o

oo

oo

o

o

oo

oo

o

o

oo

oo

oo

oo

o

oo oo

o

oo

o

o

o

o

oo o

o

o o

o

o ooo

o

o

o

oo

o

o

ooo

oo

o oo

o

oo

o

oo o

oo

o

o

oo

o

o oo

ooo

o

oo

o oo o

o

ooo

oo

o

o

o

o

o

o

o

o

o

o

oo

oooo

o

ooo

o

o

ooo

o o oo

oo

o

oo

o ooo

o

oo

o

o oo

o

o

oo

o

o

o

oo

o

o

oo

oo

o

o oo

ooo oooo

ooo

o

oo

oooo

o

o

o

oo

o

o

o

o

o oooo

o

o

oo o

o

o

oo

oo

o

ooo

oo

o

o

oo

o

o

o

o

oo

oo

ooo

oo oo

o oo

o

o

o

o

oo

o o

ooo

o

oo

oo

oo

oo

ooo

o

o o

ooo

oo

o

o

o

oo o

oo

oo

oo

o

ooo oo

oo

o

o

o

ooo

o

o

oo

o oo

o

ooo

o

o

oo

ooo

o

ooo

o oo

o

oo

oo

o

o

oo

o

oo

o ooo

oo

o

oo

o

o

oo

o

o

o

oooo

o

oooo

oo

oo

o

ooo

o

o

o

o

o

o

o o

o

o

oo o

o

o

o o

o

o

oo o

oo

oo

o

o

o o o

o

oo

oo

oooo

o

o

oo

oo

o

o

o

o

oo

ooo

o

o

o

oo o

oooo

o

oo

oo ooooo

o

ooo

o

o

o

o

o

oo

o

o

o

oo

o

o

o

ooo

oo

o

oo

o

o oo

o o

o

oo

o

o

oo

o

o

oooo

oo

ooo

oo

oo

o

ooo o o

o

oo

o

o o

oo

o

o

o

oo

oo

oo

o

oo

ooo ooo

o

oooo

o

oo

o

o

o

o

o o

o

oo

oo

o

o

o

o

o

o

oo

o

o

o

o

oo

ooo

o

o

o

o

oo

o

oo

o

o

o

o o

ooo

oooo

o

oo

ooo

o

o

ooo

o

ooo o

ooo

oo o oo

o

ooo

o

o

o

o o

o

oo

ooo o

o

o

ooo

o

oo

o

o

o

oo

ooo

o

o

oo

o

o

o

oo

ooo

ooo

o

oo

o

oo

o

o

o

oo

oo

o

o

o

oo

oo

o

o

oo

oo

o

o

oo

o

oo

oo

oo

oooo

o

oo

o

o

o

o

ooo

o

o o

o

o ooo

o

o

o

oo

o

o

o oo

o

o

o oo Component

5

PC

A C

ompo

nent

s

ICA Components

FIGURE 14.39. A comparison of the first five ICAcomponents computed using FastICA (above diagonal)with the first five PCA components(below diagonal).Each component is standardized to have unit variance.

Page 40: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

Mean ICA 1 ICA 2 ICA 3 ICA 4 ICA 5

FIGURE 14.40. The highlighted digits from Fig-ure 14.39. By comparing with the mean digits, we seethe nature of the ICA component.

Page 41: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

FIGURE 14.41. Fifteen seconds of EEG data (of1917 seconds) at nine (of 100) scalp channels (toppanel), as well as nine ICA components (lower panel).While nearby electrodes record nearly identical mixturesof brain and non-brain activity, ICA components aretemporally distinct. The colored scalps represent theICA unmixing coefficients aj as a heatmap, showingbrain or scalp location of the source.

Page 42: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

a b c

d e f

g h i

j k l

m n o

p q r

Distribution

Am

ari D

ista

nce

from

Tru

e A

a b c d e f g h i j k l m n o p q r

0.01

0.02

0.05

0.10

0.20

0.50 FastICA

KernelICAProdDenICA

FIGURE 14.42. The left panel shows 18 distributionsused for comparisons. These include the “t”, uniform,exponential, mixtures of exponentials, symmetric andasymmetric Gaussian mixtures. The right panel shows(on the log scale) the average Amari metric for eachmethod and each distribution, based on 30 simulationsin IR2 for each distribution.

Page 43: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

First MDS Coordinate

Sec

ond

MD

S C

oord

inat

e

-1.0 -0.5 0.0 0.5 1.0

-1.0

-0.5

0.0

0.5

1.0

•• •

••

••

••

••

• ••

••

•• •

••• •

• •

••

••

••

• •

• ••

••

• •

• •

• ••

• •

••

•• •

••

• •

••

••

••

••

FIGURE 14.43. First two coordinates for half-spheredata, from classical multi-dimensional scaling.

Page 44: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

−5 0 5

−15

−10

−5

0

Classical MDS

−5 0 5

−15

−10

−5

0

Local MDS

x1x1

x2

x2

FIGURE 14.44. The orange points show data ly-ing on a parabola, while the blue points shows multi-dimensional scaling representations in one dimension.Classical multidimensional scaling (left panel) does notpreserve the ordering of the points along the curve, be-cause it judges points on opposite ends of the curve tobe close together. In contrast, local multidimensionalscaling (right panel) does a good job of preserving theordering of the points along the curve.

Page 45: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

FIGURE 14.45. Images of faces mapped into the em-bedding space described by the first two coordinates ofLLE. Next to the circled points, representative facesare shown in different parts of the space. The imagesat the bottom of the plot correspond to points along thetop right path (linked by solid line), and illustrate oneparticular mode of variability in pose and expression.

Page 46: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

Page 2

Page 3

Page 4

Page 1

FIGURE 14.46. PageRank algorithm: example of asmall network

Page 47: FIGURE 14.1. Simplifications for association rules. and ...hastie/ElemStatLearnII_figures/figures14.pdf · Elements of Statistical Learning (2nd Ed.) c Hastie, Tibshirani & Friedman

Elements of Statistical Learning (2nd Ed.) c©Hastie, Tibshirani & Friedman 2009 Chap 14

Page 2

Page 1

Page 3

Page 4Page 5

Page 6

FIGURE 14.47. Example of a small network.