cs6716 pattern recognition - college of computingafb/classes/cs7616-spring2014/... · cs7616...

22
Random Forests CS7616 Pattern Recognition – A. Bobick Aaron Bobick School of Interactive Computing CS6716 Pattern Recognition Random Forests

Upload: phamdien

Post on 09-Feb-2018

217 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

Aaron Bobick School of Interactive Computing

CS6716 Pattern Recognition Random Forests

Page 2: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

Administrivia • Shray says the problem set is “close to done”

• Today chapter 15 of the Hastie book.

• Very few slides brought to you by Peter Orbanz with huge

thanks to A. Criminisi…

Page 3: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

Recall Bootstrap Method • General tool for assessing statistical accuracy. • Suppose we have a model to fit the training data:

• The idea is to draw random samples with replacement of size 𝑁𝑁 from the training data. This process is repeated 𝐵𝐵 times to get 𝐵𝐵 bootstrap datasets.

• Refit the model to each of the bootstrap datasets and examine the behavior of the fits over 𝐵𝐵 replications.

{( , ), 1,..., }i iZ x y i N= =

Page 4: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

Recall Bootstrap The basic idea: Randomly draw datasets with replacement from the training data, each sample the same size as the original training set

Page 5: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

Boostrapping in regression 95% Conf

For N=200, 95% means 5th smallest

and biggest

Page 6: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

Recall Bagging • Introduced by Breiman (Machine Learning, 1996). • Acronym for ‘Bootstrap aggregation’ . • In regression, it averages the prediction over a collection of

bootstrap samples, thus reducing the variance in prediction. • For classification, a committee (or ensemble – later) of

classifiers each cast a vote for the predicted class.

Page 7: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

• Let 𝐺𝐺� be a classifier for a K-class response. Consider an underlying indicator vector function

the entry in the 𝑖𝑖𝑡𝑡𝑡 place is 1 if the prediction for 𝑥𝑥 is the 𝑖𝑖𝑡𝑡𝑡

class, such that

• Then the bagged estimate 𝑓𝑓𝑏𝑏𝑏𝑏𝑏𝑏 𝑥𝑥 = 𝑝𝑝1, … , 𝑝𝑝𝐾𝐾 where 𝑝𝑝𝑘𝑘 is the proportion of base classifiers predicting class 𝑘𝑘 at 𝑥𝑥 where 𝑘𝑘 = 1, . . . ,𝐾𝐾.

• Finally:

Recall Bagging for classification

ˆˆ ( ) arg max ( ).kG x f x=

ˆ ( ) (0,...,0,1,0,...,0)f x =

).(ˆmaxarg)(ˆ xfxG bagk

bag =

Page 8: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

Recall Bagging Trees Notice the bootstrap trees are different than the original tree

Page 9: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

Good bagging

Page 10: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

The key is independence

Page 11: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

“The key is independence…” • Bias is the difference between the expected value of a

regressor/classifier and the actual value. If you build N trees from the same distribution of data you’d expect the same expected value for all. So the bias is what the bias is. You can’t really do anything about that.

• And for any given tree, the variance (the expected square value between the estimate and the truth) is what the variance is.

• But if the trees are independently constructed then the average of the trees will have the same bias but reduced variance.

• The goal of random forests is (according to AFB) is to produce trees that are as independent as possible.

Page 12: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

From Bagging and Boosting to Random Forests • Bagging works particularly well for trees, since trees have high

variance but low bias (why?). • But, boosting typically outperforms bagging with trees. • Why? The main culprit is usually dependence: Boosting is

better at reducing correlation between the trees than bagging is.

• Random Forests: modification of bagging with trees designed to reduce correlation. • Tree training optimizes each split over all dimensions. • Random forests choose a different subset of dimensions at each split. • Optimal split is chosen within the subset. • The subset is chosen at random out of all dimensions {1, . . . , m}.

Page 13: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

Random Forest Algorithm

Page 14: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

RFs vs Bagging

Page 15: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

RFs vs Bagging vs Gradient Boosting(?)

Page 16: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

Some RF details • Typically 𝑚𝑚 = 𝑝𝑝 for classication. But this varies. “In practice

the best values for this parameter will depend on the problem.” (Duh)

• “Out of bag samples” - to estimate test error , for each 𝒙𝒙𝑖𝑖 average/vote only the trees which were constructed from bootstrap samples that did not contain 𝒙𝒙𝑖𝑖.

• If the number of irrelevant variables is high, can be a problem. But not if enough good ones.

Page 17: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

Noise variables… but…

Page 18: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

Tree size not that big a factor

Page 19: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

Analysis of RFs • Assume each tree has a variance of 𝜎𝜎2

• If they were i.i.d. then the average of B trees would be 𝜎𝜎2

𝐵𝐵

• But they are not independent. Assume a (positive) pairwise correlation between trees of 𝜌𝜌. Then variance of the average is:

𝜌𝜌𝜎𝜎2 + 1 − 𝜌𝜌𝐵𝐵

𝜎𝜎2

• As 𝐵𝐵 → ∞ variance is determined by 𝜌𝜌. • BY selecting only m each node to consider, decrease

correlation.

Page 20: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

Empirical effect of m

Page 21: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

Relation to K-NN • A deep(maximal) tree finds a “path” down to an individual

training element. Much like KNN

Page 22: CS6716 Pattern Recognition - College of Computingafb/classes/CS7616-Spring2014/... · CS7616 Pattern Recognition – A. Bobick Random Forests Aaron Bobick . School of Interactive

Random Forests CS7616 Pattern Recognition – A. Bobick

Now for the tutorial…