cs 188 section 12 - daylen · other classifiers discussed nearest neighbors parametric /...
TRANSCRIPT
![Page 1: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings](https://reader033.vdocuments.us/reader033/viewer/2022050105/5f433ae072d75b00ad23432b/html5/thumbnails/1.jpg)
CS 188 SECTION 12These slides are on Piazza! Search for “Daylen’s slides”
![Page 2: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings](https://reader033.vdocuments.us/reader033/viewer/2022050105/5f433ae072d75b00ad23432b/html5/thumbnails/2.jpg)
CORRECTION: LAPLACE SMOOTHINGLaplaceSmoothing
▪ Laplace’sestimate(extended):▪ Pretendyousaweveryoutcomekextratimes
▪ What’sLaplacewithk=0?▪ kisthestrengthoftheprior
▪ Laplaceforconditionals:▪ Smootheachconditionindependently:
r r b
Number of events that X can take on
![Page 3: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings](https://reader033.vdocuments.us/reader033/viewer/2022050105/5f433ae072d75b00ad23432b/html5/thumbnails/3.jpg)
CALCULUS REVIEW SECTIONS
➤ Session 1: today; 6-7:30 pm, Soda 405: single variable calculus
➤ Session 2: today; 7:30-9 pm, Soda 405: identical content to session 1
➤ Session 3: tomorrow; 6-7.30 pm, Soda 380; multi variable calculus
➤ Session 4: tomorrow; 7.30-9 pm, Soda 380; identical content to session 3
![Page 4: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings](https://reader033.vdocuments.us/reader033/viewer/2022050105/5f433ae072d75b00ad23432b/html5/thumbnails/4.jpg)
UPCOMING DEADLINES
➤ Project 5 due today @ 5pm
➤ HW 6 due Wednesday @ 11:59
➤ Project 6 due Sunday @ 5pm
➤ Final Exam next Thursday
![Page 5: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings](https://reader033.vdocuments.us/reader033/viewer/2022050105/5f433ae072d75b00ad23432b/html5/thumbnails/5.jpg)
BINARY PERCEPTRONSLinearClassifiers
▪ Inputsarefeaturevalues▪ Eachfeaturehasaweight▪ Sumistheactivation
▪ Iftheactivationis:▪ Positive,output+1▪ Negative,output-1
Σf1f2f3
w1w2w3
>0?
![Page 6: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings](https://reader033.vdocuments.us/reader033/viewer/2022050105/5f433ae072d75b00ad23432b/html5/thumbnails/6.jpg)
BINARY PERCEPTRONS Learning:BinaryPerceptron
▪ Startwithweights=0▪ Foreachtraininginstance:▪ Classifywithcurrentweights
▪ Ifcorrect(i.e.,y=y*),nochange!▪ Ifwrong:adjusttheweightvectorbyaddingorsubtractingthefeaturevector.Subtractify*is-1.
![Page 7: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings](https://reader033.vdocuments.us/reader033/viewer/2022050105/5f433ae072d75b00ad23432b/html5/thumbnails/7.jpg)
MULTICLASS PERCEPTRONS
MulticlassDecisionRule
▪ Ifwehavemultipleclasses:▪ Aweightvectorforeachclass:
▪ Score(activation)ofaclassy:
▪ Predictionhighestscorewins
Binary=multiclasswherethenegativeclasshasweightzero
Learning:MulticlassPerceptron
▪ Startwithallweights=0▪ Pickuptrainingexamplesonebyone▪ Predictwithcurrentweights
▪ Ifcorrect,nochange!▪ Ifwrong:lowerscoreofwronganswer,
raisescoreofrightanswer
MulticlassDecisionRule
▪ Ifwehavemultipleclasses:▪ Aweightvectorforeachclass:
▪ Score(activation)ofaclassy:
▪ Predictionhighestscorewins
Binary=multiclasswherethenegativeclasshasweightzero
![Page 8: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings](https://reader033.vdocuments.us/reader033/viewer/2022050105/5f433ae072d75b00ad23432b/html5/thumbnails/8.jpg)
OTHER CLASSIFIERS DISCUSSED
➤ Support Vector Machines
SupportVectorMachines
▪ Maximizingthemargin:goodaccordingtointuition,theory,practice▪ Onlysupportvectorsmatter;othertrainingexamplesareignorable▪ Supportvectormachines(SVMs)findtheseparatorwithmaxmargin▪ Basically,SVMsareMIRAwhereyouoptimizeoverallexamplesatonce
MIRA
SVM
SupportVectorMachines
▪ Maximizingthemargin:goodaccordingtointuition,theory,practice▪ Onlysupportvectorsmatter;othertrainingexamplesareignorable▪ Supportvectormachines(SVMs)findtheseparatorwithmaxmargin▪ Basically,SVMsareMIRAwhereyouoptimizeoverallexamplesatonce
MIRA
SVM
![Page 9: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings](https://reader033.vdocuments.us/reader033/viewer/2022050105/5f433ae072d75b00ad23432b/html5/thumbnails/9.jpg)
OTHER CLASSIFIERS DISCUSSED
➤ Nearest Neighbors
Parametric/Non-Parametric
▪ Parametricmodels:▪ Fixedsetofparameters▪ Moredatameansbettersettings
▪ Non-parametricmodels:▪ Complexityoftheclassifierincreaseswithdata▪ Betterinthelimit,oftenworseinthenon-limit
▪ (K)NNisnon-parametric Truth
2Examples 10Examples 100Examples 10000Examples
![Page 10: CS 188 SECTION 12 - Daylen · OTHER CLASSIFIERS DISCUSSED Nearest Neighbors Parametric / Non-Parametric Parametric models: Fixed set of parameters More data means better settings](https://reader033.vdocuments.us/reader033/viewer/2022050105/5f433ae072d75b00ad23432b/html5/thumbnails/10.jpg)
WORKSHEET