spring 2012 bioe 2630 (pitt) : 16-725 (cmu ri) 18-791 (cmu ece) : 42-735 (cmu bme)

20
The content of these slides by John Galeotti, © 2012 Carnegie Mellon University (CMU), was made possible in part by NIH NLM contract# HHSN276201000580P, and is licensed under a Creative Commons Attribution-NonCommercial 3.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc/3.0/ or send a letter to Creative Commons, 171 2nd Street, Suite 300, San Francisco, California, 94105, USA. Permissions beyond the scope of this license may be available either from CMU or by emailing [email protected]. The most recent version of these slides may be accessed online via http://itk.galeotti.net/ Lecture 3 Math & Probability Background ch. 1-2 of Machine Vision by Wesley E. Snyder & Hairong Qi Spring 2012 BioE 2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME) Dr. John Galeotti

Upload: oona

Post on 23-Feb-2016

32 views

Category:

Documents


0 download

DESCRIPTION

Lecture 3 Math & Probability Background ch. 1-2 of Machine Vision by Wesley E. Snyder & Hairong Qi. Spring 2012 BioE 2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME) Dr. John Galeotti. General notes about the book. The book is an overview of many concepts - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

The content of these slides by John Galeotti, © 2012 Carnegie Mellon University (CMU), was made possible in part by NIH NLM contract# HHSN276201000580P, and is licensed under a Creative Commons Attribution-NonCommercial 3.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc/3.0/ or send a letter to Creative Commons, 171 2nd Street, Suite 300, San Francisco, California, 94105, USA. Permissions beyond the scope of this license may be available either from CMU or by emailing [email protected] most recent version of these slides may be accessed online via http://itk.galeotti.net/

Lecture 3Math & Probability Background

ch. 1-2 of Machine Vision by Wesley E. Snyder & Hairong Qi

Spring 2012 BioE 2630 (Pitt) : 16-725 (CMU RI)

18-791 (CMU ECE) : 42-735 (CMU BME)

Dr. John Galeotti

Page 2: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

General notes about the bookThe book is an overview of many conceptsTop quality design requires:

Reading the cited literatureReading more literatureExperimentation & validation

2

Page 3: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

Two themes

ConsistencyA conceptual tool implemented in many/most

algorithmsOften must fuse information from local

measurements to make global conclusions about the image

OptimizationMathematical mechanismThe “workhorse” of machine vision

3

Page 4: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

Image Processing Topics

EnhancementCoding

CompressionRestoration

“Fix” an imageRequires model of image degradation

Reconstruction

4

Page 5: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

Machine Vision Topics

AKA: Computer vision Image analysis Image understanding

Pattern recognition:1. Measurement of features

Features characterize the image, or some part of it

2. Pattern classificationRequires knowledge about the possible classes

5

Our Focus

Feature Extraction

Classification & Further Analysis

Original Image

Page 6: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

Feature measurement

6

Noise removal

Segmentation

Original Image

Shape Analysis Consistency Analysis

Matching

Features

RestorationCh. 6-7

Ch. 8

Ch. 10-11

Ch. 12-16

Ch. 9

Page 7: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

Probability

Probability of an event a occurring:Pr(a)

IndependencePr(a) does not depend on the outcome of event b, and

vice-versaJoint probability

Pr(a,b) = Prob. of both a and b occurringConditional probability

Pr(a|b) = Prob. of a if we already know the outcome of event b

Read “probability of a given b”

7

Page 8: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

Probability for continuously-valued functionsProbability distribution function:

P(x) = Pr(z<x)Probability density function:

8

Page 9: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

Linear algebra

Unit vector: |x| = 1Orthogonal vectors: xTy = 0Orthonormal: orthogonal unit vectorsInner product of continuous functions

Orthogonality & orthonormality apply here too

9

Page 10: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

Linear independence

No one vector is a linear combination of the othersxj aixi for any ai across all i j

Any linearly independent set of d vectors {xi=1…d} is a basis set that spans the space d

Any other vector in d may be written as a linear combination of {xi}

Often convenient to use orthonormal basis setsProjection: if y=aixi then ai=yTxi

10

Page 11: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

Linear transforms

= a matrix, denoted e.g. AQuadratic form:

Positive definite:Applies to A if

11

Page 12: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

More derivativesOf a scalar function of x:

Called the gradientReally important!

Of a vector function of x

Called the JacobianHessian = matrix of 2nd derivatives of a scalar function

12

Page 13: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

Misc. linear algebra

Derivative operatorsEigenvalues & eigenvectors

Translates “most important vectors” Of a linear transform (e.g., the matrix A)

Characteristic equation:A maps x onto itself with only a change in length is an eigenvaluex is its corresponding eigenvector

13

Page 14: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

Function minimization

Find the vector x which produces a minimum of some function f (x)x is a parameter vector f(x) is a scalar function of x The “objective function”

The minimum value of f is denoted:

The minimizing value of x is denoted:

14

Page 15: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

Numerical minimization

Gradient descentThe derivative points away from the minimumTake small steps, each one in the “down-hill” direction

Local vs. global minimaCombinatorial optimization:

Use simulated annealingImage optimization:

Use mean field annealing

15

Page 16: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

Markov models

For temporal processes:The probability of something happening is dependent on

a thing that just recently happened.For spatial processes

The probability of something being in a certain state is dependent on the state of something nearby.

Example: The value of a pixel is dependent on the values of its neighboring pixels.

16

Page 17: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

Markov chain

Simplest Markov modelExample: symbols transmitted one at a time

What is the probability that the next symbol will be w?For a Markov chain:

“The probability conditioned on all of history is identical to the probability conditioned on the last symbol received.”

17

Page 18: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

Hidden Markov models (HMMs)

18

1st MarkovProcess

2nd MarkovProcess

f (t)

f (t)

Page 19: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

HMM switching

Governed by a finite state machine (FSM)

19

Output 1st

Process

Output 2nd

Process

Page 20: Spring 2012 BioE  2630 (Pitt) : 16-725 (CMU RI) 18-791 (CMU ECE) : 42-735 (CMU BME)

The HMM Task

Given only the output f (t), determine:1. The most likely state sequence of the switching FSM

Use the Viterbi algorithm Computational complexity =

(# state changes) * (# state values)2

Much better than brute force, which =(# state values)(# state changes)

2. The parameters of each hidden Markov model Use the iterative process in the book Better, use someone else’s debugged code that they’ve shared

20