ceng740 new approaches and applications of...

4
Name: CENG740 New Approaches and Applications of Pattern Analysis Final Exam, June 7 th , 2011, 150 Minutes Q1. Consider the probability density functions which represent 2-classes: ! ! ! ! ! , ! ! = ! ! ! + ! ! ! ! ! ! ! ! ! ! ! !! 1 ! ! ! !! , where ! ! = ! ! = 1 and ! ! = ! ! = 2 a) Which density has greater entropy? Verify your answer by evaluating the entropies of ! ! and ! ! . b) Find Kullback-Leibler divergence !"(! ! | ! ! . c) Do ! ! and ! ! belong to the exponential family? Verify your answer by converting ! ! and ! ! to into exponential form and identify the parameters.

Upload: tranliem

Post on 19-May-2018

214 views

Category:

Documents


1 download

TRANSCRIPT

Name:

CENG740

New Approaches and Applications of Pattern Analysis

Final Exam, June 7th, 2011, 150 Minutes

Q1. Consider the probability density functions which represent 2-classes:

!! ! !! , !! =! !! + !!! !! ! !!

!!!!! 1− ! !!!!,

where !! = !! = 1 and !! = !! = 2

a) Which density has greater entropy? Verify your answer by evaluating the entropies of !! and !!.

b) Find Kullback-Leibler divergence !"(!!| !! . c) Do !! and !! belong to the exponential family? Verify your answer by

converting !! and !! to into exponential form and identify the parameters.

Q2. Consider the following regression model

! !,! = !!!! + !!! + !!

to be fitted to the following data set:

! = !!, !! !!!! = 0,1 , 1,0 , 2,2 ,

where ! ! !,!,! =!(!|! !,! ,!!!).

a. Find the maximum likelihood estimate of !!, !! and !!. b. Estimate the precision !. c. Find the regularized least squares estimate of !!, !! and !! for ! = 1.

Q3. Consider the following 2-dimensional data for classification

!! = 1,0 , 1,0.5 , 1,−1 ,

!! = 0,1 , 0,0.5 , 0,2 .

a. Calculate the Fisher criterion to find the optimal line to project the data into 1-d space.

b. Assume that the data is drawn from Normal densities: i. Estimate the mean and covariance matrix using Maximum

Likelihood method. ii. Find the logistic regression equation ! !! ! = ! !!! + !!

for ! !! = !(!!).

 

Q4. Consider the following nonlinearly separable data:

!! = 0,0 , 1,1 ,

!! = 0,1 , 1,0 .

a. Define a mapping function: ! ! : ! → ℱ,

where the data in ℱ is linearly separable. b. Design a kernel

! !, !! =< ! ! ,! !! > and verify that !(!, !!) is a valid kernel. c. Design a classifier that only employs kernels. What is the class of the unknown sample (0.5,0.5) according to your classifier?