bayesian inference for medical image segmentation via model-...

Post on 01-Aug-2020

4 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Bayesian Inference for Medical Bayesian Inference for Medical Image Segmentation via ModelImage Segmentation via Model--

Based Cluster TreesBased Cluster Trees

Visual Information Processing Group

José M. Martín Martínez

IntroductionIntroduction

● Medical Image Segmentation

● Applications

● Some Maths

● Experiments and results

● To do

ObjectivesObjectives

● Automatic Medical Image Segmentation and Quantization

● Segmentation: search for classes

● Quantization: search for colours

● Image properties

● Multimodal: different bands. For example, RGB

● X-ray, MRI, CAT, PET (emission and transmission)

Medical ImagesMedical Images

Transmission

Emission

ApplicationsApplications

● First step for other image processing operations

● Feature detection

● Image registration

● Quantizacion of data values for compression

● Satellite images processing

● Medical purposes, in particular

● For ex., presentation and analysis of mammograms

ModelModel--Based Based Segmentation/Clustering TreesSegmentation/Clustering Trees

● Two phases:

● Segmentation, based in Markov neighborhooddependency.

● Quantization (clustering), based in maximum likelihood estimation of finite mixture models.

SegmentationSegmentation

● Takes into consideration the spatial influence information, using hidden Markov model (HMM)

● We consider an unobserved pixel state associated to an observed pixel value Yi and we use Markov random field to define spatial structure on X

Where I(Xi,Xj) = 1 if Xi = Xj and otherwise = 0 and Φ is a spatial homogenity parameter

Φ small: randomness

Φ large: uniformity

},...2,1{ KX i ∈

)),(exp()( ∑∝ij ji XXIXp φ

Segmentation (II)Segmentation (II)

● We define N(Xi) as the neighoborhood of Xi andU(N(Xi),j) the number of neighborhood with statej

● From p(X) we have the conditional distribution:

∑==

k i

iii kXNU

jXNUXNjXp))),((exp(

))),((exp()),(|(φ

φφ

( )∑∈

=)(

,))),((iXNl

li jXIjXNU

Segmentation (III)Segmentation (III)

● Now, we consider the observed image Yi and we assume P(Yi | Xi = k)● Gaussian, with mean μk and standard desviation σk

● θk set of parameters (μk, σk) for state k

● Yi are conditionally independent given the Xi

∏ ∏==i i Xiii i

YPXYPXYP )|()|()|( θ

N(μk, σk)

Segmentation (IV)Segmentation (IV)● Procedure to calculate the hidden parameters

● Set number of classes (K)● Step 0: Initialize X using a marginal segmentation. (We now

have K classes)

● Step 1: Update using maximumlikelihood (means and variances in each of the K classes)

● Step 2: Update Φ using maximum pseudo-likelihood

))|ˆ(log(minargˆ Φ−=Φ Φ XPL

∏ Φ=Φi ii XNXpXPL )),ˆ(|()|ˆ(

)ˆ|(maxargˆ XYp=θ

Segmentation (V)Segmentation (V)

● Step 3: Update X, using ICM

● Repeat steps 1 to 3 until there aren’t any changes or max iterations is reached

● We take different values of K to evaluate the procedure

)),ˆ(|()|(maxargˆ Φ=== iiiiji XNjXpjXYpX

Segmentation (VI)Segmentation (VI)

● We have a statistics model por each K, how do we selectK?

● Problem of statistical model comparison, we resolve itusing Bayes factors. It is based on posterior andmarginal model probabilities

MK is the statistics models (the number of unknown classes), K = min,...max. Y is the observed image

∑ =

=max

min)()|(

)()|()|( K

KL LL

KKK

MpMYpMpMYpYMp

Segmentation (VII)Segmentation (VII)

● Denominator is the same for all MK, we can remove it

● p(MK) is the prior probability of model MK . Weassumed they are equally likely a priory

So, we can remove it, too

maxmin,...,,1

1)(minmax

=+−

= KKK

Mp K

Segmentation (VIII)Segmentation (VIII)

● It remains p(Y|MK), the integrated likelihood

which is hard to evaluate. A good approximationis

is the maximum-likelihood estimator of θK

dK is the number of parameters in model MK

KKKKK dpMYpMYp θθθ )(),|()|( ∫=

BICNdMYpMYp KKKK =−≈ )log(),ˆ|(log2)|(log2 θ

Kθ̂

Segmentation (IX)Segmentation (IX)

● This sum involves all possible configurations ofthe hidden states (N pixels and K states KN ),which is intractable

∑ ===x

KxXpKxXYpKYP )|(),|()|(

Segmentation (X)Segmentation (X)

● First, we approximate assuming all Yi are independent and then we approximate again by pseudolikelihood, which evaluate only the configuration of the neighborhood of a pixel i, instead all possibleconfigurations

≈≈∏i

i KYPKYP )|()|(

∏ ∑=

==≈i

K

jiiii XNjXPjXYP ))ˆ),ˆ(|()|((

Segmentation (XI)Segmentation (XI)

● Replacing P(Y|K) (intractable) by in the BIC equation

● Larger values of PLIC indicates a better fitness, so we take the K associated with the model withgreater PLIC

)log())|(log(2)( ˆ NdKYPKPLIC KX −=

)|(ˆ KYPX

QuantizationQuantization● What is image quantization?

● Determining clusters among pixel values to map original colorimage into an output image with a limited number of colors, without quality lost “true gray levels”

● We have an image x and we assume each pixel xi belongs to a group G which is Gaussian distributed

● The g-th group has mean μg and variance σg

● We call γ an unobserved n x G cluster assignment● γig = 1 if xi belongs to group g

● γig = 0 otherwise

Quantization (II)Quantization (II)

● We want to calculate for a number of cluster G the parameters μg and σg and to determine the cluster assignment of each pixel

● The probability density for this model is

Where θg = (μg , σg2)T, fg(·| θg) is a Gaussian density with mean μg

and variance σg2, θ =(θ1 ,…,θg), y λg >= 0 (g=1,…,G) and

∑=

=G

gg

11λ

∑=

=G

ggiggi xfxf

1

)|(),|( θλλθ

Quantization (III)Quantization (III)● Procedure● Set the number of groups G

● To estimate the parameters we use the expectation-maximization algorithm (EM)

● E step● Conditional expectation of γ, is computed through θ and λ

● M step● Update θ and λ given the current γ

● Repeat E and M steps until there aren’t any changes or until max iterations are reached

Quantization (IV)Quantization (IV)

● Choose the optimal number of clusters

● We consider a range of candidates: G = Gmin,...,Gmax

● We use the Bayesian Information Criterion, BIC, as we have seen in segmentation

)log(),ˆ|(log2)|(log2 NdMxpMxp KGGG −≈ υ

∏∑= =

=N

i

G

ggiggGG xfMxp

1 1)ˆ|(ˆ),ˆ|( θλυ

QuantizationQuantization (V)(V)

● BIC measures the balance between the improvement in the likelihood and the number of model parameters needed

● BIC aislated is not significative, we must compare BICs between 2 competing candidates (greatest value)

AlgorithmAlgorithm1. For the first image band, calculate PLIC for a

group of number of segments (for example, 1...40). Choose the number of segments with greatest PLIC and assign each pixel to the segment to which it is most likely to belong.

2. For each segment, calculate BIC using values in the second band and assign the pixels to cluster in the same way that in step 1.

3. For each left band, repeat step 2 over each subdivision, until not subclustering was possible.

DiscussionsDiscussions● Band Ordering● Goodness of algorithm is the tightness of the clusters

● Minimum sum of variances of clusters for all bands● Different cluster ordering properties of the bands● Find the optimal band ordering

● Solution: check all orderings of bands. Similarly to TSP

● Number of classes● Small: high level vision

● Large: high fidelity to the original

● We can choose the number of classes pruning the tree

ExperimentsExperiments

● Executed over 3 medical images, one obtenied by transmission and the other by emission

● 1 band only segmentation

ResultsResults

-94.6455

-220.8394

-231.0313

-284.3682

-90.6821

PLICNº Seg

Pac1

-300.000

-250.000

-200.000

-150.000

-100.000

-50.000

01 2 3 4 5 6

Results (II)Results (II)

-127.4234

-126.3293

-100.1602

-61.4911

PLICNº Seg

Pac2

-140.000

-120.000

-100.000

-80.000

-60.000

-40.000

-20.000

01 2 3 4 5

Results (III)Results (III)

-116.9728

-109.3577

-113.2086

-111.6165

-140.0634

-144.0643

-109.2172

-74.8191

PLICNº Seg

Pac3

-109.20916

-111.03515

-110.82414

Nan13

-112.53912

-99.37411

-116.31510

-116.1799

PLICNº Seg

Pac3

Results (IV)Results (IV)

-160.000

-140.000

-120.000

-100.000

-80.000

-60.000

-40.000

-20.000

01 6 11 16

Results (V)Results (V)

ToTo DoDo

● Fitting the parameters to work better with medical images

● Resolving singularity problems

● Applying this method to multimodal images

top related