description and simulation of visual texture · uleres i statistisk terminologi. s dv anlige mark o...
TRANSCRIPT
General rights Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights.
Users may download and print one copy of any publication from the public portal for the purpose of private study or research.
You may not further distribute the material or use it for any profit-making activity or commercial gain
You may freely distribute the URL identifying the publication in the public portal If you believe that this document breaches copyright please contact us providing details, and we will remove access to the work immediately and investigate your claim.
Downloaded from orbit.dtu.dk on: May 02, 2021
Description and Simulation of Visual Texture
Carstensen, Jens Michael
Publication date:1992
Document VersionPublisher's PDF, also known as Version of record
Link back to DTU Orbit
Citation (APA):Carstensen, J. M. (1992). Description and Simulation of Visual Texture.
DESCRIPTION
AND
SIM
ULATION
OF
VISUAL
TEXTURE
By
JensMichaelCarstensen
LYNGBY1992
Ph.D.THESIS
NO.59
ISSN
0107{525X
c
Copyright1992
by
JensMichaelCarstensen
Printedby
,TechnicalUniversityofDenmark
ii
ThisdocumentwasformattedwithL ATEX.
HIPSandHIPS-2aretrademarksofSharpImageSoftware,NewYork,and
TheTuringInstitute,Glasgow,UK.
CARTisatrademarkofCaliforniaStatisticalSoftware,Inc.,Lafayette,
California.
SASisaregisteredtrademarkofSASInstituteInc.,Cary,NorthCarolina.
ConnectionMachineandC�
areregisteredtrademarksofThinkingMa-
chinesCorporation.CM-200isatrademarkofThinkingMachinesCorpo-
ration.
HP-UXandHPApollo9000/750areregisteredtrademarksofHewlett-
PackardCompany.
Sun-4isaregisteredtrademarkofSunMicrosystems,Inc.
Someoftheworkinthisthesishaspreviouslybeenpublishedin:
Carstensen,J.M.&
Conradsen,K.(1992)Spin- ipalternativestospin-
exchangeMarkovrandom
�eldsimulationimplementedonaSIMDmas-
sivelyparallelcomputer,SubmittedtoIEEETransactionsonPatternAnal-
ysisandMachineIntelligence.
Carstensen,J.M.,Grunkin,M.&Conradsen,K.(1991)Measurementof
enzymatictreatmente�ectontextileusingdigitalimageanalysis,IMSOR
Technicalreport.
iii
iv
Preface
ThisthesishasbeenpreparedattheInstituteofMathematicalStatisticsand
OperationsResearch(IMSOR),TechnicalUniversityofDenmark,inpartial
ful�llmentoftherequirementsforthedegreeofPh.D.inengineering.
Thegeneralframeworkofthethesisisstatisticsanddigitalimageanalysis.
Itisimpliedthatthereaderhasabasicknowledgeoftheseareas.
Thetreatmentofthesubjectsisbynomeansexhaustive,butisintended
toimprovetheknowledgeontexturedescriptionandtexturesimulationby
goingthroughselectedtheoryandexamples.Hopefullythiscanleadtoan
improvedtextureunderstanding.
Lyngby,April1992
JensMichaelCarstensen
v
vi
Acknowledgements
TheauthorwouldliketothankProfessorKnutConradsenforguidanceand
encouragementduringthecourseofthisworkandforprovidingexcellent
researchfacilitiesfortheimagegroupatIMSOR.
ProfessorBrianD.Ripleyisthankedforinspiringdiscussionsduringmy
visittotheUniversityofOxfordandforprovidingmanyusefulreferences.
IwishtothankmycolleaguesatIMSORforcontributingtoapleasant
andinspiringscienti�candsocialenvironment.EspeciallyIwouldliketo
thankmyoÆcemateMichaelGrunkinforhelpandmoralsupportandfor
copingwithmyfrustrationsinapleasantandenjoyableway.Dr.Bjarne
Kj�rErsb�llprovidedusefulhelpandcomments,forwhichIamthankful.
NielsChristianKriegerLassen,NielsJacobCarstensenandAllanAasbjerg
Nielsenwereveryhelpfulinthelastcriticalmomentsofthepreparation
ofthisthesis.IappreciatethehelpandcontinuoussupplyofL ATEX-styles
fromDr.CarlM.Bilbo.Dr.NielsKj�lstadPoulsenisacknowledgedforhis
expertcommentsoncontroltheory.Iamalsoverygratefultothemembers
oftheimagegroup,notmentionedabove,forhelpandinspiration.
vii
ImagedatawaskindlyprovidedbyNovoNordiskandImperialCancerRe-
searchFund.IwishtothankPeterRosholm
ofNovoNordisk,andDrs.
RichardMottandHansLehrachofImperialCancerResearchFund,Lon-
donfortheirsplendidcollaboration.
Dr.PeterFrykmanoftheGeologicalSurveyofDenmarkisacknowledged
forhiscollaborationonourattempttomakeareservoirsimulationprogram
thatprovesusefultogeologists.
Iam
gratefultotheViggoJarlfoundationforsupportingme�nancially
duringmystudies.
ThisresearchwaspartiallysponsoredbytheDanishTechnicalResearch
CouncilandtheDanishNaturalScienceResearchCouncilundertheMOBS
andCAPprograms.
viii
Summary
Theproblemoftextureanalysisisconsideredwithintheframeworkofdigi-
talimageanalysis.Anextensivesetoftexturestatisticsisreviewedand
explained,andtheirperformanceinmeasuringenzymatictreatmente�ect
ontextileandinclassi�cationofamoregeneralsetoftexturesisstudied.
Wefoundthatbothproblemsweresolvedsatisfactorilywiththesetof
texturestatisticsused.
Markovrandom�eldsarereviewedandinvestigatedasmodelsoftexture.
Resultsfrom
the�eldofstatisticalphysicsarereformulatedinastatis-
ticalsetting.StandardMarkovrandom
�eldsdonothavetheabilityto
modelmorphologicalpropertiesoftextures,andthisleadsustoformulate
anextensioninthetermsofmathematicalmorphology.Thepropertiesof
morphologicalMarkovrandom
�eldsareillustrated.Wegothroughthe
problemofMarkovrandom�eldparameterestimationandsuggestanex-
tensionoftheasymptoticmaximumlikelihoodestimator(Pickard,1987)to
theanisotropic�rst-ordermodel.
ix
Markovrandom�eldsimulationisdescribedandanew,fast,parallelal-
gorithmforsimulationconditionalonthe�rst-orderstatisticsispresented.
ThisalgorithmandthemorphologicalMarkovrandom�eldsarethenused
forthesimulationofthegeometricalstructureofoilreservoirs.
Markovrandom�eldsinaBayesiansettingareusedsuccessfullytoanalyze
hybridization�ltersautomaticallyforthehumangenomeproject.
x
Resum�e
Teksturanalysebetragtesindenforrammerneafdigitalbilledanalyse.Et
omfattendeantalstatistiskestikpr�vefunktionertilteksturbeskrivelseer
gennemg�aetogforklaret,ogderesevnetilatm�alee�ektafenzymbehand-
lingp�atekstilerogtilatklassi�cereetmeregenerelts�tafteksturerer
unders�gt.Beggeproblemervistesigatkunnel�sestilfredsstillendemed
deanvendtestikpr�vefunktioner.
DergivesenoversigtoverMarkovfelter,ogderesanvendelighedsomtekstur-
modellerudforskes.Resultaterom
dissemodellerfrastatistiskfysikre-
formuleresistatistiskterminologi.S�dvanligeMarkovfelterkanikkemodel-
lereteksturersmorfologiskeegenskaber,ogdettef�rerosfrem
tilatfor-
mulereenudvidelsevedbrugafmatematiskmorfologi.MorfologiskeMarkov-
feltersegenskaberbliverendvidereillustreret.Derredeg�resforproble-
merneiforbindelsemedestimationafparametre,ogderforesl�asenudvidelse
afdenasymptotiskemaximumlikelihoodestimator(Pickard,1987)tildet
anisotropetilf�lde.
xi
TeorienforsimulationafMarkovfelterergennemg�aet,ogderpr�senteres
enny,hurtig,parallelalgoritmetilsimulationafMarkovfeltergivetden
marginalefordelingafpixelv�rdier.Dennealgoritmeogdemorfologiske
Markovfelterbliverderefteranvendttilsimulationafdenrumligestruktur
ioliereservoirer.
MarkovfelterbliverienBayesiansksammenh�nganvendttilautomatise-
ringafanalysenafhybridiserings�ltreindenforhumangenomeprojektet.
xii
Contents
Preface
v
Acknowledgements
vii
Summary
ix
Resum�e(inDanish)
xi
1
Introduction
1
1.1
Texture
..............................
1
1.2
Textureanalysis
.........................
5
1.3
Outlineofthethesis.......................
6
xiii
2
Texturestatistics
9
2.1
First-ordergraylevelstatistics
.................10
2.1.1
Multi-resolution�rst-orderstatistics..........12
2.1.2
Histogrammatching...................13
2.2
Second-ordergraylevelstatistics................13
2.2.1
Graylevelcooccurrencematrices............13
2.2.2
Grayleveldi�erencehistogram
.............19
2.2.3
Graylevelsumhistogram
................20
2.2.4
Haralickfeatures.....................23
2.2.5
GLCM
asacontingencytable..............24
2.2.6
Multi-resolutionGLCM
.................25
2.2.7
GLCM
performance...................25
2.3
Higher-ordergraylevelstatistics
................27
2.3.1
Graylevelrunlengthmatrix
..............27
2.3.2
Neighboringgrayleveldependencematrix.......29
xiv
2.4
Statisticsforbinaryimages...................32
2.5
Fourierfeatures..........................34
2.6
Measurementofenzymatictreatmente�ectontextile
....36
2.6.1
Background........................36
2.6.2
Imageacquisition.....................36
2.6.3
Descriptionofvisualproperties.............39
2.6.4
AnalysisintheFourierdomain.............41
2.6.5
Spatialdomainfeatures.................45
2.6.6
Conclusion
........................52
2.7
GLCM
featureperformance...................53
2.7.1
Imagematerial......................53
2.7.2
GLCM...........................58
2.7.3
CARTclassi�cation
...................62
2.7.4
Classi�cationsummary
.................71
2.7.5
Conclusion
........................73
xv
3
Markovrandom
�elds
75
3.1
Random�elds...........................76
3.1.1
2Dgrids..........................76
3.2
Gibbsrandom�elds
.......................78
3.2.1
Historicalperspective
..................78
3.2.2
Generalproperties....................79
3.3
Markovrandom�elds
......................80
3.4
BinaryMarkovrandom�elds..................84
3.4.1
Isingmodelrevisited...................84
3.4.2
Morphologicalextension.................94
3.5
Pottsmodels
...........................105
3.5.1
Phasetransitions.....................106
3.5.2
Morphologicalextension.................107
3.5.3
Otherextensions.....................107
3.6
GaussianMarkovrandom�elds.................108
xvi
3.6.1
Alternativegrayleveldistributions...........109
4
Markovrandom
�eldparameterestimation
111
4.1
Introduction............................112
4.2
Codingestimation
........................112
4.3
Pseudolikelihoodestimation...................114
4.4
BinaryMRF
...........................115
4.4.1
Maximumpseudolikelihood
...............115
4.4.2
Asymptoticmaximumlikelihood
............118
4.4.3
Otherestimationmethods................118
4.5
Pottsmodel............................119
4.5.1
Maximumpseudolikelihood
...............119
4.6
GaussianMRF
..........................120
4.6.1
Maximumpseudolikelihood
...............120
4.6.2
Maximumlikelihood...................121
5
Markovrandom
�eldsimulation
123
xvii
5.1
Introduction............................124
5.2
Iterativesimulation........................124
5.2.1
TheMetropolisalgorithm
................126
5.2.2
Spin- ipalgorithms
...................127
5.2.3
TheMetropolisspin-exchangealgorithm
........129
5.2.4
Swendsen-Wangalgorithm................130
5.3
The�-controlledspin- ipalgorithm
..............134
5.3.1
Introduction
.......................134
5.3.2
Thefeedbackloop
....................135
5.3.3
Relationtoimportancesampling............137
5.3.4
Parallelimplementation.................138
5.3.5
Results
..........................139
5.3.6
Conclusion
........................145
5.4
Simulationofgeologicalstructures...............147
5.4.1
Introduction
.......................147
xviii
5.4.2
Modeltypes........................148
5.4.3
AMarkovrandom�eldreservoirmodel........149
5.4.4
Simulationresults
....................151
5.4.5
Conclusion
........................153
6
Bayesianparadigm
157
6.1
Introduction............................158
6.2
Priordistribution.........................158
6.3
Observationmodel........................159
6.4
Maximumaposteriori(MAP)estimates............160
6.4.1
Simulatedannealing...................160
6.4.2
Iteratedconditionalmodes(ICM)
...........161
6.5
Marginalposteriormodes(MPM)................161
6.6
Hybridization�lteranalysis...................163
6.6.1
Background........................163
6.6.2
Robotdynamics
.....................164
xix
6.6.3
Imageanalysisproblem
.................164
6.6.4
Digitization........................166
6.6.5
Preprocessing.......................168
6.6.6
Spotlocalization.....................172
6.6.7
Spotclassi�cation
....................179
6.6.8
Results
..........................179
6.6.9
Conclusion
........................188
7
Conclusion
189
7.1
Summary
.............................189
7.2
Acomment............................191
A
Developedsoftware
193
B
GLCM
forallBrodatztextures
199
References
213
Index
223
xx
Chapter1
Introduction
1.1
Texture
Thetermvisualtextureinthetitleofthisthesisemphasizes,thatthede�-
nitionoftextureusedhereiscloselyrelatedtoperception.
Atextureisaregionin2Dor3Dthatcanbeperceivedasbeing
spatiallyhomogeneousinsomesense.
Thisde�nitionisverybroad.Itincludesastexturethetotallyuniform
region,whichinthedailylanguageissaidtohavenotexture.Indeedthe
interestingthingabouttexturesisthestudyofthespatialvariationsover
thetexturedregion,andthesevariationsoftenbecomesynonymouswith
1
2
Chapter1.Introduction
texture.Weemphasizethattexturesonlydi�eringinluminancearecon-
sidereddi�erenttexturesaccordingtoourde�nition.Brodatz(1966)isa
photographicalbum
with112textures.Thisalbum
hasbecomeastan-
dardreferenceintextureanalysisandsubsequentlythesetexturesshallbe
referredtoastheBrodatztextures.
Figure1.1showsastrictlyrandomtexture.Thepixelsareuncorrelated.
Figure1.2showsastrictlydeterministictexture(acheckerboard).Itisa
strictlyorderedpattern,thatisfullydeterminedfromtheknowledgeofa
smallsubpattern.Observabletexturesaresomewherebetweenthesetwo
extremes.Figures1.3and1.4showexamplesofarandomtexture(hand-
madepaper)andadeterministictexture(abrickwall).Thewordtexture
comesfromtheLatinwordtextura,thatmeanstextilefabric,andtextile
fabricisanotherexampleofadeterministictexture.Rao(1990)classi�es
alloftheBrodatztexturesinthreeclasses:disordered(random),weakly
ordered,andstronglyordered(deterministic).
Thequestionofscaleorresolutionisfundamentaltotextureperception.If
wezoominonthebrickwallof�gure1.4weseethetextureoftheindividual
bricks.Ifwezoomoutwemayseeatextureofwallshadingoratextureof
wallandwindows.Thustheremaybeseverallevelsofcompletelydi�erent
texturesinthesameimagebutatdi�erentscales.Atexturewithmorethan
onetextureleveliscalledahierarchicaltexture.Todistinguishbetween
di�erenttexturelevelswecanusethetermsmicrotextureandmacrotexture.
Subsequentlythetermsscaleandresolutionwillbeusedinterchangeably.
Researchersintextureperceptionhaveinvestigatedpreattentive(e�ortless
orinstantaneous)texturediscriminationinthehumanvisualsystem.The
famousiso-second-orderconjecture(Julesz,1975)statedthattextureswith
1.1
Texture
3
Figure1.1.Astrictlyrandomtexture
Figure1.2.Astrictlydeterministictexture
4
Chapter1.Introduction
Figure1.3.Handmadepaper(D57fromBrodatz)
Figure1.4.Brickwall(D95fromBrodatz)
1.2
Textureanalysis
5
thesamesecond-orderstatistics(graylevelstatisticsofpairsofpixels)can
notbedistinguishedeveniftheyhavedi�erentthird-orhigher-orderstatis-
tics.Thisconjecturehaslaterbeendisproved(Julesz,1981)andreplaced
byatextontheory(Julesz&Bergen,1983).Textonsaresmallconspicuous
featureslike
�Elongatedshapes,suchasellipses,rectanglesorlinesegments.
�Endsoflinesegments.
�Crossingsoflinesegments.
Thetextonconjecturearguesthatpreattentivetexturediscriminationis
basedondi�erencesinthedensityoftextons.
1.2
Textureanalysis
Themaingoaloftextureanalysisistoextractusefultexturalinformation
fromanimage.Historicallytherehasbeentwomajorapproaches,astruc-
turalandastatistical.Thestructuralapproachdescribesatexturebya
subpatternorprimitiveandthespatialdistributionofprimitives,theplace-
mentrule.Theprimitivesarealsocalledtextureelements.Ifweconsider
thebrickwalltheprimitiveisabrickandtheplacementrulespeci�esthe
arrangementofbricksinthewall.Thestatisticalapproachismoregenerally
applicable,becauseitdoesnotpresumethatthetexturecanbedescribed
intermsofprimitivesandplacementrules.Itdrawsonthegeneralsetof
statisticaltools.Thisthesisisprimarilybasedonthestatisticalapproach.
6
Chapter1.Introduction
Theextractionoftexturefeaturesisessentialtoapplicationssuchastex-
turemeasurement,texturesummarization,textureclassi�cationandtexture
segmentation(Texturedescriptiondenotesalloftheseareas).Thegoalof
texturemeasurementistocharacterizeatexturewithonefeature,e.g.a
featurefortextilewearassessment.Intexturesummarizationwegivesum-
mariesre ectingthevisualpropertiesoftextures.Textureclassi�cation
usuallyservesoneoftwogoals.Wemaywanttoassignaclasstoanentire
texture,e.g.acceptorrejectinindustrialqualitycontrol.Wemayalso
wanttoassignatextureclasstoeverypixelinanimageandthusobtaina
partitioningofthisimage.Texturesegmentationcorrespondstopixelwise
textureclassi�cationwithnoaprioriknowledgeofthenumberoftexture
componentsorthepropertiesofeachtexturecomponent.
ForgeneralreviewsontextureanalysisthereaderisreferredtoHaralick
(1979),vanGool,Dewaele,&Oosterlinck(1985),Tomita&Tsuji(1990),
Rao(1990).
1.3
Outlineofthethesis
Chapter2givesanoverviewoftexturestatisticsusedintextureanalysis.
Thisoverviewisfollowedbytwocasestudiesthatevaluatestheperformance
ofthesestatisticsinaccuratelymeasuringtexturalproperties.Firstwe
wanttomeasurethetexturalchangesthatcottontextilesundergoduring
cellulaseenzymatictreatment.Thenweusesecond-orderstatisticsforthe
classi�cationofBrodatztextures.
1.3
Outlineofthethesis
7
Chapter3dealswithparametricdescriptionoftexturebasedonaclass
ofmodelscalledMarkovrandom
�elds.ThetheoryofMarkovRandom
�eldsisreviewedtogetherwiththetheoryoftheassociatedGibbsrandom
�elds.ThetheoryofGibbsrandom�eldswerefoundedinstatisticalphysics
(Ising,1925)andsomerelevantresultsfromthisareaispresentedinanew
statisticalsetting.AvarietyofMarkovrandom�eldsisreviewedwithan
emphasisondiscretemodels.Furtherweintroduceasetofmorphological
Markovrandom�elds,thatextendsthestandardsetofmodelsbyusingthe
operatorsofmathematicalmorphology(Serra,1982).
FormostpracticalapplicationsofMarkovrandom�eldsitisessentialthat
wehaveaccurateandfeasiblealgorithmsforparameterestimation.
In
chapter4aselectionofestimationmethodsisreviewed,andsomeofthese
methodsareappliedinchapter5.Anextensionoftheasymptoticmaximum
likelihoodestimator(Pickard,1987)totheanisotropiccaseisproposedin
section4.4.2.
Inchapter5wereviewasetofiterativesimulationschemesforMarkov
random�eldsimulation.Wethenpresentafastnewparallelalgorithmfor
simulatingMarkovrandom�eldsconditionalongiven�rst-orderstatistics.
WeinvestigatetheuseofthisalgorithmandamorphologicalPottsmodel
inthesimulationofgeologicalstructures.
TheBayesianparadigmisaframeworkforincorporatingstochasticmodels
ofvisualphenomenaintoaverygeneralsetoftasksfromimageprocessing
andimageanalysis.Inchapter6wegiveashortreviewofBayesianimage
analysisandpresentanapplicationthatmakessuccessfuluseofMarkovran-
dom�elds,theMetropolisalgorithmandsimulatedannealinginaBayesian
framework.
8
Chapter1.Introduction
Chapter2
Texturestatistics
Texturestatisticsisfrequentlyclassi�edinto�rst-order,second-orderand
higher-orderstatistics.First-orderstatisticsrefertothemarginalgraylevel
distribution.Second-orderstatisticsrefertothejointgrayleveldistribution
ofpairsofpixelsandhigher-orderstatisticsrefertothejointgraylevel
distributionofthreeormorepixels.
Thischaptergivesanoverviewoftexturestatisticsusedintextureanalysis.
Thisoverviewisfollowedbytwocasestudiesthatevaluatetheperformance
ofthesestatisticsinaccuratelymeasuringtexturalproperties.Firstwe
wanttomeasurethetexturalchangesthatcottontextilesundergoduring
cellulaseenzymatictreatment.Thenweusesecond-orderstatisticsforthe
classi�cationofBrodatztextures.
9
10
Chapter2.Texturestatistics
2.1
First-ordergraylevelstatistics
The�rst-ordergraylevelstatisticscanbederivedfromthegraylevelhis-
togram
fhi g.hiisthenumberofpixelsinanimagewithgrayleveli.If
NisthetotalnumberofpixelsandGisthenumberofgraylevelsthen
PG�1
i=0
hi=N.Thenormalizedhistogram
fHi gwithHi=hi =N
isthe
empiricalprobabilitydensityfunctionforsinglepixels.Statisticscomputed
fromHiinclude:
1.Themeangraylevel
�=
G�1
Xi=0iHi
�measurestheaverageintensityintheimage.
2.Thegraylevelvariance
�2=
G�1
Xi=0
(i��)2H
i
where�isthestandarddeviation.Thevarianceandthestandard
deviationmeasurestheglobalcontrastintheimage.
3.ThecoeÆcientofvariation
cv=��
ThecoeÆcientofvariationisinvariantunderachangeofscale,i 0=Ai,
thusiftheintensityscalehasanaturalzero,thenthecvwillbeascale
invariantmeasureofglobalcontrast,
4.Thegraylevelskewness
1=
1�3
G�1
Xi=0
(i��)3H
i
2.1
First-ordergraylevelstatistics
11
Skewnessmeasurestheextenttowhichoutliersfavoronesideofthe
distribution.Skewnessisinvariantunderalineargrayscaletransfor-
mationi 0=Ai+B.
5.Thegraylevelkurtosis
2=
1�4
G�1
Xi=0
(i��)4H
i �3
Kurtosismeasuresthepeakednessortailprominenceofthedistribu-
tion.Itis0:0fortheGaussiandistribution.Kurtosisisinvariant
underalineargrayscaletransformationi 0=Ai+B.
6.Thegraylevelenergy
e=
G�1
Xi=0H2i
whereG�1�e�1.Energymeasuresthenonuniformityofthehis-
togram.
7.Thegraylevelentropy
s=�
G�1
Xi=0Hi logHi
where0�s�logG.Entropymeasurestheuniformityofthehis-
togram.Thisquantityiswidelyusedinimagecompression.Ifthe
logarithmisofbase2,itisthelowerboundontheaveragelengthof
thebinarycodewordsusedinerror-freecompressionofindependent
datasamples.
12
Chapter2.Texturestatistics
00.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9 1
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
5
Normalizedvariance
Scale
D43
3
3
3
3
3
3
D16+
+
+
+
+
+
+
D292
2
2
2
2
2
2
D53�
�
�
�
�
�
�
D774
4
4
4
4
4
4
Figure2.1.Normalizedvarianceversusscalefor�veBrodatztextures.
2.1.1
Multi-resolution�rst-orderstatistics
First-orderstatisticscomputedatseveraldi�erentscales(resolutions)will
provideuswithinformationaboutsecond-andhigher-orderstatistics.As
anexamplewehavetaken�veBrodatztextures,andsuccessivelylowpass
�lteredandsubsampledthem�vetimes.Wehavecomputedthevariance
ofeachimageandthendividedbythevarianceofthefullresolutionimage.
In�gure2.1weseethattheresultis�vecurves,thatcanbedistinguished.
Thusmulti-resolution�rst-orderstatisticscontainimportanttexturalinfor-
mation.
2.2
Second-ordergraylevelstatistics
13
2.1.2
Histogram
matching
The�rst-orderstatisticsarehighlydependentonthelightingconditions.It
isthereforecommonpracticetotrytoeliminatethein uenceof�rst-order
statisticsintextureanalysisbymakingthegraylevelhistogrammatcha
speci�cdistribution.Amatchtoauniformdistributioniscalledhistogram
equalization,andthisisbyfarthemostusedmatch.AmatchtoaGaussian
distributionisanotherpossibility,andinsection2.7weseethatthisisa
moregentlematchespeciallyforstochastictextures.
2.2
Second-ordergraylevelstatistics
Theautocorrelation(orthecloselyrelatedvariogram)isprobablythebest
knownsecond-ordergraylevelstatistic.Wewill,however,considerthe
second-ordergraylevelstatisticsinamoregeneralsetting:graylevelcooc-
currencematrices(GLCM).Theautocorrelationcanbecomputedfrom
thesecooccurrencematrices.
2.2.1
Graylevelcooccurrencematrices
Thegraylevelcooccurrencematricesareafullrepresentationofthesecond-
ordergraylevelstatistics.AGLCM,c,isde�nedwithrespecttoagiven
(row,column)displacementh,andelement(i,j),denotedcij ,isthenumber
oftimesapointhavinggrayleveljoccursinpositionhrelativetoapoint
havinggrayleveli.LetNh
bethetotalnumberofpairs,thenCij=cij =Nh
denotestheelementsofthenormalizedGLCM,C.
14
Chapter2.Texturestatistics
Themeaningoftheabovede�nitiongetsmoreapparentifweasanexample
computecfromthe4-colorimage
2
1
1
3
3
3
1
0
2
2
3
3
1
1
2
0
2
1
0
1
0
0
1
0
0
Ifh=(0,1),i.e.onestepinthehorizontaldirection,thencwillbe
0
1
2
3
0
2
2
2
0
1
3
2
1
1
2
0
2
1
0
3
0
2
0
2
andNh
willbeequalto20.
Itiseasilyseenthat
C(�h)=CT(h)
whereCT
isCtransposed.
AsymmetricGLCM,cs (h),canbeobtainedbypoolingthefrequenciesof
c(h)andc(�h).Hence
cs (h)=c(h)+cT(h)
and
Cs (h)=12
[C(h)+CT(h)]
2.2
Second-ordergraylevelstatistics
15
Assumingisotropy(nodirectionality)wecanpoolthefrequenciesofcooc-
currencematriceswithdisplacementshofdi�erentanglesandapproxi-
matelythesamelengthh.ThisprovidesuswiththeisotropicGLCM,
ci (h),where
ci (1)=cs (0;1)+cs (1;0)+cs (1;1)+cs (�1;1)
and
Ci (1)=14
[Cs (0;1)+Cs (1;0)+Cs (1;1)+Cs (�1;1)]
Oneofthemainproblemsassociatedwiththeuseofcooccurrencematricesis
thattheyhavetobecomputedformanydi�erentvaluesofh,thusproviding
uswithanimmenseamountofdata.Datareductioncanbeaccomplished
bypoolingthematricesasshownabove,byreducingthenumberofgray
levelsorbycomputingtexturefeaturesfromeachmatrix.Thesefeatures
canthenbeusedfordescriptionandclassi�cationoftextures.
Let
Cxi
=G�1
Xj=0Cij
Cyj
=G�1
Xi=0Cij
andlet�x,�y ,�x
and�y
bethemeansandstandarddeviationsofCxi
and
Cyj
overiandj.Thenanumberoffeaturescanbecomputedfrom
the
GLCM
including:
1.EnergyorAngularSecondMoment
E=
G�1
Xi=0
G�1
Xj=0C2ij
16
Chapter2.Texturestatistics
whereG�2�E�1.EtakesthevalueG�2forauniformdistribution
overC,andthevalue1i�onlyonecellisnonzero.
2.Entropy
S=�
G�1
Xi=0
G�1
Xj=0CijlogCij
where0�S�logG2.StakesthevaluelogG2forauniformdistri-
butionoverC,andthevalue0i�onlyonecellisnonzero.
3.MaximumProbability
M=maxCij
whereG�2�M�1.MtakesthevalueG�2forauniformdistribu-
tionoverC,andthevalue1i�onlyonecellisnonzero.
4.Correlation(orAutocorrelation)
�=
G�1
Xi=0
G�1
Xj=0
(i��x)(j��y )Cij
�x�y
where�1���1.�takesthevalue1i�onlyvaluesonthemain
diagonalofC
arenonzeroandthevalue0i�thegrayvaluesare
uncorrelated.
5.DiagonalMoment
D=
G�1
Xi=0
G�1
Xj=0 ji�jj(i+j��x ��y )Cij
Thediagonalmomentbasicallymeasuresthedi�erenceincorrelation
forhighgraylevelsandforlowgraylevels.ItismentionedinLaws
(1980),buthasotherwisebeenleftoutinmoststudiesofGLCM.
2.2
Second-ordergraylevelstatistics
17
6.InformationalCoeÆcientofCorrelation
r1= p1�e�2r0
where
r0=�
G�1
Xi=0
G�1
Xj=0CxiCyjlog(CxiCyj)�S
istheLogarithmicIndexofCorrelation(Linfoot,1957).Sincer0 �0
wehave0�r1<1.
7.MaximalCorrelationCoeÆcient.Thisfeatureisthesquarerootof
thesecondlargesteigenvalueofQwhere
Qij= X
k
Cik Cjk
CxiCyk
LetR
andSbeequaltoCwithrespectivelyrowsumsandcolumn
sumsnormalizedtounity,i.e.R
ij=Cij
Cxi
Sij=Cij
Cyj
then
Q=RST
R,ST
andQarestochasticmatrices,i.e.theirlargesteigenvalueis1.
IftheyareconsideredastransitionmatricesforaMarkovchainand
iftheyareirreduciblethenthehistogramvectorpwillbetheunique
invariantdistributionfortheMarkovchain.Therateofconvergence
totheinvariantdistributionisdeterminedbythesecond-largesteigen-
value,�2 ,where0��2<1(Seneta,1981).Qisthetransitionmatrix
foronejumpwithdisplacementvectorhandbackagain.Ifwestart
18
Chapter2.Texturestatistics
atonepixelwithaninitialgrayleveldistributionandthenmakes
successivejumpsbackandforth,thenthegrayleveldistributionwill
approachtheinvariantgrayleveldistribution.Thememoryofthe
grayleveldistributionretainedineachjumpbackandforthisdeter-
minedbythesecond-largesteigenvalue,�2 ,ofQ.Ifthepixelsoneach
sideofajumpareindependent,wehave�2=0.
Energy,entropyandmaximumprobabilityareuniformitymeasures.They
allhaveoneextremumfortheuniformdistributionandanotherextremum
whenoneprobabilityequalsunity.Thedi�erencebetweenthesemeasures
isdemonstratedfortwodistributionswith4possibleoutcomes.
p1
p2
p3
p4
Energy
Entropy
0.50
0.50
0.00
0.00
0.50(1)
0.69(2)
0.76
0.08
0.08
0.08
0.60(2)
0.81(1)
Theuniformityrankingsareshowninparentheses.Theenergymeasure
assumesthe�rstdistributiontobethemostuniformofthetwo,whilethe
entropymeasurechoosesthesecond.Weseethat,whenmeasuringunifor-
mity,energypenalizessinglehighprobabilities,whileentropypenalizeszero
probabilities.Ifweincreasethezeroprobabilitiesofthe�rstdistributiona
littlethenentropywillreversetheranking,andmakethisthemostuniform
distributionaccordingtobothmeasures.
p1
p2
p3
p4
Energy
Entropy
0.48
0.48
0.02
0.02
0.46(1)
0.86(1)
2.2
Second-ordergraylevelstatistics
19
Maximumprobabilitymeasuresuniformitysolelyonthebasisofthehighest
probabilityandtherankingbythismeasurewilloftenagreewiththatbased
onenergy.
Thecooccurrencematrixitselfcanalsobeusedasafeature(Vickers&
Modestino,1982;Parkkinen&Oja,1986).
TheuseofGLCMintextureanalysisissometimesreferredtoasthespatial
grayleveldependencemethod(SGLDM).
2.2.2
Grayleveldi�erencehistogram
Thegrayleveldi�erencehistogram(GLDH)isahistogramoftheabsolute
di�erencesofgraylevelsfrompairsofpixels.ItiscomputedfromtheGLCM
bysummingthetwo-dimensionaldensityCijoverconstantvalueofji�jj.
TheGLDHcanberegardedasahistogramofthe"distance"tothemain
diagonalintheGLCM.
Dk=
G�1
Xi=0
G�1
Xj=0
| {z}ji�jj=k
Cij ;k=0;::;G�1
ThefeaturescomputedfromtheGLDHinclude:
1.Di�erenceEnergy
DE=
G�1
Xk=0
D2k
whereG�1�DE�1.
20
Chapter2.Texturestatistics
2.Di�erenceEntropy
DS=�
G�1
Xk=0
DklogDk
where0�DS�logG.
3.Inertia,ContrastorVariogram
I=
G�1
Xk=0
k2D
k=2�2(1��)
(2.1)
where�isthegraylevelvarianceand�isthecorrelation.
4.InverseDi�erenceMomentorLocalHomogeneity
IDM=
G�1
Xk=0
Dk
1+k2
5.Di�erenceVariance
DV=I�(
G�1
Xk=0
kDk )2
GLDHfeaturesareasubsetofGLCM
features,andthisrelationwillsub-
sequentlybeimplicit.Theonlyusefulwayofcomparingthetwosetsof
featuresistodeterminethelossofinformationwhengoingfromGLCMto
GLDH.FeaturecomputationfromtheGLDHisoftencalledthegraylevel
di�erencemethod(GLDM).TheadvantageofGLDM
isthelowerstorage
requirementsandlowercomputationalcomplexity.
2.2.3
Graylevelsum
histogram
Thegraylevelsumhistogram(GLSH)isahistogramofthesumofpairsof
pixels.ItiscomputedfromtheGLCM
bysummingthetwo-dimensional
2.2
Second-ordergraylevelstatistics
21
densityCijoverconstantvalueof(i+j),i.e.
Sk=
G�1
Xi=0
G�1
Xj=0
|{z}i+j=k
Cij ;k=0;::;2G�2
WewillusetheSumAveragebelow
SA=
2G�2
Xk=0
kSk=�x+�y
ThefeaturescomputedfromtheGLSHinclude:
1.SumEnergy
SE=
2G�2
Xk=0
S2k
where(2G�1)�1�SE�1.
2.SumEntropy
SS=�
2G�2
Xk=0
SklogSk
where0�SS�log(2G�1).
3.SumVariance
SV=
2G�2
Xk=0
(k�SA)2S
k=2�2(1+�)
where�isthegraylevelvarianceand�isthecorrelation.
4.ClusterShade
A=
2G�2
Xk=0
(k�SA)3S
k
22
Chapter2.Texturestatistics
5.ClusterProminence
B=
2G�2
Xk=0
(k�SA)4S
k
GLSH-featureshasnotbeenusedaswidelyasfeaturesbasedonGLDH.
Conners,Trivedi,&Harlow(1984)foundthatclustershadeandcluster
prominencewasausefulsupplementtotheGLCM
andGLDHfeatures
mentionedabove.LiketheGLDH,theGLSHhaslowerstoragerequire-
mentsandlowercomputationalcomplexity,butnoauthorshavetoour
knowledgetriedtheGLSHfeaturesbythemselves.
ItisobviousthattheGLDHandGLSHcontainsalotoftheinformation
fromthecooccurrencematrices.Duetothelowercomputationalcomplexity
itis,asmentionedfortheGLDH,relevanttoinvestigateifanysigni�cant
informationislostwhengoingfromtheGLCM
totheGLDHandGLSH.
ThestructureoftheGLCM
isdiagonalandoftenatleastapproximately
symmetric.Unser(1986b)approximatedtheenergyandentropyfeaturesof
theGLCMfromtheGLDHandtheGLSH.Thisapproximationgaveonlya
slightdecreaseinclassi�cationaccuracy.Unser(1986b)notedthatthesum
anddi�erenceofpairsofpixelsaredecorrelated,butthisisnotgenerally
trueforthesumandtheabsolutedi�erence.Thediagonalmomentmeasures
thiscorrelationanditvariesfromtexturetotexture(Seesection2.7).
Unser(1986b)alsousedtheGLDHandGLSHthemselvesasfeatures.
2.2
Second-ordergraylevelstatistics
23
2.2.4
Haralickfeatures
Mostofthefeaturesmentionedinthissectionwereintroducedintexture
analysisinapaperbyHaralick,Shanmugam,&Dinstein(1973),where14
di�erentfeatures(f1-f14)werepresented.Theyareallnaturaldescriptors
oftwo-dimensionaldistributions,althoughtheyseemtohavebeenselected
inaratheradhocmanner.Eventhoughitisrecognizedthatthesefeatures
donotdescribeallaspectsofthecooccurrencematricestheyhavebeenused
veryrigorouslyinmanypapers.Threeofthefeatureswerenotincludedin
thelistofGLCM
features.
�Variance(f4).Thegraylevelvariancebelongstothe�rst-orderstatis-
tics.
�SumAverage(f6)
f6=SA=
2G�2
Xk=0
kSk=�x+�y
Thisfeaturealsobelongstothe�rst-orderstatistics.
�InformationMeasuresofCorrelation(f12andf13).
f12=HXY�HXY1
max(HX;HY)
f13= p1�exp(�2(HXY2�HXY))
whereHXY=Sand
HXY1=�
G�1
Xi=0
G�1
Xj=0Cijlog(CxiCyj)
HXY2=�
G�1
Xi=0
G�1
Xj=0CxiCyjlog(CxiCyj)
24
Chapter2.Texturestatistics
AsmentionedinLinfoot(1957)HXY1=HXY2.f12andf13are
thuscloselyrelated,andonlyf13,theinformationalcoeÆcientofcor-
relation,isconsidered.
2.2.5
GLCM
asacontingencytable
Zucker&Terzopoulos(1980)interpretedthecooccurrencematrixCasa
normalizedcontingencytable(Seee.g.Bishop,Fienberg,&Holland(1975))
andusedthe�2statistictoselectmatricessuitableforclassi�cation.
�2=Nh
G�1
Xi=0
G�1
Xj=0
(Cij �CxiCyj)2
CxiCyj
=Nh(
G�1
Xi=0
G�1
Xj=0
C2ij
CxiCyj
�1)
The�2valuesandtheselecteddisplacementshcanbeusedasfeaturesfor
classi�cation.
The�2statisticmeasurestheassociationbetweenvariablesincontingency
tables,butdoesnotdiscriminateamongthetypesofassociation.Figueiras-
Vidal,Paez-Borrallo,&Garcia-Gomez(1987)pointedoutthatperiodicity
isindicatedinacooccurrencematrixbyaconcentrationofhighcounts
aroundthemaindiagonal.Theysuggestedtheinertiameasure(2.1)to
detectperiodicities.
The�measureofagreement(Cohen,1960)
�= P
G�1
i=0
Cii � PG�1
i=0
CxiCyi
1� PG�1
i=0
CxiCyi
wassuggestedbyParkkinen,Selk�ainaho,&Oja(1990)todetectperiodic-
ities.Itdirectlymeasurestheconcentrationonthemaindiagonalandthe
2.2
Second-ordergraylevelstatistics
25
computationalcomplexityisO(G)insteadofO(G2)forthe�2
statistic.
The�statisticworksbestwithalimitednumberofgraylevels,e.g.from
4to32,and,aswewillseeinsection2.4,itcorrespondstothecorrelation
measureforbinarytextures.
Manyotherfeaturescanbeusedtoselectthedisplacement(s)thatgivethe
bestclassi�cation.
2.2.6
Multi-resolutionGLCM
Weszka,Dyer,&Rosenfeld(1976)concludedthatlarge-distancecooccur-
rencefeaturesgavebetterperformanceifaspatialaveragingwasdone�rst.
Thissuggeststhatcooccurrencematricesatseveraldi�erentscalesshould
beconsidered.
2.2.7
GLCM
performance
GLCMfeatureshaveanextensivehistoryasareferencefortexturefeature
performance.Weshallgiveabriefsummary.
Haralicketal.(1973)usedGLCM
featurestoclassifyphotomicrographs
ofsandstone,panchromaticaerialphotographsandmultispectralsatellite
imagery.Theyfoundthattexturalfeaturesisavaluablesupplementto
spectralfeatures.
Weszkaetal.(1976)comparedtheclassi�cationperformanceonaerialpho-
tographsandLANDSATimageryofGLCM
features,GLDHfeatures,ring
26
Chapter2.Texturestatistics
andwedgefeaturesinthespatialfrequencydomainandgraylevelrunlength
features.GLCMandGLDHfeatureswerefoundtobethemostusefuland
ofalmostequalperformance.
Conners&Harlow(1980)madeatheoreticalcomparisonofthesamegroups
offeatures,andtheresultsagreeverywellwiththoseofWeszkaetal.(1976).
Manyauthorshavesincethenintroducednewtexturalfeaturesandclaimed
thesetobesuperiortotheGLCMfeatures.
Laws(1980)claimedthathistextureenergyfeaturesperformedsigni�cantly
betterthanGLCMfeaturesinsegmentationofacompositeofeightBrodatz
textures.
Kashyap,Chellappa,&Khotanzad(1982)usethemaximumlikelihoodes-
timateoftheparametersinaSimultaneousAutoregressiveModel(SAR)
asfeaturesforclassi�cation.TheresultiscomparabletothatofGLCM
features.
Vickers&Modestino(1982)usedanisotropiccooccurrencematrixtoclas-
sifysubimagesof9Brodatztextures.
Fordistancesof1,3and5they
obtainedbetween95%and98%correctlyclassi�ed.Parkkinen&Oja(1986)
usedcooccurrencematriceswithahorizontaldisplacement.
Siew,Hodgson,&Wood(1988)usedGLCM,GLDH,graylevelrunlength
andneighboringgrayleveldependencefeaturestomeasurecarpetwear.
TheirdistinctionbetweenGLCM
andGLDHfeaturesisnoninformative
since2featuresarecommontobothgroups,andtheyuseastandardGLCM
andanisotropicGLDH.Theresultshowsthatfeaturesfromallfourgroups
cancharacterizetheappearanceofcarpets.Theirresearchindicatesthat
2.3
Higher-ordergraylevelstatistics
27
thesestatisticalmeasuresaresuperiortoatrainedpanelinreliablyranking
carpetsaccordingtowear.
duBuf,Kardan,&Spann(1990)compared7setsoffeaturesandfound
thatGLCM,Laws(Laws,1980)andUnser(Unser,1986a)featureswere
generallybest.
Berry&Goutsias(1991)madeacomparisonbetweenfeaturesbasedonthe
neighboringgrayleveldependencematrix(NGLDM)ofSun&Wee(1983)
andGLCM
features.OnsynthetictexturesNGLDM
featuresperformed
better.Onnaturaltexturestheyperformedequallywell.
2.3
Higher-ordergraylevelstatistics
Higher-ordergraylevelstatisticsweredeclaredunimportantfortextureper-
ceptionbythenowdisprovediso-second-orderconjecture(Julesz,1975),but
theyseemtohaveregainedtheirpopularityintheliterature.Wewillreview
afewapproaches.
2.3.1
Graylevelrunlengthmatrix
Agraylevelrunisasetofconsecutive,collinearpixelswiththesamegray
level.Thenumberofpixelsinarunistherunlength.Galloway(1975)
usedagraylevelrunlengthmatrix(GLRLM)tocomputetexturefeatures.
Element(i,j)oftheGLRLM,r,isdenotedrij ,andthisisthenumberof
runsofgraylevelihavinglengthj.ThetotalnumberofrunsisNr
The
28
Chapter2.Texturestatistics
GLRLM
canbecomputedinanydirection,butusuallyonlydirections0Æ,
45Æ,90Æand135Æareused.Fromtheimage
2
1
1
3
3
3
1
0
2
2
3
3
1
1
2
0
2
1
0
1
0
0
1
0
0
wecancomputerforthehorizontaldirection(0Æ).
GLRLM
1
2
3
4
5
0
3
2
0
0
0
1
4
2
0
0
0
2
3
1
0
0
0
3
1
2
0
0
0
whereNr=18.
Thefollowingfeatures,computedfromtheGLRLM,weresuggested:
1.ShortRunsEmphasis
RF1=
G�1
Xi=0
LXj
=1
Rij
j2
whereG�2�E�1.EtakesthevalueG�2forauniformdistribution
ofthecountsandthevalue1i�onlyonecellisnonzero.
2.3
Higher-ordergraylevelstatistics
29
2.LongRunsEmphasis
RF2=
G�1
Xi=0
LXj
=1j2R
ij
3.GrayLevelNonuniformityR
F3=
G�1
Xi=0
[LXj
=1Rij ] 2
where1=G
�RF3�1.RF3takesthevalue1=Gforauniform
distributionofthecountsandthevalue1i�onlyonecellisnonzero.
4.RunLengthNonuniformity
RF4=
LXj
=1 [ G
�1
Xi=0Rij ] 2
where1=L�RF4�1.RF4takesthevalue1=Lforauniform
distributionofthecountsandthevalue1i�onlyonecellisnonzero.
5.RunPercentage
RF5=Nr =N
where1=N�RF5�1.
TheGLRLM
featuresareverysensitivetonoise,andthisisprobablythe
reasonforthereportedbadperformance(e.g.Weszkaetal.(1976)).The
performancefordiscrete(e.g.binary)texturesislikelytobebetter.
2.3.2
Neighboringgrayleveldependencematrix
Theneighboringgrayleveldependencematrix(NGLDM)wasintroducedby
Sun&Wee(1983).Inthisapproachallneighborsofapixelareconsidered
30
Chapter2.Texturestatistics
atthesametime.Aneighborisapixelwithinacertaindistancedofthe
centralpixelandSisthenumberofneighbors.disusuallychosentobe
p2andthenS=8.Apixelanditsneighboraresaidtohavesimilargray
levelsiftheabsolutegrayleveldi�erenceislessthanorequaltoachosen
positivenumbera.Element(k,s)ofaNGLDM,q,isdenotedqks ,andthis
isthenumberofpixelswithgraylevelkhavingsneighborswithsimilar
graylevels.LetNd
bethetotalnumberofcountsinq,thenQ
=q=Nd
isthenormalizedNGLDM.Thenotationpresentedheredi�ersfrom
the
notationofSunandWee.Thisistokeepthede�nitionsalongthelinesof
theGLCMde�nition.Fromtheimage
2
1
1
3
3
3
1
0
2
2
3
3
1
1
2
0
2
1
0
1
0
0
1
0
0
wecancomputeqfora=0andd=p
2as
NGLDM
0
1
2
3
4
5
6
7
8
0
1
0
1
0
0
0
0
0
0
1
0
0
0
4
0
0
0
0
0
2
1
0
1
0
0
0
0
0
0
3
0
0
1
0
0
0
0
0
0
whereNd=9.
Thefeatures,thatSunandWeesuggestcomputedfromtheNGLDM,are
listedbelowwiththemodi�cationthatthefeaturecomputationhereis
2.3
Higher-ordergraylevelstatistics
31
basedonthenormalizedNGLDM.Thismeansthatthefeaturesareinde-
pendentofNd .
1.SmallNumberEmphasisN
1=
G�1
Xk=0
SXs
=0
Qks
1+s2
whereG�2�E�1.EtakesthevalueG�2forauniformdistribution
ofthecountsandthevalue1i�onlyonecellisnonzero.
2.LargeNumberEmphasisN
2=
G�1
Xk=0
SXs
=0s2Q
ks
3.NumberNonuniformity
N3=
SXs
=0 [ G
�1
Xk=0
Qks ] 2
4.SecondMoment
N4=
G�1
Xk=0
SXs
=0Q2k
s
5.Entropy
N5=�
G�1
Xk=0
SXs
=0QkslogQks
32
Chapter2.Texturestatistics
2.4
Statisticsforbinaryimages
Forbinaryimagesthenumberofgraylevels,G,equals2andthe�rst-order
statisticsaredeterminedbythefractionof1-pixels,p1=n1 =N.Wehave:
�=p1
�2=p1 (1�p1 )
AGLCMhastheform:
GLCM
0
1
sum
0
n00
n01
n0:
1
n10
n11
n1:
sum
n:0
n:1
Nh
Thenormalizedversionis:G
LCM
0
1
sum
0
p00
p01
p0:
1
p10
p11
p1:
sum
p:0
p:1
1
Forstationaryimageswehavep:0 �p0: �p0 ,p:1 �p1: �p1and
p01 �p10 �p1 �p11
p00 �p0 �p1+p11
Wesee,that,givenp1 ,allsecond-orderstatisticscanbeexpressedasa
functionofe.g.p11 ,i.e.thereisonly1degreeoffreedominaGLCMgiven
the�rst-orderstatisticp1 .
2.4
Statisticsforbinaryimages
33
ItismoreinstructivetoseethenormalizedGLCM
expressedintermsof
the�rst-orderstatisticsandthecorrelation�=p11�p1p1
p0p1
.
GLCM
0
1
sum
0
p0 p0 (1+p1
p0
�)
p0 p1 (1��)
p0
1
p1 p0 (1��)
p1 p1 (1+p0
p1
�)
p1
sum
p0
p1
1
Thusallsecond-orderstatisticscanbeexpressedintermsofthe�rst-order
statisticsandthecorrelation�.Weshallshowthisforthe�2measureand
the�measure.
The�2measureisforbinarytextures
�2=Nh[ (p00 �p0 p0 )2
p0 p0
+2(p01 �p0 p1 )2
p0 p1
+(p11 �p1 p1 )2
p1 p1
]
=Nh[p20 �2+2p0 p1 �2+p21 �2]=Nh[�2(p
0+p1 )2]=Nh�2
andthe�measureis
�=p00+p11 �p0 p0 �p1 p1
1�p0 p0 �p1 p1
=2p0 p1 �
2p0 p1
=�
Higher-orderstatisticsareelegantlyexpressedintermsofmathematical
morphologyasinSerra(1982).
34
Chapter2.Texturestatistics
2.5
Fourierfeatures
ThediscreteFouriertransform(DFT),F,anditsinverse,F�1,arede�ned
fortheimage,ff(m;n);m=0;::;M
�1;n=0;::;N�1g,as
F(f)=F(u;v)=
1MN
M
�1
Xm=0
N�1
Xn=0f(m;n)e�j2�(m
uM
+nv
N
)
and
F�1(F)=f(m;n)=
M
�1
Xu=0
N�1
Xv=0F(u;v)ej2�(m
uM
+nv
N
)
TheFourierpowerspectrumisjFj 2
=FF�
(2.2)
whereF�denotesthecomplexconjugateofF.Thepowerspectrum
usu-
allyvariesoverseveralordersofmagnitude,whichmakesitinterestingto
considerthelog-powerspectrumlog �1
+jFj 2 �
(2.3)
StandardlibraryFFTroutinesusuallyhavetherestrictionthattheheight
andthewidthoftheimagehastobeapowerof2.
ThepowerspectrumistheFouriertransformoftheautocorrelation,i.e.it
onlycontainsinformationaboutthesecond-orderstatistics.Itisrecognized
thatthephasespectrumcontainsmuchrelevantinformation,butitisvery
hardtomakeituseful.
FromtheFouriertransformoftheimage,thepowerspectrumandthelog-
powerspectrum
wecancomputeanumberoffeatures.Averagesofthe
powerspectrum
overring-shapedandwedge-shapedregionsarecommon
2.5
Fourierfeatures
35
features(seee.g.Weszkaetal.(1976)).Liu&Jernigan(1990)extracts28
featuresfromthepowerandphasespectrum.
36
Chapter2.Texturestatistics
2.6
Measurementofenzymatictreatmentef-
fectontextile
Thee�ectofcellulaseenzymatictreatmentontextileshasbeeninvestigated
usingstandardtexturealgorithms.AnextensivestudyinboththeFourier
domainandthespatialdomainhasrevealedthenatureofthechangesand
resultedinonesinglefeaturethatmeasuresthesechangesinafastand
robustway.
2.6.1
Background
ThisprojectstartedwhentheR&Dgroupinthedetergentenzymedivision
ofNovoNordisk(aworld-leadingmanufacturerofdetergentenzymes)ex-
pressedthewishtoquantifythee�ectsofenzymatictreatmentoftextiles
usingdigitalimageanalysis.Untilnowthisquanti�cationhasbeendone
qualitativelyusingmicroscopicinspectionandquantitativelyusingpanel
testsandlightmeasurements(Huntercoordinates).Therewasaneedfora
newobjective,robust,fastandrelativelyinexpensivemethod.
2.6.2
Imageacquisition
Theimageacquisitioniscarriedoutasfollows.Thetextileisplacedin
homogeneousandplentifullighting.Acameraispositionedsuchthatit's
opticalaxisisperpendiculartothetextileplaneandtherectangularvisual
2.6
Measurementofenzymatictreatmente�ectontextile
37
areacoversasmuchofthetextileaspossiblewithoutincludingnon-textile
areas.Thesizeofthetextilesinthisstudyis15x10cm.
WeusedanRGBhigh-resolutionslow-scancamera.Thecameraoutputis
digitizedbyaframegrabberthatgeneratesframesof978by768pixelsinthe
red,greenandblueband.Theseframesarecutto969by711toeliminate
acquisitionartifacts.Subsequentlywewillonlyshowresultsderivedfrom
thegreenbandsincethetextilesusedinthisexperimentareblackandgray
andthuscontainsverylittleornocolorinformation.
Thisstudyregardstheenzymatictreatmente�ectforasingletypeofcel-
lulase.Wewanttoassessthee�ectatdi�erentpHvaluesandfordi�erent
doses.Toassesstheday-to-dayvariationthetextileswerewashedondif-
ferentdaysforeachpH-level.Thuswehavethreefactorsthatwewantto
investigate.
�pH:3levels,123
�dose:8levels,01025405075100200
�day(pH):3levels,123(forpHvalues7.0,8.0and9.0)
Wehavetworepetitionsforeachcombination,thusweendupwith144
images.In�gure2.2wesee8textilesrepresentingthe8dosesforpH1,day
1andrepetition1.
38
Chapter2.Texturestatistics
Figure2.2.8textilesrepresentingthe8dosesforpH1,day1.
2.6
Measurementofenzymatictreatmente�ectontextile
39
2.6.3
Descriptionofvisualproperties
Theobjectofthedigitalimageanalysisistocomputeonefeaturethatquan-
ti�esagivenvisualpropertyfromtheimagearray.Inthiscasethevisual
propertyisthehumanperceptionofwear.Thefeaturehastocorrelatewell
withpaneltests.Forcellulaseenzymatictreatmentwithknowne�ectsthis
meansthatthefeaturehastoshowimprovementasafunctionofdoseand
showbestresultsforpHvaluesclosetothepHwithhighestenzymeactivity
(between7.0and8.0inourcase).
Obviouslymanydi�erentfeaturescanbecomputedfromtheimage.Asim-
plefeatureistheaverageintensity,lightness.Thishasastrongresemblance
towhatismeasuredbytheHuntercoordinates.Probablythislightness
featurealsohasastrongin uenceonapaneltest.Figure2.3showsthe
averageintensityasafunctionofdoseforpHlevel1.Weseethatlightness
onlyhasdiscriminativecapabilityforsmalldoses.Inthecontextofimage
analysislightnessisanon-robustfeatureinthesensethatitdependsheavily
onlightingconditionsandcamerasensitivity.
Anotheraspectofenzymatice�ectonthetextilesestimatedbythepanel
testisthedistinctnessoftheregulartextilepattern.Thisdistinctness
shouldincreaseasaresultofthecellulaseenzymatictreatment.Theregular
patternintheinvestigatedtextilesresemblesarectangulargridstructure.
Thewellde�nedperiodofthisgridmakesitappropriatetolookatthe
textilesintheFourierdomain.Thisisdoneinthenextsection.
40
Chapter2.Texturestatistics
100
102
104
106
108
110
112
0
50
100
150
200
mean
dose
day13
33
33
33
33
33
33
33
33
day2+
++++
+ +
+++ +
++
++
++
day32
22
22
22
2 2
22
22
22
2 2
Figure2.3.AverageintensityasafunctionofdoseforpHlevel1.
2.6
Measurementofenzymatictreatmente�ectontextile
41
2.6.4
AnalysisintheFourierdomain
Frequencybasedmethods.
Theclassicalwayofobtaininganestimateofthepowerspectrum
isby
equation2.2.Thelog-powerspectrumisgiveninequation2.3.
Theperiodogramisanon-consistentestimateofthepowerspectrum.Welch's
methodisonewaytodealwiththis.Theimageissplitupinanumberof
non-overlappingsubimages.Theperiodogramiscalculatedineachsubim-
age,followedbyanaveragingoverthesubspectra.
Figure2.4showsthefullresolutionpowerspectraofthetextilesin�gure
2.2.Theconcentriccirclesareisolinesforthespatialfrequency.Several
high-intensityspotsinthepowerspectrumisshowingtheperiodicityofthe
weaves.Thespotsoflowerintensityinthehigh-frequencyareasarehigher
harmonics.Weseethattheintensityinthelow-frequencyareas(nearthe
centerofthepowerspectrum)isfadingforhigherdosesofenzyme.To
illustratethise�ectwecomputedtheaverageofthepowerspectruminthe
ringsbetweentheconcentriccirclesandplotteditversustheradiusofthe
rings.Theseaveragesarecomputedforeachofthepowerspectrain�gure
2.4,andtheaveragecorrespondingtodose0subtractedfromtheaverages
ofeachoftheotherdoses.Theplotisshownin�gure2.5,anditisobvious
thattheaveragesinthelow-frequencyareasaredecreasingforhigherdoses.
Wealsonotethatallthecurveshasapproximatelythesameintersection
atafrequencycorrespondingtothefrequencyoftheweaves.Thushaving
establishedthatthepowerspectrumactuallycontainsrelevantinformation
aboutthetextilewear,wewilltrytoquantifythisinasingleFourierfeature.
42
Chapter2.Texturestatistics
Figure2.4.Powerspectraofthetextilesin�gure2.2.
2.6
Measurementofenzymatictreatmente�ectontextile
43
-6e-07
-5e-07
-4e-07
-3e-07
-2e-07
-1e-07 0
1e-07
2e-07
0
2
4
6
8
10
12
14
Energy
Ringnumber
dose=0
dose=103
3
3
3
3
3
3
3
3
3
3
3
3
3
3
3
dose=25+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
dose=402
2
2
2
2
2
2
2
2
2
2
2
2
2
2
2
dose=50�
����
����
������
�
dose=754
444
44
444
444444
4
dose=100
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
dose=200
Figure2.5.Averageofpowerspectraringsrelativetodose0forthespectra
in�gure2.4.
44
Chapter2.Texturestatistics
Spectraltexturefeatures.
Texturefeaturesderivedinthespatialfrequencydomainhavebeeninvesti-
gatede.g.inWeszkaetal.(1976)andLiu&Jernigan(1990).Thefeatures
testedinthepresentcontextarelistedbelow.
1.Rings
2.Wedges
3.Inertia
4.Entropy
5.Anisotropy
Allfeatureshasbeencomputedonboththefullresolutionpowerspectrum
andthepowerspectrumestimatedusingWelch'smethod.TheWelchspec-
tralestimateperformedsigni�cantlybetterthanthefullresolutionpower
spectrum.
Thefeatureswerecomputedonboththepowerspectrum
andthelog-
powerspectrum.Itturnedoutthatthefeaturescalculatedonthelog-power
spectrumperformedsigni�cantlybetterthanthepowerspectralfeatures.
Furthermorewefoundthatinertiaandentropyfeaturesperformedbetter
thantheotherfeatures.Theinertiafeatureperformedgenerallyalittle
betterthantheentropyfeature,anditseemstobeamorenaturalway
summarizethephenomenaobservedin�gure2.5.
2.6
Measurementofenzymatictreatmente�ectontextile
45
TheinertiafeatureIandlog-powerinertiaLIiscomputedas
I= X(u
;v) (u
2+v2)jF(u;v)j 2
LI= X(u
;v) (u
2+v2)log(1+jF(u;v)j 2)
wherewearesummingoverallfrequencies.Thenormalizedinertiaisthe
inertiadividedbytheinertiaofthecorrespondingtextilewithdose=0.
In�gure2.6weshowthenormalizedlog-powerinertiavs.log(dose)forall
threevaluesofpH.Thusthemeasureisaveragedoverdaysandrepetitions.
Itcanbeseenthatthereisacleardistinctionbetweentheperformance
oftheenzymesatthethreepHvalues.Inadditionthereseemstobean
approximatelylinearrelationbetweentheinertiaandlog(dose).
Discussionofresults
Thespectralapproachhasprovideduswithausefulfeatureandalotof
insightregardingthenatureofthisproblem.TheuseoftheFFTalgorithm
howeverintroducessome,somewhattechnical,limitationsregardingthesize
oftheimageandcomputationalspeed.Itisalsoless exibleinremoving
textileirregularitiesfromtheanalysis.
2.6.5
Spatialdomainfeatures
Thedistinctnesspropertyandothertexturalpropertiescanalsobemea-
suredbytexturalfeaturesinthespatialdomain.Siewetal.(1988)used
featuresbasedondi�erenttexturematricesforcarpetwearassessment.The
46
Chapter2.Texturestatistics
1.05
1.1
1.15
1.2
1.25
1.3
1.35
2
2.5
3
3.5
4
4.5
5
5.5
normalizedinertia
log(dose)
pH=13
3
3
3
3
3
3
3
pH=2+
+
+
+
+
+
+
+
pH=32
2
2
2
2
2
2
2
Figure2.6.Normalizedlog-powerinertiaversuslog(dose).Weseethatthe
measurere ectstheexpectedranking.
2.6
Measurementofenzymatictreatmente�ectontextile
47
conclusionofthepaperwas,thatfeaturesbasedontexturematrices(e.g.
GLCM)canbeusedtocharacterizetheappearanceofcarpetsandchanges,
theyundergoduringwear.Theproblemofcarpetwearassessmentissimi-
lartomeasuringe�ectsofenzymatictreatment,andthereforeweincluded
GLCM
featuresinourstudy.
Spatialfeatures
Thespatialdomainfeaturesincludedinthisstudywereallthe�rst-order
statisticsofsection2.1andthefollowing15GLCM
features.
1.Energy
2.Entropy
3.Maximumprobability
4.Correlation
5.Diagonalcorrelation
6.Kappa
7.Di�erenceenergy
8.Di�erenceentropy
9.Inertia
10.Localhomogeneity
11.Sumenergy
48
Chapter2.Texturestatistics
12.Sumentropy
13.Sumvariance
14.Clustershade
15.Clusterprominence
Thefeatureswerecomputedforseveralnumbersofgraylevelsandatseveral
resolutions.Attemptstomakethefeaturesrobusthaveincludedcorrection
forinhomogeneouslightingandautomaticremovaloftextileirregularities.
Theoperationalfeature
Manyofthetestedfeaturesperformedwellonsubsetsoftheimages,but
onlyafewfeaturesgaveanoverallgoodandrobustmeasurement.
Itwaspossibleto�ndarelativelysimplefeaturewithanoverallgoodand
robustperformance.Thisfeatureiscomputedasfollows.Theimageis
transformedtoaresolutionwheretheregulartextilepatternhasjustdis-
appeared(inourcasetheimageswerelowpass-�lteredandsubsampledto
1/16size).Thenthevarianceofthisimageiscomputed.Thevariances
arenormalized(divided)bythevarianceofthecorrespondingtextilewith
dose=0.Theaverageoverdaysandrepetitionsofthisfeatureisshownin
�gure2.7inalog-logplot.Itranksthetextilesjustasexpectedandit
seemsthatalinear�tisappropriateforeachpHlevel.Thisfeatureshallbe
calledthecoarse-scalenormalizedvariance(csnv)feature.Thecsnvfeature
canbecomparedtothetheFourierinertiafeatureintheFourierdomain.
Thelowpass�lterweusedcorrespondapproximatelytoamultiplication
2.6
Measurementofenzymatictreatmente�ectontextile
49
-1.4
-1.2 -1
-0.8
-0.6
-0.4
-0.2
2
2.5
3
3.5
4
4.5
5
5.5
log(csnv)
log(dose)
pH=13
3
3
3
3
3
3
3
pH=2+
+
+
+
+
+
+
+
pH=32
2
2
2
2
2
2
2
Figure2.7.Plotof(log)coarse-scalenormalizedvarianceversuslog(dose).
Weseethatthemeasurere ectstheexpectedranking.
50
Chapter2.Texturestatistics
withaGaussianweightingfunctioncenteredat(0,0)intheFourierdomain.
FortheFourierinertiafeaturetheweightingfunctionis(u2+v2).Thus
thecsnvfeaturemeasurestheenergyinthelowfrequenciesandtheinertia
featuremeasurestheenergyinthehighfrequencies.Sincethemeasuresare
normalizedtheywillactuallymeasuresimilarproperties,butasthetextile
wearseemstobebestdescribedinthelowfrequencies,theinertiafeature
isnotasrobustasthecsnvfeature.
FittingagenerallinearmodelwiththeSASGLM-procedure:
procglm;
classphday;
modellogvar=logdosephday(ph)ph*logdoselogdose*day(ph);
lsmeansph;
randomday(ph);
givestheresults:
2.6
Measurementofenzymatictreatmente�ectontextile
51
DependentVariable:LOGVAR
Source
DF
SumofSquares
MeanSquare
FValue
Model
17
10.32849194
0.60755835
138.29
Error
108
0.47448146
0.00439335
Corr.Total
125
10.80297340
R-Square
Pr>F
RootMSE
LOGVARMean
C.V.
0.956079
0.0001
0.066282
-.92821182
-7.140862
Source
DF
TypeISS
MeanSquare
FValue
Pr>F
LD
1
8.54219513
8.54219513
1944.35
0.0001
PH
2
1.60099025
0.80049512
182.21
0.0001
D(PH)
6
0.12027099
0.02004517
4.56
0.0004
LD*PH
2
0.01070209
0.00535104
1.22
0.2999
LD*D(PH)
6
0.05433348
0.00905558
2.06
0.0637
Source
DF
TypeIIISS
MeanSquare
FValue
Pr>F
LD
1
8.54219513
8.54219513
1944.35
0.0001
PH
2
0.10533370
0.05266685
11.99
0.0001
D(PH)
6
0.05593527
0.00932254
2.12
0.0565
LD*PH
2
0.01070209
0.00535104
1.22
0.2999
LD*D(PH)
6
0.05433348
0.00905558
2.06
0.0637
whereLD=LOGDOSEandD(PH)=DAY(PH).
52
Chapter2.Texturestatistics
LeastSquaresMeans
PH
LSMEAN
1
-1.04568416
2
-0.96280088
3
-0.77615044
ItfollowsthattheamountofvariabilityexplainedbypHanddoseareorders
ofmagnitudegreaterthantheremaininge�ects,inclusivetheday-to-day
variability.ThustheconclusivemodelwillonlyincludethepHanddose
e�ects.TheleastsquaremeansforthethreepHlevelsshowtheexpected
ranking.
2.6.6
Conclusion
Wehaveobtainedasinglefeaturefromdigitalimageanalysistodescribethe
e�ectofcellulaseenzymatictreatmentoftextiles.Thisfeatureisalsofast
tocomputeandseemstoberobust.Otherfeaturesmeasuringthevariation
inthetextilethatiscoarserthantheregulartextilepatterncanpossibly
describethesametextileproperties,butthecoarse-scalenormalizedvari-
anceseemstobethefeaturethathastheoverallbestperformanceofthe
featuresconsidered.Thefeaturemayalsobeusefuline.g.carpetwear
assessment.
2.7
GLCM
featureperformance
53
2.7
GLCM
featureperformance
Theperformanceof15GLCMfeaturesistestedinCARTclassi�cationof15
Brodatztextures.Wetherebyinvestigatehowmuchtexturalinformationis
containedinthesimultaneousdistributionof(horizontal)neighborpixels.
Thecooccurrencematricesarecomputedontherawtextures,onthetex-
turesafterahistogramequalization,andonthetexturesafteraGaussian
histogrammatch.
2.7.1
Imagematerial
15Brodatztextureswereselectedonthebasisthattheyshouldhavea
�ne-grainedandhomogeneoustexture.Apartofeachofthesetexturesare
shownrawin�gure2.8,afterahistogramequalizationin�gure2.9,and
afteraGaussianhistogrammatchin�gure2.10.Thenamesoftheselected
texturesareshownin�gure2.11.ThetexturesD16,D21,D53,D77and
D84willsubsequentlybecalleddeterministicduetotheirrelativelystrict
ordering.Therestwillbecalledstochastic.Thisgroupingwillbehelpful
intheinterpretationoftheclassi�cationresults.
Thetextureswerescannedfromthepaperwithan8-bit,300dpiscanner.
Theoutputfromthescannerisa2400x1800image,whichisthenreducedby
twostepsinaGaussianpyramid(Burt,1981).TheapproximatelyGaussian
operatorisaseparable,symmetric�lterwithvalues
0:05;0:25;0:40;0:25;0:05
Theresultisa600x450 oatingpointimage,wherealmostnopixelshave
identicalvalues.Threebyteversionsofeachimageisnowgenerated.
54
Chapter2.Texturestatistics
Figure2.8.15Brodatztextures(nohistogrammatch).
2.7
GLCM
featureperformance
55
Figure2.9.15Brodatztexturesafterahistogramequalization.
56
Chapter2.Texturestatistics
Figure2.10.15BrodatztexturesafteraGaussianhistogrammatch.
2.7
GLCM
featureperformance
57
Pressedcork
Grasslawn
Woolencloth
(D4)
(D9)
(D19)
Herringbone
Frenchcanvas
Calfleather
weave(D16)
(D21)
(D24)
Beachsand
Pressedcork
Woodgrain
(D29)
(D32)
(D68)
Orientalstraw
Handmadepaper
Pigskin
cloth(D53)
(D57)
(D92)
Cottoncanvas
RaÆa
Calffur
(D77)
(D84)
(D93)
Figure2.11.Namesofthe15Brodatztexturesin�gure2.10.
58
Chapter2.Texturestatistics
�The oatingpointimagescaledlinearly.
�Ahistogramequalizedversion.
�AGaussianmatchedversion(mean=127.5,sdev=40.0).
Thehistogram
equalizationandGaussianmatchareperformedbysort-
ingallpixels,whiletheimageisin oatingpointformat,andthenassign
bytevaluesaccordingtothedesiredhistogram.Thusweobtainaperfect
histogram
match.Histogram
equalizationhasbeenusedfrequently(e.g.
Haralicketal.(1973)andLaws(1980))byresearchersstudyingtheperfor-
manceoftexturefeatures.Theequalizationhasinthesecasesbeenmade
usingalessaccuratebytetobytematch.
Fortheselected,�ne-grainedtextureswecorrectedforbackgroundvaria-
tionsbysubtractinga25x25median�lteredversionofeachtexturefrom
itself.
2.7.2
GLCM
Wecomputedtheright-neighborGLCM(h=(0,1))forthethreeversionsof
all15textures.In�gures2.12,2.13and2.14areshownplotsofthecooc-
currencematricesforrespectivelytherawversions,thehistogramequalized
versionsandtheGaussianmatchedversions.
Eachimagewaspartitionedin108disjoint50x50subimages.From
the
right-neighborGLCMofthesesubimageswecomputedthefollowingGLCM
features:
2.7
GLCM
featureperformance
59
Figure2.12.Cooccurrencematricesofrawtextures.
60
Chapter2.Texturestatistics
Figure2.13.Cooccurrencematricesofhistogramequalizedtextures.
2.7
GLCM
featureperformance
61
Figure2.14.CooccurrencematricesofGaussianmatchedtextures.
62
Chapter2.Texturestatistics
1.Energy(Enrg)
2.Entropy(Entr)
3.Maximumprobability(Maxp)
4.Correlation(Corr)
5.Diagonalmoment(Diag)
6.Kappa(Kapp)
7.Di�erenceenergy(Derg)
8.Di�erenceentropy(Dent)
9.Inertia(Iner)
10.Inversedi�erencemoment(IDM)
11.Sumenergy(Serg)
12.Sumentropy(Sent)
13.Sumvariance(Svar)
14.Clustershade(Shad)
15.Clusterprominence(Prom)
2.7.3
CARTclassi�cation
Classi�cationandregressiontreesisanonparametricalternativetoclassical
discriminantanalysis.Abinarydecisiontreeisconstructedandaclassi�ca-
tionismadebyrunningdownthetreeandchoosetheclasscorrespondingto
2.7
GLCM
featureperformance
63
theterminalnode.TheCARTprogramfromCaliforniaStatisticalSoftware,
Inc.wasused.ThereaderisreferredtoBreiman,Friedman,Olshen,&
Stone(1984)andtheCARTdocumentationfordetailedinformationabout
CART.
Onlysplitsbasedonsinglefeatureswereallowed.10-foldcross-validation
wasusedforestimatingtheprobabilityofcorrectclassi�cation.
WemadeaCARTclassi�cationonsevensubsetsofthe15texturesinall
threeversions.Thesevensubsetsare:
1.The�vetexturesintheleftcolumn.
2.The�vetexturesinthemiddlecolumn.
3.The�vetexturesintherightcolumn.
4.The�vedeterministictextures:D16,D21,D53,D77andD84.
5.The�vestochastictextures:D4,D9,D29,D32andD57.
6.Thetentexturesintheleftandmiddlecolumns.
7.All15textures.
Linearlyscaledversions
In�gure2.15weseetheclassi�cationtreesuggestedbyCARTforset2.A
textureisclassi�edbystartingatthetopnodeandthenrundownthetree
untilaterminalnodeisreached.Everyterminalnodeisassociatedwith
atextureclass,andthisclassisassignedtothetexturethatwewishto
64
Chapter2.Texturestatistics
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .......................................................................................................
......................................................................................................
......................................................................................................
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ................................................................................................................................................
...........................................................................................................................................................................................
D32
...........................................................................................................................................................................................
D9
Enrg
D21
D84
D57
Shad
Diag
Iner
Figure2.15.Classi�cationtreeforset2withnohistogrammatch.
classify.Ateverynonterminalnodeadecisionismadebasedonthevalue
ofonefeature.Ifthevalueofthefeatureislowerthanthesplitvaluefor
thatnodewegoleftinthetree,otherwisewegoright.Thisclassi�cation
isapartitioningoffeaturespaceintoboxes.Thecross-validationestimate
wasonemisclassi�edtextureoutof540.CARTalsoshowedthatmany
alternativetreeswouldhaveasimilarperformance.Figure2.16showsa
scatterplotoftheinertiaversusthediagonalmomentforthetexturesin
set2.Weseethatthetextureclassesareeasilydiscriminated.
Whennohistogrammatchisperformedthe�rst-orderstatisticswillin u-
encethecooccurrencefeatures.Asthe�rst-orderstatisticsoftheBrodatz
texturesdi�ersigni�cantly,thesetof15GLCMfeatureswillbeabletodis-
criminatebetweenanysubsetofthe15textures(actuallyevenanysubset
ofallthe112Brodatztextures)withcloseto0%errorrate.Henceweshall
concentrateonthehistogrammatchedversions.
2.7
GLCM
featureperformance
65
200400
600800
-600 -400 -200 0
11
11
11
11
11
1
11
1
1
111
111
11
11 1
1
11
1 11
111
1
11
1
1
11
1
1
111
11
11
11
1
11
11
1
1 11
11
11
1
11 1
1
1
11
11 1
1 11
1
11
1
11 11
1
1
1
11
1
1 1
1 1 1
1
1
1
1 111
11
2
2 22
22
2
2
22
2
2
2
22
2
22
2
2
22
2
22 2
22
22
22
22
22
22
2
2
2
2
22
2
22
22
22
2
2
2
22
222
2
22
2
2
2
22
2
22
22
22
22
222
2 2
2
2
22
22
22
2
22
22
2
22222
2 22 2
2
22 2
33 3
3
3333
3
33
3
3
33 3
3
3
3
3
33 33
3
3 3
3
3 33
3
33
3
3 3
33 333
33
3
3
3 333 333
3 33 3
3
3
3
33
3
3333
33
33 3
33
3
3
333 3 3
3
33
33
3
3 3
3
3
3
3
33
33 3
3
33
33
3
3
33
34
44
4
4
4444
44
44
4
4
44
4 4 44
4
4
444 4 4
44
444
444
4
4 444
4 44
4
4
4 4 44
444 4
4 44 4
44
44
4 444
44
44
44
4 44 44
44
44
4
44
444
44
4
444
4 44 4
444
444
44
444
55
55 5
55
55
5
5 5
5 55 55 5
5 55
55
55 55
5
5 5
55
5
555
55
55 5
55
55
55
555
555
55
5
5
5
5
5
555
5
5 55
5
55
555 5
55
5 55
5 555
5
55
55
5
5
55 55 5
55
55
5
55 55
55 5
5
Inertia
Diag
200400
600800
-600 -400 -200 0
Figure2.16.Scatterplotofthediagonalmomentversustheinertiaforthe
texturesinset2.1=D9,2=D21,3=D32,4=D57,5=D84.
66
Chapter2.Texturestatistics
Histogram
equalizedversions
Asummaryoftheclassi�cationresultsforthehistogramequalizedtextures
islistedinthefollowingtable.
Setno.
Classes
Terminal
%correctly
Mostimportant
nodes
classi�ed
feature
1
5
10
95.9
Iner
2
5
10
96.5
Derg
3
5
6
80.9
Svar
4
5
6
98.5
Corr
5
5
13
81.3
Corr
6
10
27
89.3
Corr
7
15
54
74.3
Derg
Theresultsshowthatthesetswithseveraldeterministictextureshashigher
percentageofcorrectlyclassi�edtextures.i.e.thedeterministictextures
inthisstudyarerelativelyeasytodiscriminate.Thecorrelationfeature
anduniformityfeaturesbasedonenergyandentropyareimportantforthe
classi�cation.
Weshallnowstudytheclassi�cationresultsofset2inmoredetail.The
classi�cationtreeisshownin�gure2.17.The�rstsplitisbasedonthe
correlationfeature,anditdiscriminatesthetexturesD57andD84fromthe
otherthree.Thisisagoodsplit(highdiscriminatorypower)andsoare
thetwosplitsonthesecondlevel.Howeverontherightbranchofthesplit
basedonthediagonalmomentweseearelativelycomplexsubtreetrying
todiscriminatebetweenthetexturesD9andD32.Thefeaturesusedfor
thispurposearedi�erenceenergy,energyandentropy.Figure2.18showsa
2.7
GLCM
featureperformance
67
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .......................................................................................................
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .......................................................................................................
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .......................................................................................................
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .......................................................................................................
......................................................................................................
......................................................................................................
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ................................................................................................................................................
......................................................................................................
......................................................................................................
...........................................................................................................................................................................................
...........................................................................................................................................................................................
D32
D9
D9
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ....................................................................................................... D32
Corr
D32
D9
D32
D21
Enrg
Entr
Derg
Enrg
Derg
Enrg
Derg
D84
D57
Diag
Figure2.17.Classi�cationtreeforset2afterhistogramequalization.
68
Chapter2.Texturestatistics
0.600.65
0.700.75
0.800.85
0.90
-2000 -1500 -1000 -500 0 500 1000
1
11
1
11
1
11
1
1
11
1
11
1
1
11
1
1
11
1
11
11
1
11 1
1
1
1
1
1
1 1
1
1
1
1
11
11
1
1
1
1
1
1
1
11
11
1
11
11
11
1 1
1
1
1
1
11
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
1
11
1
1
1
11
11
1
1
1
11
1 1
1
1
22
2
22
2
22
2
2
222
2
22
22
2
22
22
22 2
2
2
2
22
22
22
2
2
22
2
2
2
22 2
2 22
2
2
22
2
2
22
22 2
2
22
2
2
2 22
2
2
22
2
22
22
2
22
2
2
22
2
2
2
2
22
22
22
2
2
22
2
2
2
2
22
2
22
22
33
3
3
3
33
3
3
33
3
33
3 3
3 3
3
3
33
3
3
3
3
3
3 3
3
3
3
3
3
33
33
3
3
33
3
3
3
3
3
33
3
3
3
3
3 3
33 33
333
3
33
3
3
33
33
3
3
3
33
33
3
3
33
3
33
3
3
3
3
3
33
33
33
33
33
3
3
33
3
3
3
3
4
4 4
444
44
4
4
4
44
4
4
44
44
4
4
444
4
4
44
44 4
4
4 4
44
44
44
44
4 4
4
44
4
44
4 4
4
44
4
4
44
44
4
4
44
44
4
4
4
44
44
44
44
4
44
44
44
4
4
44
4
444
44
4
4
4444 4
4
444
4
45
55
5 555
55
5
55
5
55
5
5 5
5 5
5
5
55
5
5
5
5
5
5
55
5
5 55
5
5
55
55
5
55
55 55
55
5
5
555 5
5
5
5
5
5 55
55
5
5
5
5
5
55
5
5
5
5 555
555
5
5
555
5
5
5
5
55
5
55
55
5
5
5 5
55 5
5
5
Corr
Diag
0.600.65
0.700.75
0.800.85
0.90
-2000 -1500 -1000 -500 0 500 1000
Figure2.18.Scatterplotofthediagonalmomentversusthecorrelationfor
thetexturesinset2afterhistogramequalization.1=D9,2=D21,3=D32,
4=D57,5=D84.
scatterplotofthediagonalmomentversusthecorrelation.Weseethatthe
majordiscriminatoryde�ciencyinthesetwofeaturesisthemixtureofthe
classesD9andD32.Figure2.19showsascatterplotoftheenergyversus
thedi�erenceenergy.Itisobviousthatthereisnoeasywayoutofthe
discriminatoryproblem.
Gaussianmatchedversions
Asummaryoftheclassi�cationresultsfortheGaussianmatchedtextures
islistedinthefollowingtable.
2.7
GLCM
featureperformance
69
0.0100.015
0.0200.025
0.002 0.004 0.006 0.008
11
1
11
11
11 1 1
1 111
1 1
1
1
1
11
1
1
11
1
11
11
111 1
111
11
11
1
11
1
11
11
1
1 1
11
11
111
11
11
1
11
1
1
1
1
1
11
11
1
11
11
1
11
1
1 11 1
1
1
1
11
1 1
1
1
1
1 1
1
1
1
1
1
1
1
22
22
22
22
222
222 2
22
22
22
2222
22
2222
222
2 22
2 22
22
22
22
22
22
22 2
22
222
22
22
22
22
22
2 22
222 2
22
22
222
222
22
22
2 2
22
22
222
2
22 2
22
222
23
3
3
3
3
3
3
3
3
3 3
3
3
3
3
3
3
3
33
3
3
3
3
3
3
3
33 33
33
3
3
3
33333 33 3
333
3 33
33
3
3
33
333 3
33
33
333
333
33
33
3
3
3
333
3
3333
33
33
33
33 3
33
33
3
33
33
33
3
3
34
4
44
4
4
4
4
44
44
4
4
44
4
44
4
4
4
4
44
4
4
44
4
4
4
4
4
44
4
44
4
4
44
44
44 4
4
4
4
44
4
4 444
444 4
44
4 4
444
44
444
4 444
44
44
44
444
44
4
44
44 4
4
44 4
444
44
44
44
55
55
5
5
5
5
5
5
5
5
5
5
5
55
55
5
5
5
5
5
5
5
55
5
5
5
5
5
5
5
55
5
55
5
5
5
5
55
55
55 5
55
55
5
5
5
5
5
5
5
5
55
55
5
5
5
5
55
5 5
55
5
55
555
55
5
5 55
55
55
5
555
5
5
55
5
5 5
5
5
55
Difference energy
Energy
Figure2.19.Scatterplotoftheenergyversusthedi�erenceenergyfor
thetexturesinset2afterhistogramequalization.1=D9,2=D21,3=D32,
4=D57,5=D84.
70
Chapter2.Texturestatistics
Setno.
Classes
Terminal
%correctly
Mostimportant
nodes
classi�ed
feature
1
5
7
93.5
Corr
2
5
6
97.6
Diag
3
5
8
85.9
Diag
4
5
5
97.2
Corr
5
5
6
84.3
Corr
6
10
18
89.7
Corr
7
15
40
80.9
Corr
Againweseethatthedeterministictexturesarerelativelyeasytodiscrimi-
nate.Allsetsexceptthesetswithamajorityofdeterministictextures(set
1andset4)wereclassi�edmorecorrectlywiththesefeaturesthanwiththe
featuresbasedonhistogramequalization.Generallythediagonalmoment
wasanimportantfeature,andfortwosetseventhemostimportant.It
canalsobeseenthatingeneralthetreeshasfewerterminalnodesthan
treesbasedonthehistogramequalization,thuswegetsimplertrees.The
energyandtheentropyfeatureswerefoundtobehighlycorrelatedforall
15texturesaswerethedi�erenceentropyandtheinertia.
Theclassi�cationtreeforset2isshownin�gure2.20.Thetreeissim-
plerthanthetreebasedonthehistogram
equalization.Onlythethree
featurescorrelation,diagonalmomentandinversedi�erencemomentare
used.Figure2.21showsascatterplotofthediagonalmomentversusthe
correlation.ThereishardlyanyconfusionbetweentheclassesD9andD32.
2.7
GLCM
featureperformance
71
...................................................................................................... ......................................................................................................
......................................................................................................
......................................................................................................
......................................................................................................
......................................................................................................
......................................................................................................
...........................................................................................................................................................................................
.................................................................................................................................................................................................................................................................................................
D84
Corr
D57
D9
D32
D21
D21
IDM
Diag
Diag
IDM
Figure2.20.Classi�cationtreeforset2afterGaussianhistogrammatch.
2.7.4
Classi�cationsummary
Theresultsoftheclassi�cationsaresummarizedasfollows:
�ItiseasytodiscriminatetheBrodatztexturesifnohistogrammatch
isperformed.
�FeaturesbasedonaGaussianmatchperformedbetterthanfeatures
basedonhistogramequalizationforthestochastictextures.
�Featuresbasedonhistogram
equalizationperformedalittlebetter
thanfeaturesbasedonaGaussianmatchforthedeterministictex-
tures.
72
Chapter2.Texturestatistics
0.600.65
0.700.75
0.800.85
0.90
-600 -400 -200 0 200 400
1
1
11
1
1 1
11
1
1
1
11
1
11
1
11
11
1
1
11
11
11 11
11
11
11
1 11
1
1
1
1
1
11
1
111
1
1
11
1
11
11
11
11
1
1 11
11
11
1 1
1
1
1
11
1
11
1
11
1
11
11
1
1 1
1
1
1
11
11
1
1
111
1
1
2
2 22
22
22
22
22
2
22
2 22
2
2
22
22
2 2
2
2
22
22
2
2
22
22
2
22
2
2
22
222
2
2
2
2
22
22
22
22
22
2
2
2
22
22
2
22
2
2
22
2
22
22
2
22
2
22
2 2
22
2
22
2 22
22
22
22
2
2
22
2
3
33
3
3
3
3
3
3
33
33
3
33
3
3
3
33
3
3
3
33
3
3 33
33
3
333
33 3
3
3
3333
333
33
3
33
3
3
3
3
33
33
3
3
3
3
3
3
3
3
3
33
3
33
3
3
33
33
3
3
3
33
33
3
3
3
3
3
3
33 3
3
3
3
33
3
3
33
3
3
4
44
44
44
4 4
44
4
4
4
4
4
44
44
44
4
4
4
4 44
44
4
4
444
44
4
4
44
4
44
4 4
4
4
444
44
4
4 4 44
444 4
4
44
4
4
44
44
444
44
4
44
4
444
4
44
44 4
444
4
4
44
444
44 4
44
44
4
4
5
55 5 5
5
5
5
5
5
55
5
5
5 55
5
5 5
5
5
5
5
5
5 55
5
5
5
55
5
555
5
5
5
5
555 55
5
5 5
5
55
5
5
5
5 5
5
55
55
55
5
55
5
55
55
5
5
55
5 55
5
5
5
555 5
5
5 5
5
55
5
5
5
5
5
5
5
5
5
5
5
55
5
55
Corr
Diag
0.600.65
0.700.75
0.800.85
0.90
-600 -400 -200 0 200 400
Figure2.21.Scatterplotofthediagonalmomentversusthecorrelation
forthetexturesinset2afterGaussianhistogrammatch.1=D9,2=D21,
3=D32,4=D57,5=D84.
2.7
GLCM
featureperformance
73
�Thedeterministictextureswereeasiertodiscriminatethanthestochas-
tictextures.
�Featuresbasedonhistogramequalizationgenerallyproducetreeswith
morenodesthanfeaturesbasedonaGaussianmatch.
�Generallycorrelationwasthemostimportantfeature.
�Thediagonalmomentwasaveryimportantfeature.Manysplitswere
basedonthediagonalmoment.
�Theuniformityfeaturesenergy,entropy,di�erenceenergy,di�erence
entropy,sumenergyandsumentropyseemstobemoreimportantfor
histogramequalizedtextures.
�Themaximumprobabilityfeaturewasgenerallyunimportant.
�Theenergyandtheentropyfeatureswerehighlycorrelatedaswere
thedi�erenceentropyandtheinertia.
2.7.5
Conclusion
Theperformanceof15right-neighborGLCM
featuresinCARTclassi�ca-
tionof15Brodatztextureshasbeeninvestigated.
Thisstudyhasshownthathistogrammatchingoftextureshasasigni�cant
e�ectonthediscriminatoryperformanceofGLCMfeaturescomputedfrom
thetextures.Especiallyitseemsthathistogramequalizationistoocrude
forstochastictextures.ForsuchtexturesaGaussianmatchwillgivebetter
performanceandasimplerandmoreinterpretableclassi�er.TheBrodatz
texturesareeasilydiscriminatedifnohistogrammatchismade.
74
Chapter2.Texturestatistics
Thediagonalmomentisanimportantfeature.Asthisfeaturecannotbe
computedfromthegrayleveldi�erencehistogram(GLDH)andthegray
levelsumhistogram(GLSH),thereisalossofrelevantinformationwhen
replacingtheGLCM
withthesetwohistograms.
Generalizationoftheconclusionsofthisstudyshouldbedonewithgreat
caution.Theselectionof15texturesthatweusedrepresentaninsigni�cant
fractionofreal-worldtextures,andonlythehorizontalneighborrelation
hasbeeninvestigated.
Chapter3
Markovrandom
�elds
Thischapterdealswithparametricdescriptionoftexturebasedonaclass
ofmodelscalledMarkovrandom
�elds.ThetheoryofMarkovRandom
�eldsisreviewedtogetherwiththetheoryoftheassociatedGibbsrandom
�elds.ThetheoryofGibbsrandom�eldswerefoundedinstatisticalphysics
(Ising,1925)andsomerelevantresultsfromthisareaispresentedinanew
statisticalsetting.AvarietyofMarkovrandom�eldsisreviewedwithan
emphasisondiscretemodels.Furtherweintroduceasetofmorphological
Markovrandom�elds,thatextendsthestandardsetofmodelsbyusingthe
operatorsofmathematicalmorphology(Serra,1982).
75
76
Chapter3.Markovrandom
�elds
. . . . . . . . . . . . . . . . . . . . . . . . . .
.......................... ..........................
..................................................... . . . .. . . . . . . . . .. . . . . . . . . ..
........................... . . . . . . . . . . . . . . . . . . . . . . . . .
..................................................... . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . ...................................................... . . . .. . . . . . . . . .. . . . . . . . . ..
..................................................... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . .. . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . ...........................
. . . . . . . . . . . . . . . . . . . . . . . . . ...........................
. . . . . . . . . . . . . . . . . . . . . . . . . ...........................
........................... . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . .
..........................
........................... . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . .
...................................................
........................................
...........
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..................................................... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
...................................................
........................................
............ . . . . . . . . .. . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . ......................
................................ . . . . . . . . .. . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . .......................
............................... . . . . . . . . . .. . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . .
..........................
Figure3.1.Regular2Dtessellations.Rectangular,triangularandhexago-
nal.
3.1
Random
�elds
Oneofthemaintasksinstatisticalimageprocessingistoconstructstochas-
ticmodelsforobservedimagesandespeciallyfortextures.Thepixelvalues
fxi ;i=0;1;::;n�1garerepresentedasrealizationsofrandomvariables
fXi ;i=0;1;::;n�1g,andtheprobabilitymeasurerepresentingthejoint
distributionofallpixelvaluesonanimagegridiscalledarandom
�eld.
P(x)istheprobabilityofaparticularimageorcon�gurationx2,where
isthesetofallpossiblecon�gurationsonthegivengrid.
3.1.1
2Dgrids
Thereexiststhreewaysofpartitioningthetwo-dimensionalplaneindis-
junct,regularpolygonsofequalsize.Suchapartitioningiscalledaregular
tessellation.Thethreeregulartessellationsaretheregularsquaretessella-
tion,theregulartriangulartessellationandtheregularhexagonaltessella-
tionasshownin�gure3.1.
3.1
Random
�elds
77
..................................................... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . .. . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . ...................................................... . . . .. . . . . . . . . .. . . . . . . . . ..
..................................................... . . . . . . . . . . . . . . . . . . . . . . . . ............................ . . . . . . . . . . . . . . . . . . . . . . . . .
..................................................... . . . .. . . . . . . . . .. . . . . . . . . ..
..........................
. . . . . . . . . . . . . . . . . . . . . . . . . ...........................
..........................
. . . . . . . . . . . . . . . . . . . . . . . . . .
..........................
. . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . .
........................................
...........
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ..................................................... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
...................................................
........................................
............ . . . . . . . . .. . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . ......................
................................ . . . . . . . . .. . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . ......................
................................ . . . . . . . . .. . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . ..
...................................................
........................... . . . . . . . . . . . . . . . . . . . . . . . . .
........................... . . . . . . . . . . . . . . . . . . . . . . . . .
........................... . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . ...........................
. . . . . . . . . . . . . . . . . . . . . . . . . ...........................
Figure3.2.2Dpixelgrids.Rectangular,honeycombandhexagonal.
Letthepolygonsofatessellationcorrespondtopixels,thenthegraphcor-
respondingtothepixelgridwillbedualtothegraphofpolygonborders,
i.e.�Asquaretessellationcorrespondtoasquarepixelgrid
�Atriangulartessellationcorrespondtoahoneycombpixelgrid
�Ahexagonaltessellationcorrespondtoatriangular(hexagonal)pixel
grid
Thehoneycombgridisusedinstatisticalmechanicsbutveryrarely(ifat
all)usedinimageanalysis.Sincetheneighborhoodofapixelinatriangular
gridishexagonal,andthepixelsarehexagonal,thisgridisoftencalledthe
hexagonalgrid,eventhoughthisterm�tsjustaswellforthehoneycomb
grid.Herewewillfollowthecommonpracticeinimageanalysis,i.e.subse-
quentlyahexagonalgridhashexagonalpixels.Thehexagonalgridisquite
popularinmathematicalmorphology(Serra,1982)duetotheattractive
neighborhoodstructure.Thesquaregridisusedinthevastmajorityof
situations,andwherenothingelseismentionedthiswillbesynonymousto
grid.Thegridscorrespondingtothetessellationsof�gure3.1areshownin
�gure3.2.Pixelsarelocatedatthelineintersections.
78
Chapter3.Markovrandom
�elds
3.2
Gibbsrandom
�elds
3.2.1
Historicalperspective
In1877Boltzmanninvestigatedthedistributionofenergystatesinmolecules
ofanidealgas.AccordingtotheBoltzmanndistributiontheprobabilityof
amoleculebeinginastatewithenergy"is:
P(")=1z
e�
1kT
"
wherezisanormalizationconstant,thatmakestheprobabilitiessumtoone.
Tistheabsolutetemperature,andk,Boltzmann'sconstant,isaconstantof
nature,thatrelatestemperaturetoenergy.Inallsubsequentformulasthe
temperaturewillbeassumedmeasuredinenergyunits,hencekTwillbe
replacedbyT.
Gibbsusedasimilardistributionin1901toexpresstheprobabilityofa
wholesystemwithmanydegreesoffreedombeinginastatewithacertain
energy.LetxdenoteastateinstatespaceandU:7!Rbetheenergy
function.Then
P(X=x)=
1Ze�1T
U(x)
(3.1)
where
Z= Xx
2
e�1T
U(x):
Ziscalledthepartitionfunction.Tcontrolsthedegreeofpeakinginthe
probabilitydensityfunction.AsT!1thedistributionwilltendtoa
uniformdistributionamongallpossiblestates.AsT!0thedistribution
willtendtoauniformdistributionamongtheminimumenergystates.The
distribution3.1iscalledtheGibbsdistributionorcanonicaldistribution.
Subsequentlytheformertermwillbeusedexclusively.
3.2
Gibbsrandom
�elds
79
Ising(1925)usedtheGibbsdistributiontodescribethebehaviorofferro-
magneticmaterials.Anysiteorpixelinsuchamaterialisthoughtofasa
smalldipole,whichcanbeinstate"spinup"or"spindown"corresponding
tovalues1and-1.
TheIsingmodelonasquaregridisde�nedthroughtheenergyfunction
U(x)=�J Xi�
jxi xj �mH X
i
xi
wherei�jmeansthatpixeliandpixeljareeitherhorizontalorvertical
nearestneighbors.Jisapropertyofthematerialthatdeterminesthe
interactionbetweenneighboringspins.IfJ>
0neighboringspinstend
tobeequal.IfJ<
0neighboringspinstendtobeopposite.J=
0
meansnointeraction.Theconstantm
>0isapropertyofthematerial
thatdeterminesthesensitivityofthespinstoanexternalmagnetic�eld
ofintensityH.H
>0willfavoraspinup,whereasH
<0willfavora
spindown.TheIsingmodelhasbeensuccessfulinexplainingferromagnetic
phenomena,buthasalsofoundedaninterestinthemoregeneralGibbs
random�elds.
Brush(1967)reviewsthehistoryoftheIsingmodel.
3.2.2
Generalproperties
Gibbsrandom�eldsarerandom�eldsde�nedthroughequation3.1.This
meansthatforeveryenergyfunctiononthereexistsacorresponding
Gibbsrandom
�eld.NotalloftheseGibbsrandom
�eldsareusefulfor
ourpurposesandinthenextsectionweshalllimitourattentiontoavery
interestingsubclass.
80
Chapter3.Markovrandom
�elds
TheGibbsmeasurehasaninterestingpropertywithrespecttoentropy.
TheentropySisfrequentlyusedasauniformitymeasureofarandom�eld
P,andisde�nedas
S(P)=� Xx
2
P(x)logP(x):
Ofallprobabilitymeasuresde�nedthroughanenergyfunctiontheGibbs
measure(3.1)isthemeasurewhichmaximizesentropyamongallmeasures
withthesameexpectedenergy(Jaynes,1957).
3.3
Markovrandom
�elds
Hassner&Sklansky(1980)introducedMarkovrandom�eldstoimageanal-
ysisandthroughthelastdecadeMarkovrandom
�eldshavebeenused
extensivelyasrepresentationsofvisualphenomena.Inthisthesisthereis
putastrongemphasisonMarkovrandom�eldswithdiscretepixelvalues
i.e.discreteMarkovrandom�elds,butmostoftheresultsareeasilyex-
tendedtocontinuousMarkovrandom�elds.Formorethoroughexpositions
onMarkovrandom�eldsthereaderisreferredtoGeman(1990),Dubes&
Jain(1989),andRipley(1988).
Intherestofthissectionweshallrestatesomede�nitionsregardingMarkov
random�eldsandatheorem
thatshowsanequivalencebetweenMarkov
random�eldsandGibbsrandom�elds.
De�nition1.LetS=fs0 ;s1 ;:::;sn�1 gbeasetofsites.Aneighbor-
hoodsystem
N=fNs ;s2SgisacollectionofsubsetsofSforwhich
3.3
Markovrandom
�elds
81
1.s62Ns
2.r2Ns ,s2Nr
Nsaretheneighborsofs.
Whensitesiandjareneighborswewritei�j.Thesetofallpossible
con�gurationsonSiscalled.
De�nition2.AcliqueCisasubsetofSforwhicheverypairofsites
areneighbors.
Singlepixelsarealsoconsideredcliques.Thesetofallcliquesonagridis
calledC.
De�nition3.Arandom
�eldX
isaMarkovrandom
�eld(MRF)
withrespecttotheneighborhoodsystemN=fNs ;s2Sgi�
1.P(X=x)>0forallx2
2.P(Xs=xs jXr=xr ;r6=s)=P(Xs=xs jXr=xr ;r2Ns )
foralls2Sandx2
ThestructureoftheneighborhoodsystemdeterminestheorderoftheMRF.
Fora�rstorderMRFtheneighborhoodofapixelconsistsofitsfournearest
neighbors.InasecondorderMRFtheneighborhoodconsistsoftheeight
nearestneighbors.Thecliquestructuresareillustratedin�gure3.3and
82
Chapter3.Markovrandom
�elds
. . . . . . . . . . . . . . . ........................................ .. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . .. ....................................... . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ......................................... . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . .. ....................................... . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . .. ....................................... . . . . . . . . . . . . . . .
Figure3.3.Cliquesfora�rst-orderneighborhood.
. . . . . . . . . . . . . . . ......................................... . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ........................................ .. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ........................................ .. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ......................................... . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ........................................ .. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ........................................ .. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ......................................... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
....................................... .. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ........................................ .. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ........................................ .. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ........................................ .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
....................................... .. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ........................................ .. . . . . . . . . . . . . . .. . . . . . . . . . . . . . . .
....................................... .. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ........................................ .. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ........................................ .. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ........................................ .. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ........................................ .. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ........................................ .. . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . ......................................... . . . . . . . . . . . . . . .
.......................................
........................................
........................................
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
........................................
........................................
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Figure3.4.Additionalcliquesforasecond-orderneighborhood.
5
4
3
4
5
4
2
1
2
4
3
1
.
1
3
4
2
1
2
4
5
4
3
4
5
Figure3.5.Ordercodingofneighborhoodstructure.Then-orderneighbor-
hoodofthecenterpixel(.)containsthepixelswithnumberslessthanor
equalton.
3.3
Markovrandom
�elds
83
�gure3.4fora�rst-orderMRFandasecond-orderMRF.Theordercoding
oftheneighborhooduptoorder�veisshownin�gure3.5.
De�nition4.X
isaGibbsrandom
�eld(GRF)withrespecttothe
neighborhoodsystemN=fNs ;s2Sgi�
P(X=x)=
1Zexp(�U(x)=T)
whereZisanormalizingconstantcalledthepartitionfunction,Tisa
controlparametercalledtemperatureandUistheenergyfunctionof
theform
U(x)= XC
2C
VC(x)
whereVC
iscalledapotentialandisafunctiondependingonlyonxs ;s2
C,
Theorem
1.(Hammersley-Cli�ord).Arandom�eldXisaGibbsrandom
�eldwithrespecttotheneighborhoodsystemNi�X
isaMarkovrandom
�eldwithrespecttoN.
AsimpleproofmaybefoundinGeman(1990).Usingthisequivalence
wehavebothalocalandaglobaldescriptionofthedistribution.Inthe
presentcontextweusetheterm
Markovrandom
�eldtoemphasizethe
Markovproperty.
84
Chapter3.Markovrandom
�elds
3.4
BinaryMarkovrandom
�elds
3.4.1
Isingmodelrevisited
ThebestknownandmostinvestigatedMarkovrandom
�eldistheIsing
model.Thismodelhasbeenstudiedinstatisticalphysicssinceitsintro-
ductioninIsing(1925),whereasstatisticiansjoinedthee�ortsinthe1960's.
WeshallgiveathoroughdescriptionoftheIsingmodelusingstatisticalter-
minology.Thuswiththenotationintroducedintheprevioussectionwe
willtalkaboutthe�rst-orderbinaryMarkovrandom�eld.Thereaderis
referredtoKinderman&Snell(1980)forbackgroundmaterialonthisissue.
Inournotationeverysitecantakethevalues0or1.Theneighborhood
ofapixelisthefournearestneighbors.Thecorrespondingthreecliques
aresinglepixels,horizontalneighborsandverticalneighbors.Singlepixels
withvalueonehavethepotential��.Horizontalneighborcliqueshavethe
potential��1ifbothpixelsareone.Thecorrespondingverticalneighbor
cliquepotentialis��2 .Ifanypixelinacliqueis0thecliquepotentialis0.
Thisgivesustheenergyfunction
U(x)=�� X
i
xi ��1 Xi$
jxi xj ��2 Xil
jxi xj
(3.2)
andthejointdistribution
P(X=x)=
1
Z(�;�1 ;�2 )exp(� X
i
xi+�1 Xi$
jxi xj+�2 Xil
jxi xj )
(3.3)
wherei$jmeansthatiandjarehorizontalneighbors,andiljmeans
thatiandjareverticalneighbors.If�1=�2thecon�gurationswillshow
3.4
BinaryMarkovrandom
�elds
85
nodirectionalityandwecallthisanisotropicmodel.Themoregeneral
formulationin(3.3)representstheanisotropicmodel.Thejointdistribution
fortheisotropicmodelis
P(X=x)=
1
Z(�;�)exp(� X
i
xi+� Xi�
jxi xj )
wherei�jmeansthatiandjareneighbors.
Theexpectedmeanandvariancecanbeexpressedas
E( X
i
Xi )= X
[ Xi
xi ]P(x)=
1Z
@@�Z=
@@�logZ
V( X
i
Xi )= X
[ Xi
xi ] 2P(x)�1Z
2(@@
�Z)2
=
1Z
@2
@�2Z�1Z
2(@@
�Z)2=
@2
@�2logZ:
Thisresultisvalidforboththeisotropicandanisotropicmodels.
Inthehorizontaldirectionweget
E( Xi$
jXi Xj )= X
[ Xi$jxi xj ]P(x)=
1Z
@@�1Z=
@@�1logZ
V( Xi$
jXi Xj )= X
[ Xi$jxi xj ] 2P(x)�1Z
2(@
@�1Z)2
=
1Z
@2
@�21Z�1Z
2(@
@�1Z)2=
@2
@�21logZ:
Fortheverticaldirectionandfortheisotropiccasetheresultsareanalogous.
Asitcanbeseenfrom
theequationsabovethepartitionfunctionisa
mainkeyinunderstandinganddescribingthebehaviourofthismodel.
86
Chapter3.Markovrandom
�elds
Manyattemptshavebeenmadetomakeevaluationofthepartitionfunction
possible.TheonlyexactresultwasfoundbyOnsager(1944)forthezero-
�eldIsingmodelinthelargegridlimit.Zero-�eldmeansthatthemarginal
probabilityof0-pixelsand1-pixelsareequal,i.e.�=��1+�2 .LetNbe
thenumberofpixelsinthegrid.OnsagerfoundthatinthelimitN!1
wecanwrite1N
logZas
log2��1+�2
2
+
12�2 R�0 R�0
log(cosh�1
2
cosh�2
2
�sinh�1
2
cos!1 �sinh�2
2
cos!2 )d!1 d!2 :
Usingthisexpressionwecan�ndthecorrelationbetweenhorizontalneigh-
borsinthelimitN!1as
�1 (�1 ;�2 )=E(4N Xi$
jXi Xj �1)
=
12�2 Z
�0 Z
�0
2sinh�1
2
cosh�2
2
�2cosh�1
2
cos!1
cosh�1
2
cosh�2
2
�sinh�1
2
cos!1 �sinh�2
2
cos!2d!1 d!2 :
(3.4)
Ananalogousexpressionisobtainedfortheverticalneighborcorrelation,
�2 (�1 ;�2 ).Intheisotropiccasewegetthenearestneighborcorrelationin
thelimitN!1as
�(�)=E(2N Xi�
jXi Xj �1)
=
12�2 Z
�0 Z
�0
2sinh�2cosh�2 �cosh�2(cos!1+cos!2 )
cosh2�2 �sinh�2(cos!1+cos!2 )
d!1 d!2 :
(3.5)
Theintegralscanbecomputedbynumericalintegration,e.g.usingGaus-
sianquadratures(Press,Flannery,Teukolsky,&
Vetterling,1988).Fig-
ures3.6and3.7showplotsof�1 (�1 ;�2 )and�(�).
3.4
BinaryMarkovrandom
�elds
87
0
1
2
3
4
0
1
2
3
40
0.5 1
�1
�2
�1
Figure3.6.Nearesthorizontalneighborcorrelationversus�1and�2forthe
anisotropicmodelinthelargegridlimit.Thelinesinthe(�1 ,�2 )planeare
isolinesforthecorrelation.
88
Chapter3.Markovrandom
�elds
00.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9 1
0
0.5
1
1.5
2
2.5
3
3.5
4
�
�
Figure3.7.Nearestneighborcorrelationversus�fortheisotropicmodelin
thelargegridlimit(Pickard,1987).
3.4
BinaryMarkovrandom
�elds
89
Thelocalpropertiesofthemodelisdeterminedbytheconditionalproba-
bilities
P(Xi=xi jxw;xe ;xn;xs )=
exp(xi [�+�1 (xw
+xe )+�2 (xn+xs )])
1+exp(�+�1 (xw
+xe )+�2 (xn+xs ))
wherexn,xs ,xw
andxearethenorth,southwestandeastneighborsofxi .
Theparametersareeasilyinterpretedinthat�controlsthenumberof1-
pixels,�1controlsthenumberofhorizontal1-1-neighborsand�2controls
thenumberofvertical1-1-neighbors.
Phasetransitions
Aphasetransition(Kinderman&Snell,1980;Pickard,1987)occursina
MRFwhenthelocallyspeci�edinteractionsarehighenoughtodevelopinto
long-rangecorrelations.
Onsager(1944)showedthattheIsingmodelhasaphasetransitionfor
sinh�12sinh�22=1:
Figure3.8showsthecriticalparametersinparameterspace.Fortheisotropic
modelthecriticalvalueis�c
=
sinh�11=
1:7627.Wetalkaboutsu-
percriticalparametersifsinh�1
2
sinh�2
2
>1andsubcriticalparametersif
sinh�1
2
sinh�2
2
<1.Fromthe�gureweseethat1DIsingmodelsdonot
haveaphasetransition.Ifwegotothesupercriticallimitineachofthefour
quadrantswegetthedeterministicpatternsshownin�gure3.9.Thereare
twosuchdeterministicpatternsineachquadrant,onebeingthepixelwise
negationoftheother.Forthe�rstquadrantwehaveablackcon�guration
andawhitecon�guration.Inthethirdquadrantwehavecheckerboard
90
Chapter3.Markovrandom
�elds
-4 -3 -2 -1 0 1 2 3 4-4
-3
-2
-1
0
1
2
3
4
�1
�2
Figure3.8.
Phasetransitionbordersforananisotropiczero-�eldIsing
model.
Figure3.9.Deterministicpatternsforeachofthefourquadrantsinthe
supercriticallimit.
3.4
BinaryMarkovrandom
�elds
91
Figure3.10.Nondeterministicpatternrepresentedineachofthefourquad-
rants.Thefourpatternscanbegeneratedfromeachotherinaverysimple
way.
andnegatedcheckerboard.Wecanusethisknowledgeofthedeterministic
patternstounderstandtherelationbetweennondeterministicpatternsin
di�erentquadrants.Thevalueofeverypixelinanondeterministicpattern
willcorrespondtothevalueofthesamepixelinoneofthetwodeterminis-
ticpatterns,i.e.wecanpartitiontheimagebasedondeterministicpattern
membership.Ifwethenreplacepixelsbelongingtoeachdeterministicpat-
ternwiththevaluesofthecorrespondingdeterministicpatternsinanother
quadrant,theresultisatransformationofthenondeterministicpatternto
theotherquadrant.Figure3.10showsanondeterministicpatternrepre-
sentedinallofthefourquadrants.Thevisualsymmetrythusobtained
elegantlymatchesthealgebraicsymmetryofparameterspace.
92
Chapter3.Markovrandom
�elds
00.2
0.4
0.6
0.8 1
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
5
�
�
Figure3.11.Theexpectedfractionof1-pixelsasafunctionof�foran
isotropiczero-�eldIsingmodel.
Thebifurcationpointoccursfor�=
2sinh�1(1)=1:7627.
Anexactexpressionfortheexpectedfractionof1-pixels,�,hasbeenob-
tainedforthezero-�eldisotropicIsingmodelinthelargegridlimit.
�(�)= 8>><>>:
12+12(1�
1
(sinh�2
)4
)18
if�>�c ,whitecon�gurations
12 �12(1�
1
(sinh�2
)4
)18
if�>�c ,blackcon�gurations
12
if���c
ThisresultoriginatedintheworkofOnsager(1944)andYang(1952).In
�gure3.11wesee�plottedversus�.Thebifurcationoccurringat�cmeans
zero-�eldcon�gurationsdoesnothave50%1-pixels,but50%ofthecon�g-
urationshavealmost100%1-pixelsandtheother50%havealmost100%
0-pixels.Theareabetweenthetobranchesforsupercritical�represents
con�gurationswithverylowprobabilityforallvaluesof�.In�gure3.12is
3.4
BinaryMarkovrandom
�elds
93
Figure3.12.Simulationsofisotropic�rst-orderIsingmodelsfor�-values
0.00,0.50,1.00,1.50,1.70,1.76,1.80,2.00and3.00.
94
Chapter3.Markovrandom
�elds
shownsimulationsofisotropic�rst-orderIsingmodelsforincreasing�.The
simulationsareconditionalon50%ofeachphase.Theyareperformedusing
10000iterationsoftheMetropolisspin- ipalgorithmdescribedinsection
5.3.Weseethatlong-rangecorrelationsoccuraroundthecritical�.
3.4.2
Morphologicalextension
Insection3.4.1onlycliqueswithoneortwopixelswereconcerned.Markov
random�eldswiththisrestrictionarecalledpairwiseinteractionmodelsor
auto-models(Besag,1974).Theparametersofapairwiseinteractionmodel
willbeabletocontroltwoveryimportantsetsofdescriptivefeatures:�rst-
orderstatisticsandsecond-orderstatistics.However,thesefeaturesdonot
describealltherelevantaspectsofatexture.Forbinarypairwiseinteraction
modelsweknowthatwewillalwayshavethesamestructurefortheblack
phaseandthewhitephase,andthisdoesnotseemlikeanaturalassumption
formanypracticalpurposes.Di�erencesbetweenthetwophasescanonly
becontrolledusingcliqueswithanoddnumberofpixels.Ripley(1988)
summarizedbinaryimagesthroughmorphologicaloperations.Thestudy
showedthataseriesofopeningsandclosingsmadeitpossibletodiscriminate
betweenimages,wheretheautocorrelationhadlittlediscriminatorypower.
GeneralsurveysonmorphologicaloperationscanbefoundinSerra(1982,
1988),Haralick,Sternberg,&Zhuang(1987).Weadoptthenotationof
Haralicketal.(1987)insubsequentmorphologicalexpressions.
Ifweconsidertheanisotropic�rst-ordermodelwithenergyfunction(3.2),
andifweletC1 (x)bethecircumferenceofoneofthephasesmeasuredby
thetotalnumberof0-1-transitionsintheimage,thenwecanexpressthe
3.4
BinaryMarkovrandom
�elds
95
energyfunctionas
U(x)=�(�+2�)A(x)+12
�C1 (x);
whereA(x)= Pn�1
i=0xi .Thustheenergyisproportionaltothecircumfer-
ence.Thisshowsthatareformulationofamodelcanprovidenewinsight.
Theenergyfunction(3.2)canalsobeexpressedusingthemorphological
operatorerosion(),as
U(x)=��A(x)��1 A(xB1 )��2 A(xB2 )
where
B1=
.
,B2=
.
andxBmeanserosionofthe1-phaseofxwithstructuringelementB.
WewillnowreformulatethebinaryMarkovrandom�eldsonthebasisof
mathematicalmorphology.Ingeneraltheenergyfunctionwillhavethe
form:
U(x)=��A(x)�
fXi=
1�i A(xBi )
(3.6)
wherethestructuringelementsfBi ;i=1;::;fgcanbechosenarbitrarily.
Weshallthenturntotheformulationoftheconditionalprobabilities.Let
xi;k=(x0 ;x1 ;::;xi�1 ;k;xi+1 ;::;xn�1 )then
P(Xi=1jrest)
P(Xi=0jrest)=P(xi;1 )
P(xi;0 )=exp(�U(xi;1 )+U(xi;0 ))
=exp(�+
kXj
=1�j [A(xi;1 Bj )�A(xi;0 Bj )])
96
Chapter3.Markovrandom
�elds
.
.
Figure3.13.Isotropicandanisotropicstructuringelements.
=exp(�+
kXj
=1�j ni;1 (Bj )):
Thuswhencomputingtheconditionalprobabilitiesweconsiderthepixels
overlappedbyBjplacedatpixeli.ni;1 (Bj )isde�nedasthenumberofthese
pixelsthataremembersofxi;1 Bj .Thecomputationoftheconditional
probabilitiesislocal,andthisisaveryimportantpropertyforthemodel
tobecomputationallyfeasible.
Twointerestingstructuringelementsareshownin�gure3.13.Theisotropic
elementcanbeusedtomodelisotropicdi�erencesbetweenthetwophases,
andtheanisotropiccanbeusedtomodelanisotropicdi�erencesbetweenthe
twophases.IfweletC2 (x)bethecircumferenceofthe1-phasemeasured
bythetotalnumberof1-pixelswithaneighboring0-pixel,thentheMRF
de�nedthroughthismeasureisequivalenttoamodelwiththeisotropic
structuringelement.
The�gures3.14to3.27showsomeexamplesofsimulatedsamplesfrom
MRFswiththestructuringelementsof�gure3.13.Thesimulationswere
runona128�128toroidalgrid.Allsampleshaveapproximately50%
blackand50%whitepixels.Theparameters�1and�2correspondtothe
structuringelementsof�gure3.13.Alltheparametersets,excepttheone
3.4
BinaryMarkovrandom
�elds
97
Figure3.14.MorphologicalMRF.�1=2:0,�2=0:0.
usedin�gure3.27,aresupercritical.Thesupercriticalsamplesshownare
thusintermediatestepstowardssomerelativelyuninterestingsteady-state
pattern.50iterations(fullsweeps)ofthealgorithm
wereusedtocreate
these�gures.Inalltheexamplesweseeastructuraldi�erencebetween
thetwophases.Wehavewhitedotsintheblackphasebutnoblackdots
inthewhitephase.Thestructuraldi�erenceisalsore ectedinthelarger
structures.Insomeoftheimagesthereisvisuallynodoubtthatitiswhite
objectsenclosedinablackphase.Suchadi�erencebetweenthetwophases
issimplynotpossiblewithbinarypairwiseinteractionmodels.
98
Chapter3.Markovrandom
�elds
Figure3.15.MorphologicalMRF.�1=4:0,�2=0:0.
Figure3.16.MorphologicalMRF.�1=0:0,�2=2:0.
3.4
BinaryMarkovrandom
�elds
99
Figure3.17.MorphologicalMRF.�1=2:0,�2=2:0.
Figure3.18.MorphologicalMRF.�1=0:5,�2=2:0.
100
Chapter3.Markovrandom
�elds
Figure3.19.MorphologicalMRF.�1=2:0,�2=0:5.
Figure3.20.MorphologicalMRF.�1=2:0,�2=4:0.
3.4
BinaryMarkovrandom
�elds
101
Figure3.21.MorphologicalMRF.�1=4:0,�2=2:0.
Figure3.22.MorphologicalMRF.�1=4:0,�2=4:0.
102
Chapter3.Markovrandom
�elds
Figure3.23.MorphologicalMRF.�1=2:0,�2=�1:0.
Figure3.24.MorphologicalMRF.�1=0:3,�2=3:0.
3.4
BinaryMarkovrandom
�elds
103
Figure3.25.MorphologicalMRF.�1=4:0,�2=�1:0.
Figure3.26.MorphologicalMRF.�1=0:5,�2=6:0.
104
Chapter3.Markovrandom
�elds
Figure3.27.MorphologicalMRF.�1=0:5,�2=0:5.
WhyformulatemorphologicalMRFs?
Theenergyfunction(3.6)isonlyareformulationoftheenergyfunctionde-
�nedthroughcliques.Toeverystructuringelementthereisacorresponding
cliquewiththesameshape.Weproposethisreformulationbecauseitbrings
coherencebetweenthestatisticalmodelsanddescriptiveimageanalysis.It
makesitmoreobviouswhentousemulti-spincliquesandwhichitshould
be.Itprovidesuswiththeeverincreasingtoolboxofmorphologicalim-
ageanalysisasmodellingtools.Otherenergyfunctionsthan(3.6)witha
moreintricaterelationbetweenstructuringelementsandandcliquesmay
beformulatedinsimplemorphologicalterms.
3.5
Pottsmodels
105
3.5
Pottsmodels
ThePottsmodelisageneralizationoftheIsingmodeltomorethantwo
unorderedstates(phases).Ithasbeenstudiedinstatisticalphysicssince
itwasintroducedinPotts(1952).Atutorialreviewoftheresultsofthis
researchcanbefoundinWu(1982).AreviewofthePottsmodelsina
statisticalsettingcanbefoundinBesag(1986).Weshallnowreviewthree
examplesofPottsmodels.Theyareofincreasingcomplexity.
Letqbethenumberofstatesandf1;2;::;qgthecorrespondingpixelvalues.
Furtherlet
Æ(x1 ;x2 ;::;xk )= (1
ifx1=x2=:::=xk
0
otherwise
thenthestandardnearest-neighborPottsmodelischaracterizedbythejoint
distribution
P(x)=
1Zexp(� Xi�
jÆ(xi ;xj ))
(3.7)
where
Z= X
exp(� Xi�
jÆ(xi xj )):
Thiscorrespondstonearest-neighborcliqueshavingthepotential��ifthe
twopixelsbelongtothesamestateandzerootherwise.Fortheconditional
probabilitiesweget
P(Xi=kjxj ;j2Ni )=
exp(�ui (k))
Pl exp(�ui (l))
whereui (k)isthenumberofneighborsofpixeliwithvaluek.IfthisPotts
modelhastwostatesitisequivalenttoanisotropiczero-�eldIsingmodel,
when�fromthePottsmodelismultipliedbytwo.
106
Chapter3.Markovrandom
�elds
ThePottsmodelabovecanbeextendedbyallowingeachstatetohavea
speci�cstructureandfrequencyofoccurrence.Thisiseasilydonebyin-
troducingstate-dependentparametersforneighbor-pairs,f�k ;k=1;::;qg,
andforsinglepixelsf�k ;k=1;::;qg,thusobtainingtheconditionalprob-
abilities
P(Xi=kjxj ;j2Ni )=
exp(�k+�k ui (k))
Pm
exp(�m
+�mui (m)):
(3.8)
Someorderingbetweenthestatescanbeobtainedbylettingtheparameters
bespeci�cforthecolorsofbothneighbors,giving
P(Xi=kjxj ;j2Ni )=
exp(�k � Pl6=k�kl ui (l))
Pm
exp(�m
� Pl6=m
�ml ui (l)):
(3.9)
3.5.1
Phasetransitions
Fortheq-statePottsmodelwehavephasetransitionsasthosedescribed
fortheIsingmodel(Potts,1952).Thecriticalvalueof�,�c ,forthemodel
(3.7)is
�c=log(1+p
q)
andforthe2-statePottsmodelthisgives
�c=log(1+p
2),
sinh�c=1:
Thus�c=0:8814.
3.5
Pottsmodels
107
3.5.2
Morphologicalextension
Itispossibletoincludemulti-spincliquestoincorporatemorphological
propertiesinthemodels.Wegeneralizethenotationfrom
thelastsec-
tionby�rstde�ningaseriesofbinaryimages,fx(k);k=1;::;qg,fromthe
q-stateimage,x,i.e.
x(k)= (1
ifxi=k
0
otherwise
WecannowintroduceamorphologicalPottsmodelas
U(x)=
qXk
=1 [�
�k A(x(k))�
fXi=
1�ikA(x(k)Bik )]:
(3.10)
Examplesofthismodelanditsapplicationwillbeshowninsection5.4.
3.5.3
Otherextensions
Theliteratureofstatisticalphysics(Wu,1982)providesuswithsomeother
extensionsofthePottsmodel.
�Site-dilutedPottsmodel
Thismodelincludesvacanciesonthegrid.Thesevacanciescanbe
chosenatrandomorinadeterministicway.Examplesofasite-diluted
Pottsmodelmodelanditsapplicationwillbeshowninsection5.4.
�Bond-dilutedPottsmodel
Inthismodelweallowneighborswithnointeraction(orbond).The
missingbondscanbechosenatrandomorinadeterministicway.
108
Chapter3.Markovrandom
�elds
�Random-bondPottsmodel
Inthismodelthepotentialofeachbondischosenindependentlyfrom
someprobabilitydistribution.
�"Spin-glass"Pottsmodel
Anextensionofthebinaryspin-glassmodel.Thepotentialsofthe
bondsisanotherrandom�eld(usuallyGaussian).
3.6
GaussianMarkovrandom
�elds
TheGaussianMarkovrandom
�eldmodelisfrequentlyusedtodescribe
continuousphenomena.Theconditionaldensityisgivenbytheexpression
P(xi jxj ;j2Ni )=
1
p2��2expf�1
2�2[xi ��� Xj2
Ni �
j (xj ��)] 2g:(3.11)
Thismodelisalsocalledaconditionalautoregressive(CAR)model.More
detaileddescriptionsofthismodelcanbefoundinBesag(1974),Ripley
(1981)andChellappa(1985).
TospecifythejointdistributionoftheCARmodelletBbean�nmatrix
withunitdiagonalentriesando�-diagonalelementsf��ij ;i6=jg,where
�ij=0unlessiandjareneighbors.Wheniandjareneighbors,�ijequals
the�thatcorrespondstotherelativepositionsofthesetwopixels.Thusif
themodelisde�nedonatoroidalgrid,thenBwillbeblockcirculantwith
circulantblocks;seee.g.Chellappa(1985)orDubes&Jain(1989).Ob-
viouslyBissymmetric.Thejointdistributionisthenmultivariatenormal
withmeanvector�,dispersionmatrix�2B�1anddensity
f(x)=
1
p2��2n pjBjexpf�1
2�2(x��)TB(x��)g:
(3.12)
3.6
GaussianMarkovrandom
�elds
109
ForthismodeltobevalidwehavetorequirethatBispositivede�nite.
TheCARmodelsarerelatedtothesimultaneousautoregressive(SAR)mod-
els(Besag,1974;Ripley,1981;Kashyap&Chellappa,1983).SARmodels
areextensionsoftheautoregressivemodelsoftimeseriesanalysistotwo
dimensions.
3.6.1
Alternativegrayleveldistributions
Thejointdensityinequation3.12correspondstotheenergyfunction
U(x)=� Xi�
j�ij(xi ��)(xj ��)
�2
:
(3.13)
Besag(1989)presentsanalternativeclassofjointdistributions,wherethe
energyfunctioninvolvespairwisedi�erencesonly.Theyarede�nedby
U(x)= Xi�
j�(xi �xj )
(3.14)
where�isafunctionthatsatis�es
�(z)=�(�z);�(z)increasingwithjzj:
Jointdistributionsde�nedbyequation3.14areimproperinthattheycan
notbenormalized(Besag,1989).Theydohoweverhaveaperfectlyproper
conditionaldensityp
(xi jxj ;j6=i)/expf� Xj2
Ni �
(xi �xj )g:
Itispossibletoformulatemorphologicalalternativestotheenergyfunc-
tion(3.13)usingtheoperatorsofgraylevelmorphology(Sternberg,1986;
Haralicketal.,1987).Suchmodelsmayturnouttobefeasibleanduseful.
110
Chapter3.Markovrandom
�elds
Chapter4
Markovrandom
�eld
parameterestimation
FormostpracticalapplicationsofMarkovrandom�eldsitisessentialthat
wehaveaccurateandfeasiblealgorithmsforparameterestimation.This
chapterreviewsaselectionofestimationmethods.Someofthesemethods
areappliedinchapter5.Anextensionoftheasymptoticmaximumlikeli-
hoodestimator(Pickard,1987)totheanisotropiccaseisproposedinsection
4.4.2.
111
112
Chapter4.Markovrandom
�eldparameterestimation
4.1
Introduction
MaximumlikelihoodestimationoftheMRFparametersisingeneralcom-
putationallyintractableduetothelikewiseintractablepartitionfunctionin
thejointprobabilitydensity.Therearehowever,asweshallsee,exceptions
tothisrule.But�rstwewilldescribesomealternativestoML-estimation.
4.2
Codingestimation
Besag(1974)introducedcodingestimationasanalternativetoML-estimation.
Thegridispartitionedintoanumberofdisjointsetofpixels,calledcoding
patterns.Thecodingsarechosensuchthatthedistributionofthepixelval-
ueswithinonecodingpattern,conditionalonthepixelvaluesoftheother
codingpatterns,areindependent.Thissimplymeansthatapixelandits
neighborcannotbemembersofthesamecodingpattern.Thenumberof
codingpatternsiskeptaslowaspossibletoobtainthemosteÆcientes-
timator.Thuswegettwocodingpatternsfora�rst-orderMRFandfour
codingpatternsforasecond-orderMRF.Thesecodingpatternsareshown
in�gure4.1and�gure4.2respectively.Sincethevariablesassociatedwith
pixelsfromonecodingpatternareconditionallyindependent,giventheob-
servedvaluesofallotherpixels,wecanexpresstheconditionallikelihood
as
Lk= Yi2
Ck
P(xi jxj ;j2Ni )
whereCk
isthesetofpixelsbelongingtocodingpatternk.Wegetone
setofestimatesforeachcodingpatternbymaximizingthecorresponding
4.2
Codingestimation
113
1
2
1
2
1
2
2
1
2
1
2
1
1
2
1
2
1
2
2
1
2
1
2
1
1
2
1
2
1
2
2
1
2
1
2
1
Figure4.1.Codingpatternsfora�rst-orderMRF.Pixelswiththesame
numberbelongtothesamecodingpattern.
1
2
1
2
1
2
3
4
3
4
3
4
1
2
1
2
1
2
3
4
3
4
3
4
1
2
1
2
1
2
3
4
3
4
3
4
Figure4.2.Codingpatternsforasecond-orderMRF.Pixelswiththesame
numberbelongtothesamecodingpattern.
114
Chapter4.Markovrandom
�eldparameterestimation
likelihoodfunction.Thesesetsmaythenbecombinedappropriately,e.g.
bycomputingthearithmeticorharmonicmean.
4.3
Pseudolikelihoodestimation
Besag(1975)suggestedusingtheproductofconditionalprobabilitiesforall
pixelsasapseudolikelihoodfunction,i.e.parameterestimateswerefound
bymaximizing
PL= Y
i
P(xi jxj ;j2Ni ):
Thisisobviouslynotareallikelihoodfunctionbecausetheconditionalprob-
abilitiesarenotindependent.Geman&GraÆgne(1987)showedhowever
thatthismethodproducedconsistentestimatesinthelargegraphlimit
undermildconditions.Thereasonforusingmaximumpseudolikelihoodes-
timationinsteadofcodingestimationistoincreasetheeÆciency.Maximum
pseudolikelihoodestimatescomparefavorablytocodingestimatesinBesag
(1977),whereGaussianMRFsareconsidered.Besag(1977)alsonotedthat
forthe�rst-orderGaussianMarkovrandom�eldonasquaregridthemax-
imumpseudolikelihoodestimatorisequivalenttotheharmonicmeanofthe
twoalternativecodingestimator.Inthesubsequenttechnicaldescription
ofestimatorsforspeci�cmodelsthecodingmethodwillgiveresultssimilar
tothepseudolikelihoodmethod.
4.4
BinaryMRF
115
4.4
BinaryMRF
4.4.1
Maximum
pseudolikelihood
InthissectionwepresenttheresultsforthebinomialMRFbecausethese
areimmediateextensionsoftheresultsforthebinaryMRF.
Let�bethevectorofMRFparametersandsibethevectorofthecorre-
spondingneighborsumsforpixeli,i.e.foraanisotropic�rst-orderMRF
wehave
�= 0BB@
��1
�2 1CCA
;
si= 0BB@
1
xW
+xE
xN
+xS 1CCA
:
Furtherlet
Ti=�Tsi
and
pi=
exp(Ti )
1+exp(Ti ):
ThenwecanexpresstheconditionaldistributionsofabinomialMRFas
(xi jxj ;j6=i)2B(n;pi ):
ForabinaryMRFnwillbeequaltoone.
ForthebinomialMRFtheconditionalprobabilityofanobservedpixelvalue
giventherestoftheobservedimageis
P(xi jxj ;j2Ni )= nx
i !pxi
i
(1�pi )n�xi:
116
Chapter4.Markovrandom
�eldparameterestimation
Theresultingpseudolikelihoodis
PL= Y
i nx
i !pxi
i
(1�pi )n�xi:
Thuswemaximize
logPL= X
i
[log nx
i !+xi Ti �nlog(1+exp(Ti ))]:
ThebinomialcoeÆcientdoesnotdependon�thusthemaximumpseudo-
likelihoodestimateof�isfoundbymaximizing
f(�)= X
i
[xi Ti �nlog(1+exp(Ti ))]
(4.1)
withrespectto�.Forthisfunctionwecan�ndthegradientvectorrf(�)
andHessianmatrixasr2f(�)
rf(�)= X
i
[xi �npi ]si
r2f(�)=�n X
i
exp(Ti )
(1+exp(Ti ))2si si T:
TheHessianmatrixisnegativesemi-de�nite,andthemaximizationproblem
isnoweasilysolvedbystandardoptimizationprocedures.
Dubes&Jain(1989)expressestheconcernthatwhenmaximizingthefunc-
tionfin(4.1)wemayrunintoalocalmaximum.Thisrequiresthatthe
optimizationisrepeatedforseveralinitialguesses.However,wehaveexpe-
riencedthatweobtainthesamesolutionfromseveralinitialguesses,and
thatforsimulatedtexturesthissolutioncorrespondtotheparametersused
inthesimulation.Thefunctionfseemstobewell-behavedevenforreal
textures.In�gure4.3weshowf(�;�)foranisotropic�rst-orderMRF
estimatedonabinarygrasslawntexture(BrodatztextureD9).
4.4
BinaryMRF
117
-2
-1.5-1
-0.50
0.5
0
0.5
1 -1.2-1.1 -1
-0.9-0.8-0.7
�
�
f(�;�)
Figure4.3.Pseudolikelihoodsurfaceforbinarygrass.Maximumisreached
for�=�1:27and�=0:64.Thelinesinthe(�,�)planeareisolinesforf.
118
Chapter4.Markovrandom
�eldparameterestimation
4.4.2
Asymptoticmaximum
likelihood
Inthecaseofazero-�eld�rst-orderbinaryMRFwecanusetheresults
ofOnsager(1944)forthelarge-gridlimittoestimate�.Themethodwas
introducedbyPickard(1987)fortheisotropiccase.Heusedequation3.5
andappliedittoa�nitegrid.Thusinournotationhegottheequation
Corr(xi ;xj ji�j)=
2N Xi�jxi xj �1=�(�):
Theequationcanbesolvednumericallyusinge.g.Brent'smethod(See
e.g.Pressetal.(1988)).Forgridslargerthan100�100Pickardshowed
thatthe�nite-gridgamma-functionsarenearlyidentical.Theresultscan
beextendedtotheanisotropiccase,usingequation3.4andthevertical
analogue.Wegettheequations
Corr(xi ;xj ji$j)=
4N Xi$jxi xj �1=�1 (�1 ;�2 )
Corr(xi ;xj jilj)=
4N Xilj
xi xj �1=�2 (�1 ;�2 )
Solvingthesetwoequationswillprovideuswithestimatesof�1and�2 .
ThesolutioncanbefoundusingaNewton-Raphsonmethod(Seee.g.Press
etal.(1988)).
4.4.3
Otherestimationmethods
Derin&Elliot(1987)introducedanalternativeestimationmethodthat
involvesthesolutionofanoverdeterminedsystemoflinearequations.This
andotheradhocmethodsarereviewedinDubes&Jain(1989).
4.5
Pottsmodel
119
4.5
Pottsmodel
Inthissectionweconsidertheq-statePottsmodelwithconditionalproba-
bilityde�nedin(3.8).
4.5.1
Maximum
pseudolikelihood
Theconditionalprobabilitiesaregivenby
pi (k)=
exp(�k+�k ui (k))
Pm
exp(�m
+�mui (m))
where,asbefore,ui (k)isthenumberofneighborsofpixeliwithvaluek.
Let�bethevectorofMarkovparametersandsi (k)bethevectorofthe
correspondingneighborfunctionsforpixeliandcolork,i.e.
�= 0BBBBBBB@
�1
�1...�
q�q 1CCCCCCCA
;
si (k)= 1
ui (k) !:
ThepseudolikelihoodfunctionP
L= Y
i
pi (xi )
isthenmaximizedbymaximizing
f(�)= X
i
[�xi
+�k ui (xi )�log X
m
exp(�m
+�mui (m))]:
120
Chapter4.Markovrandom
�eldparameterestimation
Thegradientvectoriseasilyobtainedas
rf(�)= 0BBBBB@ P
i [1xi =1 �pi (1)]si (1)
Pi [1xi =2 �pi (2)]si (2)
...
Pi [1xi =q �pi (q)]si (q) 1CCCCCA
However,ifaconstantisaddedtoevery�k
wegetexactlythesamemodel.
Thusone�kcanbechosenarbitrarilyandwethenremovethecorresponding
equationabove.
4.6
GaussianMRF
ThissectiondescribestowaysofestimatingparametersoftheGaussian
Markovrandom�eldmodelde�nedinsection3.6.
4.6.1
Maximum
pseudolikelihood
Fromtheconditionaldistributiongivenbyequation(3.11)we�ndthatthe
thepseudolikelihoodfunctionisgivenby
PL= Y
i
1
p2��2expf�1�
2(xi ��� Xj2
Ni �
j xj )2g:
4.6
GaussianMRF
121
Let�bethevectorofparametersandsibethevectorofthecorresponding
neighborsumsforpixeli,i.e.
�= 0BBBBBBB@
��1
�2...�
r 1CCCCCCCA;
si= 0BBBBBBB@
1
xW
+xE
xN
+xS
...
xU
+xV 1CCCCCCCA
:
Bysettingthepartialderivativesofthelog-likelihoodequaltozerowe
obtain
^�=[ X
i
si si T] �1 X
i
si xi
and
^�2=
1N Xi
(xi �^�Tsi )2
=
1N( X(i;j
)x2i;j �^�T X(i;j
) sxi ):
Thusthesolutionoftheestimationproblemisgiveninclosedform.
4.6.2
Maximum
likelihood
Thejointdistributiongivenbyequation(3.12)providesuswiththelikeli-
hoodfunctionL
=
1
p2��2n pjBjexpf�1
2�2(x��)TB(x��)g:
(4.2)
Letusassume(Besag,1974)that�=0andthatwehaveanestimateof
B,^B.ThentheML-estimateof�2willbe
^�2=1n
xT^Bx
122
Chapter4.Markovrandom
�eldparameterestimation
andbysubstitutingthisintoequation4.2andtakingthelogarithmleads
usto�ndingtheML-estimatebymaximizing
logjBj�nlogxTBx:
Wearenowleftwiththenumericalproblemofevaluatingthisfunctionand
especiallythedeterminantjBj.Thishasbeentriedine.g.Besag&Moran
(1975)andKashyap&Chellappa(1983).
Chapter5
Markovrandom
�eld
simulation
InthischapterwereviewasetofiterativesimulationschemesforMarkov
random�eldsimulation.Wethenpresentafastnewparallelalgorithmfor
simulatingMarkovrandom�eldsconditionalongiven�rst-orderstatistics.
FinallyweinvestigatetheuseofthisalgorithmandamorphologicalPotts
modelinthesimulationofgeologicalstructures.
123
124
Chapter5.Markovrandom
�eldsimulation
5.1
Introduction
TheproblemofgeneratingsamplesfromaMRFdistributionisimportant
foranumberofreasons.Obviouslyinimageanalysisweareconcernedwith
thevisualpropertiesofthesamples.Instatisticalphysicsitismoreinter-
estingtousethesamplesforcomputingexpectedvaluesofthermodynamic
quantities.
Ifwedisregardthespatialnatureofimagedataandconsiderthepixelvalues
asidenticallyandindependentlydistributedthenthepixelvaluehistogram
willbeasuÆcientstatisticforourrandom�eld.Simulatinganimagefrom
the�rst-orderstatisticswouldonlyrequiresamplingfromaunivariatedis-
tributionwhichisrelativelyeasybutratheruninteresting.Simulatinga
moregeneralrandom�eldcorrespondstosamplingamultivariatedistribu-
tionofveryhighdimension,andaselectionofiterativesimulationschemes
hasbeendeveloped(Seee.g.Dubes&Jain(1989)).
5.2
Iterativesimulation
TheiterativeprocessofMRFsimulationhasfruitfullybeenthoughtofas
adiscrete,�nite-stateMarkovchain.Thestate-spaceofthisMarkovchain
isthesetofallpossiblecon�gurationsandthelimitingdistributionwe
wantistheMRFdistribution.
Fromthetheoryofdiscrete,�nite-stateMarkovchainswegetthefollowing
de�nitionsandresults.LetP=fpij ;i;j2gbethematrixoftransition
5.2
Iterativesimulation
125
probabilities,wherepij (t)denotestheprobabilityofatransitionfromstate
itostatejintsteps.
De�nition5.AMarkovchainisirreducibleornon-decomposable
i�
8i;j29t:pij (t)>0
De�nition6AMarkovchainisaperiodici�
9t0 8t>t0 8i;j2:pij (t)>0
Lemma1.AnirreducibleMarkovchainisaperiodicif
9i2:pii>0
Proof.SeeAarts&Korst(1989).
De�nition7.Aprobabilitydistribution�isinvariantorstationaryfor
aMarkovchainwithtransitionprobabilitiesfpij gi�theglobalbalance
equationsaresatis�ed,thatis
8j2:�j= X
i
�i pij
Theorem
2.ForanirreducibleandaperiodicMarkovchainthereexists
auniqueinvariantdistribution.
Proof.Seee.g.Feller(1968).
126
Chapter5.Markovrandom
�eldsimulation
De�nition8.AMarkovchainisreversibleorself-adjointi�thede-
tailedbalanceequationsaresatis�ed,thatis
8i;j2:�i pij=�j pji
Lemma2.ForanirreducibleandaperiodicMarkovchain�istheunique
invariantdistributionifitsatis�esthedetailedbalanceequations.
Proof.Seee.g.Aarts&Korst(1989).
5.2.1
TheMetropolisalgorithm
Metropolis,Rosenbluth,Rosenbluth,Teller,&Teller(1953)describedan
algorithmforcomputersimulationofGibbsdistributedsystems.Thisalgo-
rithmisnowknownastheMetropolisalgorithm.
Algorithm
1.Metropolisalgorithm.LetQbeasymmetricirreducible
transitionmatrixwithstatespace.
1.Startwithcon�gurationx2
2.Chooseanewcon�gurationyfromthedistributionintherowcorre-
spondingtoxinQ
3.Replacexbyywithprobability
p=min(1;P(X=y)=P(X=x))
4.ifnotstopthengoto2
5.2
Iterativesimulation
127
NoticethatwhiletheMetropolisalgorithmwillalwaysmakeachangetoa
newcon�gurationwithhigherprobabilityitwillalsowithsomeprobability
makeachangetoanewcon�gurationwithlowerprobability.
ItistrivialtoshowthattheMetropolisalgorithmde�nesanirreducibleand
aperiodicMarkovchain.Thedetailedbalanceequationsgivefori6=j
qij pi min(1;pj
pi )=qji pjmin(1;pi
pj)
whichforbothpi �pjandpi<pjleadsto
qij=qji
ThisexplainsthesymmetryconditiononQ.
5.2.2
Spin- ipalgorithms
Inspin- ipalgorithmssinglepixelsarevisitedsuccessivelyandtheirvalues
arechangedaccordingtosomecriteria.Kirkland(1989)considered ipping
2x2and3x3blocksofpixelsbuttheresultswerenotencouraging.Thetwo
mostpopular ippingcriteriaprovidesthefollowingalgorithms.
Algorithm
2.Metropolisspin- ipalgorithm.LetQbeasymmetric
irreducibletransitionmatrixwithstatespacef0;::;G�1g,whereGisthe
numberofgraylevels.
1.Startwithcon�gurationx
2.Chooseapixelsandapixelvaluegfromthedistributionintherow
correspondingtoxsinQ.
128
Chapter5.Markovrandom
�eldsimulation
3.Setcon�gurationyequaltoxwithpixelssettog
4.Replacexbyywithprobability
p=min(1;P(X=y)=P(X=x))
5.ifnotstopthengoto2
Algorithm
3.Gibbssamplerorheatbathalgorithm.
1.Startwithcon�gurationx
2.Chooseapixels
3.ReplacexsbyavaluesampledfromtheconditionaldistributionofXs
giventhevaluesoftheneighborsofs.
4.ifnotstopthengoto2
Inbothalgorithmswehavetochoose(visit)apixelforeachiteration.One
waycouldbetochoosearandompixeleverytime,butmakingasystematic
sweepovertheimageismoreeÆcientbothintermsofrateofconvergence
andintermsoftimepersweep.Ifwemakesurethatwecontinuetovisit
everypixelthentheorderinwhichwesweepthroughtheimagedoesnot
matter.Usingasimplerastersweepdoesensureconvergencebutimposes
anarti�cialanisotropyontheintermediateresultsasseen(unintentionally
?)intheisotropicsimulationsof�gure4inDerin&Elliot(1987).Toavoid
thearti�cialanisotropyandtoenablesimultaneousupdatingofmanypixels
wedividetheimageincodingpatternsasdescribedinsection4.2.Pixels
fromeachcodingpatterndonotinteractwithotherpixelsfromthesame
codingpattern.Wethensweepthroughthecodingpatterns,oneatatime.
5.2
Iterativesimulation
129
5.2.3
TheMetropolisspin-exchangealgorithm
Thespin-exchangealgorithmwasintroducedinimageanalysisbyCross&
Jain(1983).
Algorithm
4.Metropolisspin-exchangealgorithm.
1.Startwithcon�gurationx
2.Choosetwopixelsrandsatrandom
3.Ifxr=xsthengoto2
4.Setyequaltoxwithpixelsrandsswitched
5.Replacexbyywithprobability
p=min(1;P(X=y)=P(X=x))
6.ifnotstopthengoto2
Insteadof ippingsinglepixelsthisalgorithmexchangesthevaluesoftwo
randomlychosenpixels.Thestepwillmaintainthepixelvaluehistogram
andthusthe�rst-orderstatistics.Asastop-criteriaCrossandJainchecked
ifthenumberofsuccessfulswitchingsdroppedbelow1%ortheestimated
parametersmatchedtheinputparameterswithin5%.Thisresultedina
varietyofinterestingtextures.
ToelaborateonthisalgorithmfortheisotropicIsingmodelweconsiderthe
casexr=1andxs=0,andthusyr=0andys=1.Thecon�gurationsx
130
Chapter5.Markovrandom
�eldsimulation
andyareidenticalexceptatpixelsrands.TheratioRisthencomputed
as
R=P(Y=y)
P(X=x)=
1
Z(�;�)exp(� Pi yi+� Pi�jyi yj )
1
Z(�;�)exp(� Pixi+� Pi�jxi xj )
=exp(� Xi�
j [yi yj �xi xj ])=exp(�[Ws (y)�Wr (x)])
whereWk (z)isthenumberof1-neighborsofkincon�gurationz.Ripley
(1987)discussesaproblemintheexpositionofCrossandJainforthecase
whererandsareneighbors.Thisproblemdoesnotoccurusingthepresent
exposition.
Thefactthat Pixi= Pi yimeansthat�isaredundantparameter.This
seemsquitenaturalsince�istheparameterthatcontrolstherelativenum-
berof1-pixels,andthisnumberiskeptconstantbythespin-exchange
algorithm.
5.2.4
Swendsen-Wangalgorithm
Arelativelynewtypeofsimulationalgorithminvolves ippingclusters.A
clusterisaconnectedsetofpixelswithidenticalvalues.Swendsen&Wang
(1987)describedaclusteralgorithmforthebasicPottsmodel3.7.
Algorithm
5.Swendsen-Wangalgorithm.
1.Startwithpixelcon�gurationx
5.2
Iterativesimulation
131
2.Createabondcon�gurationbyintroducingabondbetweenneighboring
pixelswiththesamecolorwithprobability
p=1�exp(��)
3.Findtheclustersjoinedbybonds
4.Independentlyassignarandomcolortoeachcluster
5.ifnotstopthengoto2
Clusteralgorithmsisanactiveresearchareaandextendedandnewversions
haveoccurred(e.g.(Wol�,1989)).
Figure5.1shows24iterationsoftheSwendsen-Wangalgorithmona5-state
Pottsmodelwith�=2:0.Convergenceseemstobeveryfast.Figure5.2
isaplotofthemaximumpseudolikelihoodestimate�asafunctionofthe
iterationnumber.
132
Chapter5.Markovrandom
�eldsimulation
Figure5.1.24iterationsoftheSwendsen-Wangalgorithmona5-statePotts
modelwith�=2:0.
5.2
Iterativesimulation
133
00.2
0.4
0.6
0.8 1
1.2
1.4
1.6
1.8 2
0
5
10
15
20
25
30
35
�
iterations
Figure5.2.Maximum
pseudolikelihoodestimate�asafunctionofthe
iterationnumber.�=2:0forthesimulation.
134
Chapter5.Markovrandom
�eldsimulation
5.3
The�
-controlledspin- ipalgorithm
TheMetropolisspin-exchangealgorithmisthemostwidelyusedalgorithm
forsimulatingMarkovrandom�eldsconditionalonthe�rst-orderstatistics.
Inthissectionweproposetwospin- ipalternativesbasedontheGibbssam-
plerandtheMetropolisalgorithmandincludesasanewfeatureafeedback
looptoachievetheconditioning.Therateofconvergenceforlargeattrac-
tionparameters�iscomparedtotherateofconvergenceoftheMetropolis
spin-exchangealgorithm.Thespin- ipalgorithmsturnouttobefasternot
onlyintimepersweepbutalsoinrateofconvergence.Furtherthespin- ip
algorithmsareeasyto0,andthisisdoneusingaSIMDmassivelyparallel
computer.
5.3.1
Introduction
SimulatingMarkovrandom�eldsconditionalontheir�rst-orderstatistics
hasbeenverypopular,sincethiscanprovideinterestingtexturesforlarge
�seeminglyavoidingthephasetransition.Thespin-exchangealgorithm
ishoweververyslowforlarge�andthisispartlyduetothefactthat
theintermediatecon�gurationshave�xed�rst-orderstatisticsandthusthe
numberofpossiblepathsbetweentwocon�gurationsareverylimited.
Inthespin- ipalternativespresentedherewedonotstrictlymaintainthe
�rst-orderstatistics,butstabilizesthesearoundapresetvaluethrougha
feedbackloop.
5.3
The�-controlledspin- ipalgorithm
135
5.3.2
Thefeedbackloop
Theideaistoconstructaspin- ipalgorithmwithalmostconstant�rst-order
statistics.ForanIsingmodelthe�rst-orderstatisticsisfullydescribed
bythemean�.Supposewewant�tohavethevalue�0 .Thismaybe
accomplishedbyafeedbackloopsuchthat�isadjustedaftereachiteration
tokeep�(t)near�0 .Incontroltheory(e.g.�Astr�om&Wittenmark(1984))
thestandardtextbookPID-controllercanbewrittenas
�(t)=Kp e(t)+Ki
1
1�q�1e(t)+Kd (1�q�1)e(t)
wheree(t)=�0 ��(t)istheerrorfunctionandq�1isthebackshift-operator,
i.e.q�1e(t)=e(t�1).Ifwemultiplywith(1�q�1)onbothsidesweget
(1�q�1)�(t)=Kp (1�q�1)e(t)+Ki e(t)+Kd (1�q�1)2e(t)
(5.1)
andthisistheformactuallyusedhere.Thename,PID-controller,comes
fromthethreecontrollingactionsinexpression5.1.
�P-proportionalaction.Thebasicideaistohaveacontrolaction
proportionaltotheerror.
�I-integralaction.Thisactionisusedtoeliminateastationaryerror
inthemeanvalue.
�D-derivativeaction.Thisactionisusedtoincreasethespeedofthe
controlsystem.
TheP-,I-andD-actionsareadjustedthroughKp ,KiandKdrespectively.
ThejointPID-actionandthedynamicofthespin- ipsystemdeterminesif
thecontrolsystemisstable.
136
Chapter5.Markovrandom
�eldsimulation
. . . . .. . . . .. . . . ... . . . .. . . . .. . . . ..
. . . . . . . . . . . . . . ................
..............................
. . . . . . .. . . . . . ... . . . . . .. .. . . . ..
. . . . .. . . . .. . . . ... . . . .. . . . .. . . . ..
. . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . .. . . . . . . . . ... .. ....................................................................................................................... .... . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
. . . . . . .. . . . . . ... . . . . . .. .. . . . ..
Spin- ip
�(t)(1�q�1)
e(t)Kp (1�q�1)
-1
x(t)
�(t)
�0
Mean
Figure5.3.P-controller.Multiplicationisperformedistheboxeslabelled
Kp (1�q�1)and�1.
AsimpleP-controller,withKi
=
Kd
=
0,isshownin�gure5.3,and
thiscontrollerwasinvestigatedinthepresentwork.Here�isdirectly
proportionaltotheerrorfunction,e(t).ExtensiontoaPID-controllerwith
integralandderivativeactionsispossible.Thiscanbedonewithoutmuch
computationale�ortbutonehastobemorecarefulinchoosingtheright
constants.Toincludeknowledgeaboutx(t�1)incomputing�(t)would
becomputationallyharderand,asweshallsee,completelyunnecessary.
Thetimestepusedinthecontrolloopisselectedasatrade-o�between
computationalcostandperformance.Thetimestepusedherecorresponds
toafullsweepofthespin- ipalgorithm.
Thiscontrolloopapproachisgenerallyapplicabletoiterativesimulation
schemesandisnotcon�nedtocontrolofthemeanvalue.Otherproperties
ofthecon�gurationx,e.g.second-orderstatistics,canbemeasuredand
usedtocontrolthesimulationparameters.
5.3
The�-controlledspin- ipalgorithm
137
5.3.3
Relationtoimportancesampling
Green(1986)suggestedthatglobalproperties(e.g.�rst-orderstatistics)
couldbeintegratedinaMarkovrandom�eldP(x)byconsideringthemod-
i�ed�eld
P�(x)/e��D(x)P(x)
whereD(x)isanon-negativerandomvariablemeasuringthedeviationfrom
theidealproperty,and�isapositiveparameter.Thismodi�ed�eldfocuses
P(x)onrealizationswiththedesiredproperty.Theparameterdetermines
thestrengthofthefocusing.FortheIsingmodelwemightchoose
D(x)=(n(x)�nd )2
wheren(x)istheactualnumberof1-pixelsinx,andnd
isthedesired
numberof1-pixels.Theconditionaldistributionofpixeliinthemodi�ed
�eldisthengivenbyreplacing�with��2�(n(xxi =0 )�nd+12).This
resultcorrespondstoalocalP-controllerwithKp=2�.Weshalladoptthe
termimportancesamplingfromRipley(1992)forthesamplingfromP�(x).
TherearethreeadvantagesinusingthePID-controllerinsteadofimpor-
tancesampling.The�-adjustmentisdoneoncepersweepinsteadofonce
perpixel.Importancesamplingcannotbeparallelizedbecausethecondi-
tionaldistributionisbasedonglobalproperties.Finallywehavetoknow
thevalueof�whendoingimportancesampling,whereasthePID-controller
will(hopefully)convergetothecorrectvaluefromanystartingguess.
138
Chapter5.Markovrandom
�eldsimulation
5.3.4
Parallelimplementation
ParallelimplementationsofMRFsimulationschemeshasbeensuggested
andimplementedseveraltimesinthepast.Geman&Geman(1984)dis-
cussedaparallelimplementationoftheGibbssamplerandsuggestedan
asynchronousupdatingschemebasedonanMIMD(multipleinstruction
multipledata)computer.Murray,Kashko,&Buxton(1986)implemented
aparallelversionoftheMetropolisspin- ipalgorithmusingsynchronous
updatingforeverycodingonanSIMD(singleinstructionmultipledata)
computer.
AnapproximationoftheMetropolisspin-exchangealgorithm
wasimple-
mentedonaSIMDcomputerbyMargalit(1989)throughaslave-master
handshakingbetweenthetwochosenpixels.Thisprocedurewasrunboth
withandwithoutusingacodingscheme.Besidesthehandshakingoverhead
onlya40%degreeofparallelismisaccomplished.
The�-controlledspin- ipalgorithmdescribedherewasimplementedona
ConnectionMachineCM-200from
ThinkingMachinesusingtheparallel
C-compiler,C�.Thebasicshapeoftheparallelvariablescorrespondtoa
gridofcodingelements.Acodingelementisagroupofneighboringpixels,
onefromeachcoding.Thecodingelementfora�rst-orderMRFcouldbea
pixelfromcoding1andit'sneighbortotheright.Forasecond-orderMRF
thecodingelementcouldbea2x2squarewithapixelfromcoding1init's
upperleftcorner.Weassociateavirtualprocessorwitheachcodingelement
andthisprocessorperformsthespin- ipoperationsuccessivelyonallthe
pixelsinthatcodingelement.Ifthenumberofcodingelementsisequalto
amultipleofthenumberofphysicalprocessorsthisschemeprovides100%
useoftheparallelcomputer.
5.3
The�-controlledspin- ipalgorithm
139
5.3.5
Results
Thesimulationswereperformedonatoroidal128by128grid.Oneiteration
correspondstoafullsweepthroughtheimage.
Mean-convergence
Inordertomakethemeanvalueconvergetothedesired�0wehavetochoose
anappropriateconstantKp .Westartedsimulationswithallblackpixels
and�0setto0:5.Di�erentvaluesofKp
werechosenforbothMetropolis
algorithmandtheGibbssamplerandtheresultsareshownin�gure5.4
and�gure5.5.�was3:0forallthecurves.WecanseethatKp=6:0seems
likeanappropriatechoiceforbothalgorithms.Wealsonoticethatringing
e�ectsaremoreapparentfortheMetropolisalgorithm.Thisisduetothe
morefrequent ipping.
�-convergence
Therateofconvergenceofthepseudo-likelihood�-estimateisshownin
�gure5.6.Thevalueof�is3.0andwecanseethatthespin-exchange
algorithmconvergesslowerthanit'sspin- ipalternatives.Toillustratethe
visualconvergenceofthethreealgorithmsweshowtypicalcon�gurations
after50,100,200,400,800,2000,4000and8000iterationsfor�=3:0.
In�gure5.7weseethattheMetropolisspin- iphasconvergedtoastable
patternalreadyafter200iterations.TheconvergenceoftheGibbssampler
isshownin�gure5.8andastablepatternisreachedafter2000iterations.
140
Chapter5.Markovrandom
�eldsimulation
0 10
20
30
40
50
60
0
20
40
60
80
100
120
140
160
180
200
%
iterations
1.0
2.0
6.0
Figure5.4.Mean-convergenceforMetropolisspin- ip.Percentageof1-
pixelsversusthenumberofiterationsfor�=3:0.
5.3
The�-controlledspin- ipalgorithm
141
0 10
20
30
40
50
60
70
0
20
40
60
80
100
120
140
160
180
200
%
iterations
1.0
2.0
3.0
4.0
6.0
9.0
Figure5.5.Mean-convergenceforGibbssampler.Percentageof1-pixels
versusthenumberofiterationsfor�=3:0.
142
Chapter5.Markovrandom
�eldsimulation
00.5 1
1.5 2
2.5 3
3.5
0
20
40
60
80
100
^�
iterations
Spin-exchange
Metropolis
Gibbssampler
Figure5.6.Convergenceinpseudo-likelihood�estimatesfortheGibbssam-
pler,Metropolisspin- ipandMetropolisspin-exchangealgorithms.
Forthespin-exchange,shownin�gure5.9,wehavetowait30000iterations
beforethepatternstabilizes.
Duringthesesimulationswenoticedthattherewerethreedi�erenttypesof
steady-statepatterns.Theseareshownin�gure5.10.The�rstcorresponds
tothesemi-steadystatereportedinRipley&Kirkland(1990)fortheuncon-
ditionalsimulation.Thesecondpatternconsistsofonephaseencapsulated
intheotherphase.Thethirdpatternshowsdiagonalstripingandthisrelies
onthetoroidalstructureofthegrid.Simulating300samplesusing20000
iterationsoftheMetropolisspin- ipresultedin56%
ofthe�rstpattern,
41%ofthesecondpatternand6%ofthethirdpattern.Wethentriedto
simulatethesamemodelwithfreeboundaryconditions,i.e.thepixelson
theboundariessimplyhavelessneighborsthannonboundarypixels.This
5.3
The�-controlledspin- ipalgorithm
143
Figure5.7.ConvergenceofMetropolisspin- ipalgorithm.Con�gurations
after50,100,200,400,800,2000,4000and8000iterationsfor�=3:0.
Figure5.8.ConvergenceofGibbssampler.Con�gurationsafter50,100,
200,400,800,2000,4000and8000iterationsfor�=3:0.
144
Chapter5.Markovrandom
�eldsimulation
Figure5.9.ConvergenceofMetropolisspin-exchangealgorithm.Con�gu-
rationsafter50,100,200,400,800,2000,4000and8000iterationsfor
�=3:0.
resultedinthetwotypesofsteady-statepatternsshownin�gure5.11.At
upto100000iterationsthedistributionwasmaintainedat80%oftheleft
patternand20%oftherightpattern.
TheseresultsalsoshowthatCross&Jain(1983)neversimulatedtosteady-
stateforsupercritical�,andthatthestop-criteriaweremoreimportantin
Figure5.10.Natureofsteady-statepatternssimulatedonatoroidalgrid.
5.3
The�-controlledspin- ipalgorithm
145
Figure5.11.Natureofsteady-statepatternssimulatedwithfreeboundary
conditions.
determiningthevisualpropertiesofthesimulatedtexturesthanthemodel
parameters..
Timing
TheparallelimplementationofthisalgorithmontheCM-200isinaverage
40timesfasterthanasequentialimplementationonanHPApollo9000/750,
whichismarketedasthefastestworkstationintheworldatthemoment.
ThetimeontheCM-200wasonan8kprocessorsystem
withexclusive
access.
5.3.6
Conclusion
TheiterativesimulationofMarkovrandom�eldsconditionalonthe�rst-
orderstatisticshasbeenstudied.Untilnowsuchsimulationshasbeendone
usingtheMetropolisspin-exchangealgorithm,whichwasmadepopularby
Cross&Jain(1983).Presentedherearetwospin- ipalternativesthathave
146
Chapter5.Markovrandom
�eldsimulation
severaladvantages.Theyarefasterpersweep.Therateofconvergenceis
higher,severalordersofmagnitudeforsupercritical�,andtheyareeasyto
parallelize.Theessentialpartfortheconditioningisasimplefeedbackloop.
Itisstraightforwardtoextendtheuseofsuchafeedbackloopiniterative
simulationschemestoconditioningonotherimagefeatures.
UsingimplementationsofthesealgorithmsonanSIMDmassivelyparallel
computerwehaveshownthatCrossandJaindidnotsimulatetosteady-
stateforlarge�andthattheirrealizationsforlarge�dependsheavilyonthe
stop-criteriaused.Statisticshasbeenmadeforthenatureofsteady-state
con�gurationsforbothsimulationsonatoroidalgridandforsimulations
withfreeboundaryconditions.
5.4
Simulationofgeologicalstructures
147
5.4
Simulationofgeologicalstructures
WeusethemorphologicalPottsmodelsde�nedinequation(3.10)andapply
the�-controlledspin- ipalgorithmfromsection5.3forthesimulationof
geologicalstructuresinanoil�eld.
5.4.1
Introduction
Inthebusinessofpetroleumexplorationandproductionitisofgreatimpor-
tancetoassessthepropertiesofanoil�eld.Computersimulationstudies
isapowerfultoolinthisassessment.Theyareperformedbysimulating
owinsimulatedstochasticreservoirs.Thusthewordsimulationisusedin
twosensesinthe�eldofpetroleumtechnology.Itisusedforthestochastic
simulationofthespatialdistributionofsedimentaryfaciesandpetrophys-
icalpropertiesaswellasforthenumericalsimulationof owinamedia.
Thiscasestudyisconcernedwithsimulationsinthe�rstsense.When ow
simulationsismeant,thisshallbestatedexplicitly.
Thesimulationisbasedonareservoirmodel.Inthismodelwehaveto
incorporategeologicalknowledgefromsimilarstructuresaswellasthege-
ologicalknowledgeobtainedfromwelldata.Theinformationusedinthe
designofareservoirmodelisoftenreferredtoassoftdata.Whensimu-
latingthereservoirmodelthewelldatashallbe�xedatthecorresponding
locationthushonoringwhatiscalledharddata.Asourceofinformation
thatseemssomewhathardertoincorporateisdatafromseismicstudies.
148
Chapter5.Markovrandom
�eldsimulation
Whenmodellingthedistributionofrocktypesweuseadiscretecodingof
thelithology,whereasmodelsforpetrophysicalpropertieslikeporosityand
permeabilitymaybemorenaturallybasedoncontinuousvariables.
Forliteratureonthesubjectofthiscasestudythereaderisreferredto
Ripley(1992),Dubrule(1989)andHaldorsen,Brand,&Macdonald(1988).
5.4.2
Modeltypes
Thetwomaingroupsofstochasticmodelsusedinreservoirsimulationare
objectmodels(orBooleanmodels)andvoxelmodels(orblockmodels).We
areprimarilyconcernedwithvoxelmodels.
Voxelmodels
Thesemodelsarebasedonaregulargridandthedistributionofvoxelvalues
ischosentosatisfye.g.acertainvariogram(correlogram)oraconditional
probabilitydistribution.Thevariogram
andtheconditionalprobability
distributionmaybeinferredfromharddata.
Threerecentpublicationswithdi�erentapproachesare:
�Adler,Jacquin,&Quiblier(1990)simulatedporousmediabasedthe
measuredporosityandvariogram.
�Farmer(1989)generatedgraylevelnumericalrocksby�rstcomput-
ingthehistogram,cooccurrencematricesandautocorrelationofarock
5.4
Simulationofgeologicalstructures
149
sample.Thenapatternwiththesamehistogramisgenerated,and
thispatternisusedasastartingcon�gurationforaspin-exchange
simulatedannealingprocedure.Deviationfromthesamplecooccur-
rencematricesandautocorrelationisusedaspenaltyintheenergy
function.
�Ripley(1992)simulatesthedistributionofrocktypesusinga3DPotts
modelconditionalontheharddatapoints.
TheapproachinthisstudyissimilartotheapproachinRipley(1992).We
includemorphologicalpropertiesinthemodelbyusingthemorphological
Pottsmodelsde�nedinequation(3.10),andweapplythe�-controlledspin-
ipalgorithmfromsection5.3forthesimulation.
Objectmodels
Anobjectmodeldescribesthedistributionofrockbodiesofrandomshape
atrandomlocations.Thetheoryofpointprocessesandrandomsetmodels
canbeaveryusefultoolforspecifyingandsimulatingobjectmodels.From
theviewpointoftextureanalysisobjectmodelscorrespondstothestructural
approachwithprimitivesandplacementrules.
5.4.3
AMarkovrandom
�eldreservoirmodel
WeshalltrytoapplythemorphologicalPottsmodel3.10forreservoirsimu-
lationonthegigascopicscale(Haldorsenetal.,1988).Themodelisintended
forthedescriptionofthedistributionofbothrocktypesanddiscretized
150
Chapter5.Markovrandom
�eldsimulation
petrophysicalproperties.Thegoalisthatthereservoirsimulationscheme
shallincorporate:
�Fixed�rst-orderstatistics.
�Anisotropyinthedi�erentfacies.
�Spatialtrends(instationarity).
�Harddata.
�Planardiscontinuities(faults)asharddata.
Thesimulationisperformedusingthefeedbackloopdescribedinsection5.3
tokeepthe�rst-orderstatistics�xed.Theconstantsforthecontrolactions
hastobeselectedforeachstate.API-controllerwasusedinthesimulations
below.
Anisotropyinthedi�erentfaciesisimplementedthroughtheuseofthe
morphologicalPottsmodels.
Instationaritycanbeimplementedbylettingthemodelparametersvary
acrossthe�eld.
Harddatapointsarehonoredbysimplynotvisitingthemduringthesim-
ulation,i.e.theywillneverchangetheirvalue.
Discontinuities(faults)areintroducedasharddata.Thisissimplydone
byconsideringdiscontinuitiesasanewphase,thediscontinuityphase(The
discontinuityphasecanalsobeconsideredasvacanciesinasite-diluted
Pottsmodel).Thediscontinuityphaseisnotconsideredinthe ipping
5.4
Simulationofgeologicalstructures
151
process.Tobe100%
e�ectivethediscontinuityphasehastobeaswide
asthelongestdistancebetweentwoneighborsintheMRF.Toavoidthe
wrap-aroundthatisduetothetoroidalgridwecanapplythediscontinuity
phasetothesidesofthegrid.Horizontalwrap-aroundmaybedesirablein
manycases,whereasthisisrarelytrueforverticalwrap-around.Wewould
inthiscaseapplythediscontinuityphasetothetopand/orbottomlinesof
thegrid.Analternativetothediscontinuityphaseistouseabond-diluted
Pottsmodel.Insuchamodelwehavenobonds(nointeractions)acrossthe
discontinuityzone.
5.4.4
Simulationresults
Areservoirsimulationprogram,rocksample,hasbeenimplemented(See
appendixA).Weshallnowshowafewexamplesofsimulationsin2Dbased
onthesemodels.Thesimulationsweremadeona128x128grid,wherethe
pixelsarerectangleswithheight1andlength4.Themodelusedisafour-
statemorphologicalPottsmodelwiththetwostructuringelementsshown
in�gure3.13.In�gures5.12to5.15weshowfoursimulationexamples.
50iterationsoftheMetropolisspin- ipalgorithmwereused.Simulations
likethesecon�rmedthat�rst-orderstatisticsandanisotropyofthedi�erent
faciescanbecontrolled.
Figures5.16and5.17illustratestheconditioningonharddataandfaultsas
harddata.Thesimulationsareconditionalonharddatainverticalcolumns
onbothsidesofthefault.Figure5.16showstwoindependentsimulations,
onewithafaultandonewithout.In�gure5.17thefaultisintroducedin
theresultofthesimulationwithoutthefaultandthensimulationisdone
again.Inthiscasethesimulationslookmoresimilar.
152
Chapter5.Markovrandom
�eldsimulation
Figure5.12.Simulationresultoffour-statemorphologicalPottsmodel.
Figure5.13.Simulationresultoffour-statemorphologicalPottsmodel.
Figure5.14.Simulationresultoffour-statemorphologicalPottsmodel.
5.4
Simulationofgeologicalstructures
153
Figure5.15.Simulationresultoffour-statemorphologicalPottsmodel.
Instationarityhasnotbeenimplementedinthesimulationprogram.
Alltheparametersusedinthesimulationsweresupercritical,i.e.
the
steady-stateresultwouldhaveonlyonecolorifwedidnotconditionon
the�rst-orderstatistics.Thusforthesesimulationstobeusefulinprac-
ticewehaveto�ndasuitablestop-criterion.Thestop-criterioncouldbea
globalstructuralstatistic,e.g.averageclustersizeorlength.
5.4.5
Conclusion
TheusefulnessofMarkovrandom�eldsinreservoirsimulationisdependent
onaneÆcientimplementationofthesimulationscheme.Thisisparticularly
truewhensimulating3Dstructures.The�-controlledspin- ipalgorithm
implementedonamassivelyparallelcomputerhasprovedveryeÆcientand
wouldbeanappropriatechoice.Theexamplespresentedinthissectionhas
beensimulatedonaserialworkstationin2D,buttheyareeasilyextended
to3D.
154
Chapter5.Markovrandom
�eldsimulation
Figure5.16.Resultoftwosimulationswithidenticalparameters.Thelower
imagehasafaultasharddata.Thesimulationsareconditionalonhard
datainverticalcolumnsonbothsidesofthefault.
5.4
Simulationofgeologicalstructures
155
Figure5.17.Resultoftwosimulationswithidenticalparameters.Thelower
imageismadebyintroducingafaultintheupperimageandthensimulating
again.Thesimulationsareconditionalonharddatainverticalcolumnson
bothsidesofthefault.
156
Chapter5.Markovrandom
�eldsimulation
SimulationsbasedonthemorphologicalPottsmodelsuggestthatitispos-
sibletosatisfyasetofcriteriathatarerelevanttoreservoirgeologists.
Thenumberofsimulationsthatwehavecomputedsofarisverylimited,
andmuchmoreresearchisneededtoevaluatethepotentialofmorhological
Markovrandom�elds(andMarkovrandom�eldsingeneral)inreservoir
simulation.
Chapter6
Bayesianparadigm
TheBayesianparadigmisaframeworkforincorporatingstochasticmodels
ofvisualphenomenaintoaverygeneralsetoftasksfromimageprocessing
andimageanalysis.SincetheseminalpaperofGeman&Geman(1984)
therehasbeenanincreasinginterestinthissubject(Besag,1986;Marro-
quin,Miter,&Poggio,1987;Geman&McClure,1987;Ripley,1988;Besag,
1989;Geman,Geman,GraÆgne,&Dong,1990).Wegiveashortreview
ofBayesianimageanalysisandpresentanapplicationthatmakessuccess-
fuluseofMarkovrandom�elds,theMetropolisalgorithmandsimulated
annealinginaBayesianframework.1
57
158
Chapter6.Bayesianparadigm
6.1
Introduction
TheBayesianparadigminimageanalysiscanbedescribedasfollows:
1.WeconstructapriorprobabilitydistributionP(x)forthevisualphe-
nomenaX,thatwewanttomakeinferencesabout.
2.WethenformulateanobservationmodelP(yjx).Thisisthedistri-
butionofobservedimagesYgivenanyparticularrealizationxofthe
priordistribution.
3.Thepriordistributionandtheobservationmodelarecombinedtothe
posteriordistributionP(xjy)byBayestheorem
P(xjy)/P(yjx)P(x):
P(xjy)isthedistributionofthevisualphenomenaXgiventheimage
ythatwehaveobserved.
4.Finallywemakeinferencesaboutthevisualphenomenabasedonthe
posteriordistributionP(xjy).
6.2
Priordistribution
ThegeneralityofBayesianimageanalysisliesinthevarietyofvisualphe-
nomena,thatwecanmodel.
Inimagerestorationwewanttomakeinferencesaboutthetrueundegraded
imagerepresentedbyXfromanoisyobservedimagey.AGaussianMRF
6.3
Observationmodel
159
couldthebeanappropriatepriordistributionforX.Priorsthatmodelthe
jointgrayleveldistributionarecalledpixelpriors.
Thegoalofimageclassi�cationistoassignaclassorlabeltoeachpixel
inanimagey.E.g.inremotesensingwecanassignland-useclasseslike
forest,lake,roadetc.topixelsinsatelliteimages.Thejointassignment
oflabelstoallpixelsisalabellingx.PriorsP(x)forthelabellingcould
bediscreteMarkovrandom�eldssuchasbinaryMRFsandPottsmodels.
Priorsthatmodelalabellingarecalledlabelpriors.
Ifwewanttomakeinferencesaboutgeometricalshapes,representedbyX,
inanimagey,weareintheareaoftemplatematching.Templatepriorsare
modelsofgeometricalrelationsinobjectsorbetweenobjectsinanimage.
TheapplicationofaMRFtemplatepriorisillustratedinthecasestudyof
section6.6.
FortheBayesianapproachtobesuccessfulitisimportantthattheprior
densityre ectsourknowledgeofthevisualphenomenabehindtheobserved
images.
6.3
Observationmodel
TheobservationmodelP(yjx)isthedistributionofobservedimages
Y
givenanyparticularrealizationxofthepriordistribution,i.e.ittells
ushowthevisualphenomena,thatwewanttomakeinferencesabout,is
actuallyobserved.Inimagerestorationxistypicallyconsideredobserved
afterconvolutionwithablurringfunctionhandadditionofanoiseimage
160
Chapter6.Bayesianparadigm
�,as
Y=h�x+�:
Inimageclassi�cationtheobservationmodelcouldbeatextureand/ornoise
modelforeachclass,e.g.aforesttextureonforestlabels,alaketextureon
lakelabelsetc.
Afterhavingspeci�edthepriormodelandtheobservationmodelweare
readytoextractinformationfromtheposteriordistribution.
6.4
Maximum
aposteriori(MAP)estimates
TheMAPestimate^xofxgivenanobservedimageyisde�nedby
^x=argmaxP(xjy):
ThusMAPestimationinvolvesmaximizationofahighdimensionaljointdis-
tribution,andthisisusuallyconnectedwithaconsiderablecomputational
cost.
6.4.1
Simulatedannealing
Asimulatedannealingschemeisasuccessivesamplingfromthedensity
PT(xjy)/[P(yjx)P(x)]
1T
(6.1)
wherethetemperaturestartsataninitialvalueT0>0andthenfallsto-
wards0.Ifthetemperatureisloweredslowenough,then(6.1)willassign
6.5
Marginalposteriormodes(MPM)
161
unitprobabilitytotheMAPimageinthelimit(Geman&Geman,1984).
Thesamplingalgorithmforsimulatedannealingcanbee.g.theGibbssam-
plerortheMetropolisspin- ipalgorithm.Asimulatedannealingscheme
willbeusedinsection6.6.ThereaderisreferredtoAarts&Korst(1989)
fordetailsonthesimulatedannealingalgorithm.
6.4.2
Iteratedconditionalmodes(ICM)
TheICM
algorithmconsistsofanumberofsweepsovertheimage,where
eachpixelisvisitedandsettothemodeoftheconditionalprobability,i.e.
^xi=argmax
t
P(Yi jyi )P(Xi=tjxj ;j6=i):
TheICM
algorithm
usuallyconvergesintheorderof10sweeps,which
isgenerallymuchlessthanwouldberequiredforasimulatedannealing
scheme.AnICMschemeisontheotherhandmorelikelytogettrappedin
alocalmaximumoftheposteriordensity.
6.5
Marginalposteriormodes(MPM)
Marroquinetal.(1987)generatedaseriesofsamplesfromthe(discrete)
posteriordistributionand,foreachpixel,chosethemodeofthemarginal
posteriordistribution,i.e.x
�i=argmaxP(xi jy)
Thesamplingalgorithm
forMPM
canbee.g.theGibbssamplerorthe
Metropolisspin- ipalgorithm.AMPMschemeusedwithalabelpriorwill
162
Chapter6.Bayesianparadigm
minimizetheexpectedmisclassi�cationerrorundertheposteriordistribu-
tion.
6.6
Hybridization�lteranalysis
163
6.6
Hybridization�lteranalysis
Analgorithm
forautomaticlocalizationandclassi�cationofspotsona
hybridization�lterhasbeendevelopedandimplemented.Thealgorithm
representsasuccessfulapplicationofaMarkovtemplateprior,theMetropo-
lisalgorithmandasimulatedannealingscheme.
6.6.1
Background
ThegenomeanalysislabatImperialCancerResearchFund(ICRF)in
Londonisworkingonthehumangenomeproject.Thisprojectinvolves
amassiveamountofhybridizationexperiments.Theintentionofthework
presentedhereistoanalyzehybridization�ltersautomaticallyforthemap-
pingofthehumangenome.
The�lterisasquaresheetofnylonwithasidelengthof23.2cm.Arobot
placesa96x96gridofspotsonthis�lter,whereeachspotisaspeci�ccosmid
clone.AcosmidcloneisastretchofDNA,about40000baseslong.When
aradioactiveDNAprobeisappliedtothe�ltertheprobewillonlybind
(hybridize)tothosecosmidclonesthatcontainthesameDNAsequenceas
theprobeitself.Theunboundprobesarewashedo�,andspotscontaining
cloneshybridizedtotheprobeappeardarkerthantheotherspots,whenan
autoradiographistakenofthe�lter.Whenaphosphorimageistakenthe
spotscontaininghybridizedcloneswillappearlighterthantheotherspots.
164
Chapter6.Bayesianparadigm
6.6.2
Robotdynamics
Thecosmidclonesareplacedonthe�lterbyarobot.Theyarekepton
microtiterdisheswithan8x12gridofwells,thustherobotarm
consists
ofanarrayof8x12pins.Whentherobotarmisdippedintoamicrotiter
dishasmallquantityofeachcosmidcloneadherestoitscorrespondingpin.
Thearmisthenmovedtothe�lterwhereitappliesthecosmidclonesas
anarrayofspots.Afterthistherobotsterilizesthepinsandmovesonto
thenextdish.Thisisdone96timesforeach�lterproducingthe96x96
gridofspots.Thisgridismadeupof6almostindependent32x48subgrids
asshownin�gure6.1.Eachsubgridcontains4x4interleaved8x12grids
correspondingtothemicrotiterdishgrid.Thespacingbetweenthewellsof
themicrotiterdishesis8mm,thusthespacingbetweenthespotsis2mm.
6.6.3
Imageanalysisproblem
Theproblemtobesolvedthroughtheuseofimageanalysisistoautomati-
callydetectwhichcosmidcloneshybridizestotheprobe.Thisinvolvesthe
correctassignmentofeachspotonthe�ltertoacorrespondingregioninthe
imageandclassi�cationofeachspotastothedegreetowhichhybridiza-
tionhasoccurred.Severalcircumstancescomplicatethesolutionofthese
problems.Forthespotlocalizationproblemwehavethat
�Therobotmovementsareimprecise.
�Themembranemayphysicallywarp.
�Somepinsoftherobotarmmaybebent.
6.6
Hybridization�lteranalysis
165
. . . . . . . . . . . . . . . . . . ........................................ . . . . . . . . . . . . . . . . . .
......................................
.. . . . . .. . . . . .. . . . . ... . . . . .. .. . . .. . . . . .
96
48
96
32
32
32
48
Figure6.1.Arrangementofthe6subgrids.Thefullgridisa96x96spot
array.
166
Chapter6.Bayesianparadigm
�Somespotsaremissing.
�Somespotsmayhavemerged.
Forthespotclassi�cationproblemwehavethat
�Thebackgroundradiationlevelvariesacrossthe�lter.
�Somespotsmayhavemerged.
�Somespotsmayhavebeenmisplaced.
Weattempttoprovideane�ectivesetofimageanalysistoolsthatarerobust
underthesecircumstances.Thespotlocalizationproblemisconsideredto
bethemostdiÆcultandwillbeourmainconcern.
6.6.4
Digitization
Theautoradiographisdigitizedbyacameraandaframegrabber.Forthe
setupusednowtheresultisan8bit512x512image.Thegray-scaleofthese
imagesisinvertedtogetwhitespots.
Thephosphorimagerscanswith88�mperpixel.Itiscapableofscanning
anareaofupto35�43cm
with16-bitgray-scaleresolution.The�lter
anditsimmediatesurroundingsarescannedandtheresultingimageissub-
sampledtoan8-bit1024x1024image.Thisimageisthestartingpointof
theprocessing.Thespacingbetweenthespotsisabout8.5pixelsandthis
seemsreasonableforourpurpose.Theexamplesshowninthisthesisare
phosphorimages.In�gure6.2weshowanexampleofarawimage.
6.6
Hybridization�lteranalysis
167
Figure6.2.Rawimage.Thisisagoodqualityphosphorimageshowingthe
full96x96spotarray.
168
Chapter6.Bayesianparadigm
6.6.5
Preprocessing
Thepreprocessingservesfourpurposes.
1.Correctionforrotation.
2.Findingtherectangularoutlineofthespotarray.
3.Correctionforbackgroundvariations.
4.Spotequalization.
Thesuccessofthesubsequentspotlocalizationandspotclassi�cationde-
pendshighlyonasuccessfulimplementationofthesepreprocessingsteps.
Toillustratethepreprocessing,thespotlocalization,andthespotclassi�-
cationwewillshowthee�ectofeachstepontheimagein�gure6.3.This
isaphosphorimageofa32x48subgrid.
Correctionforrotation
Thespotarrayisnormallyverywellalignedwiththepixelarrayinphosphor
images,butautoradiographswillingeneralberotatedslightly.Therotation
anglecanbefoundbyusingtheHoughtransform(Seee.g.Duda&Hart
(1972))andsearchfortheanglebetweene.g.-5and+5degreeswiththe
highestvarianceoverthepro�leinHoughspace.Theimagecanthenbe
rotatedbackintoalignment.In�gure6.4weseetheimagefrom�gure6.3
afteralignment.
6.6
Hybridization�lteranalysis
169
Figure6.3.Phosphorimageofa32x48subgrid.
170
Chapter6.Bayesianparadigm
Figure6.4.Alignedversionoftheimagein�gure6.3.Therectangular
outlineisshown.
6.6
Hybridization�lteranalysis
171
Inparticularlyhardcasesthefourcornersofthespotarraycanbepointed
outmanually.Inthiswaywecanaligntheimageandobtaintherectangular
outlineofthespotarray.
Findingtherectangularoutlineofthespotarray
Wecannowassumestrictlyhorizontalandverticalbordersonthespot
array.Thesebordersarefoundby�rstcomputingthesum
ofeachrow
(column),fsi ;i=
0;::;1023g.Thenwecomputethedi�erenceoflag8
obtainingfdi ;i=8;::;1023g,wheredi=si �si�8 .Lag8ischosenbe-
causeitisclosetothedistancebetweenspotrows(columns).Finallywe
�ndthestartingrow(column)andendingrow(column)asargmaxi di �4
andargmini di+4,whereargmaxidiistherow(column)numberwith
themaximumdi�erence,andargmini diistherow(column)numberwith
theminimum
di�erence.Figure6.4showstherectangularoutline,where
threeofthesideswerefoundbythismethod.Theendingrowhadtobe
repositioned.
Correctionforbackgroundvariations
Thebackgroundvariesovertheimagesandthiswillcauseproblemsinthe
localizationandclassi�cationprocess.Thestandardwayofcorrectingfor
varyingbackgroundistosubtractalowpass�lteredimagefromtheoriginal.
Asalowpass�lterwewillchooseagray-scaleopening.Usinga at9�9
gray-scaleopeningwillremoveallthespotsandleavethebackground,which
isthensubtractedfromtheimage.Thisoperationcanbewrittenas
R=I�O(I)
172
Chapter6.Bayesianparadigm
whereIistheinputimage,O(I)istheopeningoftheinputimage,andR
istheresultingimage.
Spotequalization
Tomakethelocalizationprocesseasierweequalizetheintensityofthe
spots,thusweightingthespotsequally.Thisisdoneusingamorphological
equalization,
R=
I
D(I)�E(I)
whereIistheinputimage,Disthedilatedinputimage,Eistheeroded
inputimage,andRistheresultingimage.Againa at9�9structuring
elementisused.Basicallythemorphologicalequalizationmakesthelocal
graylevelrangeconstantovertheimage.Figure6.5showsthee�ectof
backgroundcorrectionandspotequalizationoftheimagein�gure6.4.
6.6.6
Spotlocalization
Thespotlocalizationinvolvesmatchinga96�96gridonthespotsin
theimage.Thisgridshouldadaptgloballytoprovidetheabsolutespot
coordinatesandlocallytotakeintoaccountallthesmalldistortionsinthe
grid.
6.6
Hybridization�lteranalysis
173
Figure6.5.Backgroundcorrectedandspotequalizedversionoftheimage
in�gure6.4.
174
Chapter6.Bayesianparadigm
Initialassignment
Ifwecanobtainagoodinitialguessonthespotlocationsthenthesub-
sequentprocessingwillbefaster.Astheinitialguesswecovertheoutline
rectanglewitharegular96�96grid.
Simulatedannealingscheme
AsabasisforimprovingthespotlocationsweuseaMarkovrandom�eld
asatemplatepriorfortheregulargridstructureofthespotarray.The
variablesinthismodeldoesnotrepresentapixelvaluebutthe(x;y)image
positionofaspot.Thisvariableisnotde�nedonthepixelgridbutonthe
spotgrid.Thepriorisde�nedas
P(g)/exp(��0 Xi�
jd20 (i;j)��1 Xi�
j (d1 (i;j)�D)2)
whereiandjrepresentsspotsandg=f(xi ;yi );i=1;::;ns gcontainsthe
locations(x;y)ofallthespots.nsisthenumberofspotsinthespotarray.
Theneighborhoodisthefournearestneighbors.d0 (i;j)isthedeviationin
alignmentofthespotsiandj,andd1 (i;j)�Disthedeviationfromthe�xed
griddistance,D,betweenneighbors.Figure6.6illustratesthemeaningof
d0andd1 .
Giventhespotlocationsgwethenspecifyanobservationmodelforthe
observedimageyas
P(yjg)/exp(� X
i
�(i))
wherethesummationisoverallspotsi,and�(i)isthesum
ofthegray
levelsina5x5neighborhoodaroundspoti.
6.6
Hybridization�lteranalysis
175
. . . . . . . . . . . . . . . ......................................... . . . . . . . . . . . . . . .. .. .. .. .. .. . .. .. .. .. .. . .. .. .. .. .. . .. .. .. .. .. . .. .. .. .. . .. .. .. .. .. . .. .. .. .. .. . .. .. .. .. .. . .. .. .. .. . .. .. .. .. .. . .. .. .. .. .. . .. .. .. .. .. . .. .. .. .. .. . .. .. .. .. . .. .. .. .. .. . .. .. .. .. .. . .. .. .. .. .. . .. .. .. .. . .. .. .. .. .. . .. .. .. .. .. . .. .. .. .. .. . .. .. .. .. .. . .. .. .. .. . .. .. .. .. .. . .. .. .. .. .. . .. .. .. .. .
......................................
. . .. . . . . .. . . . . .. . . .. . .. . . . . .. . . . . .. . ..
. . . . . . . . . . . . . . . .. ....................................... . . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . . . . ........................................ . . . . . . . . . . . . . . . . . .
D
......................................
. . .. . . . . .. . . . . .. . . .. . .. . . . . .. . . . . .. . ..
d0
d1
Figure6.6.De�nitionofthedistancemeasuresd0andd1forahorizontal
neighbor-pair.Disthedistancebetweenneighborsonaperfectgrid.
Wecanregardthissetupasastructuraltexturemodel.Thepriormodel
representstheplacementrulesandtheobservationmodelrepresentsthe
primitives.
Theposteriordistribution
P(gjy)/P(yjg)P(g)
isobviouslyanewMarkovrandom�eld,andre ectsatrade-o�betweenthe
regularityofthegridandthetrustintheimagedata.Theenergyfunction
oftheposteriordistributionisgivenas
U=�0 Xi�
jd20 (i;j)+�1 Xi�
j (d1 (i;j)�D)2�� X
i
�(i):
Inthisenergyfunctionwecancontrolthepropertiesofthe�ttedgrid.The
faithinthedataiscontrolledby�,sincethisparameteristheweightofthe
intensityofthespots.Theregularityofthegridiscontrolledby�0and�1 .
�0determinesthedegreeoflinearityofthegridand�1controlsdeviations
fromthe�xedgriddistancebetweenneighboringspots.
176
Chapter6.Bayesianparadigm
Thismodelhasaproblemforspotsontheedges.Ifnothingisdonethe
threeorfourouterrowsandcolumnswillbedraggedtowardsthecenter
ofthespotarraybecauseofthelackofspotspullingtheotherway.To
eliminatethise�ectwede�nearti�cialspotsaroundtheedgesofthespot
array.Thearti�cialspotsareinitiallypositionedjustoutsidetheoutline
ofthespotarray,andtheonlyrestrictionintheirmovementsisthatthey
cannotcrossthisoutline.
WecannowapplyasimulatedannealingschemeandtheMetropolisalgo-
rithmusingthisMarkovrandom�eld.Everyspotisvisitedandanattempt
ismadetochangeitspositiontoarandomlyselectednearestneighbor.
In�gure6.7thelocationofeveryspotin�gure6.5ismarkedbyadot.
Figure6.8showthespotlocationsinaclose-upofthelowerrightcornerof
�gure6.7.
Robotgridcontrol
Thesame8�12gridofrobotpinsisused96timesoneach�lter.This
gridcanberegardedas�xedandwecanusethatinformationtodetect
andcorrectmisplacedspots.We�rstcomputethemeanoftherelative
positionsofneighborsinthe8�12grid.Thenforeachspotwecompute
thedeviationsfromthismeanforallfourneighbors.Thetrimmedmean
(minandmaxtrimmedo�)ofthesefourdeviationswillgiveagriddeviation
numberforeachspot.Ifthegriddeviationnumberexceedsaspeci�cvalue
thespotwillbeconsideredasmisplaced.Themisplacedspotscanthenbe
relocatedusingtherelativepositiontotheneighborinthe8�12gridwith
thelowestgriddeviation.
6.6
Hybridization�lteranalysis
177
Figure6.7.Spotlocations.Thelocatedspotsoftheimagein�gure6.5are
markedwithadot.
178
Chapter6.Bayesianparadigm
Figure6.8.Spotlocationsinaclose-upofthelowerrightcornerofthe
imagein�gure6.7.
6.6
Hybridization�lteranalysis
179
Therearenomisplacedspotsin�gure6.7,butanexampleoftherobotgrid
controlprocedurewillbeshowninsection6.6.8.
6.6.7
Spotclassi�cation
Thespotclassi�cationisbasedonthemeangraylevelintheneighborhood
ofthespotlocationfromthebackgroundcorrectedimage.Thresholdsare
selectedtoclassifyeachspotinoneofthreeclasses:positive(+),negative
(-)ormissing(x).Spotclassi�cationsofthelocatedspotsin�gure6.8are
shownin�gure6.9.
Ifthereisanydoubtwhetheraspothasbeencorrectlylocateditwillbe
classi�edasmissing.
6.6.8
Results
Figures6.10and6.11showclose-upsofthelocalizationresultoftheimage
in�gure6.2.Theyillustratetherobustnessofthealgorithm.In�gure6.10
thereisaverticalgapdownthemiddleoftheimage.Thisgapdoesnotcause
anyproblemsinthelocalizations.Inthecenterof�gure6.11weseethat
tworowsofspotsmergeandsplitupagain.Thisisalsointerpretedcorrectly
bythealgorithm.Inboth�guresweseethatmissingspotsarelocatedina
satisfactoryway.Toobtaintheseresultsweusedtheparameters:
�Priormodel:�0=�1=5:0
�Observationmodel:�=0:2
180
Chapter6.Bayesianparadigm
Figure6.9.Spotclassi�cationsofthelocatedspotsin�gure6.8.Theclasses
are:positive(+),negative(-)ormissing(x).
6.6
Hybridization�lteranalysis
181
Figure6.10.Close-upsofthelocalizationresultoftheimagein�gure6.2.
Thereisaverticalgapdownthemiddleoftheimage.
�Startingtemperature:T0=4:0
�Temperaturescheme:Tn
=Tn�1log(n+2)
log(n+3)
�Numberofiterations:100
Untilnowwehaveonlyshowngoodqualityphosphorimages.In�gure6.12
weseeanoisyphosphorimage,wheretheregularspotpatternishardly
noticeableinlargeareasofthespotarray.Inthesimulatedannealingscheme
weusedthesameparametersasbeforeexceptthatweset�=0:1toputless
trustinthedataandmoretrustinthegridstructure.Thelocationsfound
onthefullgridaremarkedwithdotsandshownin�gure6.13.Aclose-up
ofthis�gureisshownin�gure6.14.Wecanseefromthe�gures,thatthe
182
Chapter6.Bayesianparadigm
Figure6.11.Close-upsofthelocalizationresultoftheimagein�gure6.2.
Inthecenterweseethattworowsofspotsmergeandsplitupagain.
6.6
Hybridization�lteranalysis
183
Figure6.12.Noisyphosphorimage.Theregularspotpatternishardly
noticeableinlargeareasofthespotarray.
184
Chapter6.Bayesianparadigm
Figure6.13.Spotlocationsfromtheimagein�gure6.12markedwithdots.
clearlyvisiblespotsarelocatedcorrectly.Evenforareaswherespotsare
hardlynoticeableweseethatalgorithmmakesareasonablechoice.
Figures6.15and6.16showthee�ectoftherobotgridcontrol.Agroupof
4x3spotshasbeenshiftedtotheleftin�gure6.15.Inthiscasetheshift
wasduetoafastcooling.In�gure6.16weseethemisplacedspotspointed
outbytherobotgridcontrolalgorithm.Wecannowrelocatethemisplaced
spotsandrunthelocalizationalgorithmagain.
6.6
Hybridization�lteranalysis
185
Figure6.14.Close-upofthespotlocationsshownin�gure6.13.
186
Chapter6.Bayesianparadigm
Figure6.15.Errorsinthelocalization.Agroupof4x3spotshasbeen
shiftedtotheleft.
6.6
Hybridization�lteranalysis
187
Figure6.16.Misplacedspotspointedoutbytherobotgridcontrolalgo-
rithm.
188
Chapter6.Bayesianparadigm
6.6.9
Conclusion
Wehavepresentedanalgorithmforautomaticlocalizationandclassi�ca-
tionofspotsonahybridization�lter.ThealgorithmisbasedonaMarkov
templatepriorforthespotarray,andthelocalizationisobtainedasatrade-
o�betweenthismodelandtheobserveddata.Thecomputationisbased
onasimulatedannealingscheme.Asetofoperationswasusedtoprepro-
cesstheimages.Thesepreprocessingstepshelpedsigni�cantlyinmaking
thesimulatedannealingschemesuccessfulandcomputationallyfeasible.A
postprocessingstepthatimplementsacheckonthelocalizationhasbeen
implemented.
Thealgorithm
hasbeensuccessfullyappliedtomanyhybridization�lter
images.Itseemstobebothe�ectiveandrobustcomparedtopreviously
testedautomaticmethods(unpublished).Itseemstobeabletoclassify
spotsmuchfasterandinmanycasesmoreaccuratelythanamanualoper-
ator.
Chapter7
Conclusion
Textureisanimportantcharacteristicofvisualphenomena,andmanyat-
temptshavebeenmadetocapturetherelevanttexturalpropertiesinaset
oftexturefeaturesorasatexturemodel.Wehavecontributedtothese
attemptsbygoingthroughselectedtheoryandpracticalapplications.
7.1
Summary
Fortexturedescriptionwehavebasedourstudiesonthe�rst-andsecond-
orderstatistics.Wehaveshownthat�rst-orderstatisticscanprovidevalu-
abletexturalinformationiftheyarecomputedatseveralscales(resolutions).
Wefoundthatacoarse-scale�rst-orderstatisticrobustlymeasuredenzy-
matictreatmente�ectsontextile.Thisshowsthatitmaybefruitfulto
189
190
Chapter7.Conclusion
considerthesimplestfeatures�rst,whensolvingatexturedescriptionprob-
lem.
Wehavesurveyedfeaturesbasedongraylevelcooccurrencematrices.The
e�ectofmatchingthegraylevelhistogramtoaspeci�cdistributionbefore
computingthecooccurrencefeatureshasbeenstudied.Classi�cationre-
sultssuggestthatthefrequentlyusedhistogramequalizationreducesthe
discriminatorypowerofthefeaturessigni�cantlyforstochastictextures.A
relativelyneglectedfeature,thediagonalmoment,turnedouttobeveryim-
portantfordiscriminatingtexturesafteraGaussianhistogrammatch.This
suggeststhatingeneralwelooseimportantinformationwhenreplacing
thecooccurrencematrixwiththegrayleveldi�erenceandgraylevelsum
histograms.ThecombinationofGaussianmatchedtexturesandCART
classi�cationresultedinsimple,easilyinterpretableandrelativelyaccurate
classi�ers.
Markovrandom�eldshavebeensurveyedastexturemodels.Manyimpor-
tantresultsaboutthesemodelsfromthe�eldofstatisticalphysicsarestill
fairlyunknowninthe�eldofimageanalysis.Wehaverestatedsomeofthe
resultsinastatisticalsetting.Theseresultsleadsustoanextensiontothe
asymptoticmaximumlikelihoodestimatorofPickard(1987).
StandardMarkovrandom�eldsarebasedonpairwiseinteractionbetween
pixelsthusfailingtoincorporatemorphologicalproperties.Wesuggesta
reformulationofthediscretemodels,inwhichtheoperatorsofmathematical
morphologyreplacetheconceptofcliques.Theadvantagesofmorphologi-
calMarkovrandom�eldsare,thatmorphologicalpropertiesbecomemore
apparentandthatweobtainacoherencebetweentexturedescriptionand
7.2
A
comment
191
texturemodels.IllustrativesimulationsofmorphologicalMarkovrandom
�eldsshowthatinterestingvisualphenomenacanbecreated.
WehavegivenareviewofMarkovrandom�eldparameterestimationand
Markovrandom
�eldsimulation.Anew,fast,parallelalgorithm
forsi-
mulationconditionalonthe�rst-orderstatisticshasbeendevelopedand
implementedonamassivelyparallelcomputer.Theconditioningismain-
tainedbyastandardPID-controller.Longrunsofthisalgorithmhasgiven
usinformationaboutsteady-statepatternsfortheconditionalmodels.The
algorithmhasalsobeenusedforsimulationsofthegeometricalstructureof
oilreservoirsbasedonamorphologicalMarkovrandom�eldmodel.
Markovrandom�eldshavebeenusedsuccessfullyinaBayesiansettingto
analyzehybridization�ltersautomaticallyforthehumangenomeproject.A
�rst-orderMarkovrandom�eldisusedtomodelthegeometricalstructure
ofaspotarray,andthismodelisthenusedaspriorknowledgeforthe
accuratelocalizationofthesinglespots.Thelocalizationisdoneusinga
simulatedannealingscheme.
Anextensivecollectionofsoftwarehasbeendevelopedduringthecourseof
thiswork.ThemainsoftwaredevelopmentsarelistedinappendixA.
7.2
Acomment
Textureanalysishasbeenstudiedextensivelybymanyresearchersoverthe
lasttwodecades.Thestandardreferenceformostofthesestudieshasbeen
theBrodatztextures.Althoughthesetexturescancontinuetoprovide
192
Chapter7.Conclusion
insightabouttexturefeatures,therearetwopointsofcriticismtosuchan
approach.TheBrodatztexturesonlyrepresentanin�nitesimalfraction
ofrealworldtextures,andtheBrodatztexturesareverydi�erent.Even
thoughtheliteratureontextureanalysis,basedonBrodatztextures,is
fullofsuccesses,therearestillplentyofchallengesfortextureresearchers
in�eldslikeindustrialinspection,biologicalandmedicalimaging,remote
sensing,geologyetc.
AppendixA
Developedsoftware
Anextensiveselectionofsoftwarehasbeendevelopedduringthecourseof
thiswork.Theserial(nonparallel)programsweredevelopedinConHP
workstationsrunningHP-UX.TheparallelprogramsweredevelopedinC�
onaConnectionMachineCM-200withaSun-4frontend.Serialprograms
havethesuÆx.candparallelprogramshavethesuÆx.cs.
Standardnumericalalgorithmsweretakenfrom
Pressetal.(1988).On
theConnectionMachineweusedthesuppliedCMSSLlibrary.Forrandom
numbergenerationunderHP-UXweusedthewell-knownlinearcongruen-
tialalgorithmwith48-bitintegerarithmetic(drand48).OntheConnection
Machineweusedalagged-Fibonaccialgorithm(Knuth,1973)implemented
intheCMSSLlibrary.
193
194
AppendixA.DevelopedSoftware
Bothserialandparallelprogramshavebeenmadetoworkasmodulesof
thepipe-orientedHIPSandHIPS-2imageprocessingsoftware.
Thefollowinglistcontainsthemainsoftwaredevelopments.
�Texturestatistics
1.histinfo
Histinfocomputes�rst-orderstatisticsfromtheinputimage.
2.glcm
Glcmcomputesthegraylevelcooccurrencematrixand15fea-
turesfromthismatrix.
3.fhist
Fhisttakesa oatingpointinputimage,sortsallthepixels,and
outputsbyteimagewithaspeci�edhistogram.Thehistogram
canbeuniform(equalization),Gaussian,orabeta-function.
�Markovrandom�eldestimation
1.binest
Binestcomputescodingestimatesandmaximum
pseudolikeli-
hoodestimatesfromabinaryinputimage.Modelsuptoorder
�vecanbeestimated.Isotropy/anisotropycanbecontrolled
foreachneighbor-distance.The�2
teststatisticandthelog-
likelihoodiscomputedforeachcoding.
2.binomest
Binomestestimatesthemaximumpseudolikelihoodestimatesof
abinomialMarkovrandom�eldfromtheinputimage.
AppendixA.DevelopedSoftware
195
3.asympest
Asympestcomputestheasymptoticmaximum
likelihoodesti-
mateofa�rst-orderbinaryMarkovrandom�eldfromtheinput
image.
4.pottsest
Pottsestcomputesthemaximumpseudolikelihoodestimateofa
Pottsmodelfromtheinputimage.
5.gaussest
Gaussestcomputesthemaximumpseudolikelihoodestimateofa
GaussianMarkovrandom�eldfromtheinputimage.
�Markovrandom�eldsimulation
1.binsamp
BinsampsimulatesbinaryMarkovrandom�eldsusingtheGibbs
samplerortheMetropolisalgorithm.
2.pottssamp
PottssampsimulatesPottsmodelsusingtheGibbssampleror
theMetropolisalgorithm.
3.morphsamp
MorphsampsimulatesmorphologicalbinaryMarkovrandom�elds
usingtheMetropolisalgorithm.
4.swendsen
SwendsensimulatesPottsmodelsusingtheSwendsen-Wangal-
gorithm.
5.rocksamp
Rocksampsimulatesgeologicalsamplesusingamorphological
Pottsmodelandthe�-controlledspin- ipalgorithm.Model
196
AppendixA.DevelopedSoftware
parametersforeachphasecanbespeci�ed.Rocksampisthe
programusedinsection5.4.
6.icrf
Icrfisthepackageofhybridizationanalysissoftwareusedinsec-
tion6.6.
7.bingen
BingenisprograminC�forConnectionMachines.Itsimulates
binaryMarkovrandom
�eldsin2Dand3D.The�-controlled
algorithmisimplemented.Theresultscanbemonitored"real
time"inanX-window.
�Other
1.xshow
XshowisaprogramthatdisplaysHIPSimagesunderX-windows
andletstheuserinteractusingHIPSprograms.
2.frarithmetic
Frarithmeticisaprogramthatcanbeexecutedwithmanynames
(allstartingwith"fr").Itdoesmanykindsofarithmeticopera-
tionsonasetofimages.
3.AHIPSimplementationofthebasicgraylevelmorphological
operations:
{Erosion
{Dilation
{Opening
{Closing
{Morphologicalgradient
{Whitetophat
AppendixA.DevelopedSoftware
197
{Blacktophat
{Morphologicalequalization
198
AppendixA.DevelopedSoftware
AppendixB
GLCM
forallBrodatz
textures
Thisappendixcontainstheright-neighborGLCM
foralltheBrodatztex-
tures(nohistogram
match).Byobservingthekindofstructuresthese
matricescanhavewemaygetabetterideaofwhichfeaturesgivethebest
summary.
199
200
AppendixB.GLCM
forallBrodatztextures
FigureB.1.
Right-neighborGLCM
forBrodatztexturesD1toD10
(byrow).
AppendixB.GLCM
forallBrodatztextures
201
FigureB.2.
Right-neighborGLCM
forBrodatztexturesD11toD20
(byrow).
202
AppendixB.GLCM
forallBrodatztextures
FigureB.3.
Right-neighborGLCM
forBrodatztexturesD21toD30
(byrow).
AppendixB.GLCM
forallBrodatztextures
203
FigureB.4.
Right-neighborGLCM
forBrodatztexturesD31toD40
(byrow).
204
AppendixB.GLCM
forallBrodatztextures
FigureB.5.
Right-neighborGLCM
forBrodatztexturesD41toD50
(byrow).
AppendixB.GLCM
forallBrodatztextures
205
FigureB.6.
Right-neighborGLCM
forBrodatztexturesD51toD60
(byrow).
206
AppendixB.GLCM
forallBrodatztextures
FigureB.7.
Right-neighborGLCM
forBrodatztexturesD61toD70
(byrow).
AppendixB.GLCM
forallBrodatztextures
207
FigureB.8.
Right-neighborGLCM
forBrodatztexturesD71toD80
(byrow).
208
AppendixB.GLCM
forallBrodatztextures
FigureB.9.
Right-neighborGLCM
forBrodatztexturesD81toD90
(byrow).
AppendixB.GLCM
forallBrodatztextures
209
FigureB.10.Right-neighborGLCM
forBrodatztexturesD91toD100
(byrow).
210
AppendixB.GLCM
forallBrodatztextures
FigureB.11.Right-neighborGLCM
forBrodatztexturesD101toD110
(byrow).
AppendixB.GLCM
forallBrodatztextures
211
FigureB.12.Right-neighborGLCM
forBrodatztexturesD111andD112
(byrow).
212
AppendixB.GLCM
forallBrodatztextures
References
Aarts,E.&Korst,J.(1989).SimulatedAnnealingandBoltzmannMachines.
J.Wiley&Sons,NewYork.272pp.
Adler,P.M.,Jacquin,C.G.,&Quiblier,J.A.(1990).Flowinsimulated
porousmedia.Int.J.MultiphaseFlow,16(4),691{712.
�Astr�om,K.J.&
Wittenmark,B.(1984).Computercontrolledsystems:
Theoryanddesign.Prentice-HallInternational.430pp.
Berry,J.R.&Goutsias,J.(1991).Acomparativestudyofmatrixmeasures
formaximumlikelihoodtextureclassi�cation.IEEETransactionson
Systems,Man,andCybernetics,21(1),252{261.
Besag,J.&Moran,P.A.P.(1975).Ontheestimationandtestingofspatial
interactioningaussianlatticeprocesses.Biometrika,62(3),555{562.
Besag,J.(1974).Spatialinteractionandthestatisticalanalysisoflattice
systems.JournaloftheRoyalStatisticalSociety,SeriesB,36,192{236.
Besag,J.(1975).Statisticalanalysisofnon-latticedata.TheStatistician,
24,179{195.
213
214
REFERENCES
Besag,J.(1977).EÆciencyofpseudolikelihoodestimationforsimplegaus-
sian�elds.Biometrika,64(3),616{618.
Besag,J.(1986).Onthestatisticalanalysisofdirtypictures.Journalof
theRoyalStatisticalSociety,SeriesB,48(3),259{302.
Besag,J.(1989).TowardsBayesianimageanalysis.JournalofApplied
Statistics,16(3),395{407.
Bishop,Y.M.M.,Fienberg,S.E.,&
Holland,P.W.(1975).Discrete
MultivariateAnalysis-TheoryandPractice.MITPress,Cambridge,
Massachusetts.557pp.
Breiman,L.,Friedman,J.H.,Olshen,R.A.,&Stone,C.J.(1984).Clas-
si�cationandregressiontrees.Wadsworth&Brooks/Coleadvanced
books&software,Monterey,California.358pp.
Brodatz,P.(1966).Textures-APhotographicAlbumforArtistsandDe-
signers.Dover,NewYork.
Brush,S.G.(1967).HistoryoftheLenz-Isingmodel.ReviewsofModern
Physics,39(4),883{893.
Burt,P.J.(1981).Fast�ltertransformsforimageprocessing.Computer
GraphicsandImageProcessing,16,20{51.
Chellappa,R.(1985).Two-dimensionaldiscretegaussianMarkovrandom
�eldmodelsforimageprocessing.InKanal,L.&Rosenfeld,A.(Eds.),
ProgressinPatternRecognition2,pp.79{112.North-Holland,Amster-
dam.
Cohen,J.(1960).AcoeÆcientofagreementfornominalscales.Educational
andPsych.Meas.,20,37{46.
REFERENCES
215
Conners,R.W.&Harlow,C.A.(1980).Atheoreticalcomparisonoftex-
turealgorithms.IEEETransactionsonPatternAnalysisandMachine
Intelligence,2(3),204{222.
Conners,R.W.,Trivedi,M.M.,&Harlow,C.A.(1984).Segmentationofa
high-resolutionurbansceneusingtextureoperators.ComputerVision,
GraphicsandImageProcessing,25,273{310.
Cross,G.R.&Jain,A.K.(1983).Markovrandom�eldtexturemodels.
IEEETransactionsonPatternAnalysisandMachineIntelligence,5,
25{39.
Derin,H.&Elliot,H.(1987).Modellingandsegmentationofnoisyand
texturedimagesusingGibbsrandom
�elds.IEEETransactionson
PatternAnalysisandMachineIntelligence,9,39{55.
duBuf,J.M.H.,Kardan,M.,&Spann,M.(1990).Texturefeatureperfor-
manceforimagesegmentation.PatternRecognition,23(3/4),291{309.
Dubes,R.C.&Jain,A.K.(1989).Random�eldmodelsinimageanalysis.
JournalofAppliedStatistics,16(2),131{164.
Dubrule,O.(1989).Areviewofstochasticmodelsforpetroleumreservoirs.
InArmstrong,M.(Ed.),Geostatistics,Vol.2,pp.493{506.Kluwer
AcademicPublishers.
Duda,R.O.&Hart,P.E.(1972).UseoftheHoughtransformationto
detectlinesandcurvesinpictures.Communicationsoftheacm,15(1),
11{15.
Farmer,C.L.(1989).Themathematicalgenerationofreservoirgeology..
PaperpresentedattheJointIMA/SPEEuropeanConferenceonThe
MathematicsofOilRecovery.RobinsonCollege,CambridgeUniversity,
25th-27thJuly.
216
REFERENCES
Feller,W.(1968).Anintroductiontoprobabilitytheoryanditsapplications,
Vol.1.J.Wiley&Sons.509pp.
Figueiras-Vidal,A.R.,Paez-Borrallo,J.M.,&Garcia-Gomez,R.(1987).
Onusingcooccurrencematricestodetectperiodicities.IEEETransac-
tionsonAcoustics,Speech,andSignalProcessing,35,114{116.
Galloway,M.M.(1975).Textureanalysisusinggraylevelrunlengths.
ComputerGraphicsandImageProcessing,4,172{179.
Geman,S.&Geman,D.(1984).Stochasticrelaxation,Gibbsdistributions
andtheBayesianrestorationofimages.IEEETransactionsonPattern
AnalysisandMachineIntelligence,6,721{741.
Geman,S.&GraÆgne,C.(1987).Markovrandom
�eldsandtheirap-
plicationstocomputervision.InGleason,A.(Ed.),Proceedingsof
theInternationalCongressofMathematicians,pp.1496{1517Berke-
ley,California.
Geman,S.&McClure,D.E.(1987).Statisticalmethodsfortomographic
imagereconstruction.BulletinoftheInternationalStatisticalInstitute,
52,5{21.
Geman,D.,Geman,S.,GraÆgne,C.,&Dong,P.(1990).Boundaryde-
tectionbyconstrainedoptimization.IEEETransactionsonPattern
AnalysisandMachineIntelligence,12(7),609{628.
Geman,D.(1990).Random
�eldsandinverseproblemsinimaging.In
Saint-Flourlectures1988,LectureNotesinMathematics,pp.113{193.
Springer-Verlag.
Green,P.J.(1986).ContributiontothediscussionofthepaperbyJ.Besag.
JournaloftheRoyalStatisticalSociety,SeriesB,48(3),284{285.
REFERENCES
217
Haldorsen,H.H.,Brand,P.J.,&
Macdonald,C.J.(1988).Reviewof
thestochasticnatureofreservoirs.InEdwards,S.&King,P.(Eds.),
Mathematicsinoilproduction,No.18inTheInstituteofMathematics
&
itsApplicationsconferenceseries,pp.109{209.ClarendonPress,
Oxford.
Haralick,R.M.,Shanmugam,K.,&Dinstein,I.(1973).Texturalfeatures
forimageclassi�cation.IEEETransactionsonSystems,Man,and
Cybernetics,3(6),610{621.
Haralick,R.M.,Sternberg,S.R.,&Zhuang,X.(1987).Imageanalysisusing
mathematicalmorphology.IEEETransactionsonPatternAnalysisand
MachineIntelligence,9(4),532{550.
Haralick,R.M.(1979).Statisticalandstructuralapproachestotexture.
ProceedingsoftheIEEE,67(5),786{804.
Hassner,M.&Sklansky,J.(1980).TheuseofMarkovrandom
�eldsas
modelsoftexture.ComputerGraphicsandImageProcessing,12,357{
370.
Ising,E.(1925).Beitragz�urTheoriedesFerromagnetismus.Zeitschriftf�ur
Physik,31,253{258.
Jaynes,E.T.(1957).Informationtheoryandstatisticalmechanics.Physical
Review,106,620{630.
Julesz,B.&Bergen,J.R.(1983).Textons,thefundamentalelementsinpre-
attentivevisionandperceptionoftextures.BellSyst.Tech.J.,62(6),
1619{1645.
Julesz,B.(1975).Experimentsinthevisualperceptionoftexture.Scienti�c
American,232(4),34{43.
218
REFERENCES
Julesz,B.(1981).Textons,theelementsoftextureperception,andtheir
interactions.Nature,290,91{97.
Kashyap,R.L.&Chellappa,R.(1983).Estimationandchoiceofneigh-
borsinspatialinteractionmodelsofimages.IEEETransactionson
InformationTheory,29(1),60{72.
Kashyap,R.L.,Chellappa,R.,&Khotanzad,A.(1982).Textureclas-
si�cationusingfeaturesderivedfrom
random
�eldmodels.Pattern
RecognitionLetters,1(1),43{50.
Kinderman,R.&Snell,J.L.(1980).MarkovRandomFieldsandtheirAp-
plications.AmericanMathematicalSociety,Providence,RhodeIsland.
142pp.
Kirkland,M.(1989).SimulationmethodsforMarkovrandom�elds.Ph.D.
thesis,UniversityofStrathclyde,Glasgow.189pp.
Knuth,D.E.(1973).TheArtofComputerProgramming,Vol.2:Seminu-
mericalAlgorithms.Addison-Wesley.
Laws,K.I.(1980).TexturedImageSegmentation.USCIPIreport#940,
SignalandImageProcessingInstitute,UniversityofSouthernCalifor-
nia,LosAngeles.178pp.
Linfoot,E.H.(1957).Aninformationalmeasureofcorrelation.Information
andControl,1,85{89.
Liu,S.&Jernigan,M.E.(1990).Textureanalysisanddiscriminationin
additivenoise.ComputerVision,GraphicsandImageProcessing,49,
52{67.
Margalit,A.(1989).AparallelalgorithmtogeneratetogenerateaMarkov
random�eldimageonaSIMDhypercubemachine.PatternRecognition
Letters,9(4),263{278.
REFERENCES
219
Marroquin,J.,Miter,S.,&Poggio,T.(1987).Probabilisticsolutionof
ill-posedproblemsincomputationalvision.JournaloftheAmerican
StatisticalAssociation,82,76{89.
Metropolis,N.,Rosenbluth,A.W.,Rosenbluth,M.N.,Teller,A.H.,&
Teller,E.(1953).Equationofstatecalculationsbyfastcomputing
machines.J.Chem.Phys.,21,1087{1092.
Murray,D.W.,Kashko,A.,&Buxton,H.(1986).Aparallelapproachto
thepicturerestorationalgorithmofGemanandGemanonanSIMD
machine.ImageandVisionComputing,4(3),133{142.
Onsager,L.(1944).Crystalstatistics.I.atwo-dimensionalmodelwithan
order-disordertransition.PhysicalReview,65,117{149.
Parkkinen,J.&Oja,E.(1986).Cooccurrencematricesandsubspacemeth-
odsintextureanalysis.InProc.8thICPR,pp.405{408Paris,France.
Parkkinen,J.,Selk�ainaho,K.,&Oja,E.(1990).Detectingtextureperiod-
icityfromthecooccurrencematrix.PatternRecognitionLetters,11(1),
43{50.
Pickard,D.K.(1987).InferencefordiscreteMarkov�elds:thesimplestnon-
trivialcase.JournaloftheAmericanStatisticalAssociation,82(397),
90{96.
Potts,R.B.(1952).Somegeneralizedorder-disordertransformations.Proc.
Camb.Phil.,48,106{109.
Press,W.H.,Flannery,B.P.,Teukolsky,S.A.,&Vetterling,W.T.(1988).
NumericalRecipesinC.CambridgeUniversityPress.735pp.
Rao,A.R.(1990).ATaxonomyforTextureDescriptionandIdenti�cation.
Springer-Verlag.197pp.
220
REFERENCES
Ripley,B.D.&Kirkland,M.D.(1990).Iterativesimulationmethods.J.
Comp.Appl.Math.,31,165{172.
Ripley,B.D.(1981).SpatialStatistics.J.Wiley&Sons.252pp.
Ripley,B.D.(1987).Stochasticsimulation.J.Wiley&Sons.237pp.
Ripley,B.D.(1988).Statisticalinferenceforspatialprocesses.Cambridge
UniversityPress.148pp.
Ripley,B.D.(1992).Stochasticmodelsforthedistributionofrocktypes
inpetroleumreservoirs.InWalden,A.&Guttorp,P.(Eds.),Statistics
intheEnvironmentalandEarthSciences.GriÆn.
Seneta,E.(1981).Non-negativeMatricesandMarkovChains.Springer-
Verlag,NewYork.279pp.
Serra,J.(1982).ImageAnalysisandMathematicalMorphology,Vol.1.
AcademicPress,NewYork.610pp.
Serra,J.(Ed.).(1988).
ImageAnalysisandMathematicalMorphology,
Vol.2.AcademicPress,NewYork.411pp.
Siew,L.H.,Hodgson,R.M.,&Wood,E.J.(1988).Texturemeasuresfor
carpetwearassessment.IEEETransactionsonPatternAnalysisand
MachineIntelligence,10(1),92{105.
Sternberg,S.R.(1986).Grayscalemorphology.ComputerVision,Graphics
andImageProcessing,35,333{355.
Sun,C.&
Wee,W.G.(1983).Neighboringgrayleveldependencema-
trixfortextureclassi�cation.ComputerVision,GraphicsandImage
Processing,23,341{352.
REFERENCES
221
Swendsen,R.H.&
Wang,J.(1987).Nonuniversalcriticaldynamicsin
MonteCarlosimulations.PhysicalReviewLetters,58(2),86{88.
Tomita,F.&Tsuji,S.(1990).ComputerAnalysisofVisualTexture.Kluwer
AcademicPublishers.173pp.
Unser,M.(1986a).Locallineartransformsfortexturemeasurements.Signal
Processing,11,61{79.
Unser,M.(1986b).Sum
anddi�erencehistogramsfortextureclassi�ca-
tion.IEEETransactionsonPatternAnalysisandMachineIntelligence,
8(1),118{125.
vanGool,L.,Dewaele,P.,&Oosterlinck,A.(1985).Textureanalysisanno
1983.ComputerVision,GraphicsandImageProcessing,29,336{357.
Vickers,A.L.&Modestino,J.W.(1982).Amaximumlikelihoodapproach
totextureclassi�cation.IEEETransactionsonPatternAnalysisand
MachineIntelligence,4(1),61{68.
Weszka,J.S.,Dyer,C.R.,&Rosenfeld,A.(1976).Acomparativestudy
oftexturemeasuresforterrainclassi�cation.IEEETransactionson
Systems,Man,andCybernetics,6(4),269{285.
Wol�,U.(1989).CollectiveMonteCarloupdatingforspinsystems.Physical
ReviewLetters,62,361{364.
Wu,F.Y.(1982).ThePottsmodel.ReviewsofModernPhysics,54(1),
235{268.
Yang,C.N.(1952).Thespontaneousmagnetizationofthetwo-dimensional
isingmodel.PhysicalReview,85,808{816.
222
REFERENCES
Zucker,S.W.&Terzopoulos,D.(1980).Findingstructureinco-occurrence
matricesfortextureanalysis.ComputerGraphicsandImageProcess-
ing,12,286{308.
Index
�-controlledspin- ipalgorithm,
134
asymptoticmaximum
likelihood,
118
autocorrelation,13
backgroundcorrection,170
Bayesianparadigm,157
CARmodel,108
CART,61
classi�cation,61,178
clique,81
codingelement,138
codingestimation,112
ConnectionMachine,138
contingencytable,24
cross-validation,62
diagonalmoment,16
distinctness,38
entropy,80
enzymatictreatment,35
equalization
histogram,13
morphological,171
spot,171
erosion,95
faults,150
Fourierfeatures,33
GaussianMarkovrandom
�elds,
108
Gaussianmatch,52
Gibbsrandom�elds,6,75
GLCM,13,46,52,197
GLDH,19
GLRLM,27
GLSH,20
graylevelcooccurrencematrices,
13
graylevelhistogram,10
grid,76
Hammersley-Cli�ordtheorem,83
223
224
INDEX
Haralickfeatures,23
histogramequalization,13,52
Houghtransform,167
Huntercoordinates,35
hybridization�lter,162
ICM,161
ICRF,162
importancesampling,137
inertia,20
inversedi�erencemoment,20
Isingmodel,83
iso-second-orderconjecture,2
iterativesimulation,124
labelprior,159
log-powerspectrum,34,40
macrotexture,2
MAPestimate,160
Markovchain,124
Markovrandom�elds,6,75
Metropolisalgorithm,126
microtexture,2
morphologicalMarkovrandom�eld,
94,106
MPMestimate,161
multi-resolution,12
neighborhoodsystem,80
NGLDM,29
NovoNordisk,35
objectmodels,149
observationmodel,159
phasetransition,89,106
PID-controller,135
pixelprior,158
placementrule,5
Pottsmodels,104
powerspectrum,34,40
primitive,5
pseudolikelihoodestimation,114
randomnumbergeneration,193
reservoirsimulation,147
rotationcorrection,167
SARmodel,108
simulatedannealing,160,162,173
spin-exchange,129
spin- ip,127
structuringelement,95
Swendsen-Wangalgorithm,130
templateprior,159,162
tessellation,76
texton,5
texture,1
Brodatz,2
deterministic,2
INDEX
225
hierarchical,2
random,2
textureanalysis,5
statistical,5
structural,5
textureelements,5
transitionmatrix,125
voxelmodels,148