image classification: redux lecture 7 prepared by r. lathrop 11//99 updated 11/02 readings: erdas...
TRANSCRIPT
![Page 1: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/1.jpg)
Image Classification: Redux
Lecture 7
Prepared by R. Lathrop 11//99
Updated 11/02
Readings:
ERDAS Field Guide 5th Ed. Ch 6:234-260
![Page 2: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/2.jpg)
Supervised vs. Unsupervised Approaches
• Supervised - image analyst "supervises" the selection of spectral classes that represent patterns or land cover features that the analyst can recognize
Prior Decision
• Unsupervised - statistical "clustering" algorithms used to select spectral classes inherent to the data, more computer-automated
Posterior Decision
![Page 3: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/3.jpg)
Supervised vs. Unsupervised
Edit/evaluate signatures
Select Training fields
Classify image
Evaluate classification
Identify classes
Run clustering algorithm
Evaluate classification
Edit/evaluate signatures
![Page 4: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/4.jpg)
Supervised vs. Unsupervised
Red
NIR
Supervised Prior Decision: from Information classes in the Image to Spectral Classes in Feature Space
Unsupervised Posterior Decision: from Spectral Classes in Feature Space to Information Classes in the Image
![Page 5: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/5.jpg)
Training• Training: the process of defining criteria by which
spectral patterns are recognized• Spectral signature: result of training that defines a
training sample or clusterparametric - based on statistical parameters that assume a normal distribution (e.g., mean, covariance matrix)nonparametric - not based on statistics but on discrete objects (polygons) in feature space
![Page 6: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/6.jpg)
Supervised Training Set Selection
• Objective - selecting a homogenous (unimodal) area for each apparent spectral class
• Digitize polygons - high degree of user control; often results in overestimate of spectral class variability
• Seed pixel - region growing technique to reduce with-in class variability; works by analyst setting threshold of acceptable variance, total # of pixels, adjacency criteria (horiz/vert, diagonal)
![Page 7: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/7.jpg)
Supervised Training Set Selection
Whether using the digitized polygon or seed pixel technique, the analyst should select multiple training sites to identify the many possible spectral classes in each information class of interest
![Page 8: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/8.jpg)
Training Stage
• Training set ---> training vector
• Training vector for each spectral class- represents a sample in n-dimensional measurement space where n = # of bands
for a given spectral class j
Xj = [ X1 ] X1 = mean DN band 1
[ X2] X2 = mean DN band 2
![Page 9: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/9.jpg)
Classification Training Aids• Goal: evaluate spectral class separability• 1) Graphical plots of training data
- histograms- coincident spectral plots- scatter plots
• 2) Statistical measures of separability - divergence - Mahalanobis distance
• 3) Training Area Classification
• 4) Quick Alarm Classification- paralellipiped
![Page 10: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/10.jpg)
Training Aids
• Graphical portrayals of training data– histogram (check for normality)– ranges (coincident spectral plots)– scatter plots (2D or 3D)
• Statistical Measures of Separability: expressions of statistical distance that are sensitive to both mean and variance- divergence- Mahalanobis distance
![Page 11: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/11.jpg)
Training Aids
• Scatter plots: each training set sample constitutes an ellipse in feature space
• Provides 3 pieces of information - location of ellipse: mean vector
- shape of ellipse: covariance- orientation of ellipse:
slope & sign of covariance
• Need training vector and covariance matrix
![Page 12: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/12.jpg)
Red Reflectance
NIRReflectance
Grass
Trees
water
ImperviousSurface &Bare Soil
Spectral Feature Space
Examine ellipses for gaps and overlaps. Overlapping ellipses ok within information classes; want to limit between info classes
Conifer
Broadleaf
Mix: grass/trees
![Page 13: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/13.jpg)
Training Aids
• Training/Test Area classification: look for misclassification between information classes; training areas can be biased, better to use independent test areas
• Quick alarm classification: on-screen evaluation of all pixels that fall within the training decision region (e.g. parallelipiped)
![Page 14: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/14.jpg)
Classification Decision Process
• Decision Rule: mathematical algorithm that, using data contained in the signature, performs the actual sorting of pixels into discrete classes
• Parametric vs. nonparametric rules
![Page 15: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/15.jpg)
Parallelepiped or box classifier
• Decision region defined by the rectangular area defined by the highest and lowest DN’s in each band; specify by range (min/max) or std dev.
• Pro: Takes variance into account but lacks sensitivity to covariance (Con)
• Pro: Computationally efficient, useful as first pass• Pro: Nonparametric• Con: Decision regions may overlap; some pixels
may remain unclassified
![Page 16: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/16.jpg)
Red Reflectance
NIRReflectance
Spectral Feature Space
Upper and lower limit of each box set by either range (min/max) or # of standard devs.
Note overlap in Red but not NIR band
Parallelepiped or Box Classifier
![Page 17: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/17.jpg)
Parallelepipeds have “corners”
Red reflectance
NIR
reflectance
Adapted from ERDAS Field Guide
.
Parallelepiped boundary
Signature ellipseunir
ured
Candidate pixel
![Page 18: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/18.jpg)
Parallelepiped or Box Classifier: problems
Red reflectance
NIR
reflectance
Soil 1 Soil 2
Soil 3
Water 1
Water 2
Veg 1
Veg 2
Veg3
Adapted from Lillesand & Kiefer, 1994
Overlap region
Misclassified pixel
??Unclassified pixels
![Page 19: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/19.jpg)
Minimum distance to means
• Compute mean of each desired class and then classify unknown pixels into class with closest mean using simple euclidean distance
• Con: insensitive to variance & covariance
• Pro: computationally efficient
• Pro: all pixels classified, can use thresholding to eliminate pixels far from means
![Page 20: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/20.jpg)
Minimum Distance to Means Classifier
Red reflectance
NIR
reflectance
Soil 1 Soil 2
Soil 3
Water 1
Water 2
Veg 1
Veg 2
Veg3
Adapted from Lillesand & Kiefer, 1994
![Page 21: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/21.jpg)
Minimum Distance to Means Classifier: Euclidian Spectral Distance
X
Y 92, 153
180, 85
Xd = 180 -92
Yd = 85-153Distance = 111.2
![Page 22: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/22.jpg)
Statistically-based classifiers
• Defines a probability density (statistical) surface
• Each pixel is evaluated for its statistical probability of belonging in each category, assigned to class with maximum probability
• The probability density function for each spectral class can be completely described by the mean vector and covariance matrix
![Page 23: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/23.jpg)
Parametric Assumption: each spectral class exhibits a unimodal normal
distribution
0 255Digital Number
# of pixels
Class 1 Class 2
Bimodal histogram: Mix of Class 1 & 2
![Page 24: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/24.jpg)
Red Reflectance
NIRReflectance
Spectral Feature Space
Ellipses defined by class mean and covariance; creates likelihood contours around each spectral class;
Spectral classes as probability surfaces
![Page 25: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/25.jpg)
Red Reflectance
NIRReflectance
Spectral Feature Space
Some classes may have large variance and greatly overlap other spectral classes
Sensitive to large covariance values
![Page 26: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/26.jpg)
Maximum likelihood classifier
• Pro: potentially the most accurate classifier as it incorporates the most information (mean vector and COV matrix)
• Con: Parametric procedure that assumes the spectral classes are normally distributed
• Con: sensitive to large values in the covariance matrix
• Con: computationally intensive
![Page 27: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/27.jpg)
Hybrid classification• Can easily mix various classification algorithms in
a multi-step process
• First pass: some non-parametric rule (feature space or paralellipiped) to handle the most obvious cases, those pixels remaining unclassified or in overlap regions fall to second pass
• Second pass: some parametric rule to handle the difficult cases; the training data can be derived from unsupervised or supervised techniques
![Page 28: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/28.jpg)
GIS Rule-based approaches• Unsupervised or supervised techniques to define
spectral classes• Use of additional geo-spatial data sets to either pre-
stratify image data set, for inclusion as additional band data in classification algorithm or post-processing
• Develop set of boolean rules or conditional statementsexample:
if spectral class = conifer and soil = sand, then pitch pine
![Page 29: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/29.jpg)
Region-based classification approaches
• As an alternative to “per-pixel” classification approaches, region-based approaches attempt to include the local spatial context
• Textural channels approach: inclusion of texture (local variance) as an additional channel in classification; con: tends to blur edges
• Region growing: Image segmented into spectrally homogenous, spatially contiguous regions first, then these regions are classified using a spectral classification approach; conceptually very promising but robust operational algorithms scarce
![Page 30: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/30.jpg)
Object-oriented classification: eCognition example
To download free trial version, go to:
http://www.definiens-imaging.com/down/index.htm
From the eCognition website: “Image analysis with eCognition is based upon contiguous, homogeneous image regions which are generated by an initial image segmentation.Connecting all the regions, the image content is represented as a network of image objects. These image objects act as the building blocks for the subsequent image analysis. In comparison to pixels, image objects carry much more useful information. Thus, they can be characterised by far more properties than pure spectral or spectral-derivative information, such as their form,texture, neighbourhood or context.”
![Page 31: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/31.jpg)
Post-classification “smoothing”
• Most classifications have a problem with “salt and pepper”, i.e., single or small groups of mis-classified pixels, as they are “point” operations that operate on each pixel independent of its neighbors
• Majority filtering: replaces central pixel with the majority class in a specified neighborhood (3 x 3 window); con: alters edges
• Eliminate: clumps “like” pixels and replaces clumps under size threshold with majority class in local neighborhood; pro: doesn’t alter edges
![Page 32: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/32.jpg)
Accuracy Assessment
• Various techniques to assess the “accuracy’ of the classified output by comparing the “true” identity of land cover derived from reference data (observed) vs. the classified (predicted) for a random sample of pixels
• Contingency table: m x m matrix where m = # of land cover classes– Columns: usually represent the reference data– Rows: usually represent the remote sensed classification
results
![Page 33: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/33.jpg)
Accuracy Assessment Contingency MatrixReference Data
Class.Data
1.10 1.20 1.40 1.60 2.00 2.10 2.40 2.50 RowTotal
1.10 109 11 4 0 0 0 1 1 116
1.20 2 82 2 3 0 0 0 0 101
1.40 3 4 123 0 0 0 0 0 130
1.60 2 1 0 22 0 0 1 1 25
2.00 0 0 0 0 0 0 0 0 0
2.10 0 0 0 0 0 9 0 0 9
2.40 0 2 1 0 0 0 74 0 76
2.50 0 1 0 0 0 0 0 41 43
ColTotal
116 101 130 25 0 9 76 43 500
![Page 34: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/34.jpg)
Accuracy Assessment• Sampling Approaches: to reduce analyst bias
– simple random sampling: every pixel has equal chance
– stratified random sampling: # of points will be stratified to the distribution of thematic layer classes (larger classes more points)
– equalized random sampling: each class will have equal number of random points
• Sample size: at least 30 samples per land cover class
![Page 35: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/35.jpg)
Accuracy Assessment Issues
• What constitutes reference data? - higher spatial resolution imagery (with
visual interpretation) - “ground truth”: GPSed field plots
- existing GIS maps
• Problem with “mixed” pixels: possibility of sampling only homogeneous regions (e.g., 3x3 window) but introduces a subtle bias
![Page 36: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/36.jpg)
Errors of Omission vs. Commission
• Error of Omission: pixels in class 1 erroneously assigned to class 2; from the class 1 perspective these pixels should have been classified as class1 but were omitted
• Error of Commission: pixels in class 2 erroneously assigned to class 1; from the class 1 perspective these pixels should not have been classified as class but were included
![Page 37: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/37.jpg)
Errors of Omission vs. Commission: from a Class2 perspective
0 255
Digital Number
# of pixels
Class 1 Class 2
Commission error: pixels in Class1 erroneously assigned to Class 2
Omission error: pixels in Class2 erroneously assigned to Class 1
![Page 38: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/38.jpg)
Accuracy Assessment Measures• Overall accuracy: divide total correct (sum of the major
diagonal) by the total number of sampled pixels; can be misleading, should judge individual categories also
• Producer’s accuracy: measure of omission error; total number of correct in a category divided by the total # in that category as derived from the reference data
• User’s accuracy: measure of commission error; total number of correct in a category divided by the total # that were classified in that category
![Page 39: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/39.jpg)
Accuracy Assessment Measures• Kappa coefficient: provides a difference measurement
between the observed agreement of two maps and agreement that is contributed by chance alone
• A Kappa coefficient of 90% may be interpreted as 90% better classification than would be expected by random assignment of classes
• Allows for statistical comparisons between matrices (Z statistic); useful in comparing different classification approaches to objectively decide which gives best results
![Page 40: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/40.jpg)
Kappa coefficient
Khat = (n * SUM Xii) - SUM (Xi+ * X+i) n2 - SUM (Xi+ * X+i)
where SUM = sum across all rows in matrix
Xi+ = marginal row total (row i)
X+I = marginal column total (column i)
n = # of observations
Takes into account the off-diagonal elements of the contingency matrix (errors of omission and commission)
![Page 41: Image Classification: Redux Lecture 7 Prepared by R. Lathrop 11//99 Updated 11/02 Readings: ERDAS Field Guide 5th Ed. Ch 6:234-260](https://reader035.vdocuments.us/reader035/viewer/2022062305/56649db65503460f94aa838f/html5/thumbnails/41.jpg)
Accuracy Assessment Measures
Code Land Cover Description NumberCorrect
Producer=sAccuracy
User=sAccuracy
Kappa
1.10 Developed 109 94.0 86.5 0.8243
1.20 Cultivated/Grassland 82 81.2 92.1 0.9014
1.40 Forest/Scrub/Shrub 123 94.6 94.6 0.9272
1.60 Barren 22 88.0 81.5 0.8051
2.00 Unconsolidated Shore --- --- --- ---
2.10 Estuarine Emergent Wetland 9 100.0 100.0 **
2.40 Palustrine Wetland: Emergent/Forested 74 97.4 96.1 0.9541
2.50 Water 41 95.4 97.6 0.9740
Totals 460 0.9005
** Sample Size for this Land Cover Class Too Small (< 25) for valid Kappa measure Overall Classification Accuracy = 92.0%