class presentation of csc481 - artificial neural networkscingiser/csc481/chapter_notes/ren.pdf-...
Post on 17-Mar-2020
1 Views
Preview:
TRANSCRIPT
1
page 1 of 34
Class Presentation of CSC481- Artificial Neural Networks
Tiegeng RenDept. of Computer Science
04/19/2004
page 2 of 34
Outline
• Problems in classification systems
• Introduction of neural networks
• How to use neural networks
• Applications
• Summary
2
page 3 of 34
Classification Systems
MappingRelationship
Methods:
Statistical classifier
ANN-based classifier
Observation space
Solutionspace
0~255 0~255 0~255 0~255 0~255 0~255 0~255
Category 1Category ..Category …Category …Category …Category …Category N
page 4 of 34
Classification Process for Remote Sensing Image
MappingRelationship
Methods:
Statistical classifier
ANN-based classifier
Observation space
SolutionspaceLandsat TM
Band1Band2Band3Band4Band5Band6Band7
0~255 0~255 0~255 0~255 0~255 0~255 0~255
Category 1Category ..Category …Category …Category …Category …Category N
WaterwetlandForestAgri.UrbanResidential
4045611938011225
Category: Forest(Pattern)
3
page 5 of 34
Supervised Remote Sensing Image Classification
Vegetation
Water
Soil
Band 1 (0 ~ 255)
Band 2 (0 ~ 255)
page 6 of 34
Statistical Methods
- Need Gaussian (Normal) distribution on the input data which is required by Bayesian classifier.
- Restrictions about the format of input data.
Band 1 (0 ~ 255)
Band 2 (0 ~ 255)
Vegetation
Water
Soil
4
page 7 of 34
Statistical Methods:
Deciduous forest
Conifer forest
Forest Wetland
Mixed forest
Brush Land
Non-forest Wetland
How to find a boundary for the following patterns?
page 8 of 34
Artificial Neural Network Approach
• No need for normal distribution on input data
• Flexibility on input data format
• Improved classification accuracy
• Robust and reliability
5
page 9 of 34
……………
…………
Introduction to Neural Networks-- Artificial Neural Network Is Defined by ...
• Processing elements• Organized topological structure• Training/Learning algorithms
page 10 of 34
Processing Element (PE)
� f(x) OutputInput
PE
Artificial counterparts of neurons in a brain
Wj1
Wj2
Wj4
Wj3
Wj5
6
page 11 of 34
PE’s Output
Function of Processing Elements
……………
…………
unitjo1
o2o3
fwj1
wj2
wj3
oj
PE’s Inputs
• Receive outputs from each PEslocate in previous layer.
• Compute the output with a Sigmoid activation function F(Sumof(Oi*Wji) )
• Transfer the output to all thePEs in next layer
page 12 of 34
Organized topological structures
7
page 13 of 34
Input layer Hidden layer Output layer
……………
…………
Input vector i(x1, x2, … xn)
Output vector i(o1, o2, … om)
ANN Structure- A multi-layer feedforward NN
page 14 of 34
Input layer Hidden layer Output layer
……………
…………
Input vector i(x1, x2, … xn)
Output vector i(o1, o2, … om)
Training/Learning Algorithm- Backpropagation (BP)
Feed Information Analysis Result
8
page 15 of 34
Training/Learning AlgorithmBack-propagation Mechanism
• Compute total error
• Compute the partial derivatives
• Update the weights and go to next epoch
W (t+1) = W(t) +
∑∑ −∈ n
pn
pn
Pp
st 2)(2
1
ijw
E
∂∂
W∆
page 16 of 34
Back-propagation Mechanism
Wij
Error
w∆
Global minimum
9
page 17 of 34
BPANNError Space, Weight Adaptive
Total Error
Steps
Wij
Wkl
page 18 of 34
How to use a neural network
• Analysis the problem domain• ANN design
– What structure of ANN to choose– What Algorithm to use– Input and Output
• Training• Applying the well-trained neural network to
your problem
10
page 19 of 34
Pattern Recognition- Understand the problem
Vegetation:
(10, 89) ------> (1,0,0)
(11,70) ------> (1,0,0)
… … … (1,0,0)
Water:
(10, 21) ------> (0,1,0)
(15, 32) ------> (0,1,0)
… … … (0,1,0)
Soil:
(50, 40) -------> (0,0,1)
(52, 40) -------> (0,0,1)
… … … (0,0,1)
Band 1 (0 ~ 255)
Band 2 (0 ~ 255)Vegetation
Water
Soil
What ANN structure to choose?
– Multi-layer feed-forward
What ANN training algorithm?– Back-propagation
page 20 of 34
ANN Design
Pattern
Input layer Hidden layer Output layer
…………
How many PEs we need - Basic rules in designing an ANN.
• Input layer PEs - by dimension of input vector
• Output layer PEs - by total number of patterns (classes)
…………
Feed Forward
Back-Propagate
(0 ~ 255)(0 ~ 255)
(0 ~ 1)
(0 ~ 1)(0 ~ 1)
Band 1 (0 ~ 255)
Band 2 (0 ~ 255)Vegetation
Water
Soil
11
page 21 of 34
ANN Training - From Pattern to Land Cover Category
Pattern
…………
…………
Feed Forward
Back-Propagate
Vegetation:
(10, 89) ------> (1,0,0)
(11,70) ------> (1,0,0)
… … … (1,0,0)
Water:
(10, 21) ------> (0,1,0)
(15, 32) ------> (0,1,0)
… … … (0,1,0)
Soil:
(50, 40) -------> (0,0,1)
(52, 40) -------> (0,0,1)
… … … (0,0,1)
Land Cover Category
10
89
1 (Vegetation)
0 (Water)
0 (Soil)
A vegetation pixel(10, 89) ------> (1,0,0)
page 22 of 34
After Training
water
Vegetation
Soil
…………
…………
x
y
1 (Vegetation)
0 (Water)
0 (Soil)
A new pixel (x,y), x in band 1, y in band 2
A Well-trained Neural Network
12
page 23 of 34
Real-world Applications
• Pattern recognition - Remote sensing image classification
• Banking – credit evaluation
• $tock market data analysis and prediction
page 24 of 34
Band 4,3,2In RGB
Band 5,4,3In RGB
Remote sensing image classificationRhode Island 1999 Landsat-7 Enhanced Thematic Mapper
Plus (ETM+) Image
13
page 25 of 34
ANN Design
What ANN structure to choose? – Multi-layer feed-forward
What ANN training algorithm? – Back-propagation & RPROP
PE in each layer? -- 6 – X – 10
page 26 of 34
Training sample and Testing sample
Class Name Training SampleSize (pixels)
Testing Sample Size(pixels)
Agriculture 116 153Barren Land 146 125
Conifer Forest 173 146Deciduous Forest 343 217
Mixed Forest 265 155Brush Land 75 75Urban Area 287 108
Water 238 133Non-forest Wetland 248 163
Forest Wetland 52 57Total Pixels 1943 1332
Training and Testing Pattern
14
page 27 of 34
Classification Result
Deciduous forest
Turf / Grass
Barren land
Conifer forest
Wetland 2
Mixed forest
Brush Land
Wetland 1
Water
Urban area
Some stats on the classification:
Training: 1943 pixel, ANN structure: 48-350-11, Training time: 5 hours, final error: (100% - 92.7%)Classification: over 9 Million pixels, takes 6 hours to get the land-cover map.
page 28 of 34
Classification Result- A Close Look
Rhode Island 1999 ETM+ Rhode Island 1999 Land-use and Land-cover map
Deciduous forest
Turf / Grass
Barren land
Conifer forest
Wetland 2
Mixed forest
Brush Land
Wetland 1
Water
Urban area
15
page 29 of 34
Banking – credit evaluation
• 10 ~ 20 attributes as input– Yearly income, marriage status, credit history,
residence, children, etc
• Expert to choose typical training data set
• Choose NN structure and training algorithm– A dynamic NN structure applied
– Self-growing algorithm …
page 30 of 34
Stock market data analysis and prediction – a world full of patterns
• 10 ~ 20 attributes– P/E, Weekly Volume, Book Value etc
• Expert to choose training data set– Typically upon a certain pattern/theory
• NN structure – this time, it is different, “time” plays an important rule.
16
page 31 of 34
NN structure and training algorithmin stock market
• Recurrent network, Generalized neural network
• Normal BP, Conjugate Gradient Method(CG), Quick-Prop
Hopfield Network (Recurrent network)
page 32 of 34
Summary
• A Neural Network consists
– Processing Element(PE)
– Topological structure• Multi-layer feed-forward
– Training/learning algorithm• Back-propagation
– Numbers of PEs in each layer
17
page 33 of 34
Summary – cont.
Basic Back-propagation has the following shortcomings:
• Time-consuming
• Black box - uncontrollable training
• Training result unpredictable
page 34 of 34
Some reference
• A good starting point – Timothy Master’s “Practical neural network
Recipes in C++”
Other site:
IEEE Neural Network community
18
page 35 of 34
Thank You!
top related