artificial neural networks - iasbs

11
Artificial Neural Networks Part 9

Upload: others

Post on 13-Jul-2022

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Artificial Neural Networks - IASBS

Artificial Neural

Networks

Part 9

Page 2: Artificial Neural Networks - IASBS

Practical Aspects Training and Test subsets

all samples representing the maximum value and minimum value of

each input variable should belong to training subset.

the MLP ignores the function behavior at both the left and right side of

the definition domain.

Page 3: Artificial Neural Networks - IASBS

Practical Aspects Over-fitting / Under-fitting

Increasing indiscriminately the number of neurons. usually drive the

MLP to overfit.

The squared error of the training set tends to be very low; however, the

squared error of test set tends to be very high.

Page 4: Artificial Neural Networks - IASBS

Neural Network Toolbox

Perceptron networks can be created with the function newp. These

networks can be initialized, simulated and trained with the init, sim and

train.

net = newp(PR,S,TF,LF)

PR R x 2 matrix of min and max values for R

input elements

S Number of neurons

TF Transfer function (default = 'hardlim')

LF Learning function (default = 'learnp')

P = [0 0 1 1;

0 1 0 1];

T = [0 1 1 1];

net = newp([0 1;0 1],1,'hardlim','learnp');

Y = sim(net,P)

net.trainParam.epochs = 20;

net = train(net,P,T);

Y = sim(net,P)

Perceptron Rule

Page 5: Artificial Neural Networks - IASBS

Neural Network Toolbox

Create feed-forward backpropagation network

p = [-1 -1 2 2;0 5 0 5];

t = [-1 -1 1 1];

net=newff(p,t,[3,2],{'tansig',‘tansig'},'traingdm');

net.trainParam.show = 50;

net.trainParam.lr = 0.05;

net.trainParam.mc = 0.9;

net.trainParam.epochs = 300;

net.trainParam.goal = 1e-5;

[net,tr]=train(net,p,t);

a = sim(net,p)

Momentum coefficient

Show the performance each 50 epochs

error threshold (stop criterion)

Back-propagation MLPs

Gradient descend method

Page 6: Artificial Neural Networks - IASBS

Neural Network Toolbox Functional Approximation

p = [-1:0.01:1];

t = sin(2*pi*p)+0.1*randn(size(p));

trainInd = 1:3:201;

valInd = 2:3:201;

testInd = 3:3:201;

[trainP,val.P,test.P,trainInd,valInd,testInd]=dividerand(p);

[trainT,val.T,test.T] = divideind(t,trainInd,valInd,testInd);

net.divideParam.trainRatio = 50/100;

net.divideParam.valRatio = 25/100;

net.divideParam.testRatio = 25/100;

Page 7: Artificial Neural Networks - IASBS

Neural Network Toolbox

We now create a 1-20-1-1 network:

net = newff(trainP,trainT,[20],{'tansig'},'traingdx');

net.trainParam.show = 25;

net.trainParam.epochs = 300;

net.trainParam.max_fail=50;

net = init(net);

net = train(net,trianP,trainT,[],[],val,test);

net = train(net,p,t);

Y = sim(net,p);

Functional Approximation

Page 8: Artificial Neural Networks - IASBS

Neural Network Toolbox Functional Approximation

There is no major problems with the training. The error profiles for

validation and test are very similar. If the validation curve increased

significantly, then it is possible that some overfitting might have occurred.

Page 9: Artificial Neural Networks - IASBS

Neural Network Toolbox Functional Approximation

Scatter plot a regression plot, which shows the relationship between the

outputs of the network and the targets.

If the training were perfect, the network

outputs and the targets would be

exactly equal, but the relationship is

rarely perfect in practice.

The R value is an indication of the

relationship between the outputs and

targets.

Page 10: Artificial Neural Networks - IASBS

Neural Network Toolbox Functional Approximation

Page 11: Artificial Neural Networks - IASBS

Neural Network Toolbox

Write a Matlab m-file and use neural network toolbox to approximate

following function:

f(x) = sin(x) + x.cos(3x)

- Suppose the range of x between -6 and 6

- Do the same steps as the example and try to run the network with

different number of neurons in hidden layer

Generalization