mi assnmnt 3

12
Assignment -3 ANN_LecNotes_Mat 1 Problem 1.12 – A architectural graph fully connected feed forward network consisting of 10 source nodes, two hidden layers, one with 4 nodes and other with 3 nodes and single output node is shown below- Problem 1.13(a) – Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

Upload: ashish-kushwaha

Post on 13-Dec-2015

219 views

Category:

Documents


3 download

DESCRIPTION

mi

TRANSCRIPT

Page 1: Mi Assnmnt 3

Assignment -3

ANN_LecNotes_Mat 1

Problem 1.12 – A architectural graph fully connected feed forward network consisting of 10 source nodes, two hidden layers, one with 4 nodes and other with 3 nodes and single output node is shown below-

Problem 1.13(a) –

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

Page 2: Mi Assnmnt 3

For the given feed forward network, let the inputs are x1 and x2.For input layer,

u1=5 x1+x2

u2=2 x1−3 x2

And

v1=1

1+e−au1

v2=1

1+e−au2

where v1 and v2 are inputs to hidden layer 1.

For hidden layer 1,

u11=3 v1−v2

u12=4v1+6 v2

And

v11=1

1+e−au11

v12=1

1+e−au12

Where v11 and v21 are the inputs to hidden layer 2.

For hidden layer 2,

y p=−2 v11+v12

Where yp is the input to output neuron.

So the output y is

y= 1

1+e−a y p

(b) Since the output neuron operates in its linear region, the output will be the same as the linear sums of outputs, so

y=yp

Problem 1.14- For the same network in the previous question, the biases are -1 and +1 for top and bottom neurons of first hidden layer, and +1 and -2 for top and bottom neurons of second hidden layer, then

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

Page 3: Mi Assnmnt 3

For input layer,u1=5 x1+x2

u2=2 x1−3 x2

And

v1=1

1+e−au1

v2=1

1+e−au2

where v1 and v2 are inputs to hidden layer 1.

For hidden layer 1,

u11=3 v1−v2−1

u12=4v1+6 v2+1

And

v11=1

1+e−au11

v12=1

1+e−au12

Where v11 and v21 are the inputs to hidden layer 2.

For hidden layer 2,

y p=−2 v11+v12+1−2

Where yp is the input to output neuron.

So the output y is

y= 1

1+e−a y p

Problem 1.15- let there be a fully connected network which has n inputs and n neurons in input layer, so the output of the jth neuron without active function will be

u j=∑i=0

n

w ji x i

However, as the neuron is operation in its linear region so,

v j=u j

Where vj is the input for the for the next hidden layer coming from the jth neuron.

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

Page 4: Mi Assnmnt 3

Now, let in the next hidden layer, there are m neurons, so the output of the kth neuron without active function will be

sk=∑z=0

n

pkz vz

But the value of vz can be put from previous equation, so sk will become

sk=∑z=0

n

pkz∑i=0

n

w zi xi

sk=∑i=0

n

(∑z=0

n

pkz w zi) x i

As the neuron is operating in linear region, the output of this neuron as input to another layer of neurons will be sk only. So, from the equation it is clear that the output of any layer of neuron can be written in form of the inputs of input layer, so we can say that in a multilayer feed forward network, if all neurons are operating in their linear region, the network is equivalent to a single layer feed forward network.

Problem 1.16- Fully recurrent network with 5 neurons with no self feedback is shown below

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

Page 5: Mi Assnmnt 3

Problem 1.17-

For the given network, the x1(n) and x2(n) defines the upper and lower neurons respectively.

So,

x1(n) = x2(n-1)

x2(n) = x1(n-1)

From the above two equations, it can be written as

x1(n) = x2(n-2)

This is the non linear differential equation that defines the given network. That order of the differential equation is 2.

Problem 1.18-

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

Page 6: Mi Assnmnt 3

The two first order differential equations that describe the operation of the given system are as following

x1(n) = x1(n-1) + x2(n-1)

x2(n) = x1(n-1)+ x2(n-1)

Problem 1.19-

Architectural graph of a recurrent network having 3 source nodes, 2 hidden neurons and 4 output neurons has been shown below. The network has been considered fully connected.

Problem 1.20-

Auto regressive (AR) model described by the differential equation is

y(n)=w1y(n-1)+w2y(n-2)+…………………+wmy(n-M)+v(n)

y(n) is the model output and v(n) is a sample drawn from white noise.

The diagram of the model has been shown below.

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

Page 7: Mi Assnmnt 3

It can be simply seen from the diagram that there 1:1 relation between weights and signals. So, each weight is is effecting only one signal. So, even if the output signal at one time event is not in the scale, it can be normalised by changing the corresponding weight. So , by this, the invariance in scale can be achieved.

It has also invariance in time translation because it has signals of instant 1 to m th instant in past. So, we can retain the output while translation of time. So, we can say that the system is invariant in time translation.

ANN_LecNotes_Mat 2

Problem 1.1(a)-

As at one time instant, signal will be transferred from one layer of neurons to another layer of neurons.

So,

N3(t)=1*N1(t-1)+1*N2(t-1)

N4(t)=2*N1(t-1)-1*N2(t-1)

Similarly,

N5(t)=2*N3(t-1)+2*N4(t-1)

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

Page 8: Mi Assnmnt 3

The value of N5 in terms of N1 and N2 will be

N5(t)=2*N3(t-1)+2*N4(t-1)

N5(t)=2*(1*N1(t-2)+1*N2(t-2))+2*(2*N1(t-2)-1*N2(t-2))

N5(t)=6*N1(t-2)

(b) since, N1=1 and N2=0 at t=0,

So from the above equations,

N3=1 and N4=2

As the threshold of neurons is 2 so output of the neuron N3 will be 0.

So, N5=2.

Problem 1.2-

The XOR gate can be written as

Y=A’B +AB’

Y= ((A+B’) (A’+B))’

Truth table for y=A+B’ will be

A B Y0 0 10 1 01 0 11 1 1

So the neural network for ORNOT function y=A+B’ will be

However the threshold value of neuron y is 0.

For the single input neuron, the not gate can be represented as following network where the threshold value of output neuron is 0.

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

Page 9: Mi Assnmnt 3

So, the full network for XOR gate will be

The threshold values of x1, x2 and y neurons is 0, the threshold value of y1 neuron is 2.

Problem 1.5-

The McCulloch-Pitt’s model for the given problem statement has been shown below for the pattern identification. Each neuron has the threshold level of 2. The output for the upscale and down scale is represented by

Y1=x1(t-3) AND x2(t-2) AND x3(t-1)

Y1=x1(t-1) AND x2(t-2) AND x3(t-3)

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

Page 10: Mi Assnmnt 3

Problem 1.4-

For the given network of perception heat and cold, the output are given by

Y1=x1(t-1) OR (x2(t-3) ANDNOT x2(t-2))

Y2= x2(t-2) AND x2(t-1)

However, for the given input at t, the felling at (t+1),

If input is (1,0) means hot is being touched for 1 time step at t, the output feeling will be hot at (t+1).

If input is (0,1) means cold is touched for 1 time step at t , the output feeling at (t+1) will be cold as seen from the above equations.

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU