You are on page 1of 12

Assignment -3

ANN_LecNotes_Mat 1

Problem 1.12 A architectural graph fully connected feed forward network consisting of
10 source nodes, two hidden layers, one with 4 nodes and other with 3 nodes and single
output node is shown below-

Problem 1.13(a)

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

For the given feed forward network, let the inputs are x1 and x2.
For input layer,

u1=5 x 1+x 2

u2=2 x 13 x2
And

v 1=

1
a u
1+e

v 2=

1
1+ea u

where v1 and v2 are inputs to hidden layer 1.


For hidden layer 1,
u11 =3 v 1v 2
u12=4 v 1 +6 v 2
And

v 11 =

1
1+ ea u

v 12=

1
1+ea u

11

12

Where v11 and v21 are the inputs to hidden layer 2.


For hidden layer 2,

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

y p=2 v 11 + v 12
Where yp is the input to output neuron.
So the output y is
y=

1
1+ea y

(b) Since the output neuron operates in its linear region, the output will be the same as the
linear sums of outputs, so
y=yp

Problem 1.14- For the same network in the previous question, the biases are -1 and +1 for
top and bottom neurons of first hidden layer, and +1 and -2 for top and bottom neurons of
second hidden layer, then

For input layer,

u1=5 x 1+x 2

u2=2 x 13 x2
And

v 1=

1
1+ea u

v 2=

1
1+ea u

where v1 and v2 are inputs to hidden layer 1.


For hidden layer 1,
u11 =3 v 1v 21
u12=4 v 1 +6 v 2 +1
And

v 11 =

1
a u
1+ e

11

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

v 12=

1
a u
1+e

12

Where v11 and v21 are the inputs to hidden layer 2.


For hidden layer 2,
y p=2 v 11 + v 12+12
Where yp is the input to output neuron.
So the output y is
y=

1
1+ea y

Problem 1.15- let there be a fully connected network which has n inputs and n neurons in
input layer, so the output of the jth neuron without active function will be
n

u j = w ji x i
i=0

However, as the neuron is operation in its linear region so,


v j=u j
Where vj is the input for the for the next hidden layer coming from the jth neuron.
Now, let in the next hidden layer, there are m neurons, so the output of the kth neuron without
active function will be
n

s k = p kz v z
z=0

But the value of vz can be put from previous equation, so sk will become

z=0

i=0

s k = p kz w zi x i

s k =
i=0

pkz w zi
z=0

xi

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

As the neuron is operating in linear region, the output of this neuron as input to another layer
of neurons will be sk only. So, from the equation it is clear that the output of any layer of
neuron can be written in form of the inputs of input layer, so we can say that in a multilayer
feed forward network, if all neurons are operating in their linear region, the network is
equivalent to a single layer feed forward network.

Problem 1.16- Fully recurrent network with 5 neurons with no self feedback is shown
below

Problem 1.17-

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

For the given network, the x1(n) and x2(n) defines the upper and lower neurons respectively.
So,
x1(n) = x2(n-1)
x2(n) = x1(n-1)
From the above two equations, it can be written as
x1(n) = x2(n-2)
This is the non linear differential equation that defines the given network. That order of the
differential equation is 2.

Problem 1.18-

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

The two first order differential equations that describe the operation of the given system are
as following
x1(n) = x1(n-1) + x2(n-1)
x2(n) = x1(n-1)+ x2(n-1)

Problem 1.19Architectural graph of a recurrent network having 3 source nodes, 2 hidden neurons and 4
output neurons has been shown below. The network has been considered fully connected.

Problem 1.20Auto regressive (AR) model described by the differential equation is


y(n)=w1y(n-1)+w2y(n-2)++wmy(n-M)+v(n)
y(n) is the model output and v(n) is a sample drawn from white noise.
The diagram of the model has been shown below.

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

It can be simply seen from the diagram that there 1:1 relation between weights and signals.
So, each weight is is effecting only one signal. So, even if the output signal at one time event
is not in the scale, it can be normalised by changing the corresponding weight. So , by this,
the invariance in scale can be achieved.
It has also invariance in time translation because it has signals of instant 1 to m th instant in
past. So, we can retain the output while translation of time. So, we can say that the system is
invariant in time translation.
ANN_LecNotes_Mat 2

Problem 1.1(a)-

As at one time instant, signal will be transferred from one layer of neurons to another layer of
neurons.
So,
N3(t)=1*N1(t-1)+1*N2(t-1)
N4(t)=2*N1(t-1)-1*N2(t-1)
Similarly,
N5(t)=2*N3(t-1)+2*N4(t-1)
Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

The value of N5 in terms of N1 and N2 will be


N5(t)=2*N3(t-1)+2*N4(t-1)
N5(t)=2*(1*N1(t-2)+1*N2(t-2))+2*(2*N1(t-2)-1*N2(t-2))
N5(t)=6*N1(t-2)
(b) since, N1=1 and N2=0 at t=0,
So from the above equations,
N3=1 and N4=2
As the threshold of neurons is 2 so output of the neuron N3 will be 0.
So, N5=2.

Problem 1.2The XOR gate can be written as


Y=AB +AB
Y= ((A+B) (A+B))
Truth table for y=A+B will be
A
0
0
1
1

B
0
1
0
1

Y
1
0
1
1

So the neural network for ORNOT function y=A+B will be

However the threshold value of neuron y is 0.


For the single input neuron, the not gate can be represented as following network where the
threshold value of output neuron is 0.
Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

So, the full network for XOR gate will be

The threshold values of x1, x2 and y neurons is 0, the threshold value of y1 neuron is 2.

Problem 1.5The McCulloch-Pitts model for the given problem statement has been shown below for the
pattern identification. Each neuron has the threshold level of 2. The output for the upscale and
down scale is represented by
Y1=x1(t-3) AND x2(t-2) AND x3(t-1)
Y1=x1(t-1) AND x2(t-2) AND x3(t-3)

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

Problem 1.4-

For the given network of perception heat and cold, the output are given by
Y1=x1(t-1) OR (x2(t-3) ANDNOT x2(t-2))
Y2= x2(t-2) AND x2(t-1)
Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

However, for the given input at t, the felling at (t+1),


If input is (1,0) means hot is being touched for 1 time step at t, the output feeling will be hot
at (t+1).
If input is (0,1) means cold is touched for 1 time step at t , the output feeling at (t+1) will be
cold as seen from the above equations.

Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU

You might also like