Professional Documents
Culture Documents
ANN_LecNotes_Mat 1
Problem 1.12 A architectural graph fully connected feed forward network consisting of
10 source nodes, two hidden layers, one with 4 nodes and other with 3 nodes and single
output node is shown below-
Problem 1.13(a)
For the given feed forward network, let the inputs are x1 and x2.
For input layer,
u1=5 x 1+x 2
u2=2 x 13 x2
And
v 1=
1
a u
1+e
v 2=
1
1+ea u
v 11 =
1
1+ ea u
v 12=
1
1+ea u
11
12
y p=2 v 11 + v 12
Where yp is the input to output neuron.
So the output y is
y=
1
1+ea y
(b) Since the output neuron operates in its linear region, the output will be the same as the
linear sums of outputs, so
y=yp
Problem 1.14- For the same network in the previous question, the biases are -1 and +1 for
top and bottom neurons of first hidden layer, and +1 and -2 for top and bottom neurons of
second hidden layer, then
u1=5 x 1+x 2
u2=2 x 13 x2
And
v 1=
1
1+ea u
v 2=
1
1+ea u
v 11 =
1
a u
1+ e
11
v 12=
1
a u
1+e
12
1
1+ea y
Problem 1.15- let there be a fully connected network which has n inputs and n neurons in
input layer, so the output of the jth neuron without active function will be
n
u j = w ji x i
i=0
s k = p kz v z
z=0
But the value of vz can be put from previous equation, so sk will become
z=0
i=0
s k = p kz w zi x i
s k =
i=0
pkz w zi
z=0
xi
As the neuron is operating in linear region, the output of this neuron as input to another layer
of neurons will be sk only. So, from the equation it is clear that the output of any layer of
neuron can be written in form of the inputs of input layer, so we can say that in a multilayer
feed forward network, if all neurons are operating in their linear region, the network is
equivalent to a single layer feed forward network.
Problem 1.16- Fully recurrent network with 5 neurons with no self feedback is shown
below
Problem 1.17-
For the given network, the x1(n) and x2(n) defines the upper and lower neurons respectively.
So,
x1(n) = x2(n-1)
x2(n) = x1(n-1)
From the above two equations, it can be written as
x1(n) = x2(n-2)
This is the non linear differential equation that defines the given network. That order of the
differential equation is 2.
Problem 1.18-
The two first order differential equations that describe the operation of the given system are
as following
x1(n) = x1(n-1) + x2(n-1)
x2(n) = x1(n-1)+ x2(n-1)
Problem 1.19Architectural graph of a recurrent network having 3 source nodes, 2 hidden neurons and 4
output neurons has been shown below. The network has been considered fully connected.
It can be simply seen from the diagram that there 1:1 relation between weights and signals.
So, each weight is is effecting only one signal. So, even if the output signal at one time event
is not in the scale, it can be normalised by changing the corresponding weight. So , by this,
the invariance in scale can be achieved.
It has also invariance in time translation because it has signals of instant 1 to m th instant in
past. So, we can retain the output while translation of time. So, we can say that the system is
invariant in time translation.
ANN_LecNotes_Mat 2
Problem 1.1(a)-
As at one time instant, signal will be transferred from one layer of neurons to another layer of
neurons.
So,
N3(t)=1*N1(t-1)+1*N2(t-1)
N4(t)=2*N1(t-1)-1*N2(t-1)
Similarly,
N5(t)=2*N3(t-1)+2*N4(t-1)
Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU
B
0
1
0
1
Y
1
0
1
1
The threshold values of x1, x2 and y neurons is 0, the threshold value of y1 neuron is 2.
Problem 1.5The McCulloch-Pitts model for the given problem statement has been shown below for the
pattern identification. Each neuron has the threshold level of 2. The output for the upscale and
down scale is represented by
Y1=x1(t-3) AND x2(t-2) AND x3(t-1)
Y1=x1(t-1) AND x2(t-2) AND x3(t-3)
Problem 1.4-
For the given network of perception heat and cold, the output are given by
Y1=x1(t-1) OR (x2(t-3) ANDNOT x2(t-2))
Y2= x2(t-2) AND x2(t-1)
Ashish Kushwaha, PhD Scholar, Dept. of Electrical Engineering, SOOE, SNU