You are on page 1of 7

A perceptron in Matlab | Matlab Geeks

Page 1 of 7

Like

364

http://matlabgeeks.com/tips-tutorials/neural-networks-a-perceptron-in-matlab/

4/11/2012

A perceptron in Matlab | Matlab Geeks

Page 2 of 7

If you want to verify this yourself, run the following code in Matlab. Your code can further be modified to fit your personal needs. We first initialize our variables of interest, including the input, desired output, bias, learning coefficient and weights.
input = [0 0; 0 1; 1 0; 1 1]; numIn = 4; desired_out = [0;1;1;1]; bias = -1; coeff = 0.7; rand('state',sum(100*clock)); weights = -1*2.*rand(3,1);

The input and desired_out are self explanatory, with the bias initialized to a constant. This value can be set to any non-zero number between -1 and 1. The coeff represents the learning rate, which specifies how large of an adjustment is made to the network weights after each iteration. If the coefficient approaches 1, the weight adjustments are modified more conservatively. Finally, the weights are randomly assigned. A perceptron is defined by the equation:

Therefore, in our example, we have w1*x1+w2*x2+b = out We will assume that weights(1,1) is for the bias and weights(2:3,1) are for X1 and X2, respectively. One more variable we will set is the iterations, specifying how many times to train or go through and modify the weights.
iterations = 10;

Now the feed forward perceptron code.


for i = 1:iterations out = zeros(4,1); for j = 1:numIn y = bias*weights(1,1)+... input(j,1)*weights(2,1)+input(j,2)*weights(3,1); out(j) = 1/(1+exp(-y)); delta = desired_out(j)-out(j); weights(1,1) = weights(1,1)+coeff*bias*delta; weights(2,1) = weights(2,1)+coeff*input(j,1)*delta; weights(3,1) = weights(3,1)+coeff*input(j,2)*delta; end end

A little explanation of the code. First, the equation solving for out is determined as mentioned above, and then run through a sigmoid function to ensure values are squashed within a [0 1] limit. Weights are then modified iteratively based on the delta rule. When running the perceptron over 10 iterations, the outputs begin to converge, but are still not precisely as expected:
out = 0.3756 0.8596 0.9244 0.9952 weights = 0.6166 3.2359 2.7409

As the iterations approach 1000, the output converges towards the desired output.
out = 0.0043 0.9984 0.9987 1.0000 weights = 5.4423 12.1084 11.8823

As the OR logic condition is linearly separable, a solution will be reached after a finite number of loops. Convergence time can also change based on the initial weights, the learning rate, the transfer function (sigmoid, linear, etc) and the learning rule (in this case the delta rule is used, but other algorithms like the Levenberg-Marquardt also exist). If you are interested try to run the same code for other logical conditions like AND or NAND to see what you get. While single layer perceptrons like this can solve simple linearly separable data, they are not suitable for non-separable data, such as the XOR. In order to learn such a data set, you will need to use a

http://matlabgeeks.com/tips-tutorials/neural-networks-a-perceptron-in-matlab/

4/11/2012

A perceptron in Matlab | Matlab Geeks

Page 3 of 7

This entry was posted in Tips & Tutorials and tagged matlab, neural network, perceptron, tips, tutorial. Bookmark the permalink. Energy Release in Recent Earthquakes Modeling with ODEs in Matlab Part 1

15 Responses to Neural Networks A perceptron in Matlab

1.

suresh says: February 22, 2012 at 1:06 pm how can i solve a differential equation using neural network scheem in matlab suppose my equation is dy/dx = 3*sin(x)+e^2x Reply

2. Pingback: Neural Networks A perceptron in Matlab PIYABUTE FUANGKHON

3.

nietzsche says: February 21, 2012 at 2:28 pm hi all, I have a question and i really need help coz Ive tried everything but in vain. i have a matrix A = [k x 1] where k = 10,000 and ranging, say, from a to b, randomly. I need to get a matrix B = [m x 1] from A, where m is from a to c ( a<c<b), basically, i want to do is to "shrink" A and get a smaller matrix B. Thanks everybody. Nietzsche. Reply Vipul Lugade says: March 14, 2012 at 1:01 am From your question, Im assuming something like the following?: % Preallocate a random 10000 x 1 matrix A A = rand(10000,1); a = min(A) b = max(A) % set c = to a random value in between a and b. Lets choose 0.5 for this example c = 0.5 % Then create B for between a and b B = A(A< =c) % Can also use the following, though the second part is redundant in this case B = A(A<=c & A>=a) Reply

4.

deepika says: February 16, 2012 at 5:12 pm hi thanks for your tutorial could you please explain how to solve differential equation in neural networks Regards Reply Vipul Lugade says: March 14, 2012 at 12:56 am Deepika, Well try to do something on this in the near future. Thanks, Vipul

http://matlabgeeks.com/tips-tutorials/neural-networks-a-perceptron-in-matlab/

4/11/2012

A perceptron in Matlab | Matlab Geeks

Page 4 of 7

Reply 5. JJ says: February 16, 2012 at 12:33 am shouldt the input entered be: input = [0 0; 1 0; 0 1; 1 1]; Instead of. input = [0 0; 0 1; 1 0; 1 1]; Im sure Im just confused but I need to use the following input data (and am uncertain about how to enter it): X1=0, 0, 1, 1 X2=0, 1, 0, 1 would it be input = [0 0;0 1; 1 0; 1 1] or input = [0 0;1 0;0 1;1 1] Your help would be much appreciated Reply Vipul Lugade says: March 14, 2012 at 12:54 am You are correct. In our example here for OR, both [1 0] and [0 1] map to an output of 1 though, so it works still. If you have a matrix of inputs = [X1 X2] which are defined as follows: X1=0, 0, 1, 1 X2=0, 1, 0, 1 Then you would use this: input = [0 0;0 1; 1 0; 1 1] Reply 6. jamileh Yousefi says: December 31, 2011 at 2:35 am Hello Ive tried this example. I always get same results: Out 0.5 0.5 0.5 0.5 weights: 0 0 0 I dont know what is wrong with my code. please help. here is my code input =[0 0; 0 1; 1 0; 1 1]; numIn = 4; desired_out = [0;1;1;1]; bias = -1; coeff = 1; %rand(state, sum(100*clock)); weights = -1*2.*rand(3,1); iterations = 10000; for i =1:iterations out = zeros(4,1);

http://matlabgeeks.com/tips-tutorials/neural-networks-a-perceptron-in-matlab/

4/11/2012

A perceptron in Matlab | Matlab Geeks

Page 5 of 7

for j=1:numIn y = bias*weights(1,1)+ input(j,1)*weights(2,1)+input(j,2)*weights(3,1); out(j) = 1/(1+exp(-y)); delta=desired_out(j)-out(j); weights(1,1)=weights(1,1)*coeff*bias*delta; weights(2,1)=weights(2,1)*coeff*input(j,1)*delta; weights(3,1)=weights(3,1)*coeff*input(j,2)*delta; end end Reply jamileh Yousefi says: December 31, 2011 at 2:59 am solved Reply 7. sue says: December 21, 2011 at 7:27 am hi, thanks for the good explaination about perceptron. I hv one question, this program is to train the input right?? then..how im going to test the input for classification using perceptron ? Reply 8. Vipul Lugade says: October 27, 2011 at 12:59 am David, I dont know if I follow your question. You could plot the results, residuals, MSE errors or other variables over each iteration. If you want to do something like this, that would be possible. If this isnt what you were looking for, let me know. Vipul Reply 9. David says: October 16, 2011 at 4:15 pm sorry for my english how I can plot this perceptron? Thanks. Reply 10. Vipul says: May 23, 2011 at 11:36 pm Saeed, An implementation of a multilayer perceptron is now available. http://matlabgeeks.com/tips-tutorials/neural-networks-a-multilayer-perceptron-in-matlab/ Take care, Vipul Reply 11. saeed says: May 20, 2011 at 12:16 pm hi thank you for having this brief and useful tutorial.

http://matlabgeeks.com/tips-tutorials/neural-networks-a-perceptron-in-matlab/

4/11/2012

A perceptron in Matlab | Matlab Geeks

Page 6 of 7

Id really appreciate if you send me a multilayer perceptron implementation using matlab . best regards. Reply

Leave a Reply
Your email address will not be published. Required fields are marked * Name * Email * Website

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite="">
<cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong> Post Comment

Subscribe to Matlab Geeks ! Follow us on Twitter !


Search

Search for: Tutorials

Create Scripts Search Paths Managing Paths User Input Variables Dynamic Variable Names Arrays Loops If Else Operators Plotting Plot with date axis Signal Processing Digital Filtering Fourier Transform 2-D Fourier Transform Ordinary Differential Equations Part 1 Part 2 Part 3 Neural Networks Perceptron Multi-layer Perceptron Binary Classification K-Means Clustering Sports Analysis Politics and Economics Welcome to Matlab Geeks We provide assistance and technical support for your current project or programming requirements. Whether its debugging code or starting from scratch, we can assist in all your data analysis needs. Please contact us at service@matlabgeeks.com if you want to know more about our prices and the technical solutions we provide.

http://matlabgeeks.com/tips-tutorials/neural-networks-a-perceptron-in-matlab/

4/11/2012

A perceptron in Matlab | Matlab Geeks

Page 7 of 7

Matlab Geeks Proudly powered by WordPress.

http://matlabgeeks.com/tips-tutorials/neural-networks-a-perceptron-in-matlab/

4/11/2012

You might also like