Professional Documents
Culture Documents
Overview
What is Neural Networks? Why Neural Networks? Human and Artificial Neurons Investigating the Similarities Back-propagation algorithm Applications Conclusion Future Enhancements
An information-processing paradigm.
Composed of large number of highly interconnected processing elements working in unison to solve specific problems.
Also known as neurocomputers, connectionist computers, parallel distributed processors, etc. Key element: Novel structure of the information processing system.
Knowledge is acquired by network from its environment through a learning process. Interneuron connection strengths, known as synaptic weights, are used to store the acquired knowledge.
Used to extract patterns and detect trends that are too complex to be noticed by humans or other computer techniques. A trained neural network can be thought of as an "expert" in the category of information it has been given to analyze. Expert can then be used to provide projections given new situations of interest and answer "what if" questions.
Other advantages..
Disadvantages
Because the network finds out how to solve the problem by itself, its operation can be unpredictable.
Requires high processing time for large neural networks. Different architectures consequently requires different types of algorithms despite to be an apparently complex system
Dendrites
- neurons collect signals from others through a host - splits into 1000s of branches - converts axon into electrical effects that inhibit or excite activity in the connected neurons.
Axon
Synapse
Learning occurs by changing the effectiveness of the synapses so that the influence of one neuron on another changes. Brain cells (neurons) are five to six orders of magnitude slower than the silicon chip happen in ns range where as neural events happen in ms. The brain makes up the slow rate of operation of a neuron by massive interconnection between them.
We conduct these neural networks by first trying to deduce the essential features of neurons and their interconnections. We then typically program a computer to simulate these features. However because our knowledge of neurons is incomplete and our computing power is limited, our models are necessarily gross idealizations of real networks of neurons.
Inputs Xi:
Weights Wki:
real-valued numbers
Threshold u:
Referred to as bias value. Threshold can be regarded as another input / weight pair
Activation function, f:
Output Yk:
The modification of synaptic weights provides the traditional method for the design of NNs.
Interconnection layers
Multilayer perceptron
Required desired output in order to learn, so called supervised network. Goal: to create a model that correctly maps the input to the output using historical data so that it can then used to produce when desired output is unknown.
Learn MLP and many other neural networks. input data is repeatedly presented to NN
With each presentation, the error between the network output and the desired output is computed and fed back to the neural network. The neural network uses this error to adjust its weights such that the error will be decreased. In order to train a neural network to perform some task, we must adjust the weights of each unit in such a way that the error between the desired output and the actual output is reduced. This process requires that the NN compute the error derivative of weights.
Applications
Neural networks have broad applicability to real world problems. successfully applied to broad spectrum of data intensive applications, such as: Voice Recognition Target Recognition Medical Diagnosis Process Modeling and control Credit rating Targeted Marketing Financial forecasting
Conclusion
NNs ability by learning makes them very flexible and powerful. No need to devise an algorithm in order to perform a specific task (no need to understand internal mechanisms). Very well suited for real time system for their fast response and computational times which are due to parallel architecture. Even though NNs have huge potential we will only get the best when they are integrated with computing.
Future Enhancements
Robots that can see, feel and predict the world around them. Improved stock prediction Common usage of self driving cars. Handwritten documents to be automatically transformed into formatted processing documents. Composition of music. Self diagnosis of medical problems using NNs. And much more
Bibliography
lia.univ-avignon.fr/fileadmin/documents/.../book-neurointro.pdf
www.learnartificialneuralnetworks.com