Neural Networks
Master’s Thesis
by
Prateeti Raj
Thesis Supervisor
Prof. Ashay Dharwadker
15th April 2004
Abstract
This thesis is a survey of the field of Neural Networks. Various types
of neural networks and learning algorithms are explained and demonstrated.
In particular, the Perceptron, Hamming and Hopfield networks and their
applications are described, and a detailed historical background is provided.
The mathematical model of a Threshold Logic Unit and the problem of finding
a linearly separable function that agrees with the desired training set
is explored. Consequently, a calculus argument in the weight space allows
us to derive the famous Widrow-Hoff delta rule using the zero-one step
function and the Werbos generalized delta rule using the sigmoid function.
We have implemented all of these neural networks and learning algorithms
in C++ and provide the source code of the software under the GNU public
license for noncommercial use on the accompanying CD.
Contents
1. The Human Brain
2. History of Neural Networks
3. What is a Neural Network?
4. Learning in a Neural Network
5. A Simple Example of a Neural Network
6. Problem Statement
7. Perceptron
8. Pattern Recognition Problem
9. Programs
10. Hamming Network
11. Program on Hamming Network
12. Hopfield Networks
13. Widrow-Hoff Algorithm
14. Program using Widrow-Hoff Algorithm
15. Werbos Algorithm
16. Summary of Presentations Given
17. Applications of Neural Networks
18. References |