|
NEURAL NETWORKS
by
Hervé Abdi,
Dominique Valentin,
and
Betty Edelman
|
Abdi, H., Valentin, D., Edelman, B. (1999).
Neural Networks.
Thousand Oaks (CA): Sage. (100p.)
Quantitative Applications in the Social Sciences Series # 124.
ISBN 0-7619-1440-4. Price: $13.95.
Contents:
- Series Editor's Introduction v
- Notations vi
- Introduction 1
- What are neural networks? 1
- Overview of this book 2
- The Perceptron 3
- Overview 3
- The McCulloch and Pitts neuron 4
- Architecture of a perceptron 8
- Widrow-Hoff learning rule 9
- Learning with +1/-1 cells 13
- Performance evaluation 17
- Perceptron and discriminant analysis 19
- Learning and testing sets: The validation problem 20
- Linear Autoassociative Memories 21
- Overview 21
- The building block: the basic linear unit 22
- Architecture of an autoassociative memory 23
- Hebbian learning rule 25
- Retrieval of a learned pattern 29
- Limitations of Hebbian Learning 31
- Generalization to new stimuli 35
- Types of errors 36
- The Widrow-Hoff learning rule 37
- Singular value decomposition: PCA models 41
- The Widrow-Hoff rule and gradient descent 45
- Linear Heteroassociative Memories 46
- Overview 46
- Architecture of a heteroassociative memory 47
- Hebbian learning rule 48
- Widrow-Hoff learning rule 54
- Widrow-Hoff learning and pseudo-inverse 58
- Widrow-Hoff rule and gradient descent 60
- Discriminant analysis and perceptron revisited 60
- Radial basis function networks 61
- Error Backpropagation 68
- Overview 68
- Architecture and notation 69
- The building block: nonlinear units 70
- The backpropagation algorithm 73
- Performance analysis 83
- Backpropagation and gradient descent 85
- Backpropagation and logistic regression 87
- Useful References 87
- References 88
- About the Authors 90
back to Hervé Abdi's home page
Link to Sage home page for order information