Providing detailed examples of simple applications, this new book introduces the use of neural networks. It covers simple neural nets for pattern classification; pattern association; neural networks based on competition; adaptive-resonance theory; and more. For professionals working with neural networks.
Author(s): Laurene V. Fausett
Series: 1
Edition: Paperback
Publisher: Pearson
Year: 1993
Language: English
Pages: 480
Cover 1
ToC 2
Preface 7
Acknowledgments 9
Chapter 1 Introduction 11
1.1 Why Neural Networks, and Why Now? 11
1.2 What Is a Neural Net? 13
12.1 Artificial Neural Networks 13
1.2.2 Biological Neural Networks 15
1.3 Where Are Neural Nets Being Used? 17
1.3.1 Signal Processing 17
1.3.2 Control 18
1.3.3 Pattern Recognition 18
1.3.4 Medicine 19
1.3.5 Speech Production 19
1.3.6 Speech Recognition 19
1.3.7 Business 19
1.4 How Are Neural Networks Used? 21
1.4.1 Typical Architectures 22
1.4.2 Setting the Weights 25
1.4.3 Common Activation Functions 27
1.4.4 Summary of Notation 30
1.5 Who Is Developing Neural Networks? 32
1.5.1 The 1940s: The Beginning of Neural Nets 32
1.5.2 The 1950s and 1960s: The First Golden Age of Neural Networks 32
1.5.3 The 1970s: The Quiet Years 34
1.5.4 The 1980s: Renewed Enthusiasm 35
1.6 When Neural Nets Began: the McCulloch-Pitts Neuron 36
1.6.1 Architecture 37
1.6.2 Algorithm 38
1.6.3 Applications 40
1.7 Suggestions for Further Study 45
1.7.1 Readings 45
1.7.2 Exercises 47
Chapter 2 Simple Neural Nets for Pattern Classification 49
2.1 General Discussion 49
2.1.1 Architecture 50
2.1.2 Biases and Thresholds 51
2.1.3 Linear Separability 53
2.1.4 Data Representation 58
2.2 Hebb Net 58
2.2.1 Algorithm 59
2.2.2 Application 60
2.3 Perceptron 69
2.3.1 Architecture 70
2.3.2 Algorithm 71
2.3.3 Application 72
2.3.4 Perceptron Learning Rule Convergence Theorem 86
2.4 Adaline 90
2.4.1 Architecture 91
2.4.2 Algorithm 91
2.4.3 Applications 92
2.4.4 Derivations 96
2.4.5 Madaline 98
2.5 Suggestions for Further Study 106
2.5.1 Readings 106
2.5.2 Exercises 107
2.5.3 Projects 110
Chapter 3 Pattern Association 111
3.1 Training Algorithms for Pattern Association 113
3.1.1 Hebb Rule for Pattern Association 113
3.1.2 Delta Rule for Pattern Association 116
3.2 Heteroassociative Memory Neural Network 118
3.2.1 Architecture 118
3.2.2 Application 118
3.3 Autoassociative Net 131
3.3.1 Architecture 131
3.3.2 Algorithm 132
3.3.3 Application 132
3.3.4 Storage Capacity 135
3.4 Iterative Autoassociative Net 139
3.4.1 Recurrent Linear Autoassociator 140
3.4.2 Brain-State-in-a-Box 141
3.4.3 Autoassociator With Threshold Function 142
3.4.4 Discrete Hopfield Net 145
3.5 Bidirectional Associative Memory (BAM) 150
3.5.1 Architecture 151
3.5.2 Algorithm 151
3.5.3 Application 154
3.5.4 Analysis 158
3.6 Suggestions for Further Study 159
3.6.1 Readings 159
3.6.2 Exercises 160
3.6.3 Projects 162
Chapter 4 Neural Networks Based on Competition 166
4.1 Fixed-Weight Competitive Nets 168
4.1.1 Maxnet 168
4.1.2 Mexican Hat 170
4.1.3 Hamming Net 174
4.2 Kohonen Self-Organizing Maps 179
4.2.1 Architecture 179
4.2.2 Algorithm 180
4.2.3 Application 182
4.3 Learning Vector Quantization 197
4.3.1 Architecture 197
4.3.2 Algorithm 198
4.3.3 Application 199
4.3.4 Variations 202
4.4 Counterpropagation 205
4.4.1 Full Counterpropagation 206
4.4.2 Forward-Only Counterpropagation 216
4.5 Suggestions For Further Study 221
4.5.1 Readings 221
4.5.2 Exercises 221
4.5.3 Projects 224
Chapter 5 Adaptive Resonance Theory 228
5.1 Introduction 228
5.1.1 Motivation 228
5.1.2 Basic Architecture 229
5.1.3 Basic Operation 230
5.2 ART1 232
5.2.1 Architecture 232
5.2.2 Algorithm 235
5.2.3 Applications 239
5.2.4 Analysis 253
5.3 ART2 256
5.3.1 Architecture 257
5.3.2 Algorithm 260
5.3.3 Applications 267
5.3.4 Analysis 286
5.4 Suggestions for Further Study 293
5.4.1 Readings 293
5.4.2 Exercises 294
5.4.3 Projects 297
Chapter 6 Backpropagation Neural Net 299
6.1 Standard Backpropagation 299
6.1.1 Architecture 300
6.1.2 Algorithm 300
6.1.3, Applications 310
6.2 Variations 315
6.2.1 Alternative Weight Update Procedures 315
6.2.2 Alternative Activation Functions 319
6.2.3 Strictly Local Backpropagation 326
6.2.4 Number of Hidden Layers 330
6.3 Theoretical Results 334
6.3.1 Derivation of Learning Rules 334
6.3.2 Multilayer Neural Nets as Universal Approximators 338
6.4 Suggestions for Further Study 340
6.4.1 Readings 340
6.4.2 Exercises 340
6.4.3 Projects 342
Chapter 7 a Sampler of Other Neural Nets 345
7.1 Fixed Weight Nets for Constrained Optimization 346
7.1.1 Boltzmann Machine 349
7.1.2 Continuous Hopfield Net 359
7.1.3 Gaussian Machine 368
7.1.4 Cauchy Machine 370
7.2 A Few More Nets that Learn 373
7.2.1 Modified Hebbian Learning 373
7.2.2 Boltzmann Machine with Learning 378
7.2.3 Simple Recurrent Net 383
7.2.4 Backpropagation in Time 388
7.2.5 Backpropagation Training for Fully Recurrent Nets 395
7.3 Adaptive Architectures 396
7.3.1 Probabilistic Neural Net 396
7.3.2 Cascade Correlation 401
7.4 Neocognitron 409
7.4.1 Architecture 410
7.4.2 Algorithm 418
7.5 Suggestions for Further Study 429
7.5.1 Readings 429
7.5.2 Exercises 429
7.5.3 Project 431
Glossary 433
References 449
Index 461