Author(s): Madan M Gupta; Liang Jin; Noriyasu Homma
Publisher: Wiley
Year: 2003
Language: English
Pages: 751
City: New York
Tags: Информатика и вычислительная техника;Искусственный интеллект;Нейронные сети;
Cover......Page 1
Contents......Page 8
Preface......Page 24
Acknowledgments......Page 28
Part I FOUNDATIONS OF NEURAL NETWORKS......Page 30
1 Neural Systems: An Introduction......Page 32
1.1 BASICS OF NEURONAL MORPHOLOGY......Page 33
1.2 THE NEURON......Page 37
1.3 NEUROCOMPUTATIONAL SYSTEMS:SOME PERSPECTIVES......Page 38
1.4 NEURONAL LEARNING......Page 41
1.5 THEORY OF NEURONAL APPROXIMATIONS......Page 42
1.6 FUZZY NEURAL SYSTEMS......Page 43
1.7.1 Neurovision Systems......Page 44
1.7.3 Neural Hardware Implementations......Page 45
1.8 AN OVERVIEW OF THE BOOK......Page 46
2 Biological Foundations of Neuronal Morphology......Page 50
2.1.1 Basic Neuronal Structure......Page 51
2.1.2 Neural Electrical Signals......Page 54
2.2 NEURAL INFORMATION PROCESSING......Page 56
2.2.1 Neural Mathematical Operations......Page 57
2.2.2 Sensorimotor Feedback Structure......Page 59
2.2.3 Dynamic Characteristics......Page 60
2.3.1 Types of Human Memory......Page 61
2.3.2 Features of Short- Term and Long- Term Memories......Page 63
2.3.3 Content- Addressable and Associative Memory......Page 64
2.4.1 Types of Human Learning......Page 65
2.5 CONCLUDING REMARKS......Page 67
2.6 SOME BIOLOGICAL KEYWORDS......Page 68
3 Neural Units: Concepts,Models, and Learning......Page 72
3.1 NEURONS AND THRESHOLD LOGIC:SOME BASIC CONCEPTS......Page 73
3.1.1 Some Basic Binary Logical Operations......Page 74
3.1.2 Neural Models for Threshold Logics......Page 76
3.2.1 Realization of Switching Function......Page 80
3.3.1 Concept of Parameter Adaptation......Page 91
3.3.2 The Perceptron Rule of Adaptation......Page 94
3.3.3 Mays Rule of Adaptation......Page 97
3.4 ADAPTIVE LINEAR ELEMENT ( ADALINE)......Page 99
3.4.1 a- LMS ( Least Mean Square) Algorithm......Page 100
3.4.2 Mean Square Error Method......Page 104
3.5.1 Nonlinear Sigmoidal Functions......Page 109
3.5.2 Backpropagation for the Sigmoid Adaline......Page 111
3.6 NETWORKS WITH MULTIPLE NEURONS......Page 113
3.6.1 A Simple Network with Three Neurons......Page 114
3.6.2 Error Backpropagation Learning......Page 117
3.7 CONCLUDING REMARKS......Page 123
Part II STATIC NEURAL NETWORKS......Page 132
4 Multilayered Feedforward Neural Networks (MFNNs) and Backpropagation Learning Algorithms......Page 134
4.1.1 Structure and Operation Equations......Page 136
4.1.2 Generalized Delta Rule......Page 141
4.1.3 Network with Linear Output Units......Page 147
4.2.1 Network Model......Page 150
4.2.2 Simulation Results......Page 152
4.2.3 Geometric Explanation......Page 156
4.3 BACKPROPAGATION ( BP) ALGORITHMS FOR MFNN......Page 158
4.3.1 General Neural Structure for MFNNs......Page 159
4.3.2 Extension of the Generalized Delta Rule to General MFNN Structures......Page 164
4.4.1 Optimality Conditions......Page 169
4.4.2 Weight Updating......Page 171
4.4.3 Transforming the Parameter Space......Page 172
4.5.1 Modified Increment Formulation......Page 173
4.5.2 Effect of Momentum Term......Page 175
4.6.1 Updating Procedure......Page 178
4.6.2 Signal Propagation in MFNN Architecture......Page 180
4.7.1 Initial Values of Weights and Learning Rate......Page 184
4.7.2 Number of Hidden Layers and Neurons......Page 187
4.7.3 Local Minimum Problem......Page 191
4.8 CONCLUDING REMARKS......Page 192
5 Advanced Methods for Learning and Adaptation in MFNNs......Page 200
5.1 DIFFERENT ERROR MEASURE CRITERIA......Page 201
5.1.1 Error Distributions and Lp Norms......Page 202
5.1.2 The Case of Generic Lp Norm......Page 204
5.2 COMPLEXITIES IN REGULARIZATION......Page 206
5.2.1 Weight Decay Approach......Page 208
5.2.2 Weight Elimination Approach......Page 209
5.2.3 Chauvin's Penalty Approach......Page 210
5.3.1 First- Order Pruning Procedures......Page 212
5.3.2 Second- Order Pruning Procedures......Page 215
5.4 EVALUATION OF THE HESSIAN MATRIX......Page 220
5.4.1 Diagonal Second- Order Derivatives......Page 221
5.4.2 General Second- Order Derivative Formulations......Page 225
5.5 SECOND- ORDER OPTIMIZATION LEARNING ALGORITHMS......Page 227
5.5.1 Quasi- Newton Methods......Page 228
5.5.2 Conjugate Gradient ( CG) Methods for Learning......Page 229
5.6.1 Linearized Least Squares Learning ( LLSL)......Page 231
5.6.2 Decomposed Extended Kalman Filter ( DEKF) Learning......Page 233
5.7 TAPPED DELAY LINE NEURAL NETWORKS ( TDLNNs)......Page 237
5.8 APPLICATIONS OF TDLNNs FOR ADAPTIVE CONTROL SYSTEMS......Page 240
5.9 CONCLUDING REMARKS......Page 244
6 Radial Basis Function Neural Networks......Page 252
6.1.1 Basic Radial Basis Function Network Models......Page 253
6.1.2 RBFNs and Interpolation Problem......Page 256
6.1.3 Solving Overdetermined Equations......Page 261
6.2.1 Gaussian RBF Network Model......Page 264
6.2.2 Gaussian RBF Networks as Universal Approximator......Page 268
6.3.1 K- Means Clustering- Based Learning Procedures in Gaussian RBF Neural Network......Page 271
6.3.2 Supervised ( Gradient Descent) Parameter Learning in Gaussian Networks......Page 274
6.4 CONCLUDING REMARKS......Page 275
7 Function Approximation Using Feedforward Neural Networks......Page 282
7.1 STONE- WEIERSTRASS THEOREM AND ITS FEEDFORWARD NETWORKS......Page 283
7.1.1 Basic Definitions......Page 284
7.1.2 Stone- Weierstrass Theorem and Approximation......Page 285
7.1.3 Implications for Neural Networks......Page 287
7.2 TRIGONOMETRIC FUNCTION NEURAL NETWORKS......Page 289
7.3 MFNNs AS UNIVERSAL APPROXIMATORS......Page 295
7.3.1 Sketch Proof for Two- Layered Networks......Page 296
7.3.2 Approximation Using General MFNNs......Page 300
7.4 KOLMOGOROV'S THEOREM AND FEEDFORWARD FEEDFORWARD NETWORKS......Page 303
7.5 HIGHER- ORDER NEURAL NETWORKS ( HONNs)......Page 308
7.6.1 Sigma- Pi Neural Networks ( S- PNNs)......Page 316
7.6.2 Ridge Polynomial Neural Networks ( RPNNs)......Page 317
7.7 CONCLUDING REMARKS......Page 320
Part III DYNAMIC NEURAL NETWORKS......Page 324
8 Dynamic Neural Units (DNUs): Nonlinear Models and Dynamics......Page 326
8.1.1 A Generalized DNU Model......Page 327
8.1.2 Some Typical DNU Structures......Page 330
8.2.1 An Isolated DNU......Page 336
8.2.2 DNU Models: Some Extensions and Their Properties......Page 337
8.3.1 A General Model......Page 346
8.3.2 Positive- Negative ( PN) Neural Structure......Page 349
8.3.3 Further Extension to the PN Neural Model......Page 351
8.4 NEURON WITH MULTIPLE NONLINEAR FEEDBACK......Page 353
8.5 DYNAMIC TEMPORAL BEHAVIOR OF DNN......Page 356
8.6.1 Equilibrium Points of a DNU......Page 360
8.6.2 Stability of the DNU......Page 362
8.6.3 Pitchfork Bifurcation in the DNU......Page 363
8.7 CONCLUDING REMARKS......Page 367
9 Continuous- Time Dynamic Neural Networks......Page 374
9.1 DYNAMIC NEURAL NETWORK STRUCTURES: AN INTRODUCTION......Page 375
9.2.1 State Space Model of the Hopfield DNN......Page 380
9.2.2 Output Variable Model of the Hopfield DNN......Page 383
9.2.3 State Stability of Hopfield DNN......Page 386
9.2.4 A General Form of Hopfield DNN......Page 390
9.3 HOPFIELD DYNAMIC NEURAL NETWORKS ( DNNs) AS GRADIENT- LIKE SYSTEMS......Page 392
9.4.1 Hopfield Dynamic Neural Networks with Triangular Weighting Matrix......Page 398
9.4.2 Hopfield Dynamic Neural Network with Infinite Gain( Hard Threshold Switch)......Page 401
9.4.3 Some Restrictions on the Internal Neural States of the Hopfield DNN......Page 402
9.4.4 Dynamic Neural Network with Saturation ( DNN- S)......Page 403
9.4.5 Dynamic Neural Network with Integrators......Page 407
9.5.1 The Pineda Model of Dynamic Neural Networks......Page 409
9.5.2 Cohen- Grossberg Model of Dynamic Neural Network......Page 411
9.6.1 Conditions for Equilibrium Points of DNN- 1......Page 413
9.6.2 Conditions for Equilibrium Points of DNN- 2......Page 415
9.7 CONCLUDING REMARKS......Page 416
10 Learning and Adaptation in Dynamic Neural Networks......Page 422
10.1 SOME OBSERVATION ON DYNAMIC NEURAL FILTER BEHAVIORS......Page 424
10.2 TEMPORAL LEARNING PROCESS I:DYNAMIC BACKPROPAGATION ( DBP)......Page 427
10.2.1 Dynamic Backpropagation for CT- DNU......Page 428
10.2.2 Dynamic Backpropagation for DT- DNU......Page 432
10.2.3 Comparison between Continuous and Discrete- Time Dynamic Backpropagation Approaches......Page 436
10.3.1 Continuous- Time Dynamic Forward Propagation ( CT- DFP)......Page 440
10.3.2 Discrete- Time Dynamic Forward Propagation ( DT- DFP)......Page 443
10.4.1 General Representation of Network Models......Page 450
10.4.2 DBF Learning Algorithms......Page 453
10.5 CONCLUDING REMARKS......Page 460
11 Stability of Continuous-Time Dynamic Neural Networks ......Page 464
11.1 LOCAL ASYMPTOTIC STABILITY......Page 465
11.1.1 Lyapunov's First Method......Page 466
11.1.2 Determination of Eigenvalue Position......Page 469
11.1.3 Local Asymptotic Stability Conditions......Page 472
11.2.1 Lyapunov Function Method......Page 473
11.2.2 Diagonal Lyapunov Function for DNNs......Page 474
11.2.3 DNNs with Synapse- Dependent Functions......Page 477
11.2.4 Some Examples......Page 479
11.3.1 Lyapunov Function Method for Exponential Stability......Page 481
11.3.2 Local Exponential Stability Conditions for DNNs......Page 482
11.4 GLOBAL EXPONENTIAL STABILITY OF DNNs......Page 490
11.5 CONCLUDING REMARKS......Page 493
12 Discrete-Time Dynamic Neural Networks and Their Stability......Page 498
12.1 GENERAL CLASS OF DISCRETE- TIME DYNAMIC NEURAL NETWORKS ( DT- DNNs)......Page 499
12.2.1 Lyapunov's Second Method of Stability......Page 503
12.2.2 Lyapunov's First Method......Page 504
12.3 STABILITY CONDITIONS FOR DISCRETE- TIME DNNs......Page 507
12.3.1 Global State Convergence for Symmetric Weight Matrix......Page 508
12.3.3 Diagonal Lyapunov Function Method......Page 510
12.3.4 Examples......Page 515
12.4 MORE GENERAL RESULTS ON GLOBALLY ASYMPTOTIC STABILITY......Page 517
12.4.1 Main Stability Results......Page 519
12.4.2 Examples......Page 525
12.5 CONCLUDING REMARKS......Page 529
Part IV SOME ADVANCED TOPICS IN NEURAL NETWORKS ......Page 536
13 Binary Neural Networks......Page 538
13.1.1 Basic Definitions......Page 539
13.1.2 Lyapunov Function Method......Page 548
13.2.1 State Operating Equations......Page 550
13.2.2 State Convergence of Hopfield Neural Network with Zero- Diagonal Elements......Page 553
13.2.3 State Convergence of Dynamic Neural Network with Nonnegative Diagonal Elements......Page 559
13.2.4 Estimation of Transient Time......Page 563
13.3.1 Binary State Updating......Page 568
13.3.2 Formulations for Transient Time in Asynchronous Mode......Page 572
13.4.1 Neural Network with Symmetric Weight Matrix......Page 576
13.4.2 Neural Network with Skew- Symmetric Weight Matrix......Page 585
13.4.3 Estimation of Transient Time......Page 589
13.5.1 State Updating with Ordered Partition......Page 590
13.5.2 Guaranteed Convergence Results for Block Sequential Operation......Page 593
13.6 CONCLUDING REMARKS......Page 600
14 Associative Memories Associative Memories......Page 608
14.1.1 Basis of Hebb's Learning Rule......Page 609
14.1.2 Hebb's Learning Formulations......Page 611
14.1.3 Convergence Considerations......Page 613
14.2.1 The Hamming Distance ( HD)......Page 620
14.2.2 Self- Recall of Stored Patterns......Page 621
14.2.3 Attractivity in Synchronous Mode......Page 626
14.3.1 Convergence for Nonorthogonal Patterns......Page 637
14.3.2 Storage of Nonorthogonal Patterns......Page 642
14.4.1 The Projection Learning Rule......Page 647
14.4.2 A Generalized Learning Rule......Page 649
14.5 INFORMATION CAPACITY OF BINARY HOPFIELD NEURAL NETWORK......Page 653
14.6 CONCLUDING REMARKS......Page 655
15 Fuzzy Sets and Fuzzy Neural Networks......Page 662
15.1.1 Some Preliminaries......Page 665
15.1.2 Fuzzy Membership Functions ( FMFs)......Page 668
15.1.3 Fuzzy Systems......Page 670
15.2 BUILDING FUZZY NEURONS ( FNs) USING FUZZY ARITHMETIC AND FUZZY LOGIC OPERATIONS......Page 673
15.2.1 Definition of Fuzzy Neurons......Page 674
15.2.2 Utilization of T and S Operators......Page 676
15.3.1 Updating Formulation......Page 681
15.3.2 Calculations of Partial Derivatives......Page 683
15.4 REGULAR FUZZY NEURAL NETWORKS (RFNNs)......Page 684
15.4.1 Regular Fuzzy Neural Network ( RFNN) Structures......Page 685
15.4.2 Fuzzy Backpropagation ( FBP) Learning......Page 686
15.4.3 Some Limitations of Regular Fuzzy Neural Networks ( RFNNs)......Page 687
15.5.1 Difference- Measure- Based Two- Layered HFNNs......Page 691
15.5.2 Fuzzy Neurons and Hybrid Fuzzy Neural Networks ( HFNNs)......Page 694
15.5.3 Derivation of Backpropagation Algorithm for Hybrid Fuzzy Neural Networks......Page 696
15.5.4 Summary of Fuzzy Backpropagation ( FBP) Algorithm......Page 699
15.6 FUZZY BASIS FUNCTION NETWORKS ( FBFNs)......Page 700
15.6.1 Gaussian Networks versus Fuzzy Systems......Page 701
15.6.2 Fuzzy Basis Function Networks ( FBFNs) Are Universal Approximators......Page 706
15.7 CONCLUDING REMARKS......Page 708
References and Bibliography......Page 716
Appendix A Current Bibliographic Sources on Neural Networks......Page 740
Index......Page 744