The subject of adaptive filters constitutes an important part of statistical signal processing. Whenever there is a requirement to process signals that result from operation in an
environment of unknown statistics or one that is inherently nonstationary, the use of
an adaptive filter offers a highly attractive solution to the problem as it provides a significant improvement in performance over the use of a fixed filter designed by conventional methods. Furthermore, the use of adaptive filters provides new signal-processing
capabilities that would not be possible otherwise. We thus find that adaptive filters have
been successfully applied in such diverse fields as communications, control, radar, sonar,
seismology, and biomedical engineering, among others.
Author(s): Simon Haykin
Edition: 5th, intern.
Publisher: Pearson
Year: 2014
Language: English
Pages: 907
City: Harlow
Tags: adaptive, filter, haykin,
Cover
......Page 1
Title......Page 2
Contents......Page 5
Preface......Page 11
Acknowledgments......Page 17
1. The Filtering Problem......Page 20
3. Adaptive Filters......Page 23
4. Linear Filter Structures......Page 25
5. Approaches to the Development of Linear Adaptive Filters......Page 31
6. Adaptive Beamforming......Page 32
7. Four Classes of Applications......Page 36
8. Historical Notes......Page 39
1.1 Partial Characterization of a Discrete-Time Stochastic Process......Page 49
1.2 Mean Ergodic Theorem......Page 51
1.3 Correlation Matrix......Page 53
1.4 Correlation Matrix of Sine Wave Plus Noise......Page 58
1.5 Stochastic Models......Page 59
1.6 Wold Decomposition......Page 65
1.7 Asymptotic Stationarity of an Autoregressive Process......Page 68
1.8 Yule–Walker Equations......Page 70
1.9 Computer Experiment: Autoregressive Process of Order Two......Page 71
1.10 Selecting the Model Order......Page 79
1.11 Complex Gaussian Proceses......Page 82
1.12 Power Spectral Density......Page 84
1.13 Propert ies of Power Spectral Density......Page 86
1.14 Transmission of a Stationary Process Through a Linear Filter......Page 88
1.15 Cramér Spectral Representation for a Stationary Process......Page 91
1.16 Power Spectrum Estimation......Page 93
1.17 Other Statistical Characteristics of a Stochastic Process......Page 96
1.18 Polyspectra......Page 97
1.19 Spectral-Correlation Density......Page 100
1.20 Summary and Discussion......Page 103
Problems......Page 104
2.1 Linear Optimum Filtering: Statement of the Problem......Page 109
2.2 Principle of Orthogonality......Page 111
2.3 Minimum Mean-Square Error......Page 115
2.4 Wiener–Hopf Equations......Page 117
2.5 Error-Performance Surface......Page 119
2.6 Multiple Linear Regression Model......Page 123
2.7 Example......Page 125
2.8 Linearly Constrained Minimum-Variance Filter......Page 130
2.9 Generalized Sidelobe Cancellers......Page 135
2.10 Summary and Discussion......Page 141
Problems......Page 143
3.1 Forward Linear Prediction......Page 151
3.2 Backward Linear Prediction......Page 158
3.3 Levinson–Durbin Algorithm......Page 163
3.4 Properties of Prediction-Error Filters......Page 172
3.5 Schur–Cohn Test......Page 181
3.6 Autoregressive Modeling of a Stationary Stochastic Process......Page 183
3.7 Cholesky Factorization......Page 186
3.8 Lattice Predictors......Page 189
3.9 All-Pole, All-Pass Lattice Filter......Page 194
3.10 Joint-Process Estimation......Page 196
3.11 Predictive Modeling of Speech......Page 200
3.12 Summary and Discussion......Page 207
Problems......Page 208
4.1 Basic Idea of the Steepest-Descent Algorithm......Page 218
4.2 The Steepest-Descent Algorithm Applied to the Wiener Filter......Page 219
4.3 Stability of the Steepest-Descent Algorithm......Page 223
4.4 Example......Page 228
4.5 The Steepest-Descent Algorithm Viewed as a Deterministic Search Method......Page 240
4.6 Virtue and Limitation of the Steepest-Descent Algorithm......Page 241
4.7 Summary and Discussion......Page 242
Problems......Page 243
5.1 Principles of Stochastic Gradient Descent......Page 247
5.2 Application 1: Least-Mean-Square (LMS) Algorithm......Page 249
5.3 Application 2: Gradient-Adaptive Lattice Filtering Algorithm......Page 256
5.4 Other Applications of Stochastic Gradient Descent......Page 263
5.5 Summary and Discussion......Page 264
Problems......Page 265
6.1 Signal-Flow Graph......Page 267
6.2 Optimality Considerations......Page 269
6.3 Applications......Page 271
6.4 Statistical Learning Theory......Page 291
6.5 Transient Behavior and Convergence Considerations......Page 302
6.6 Efficiency......Page 305
6.7 Computer Experiment on Adaptive Prediction......Page 307
6.8 Computer Experiment on Adaptive Equalization......Page 312
6.9 Computer Experiment on a Minimum-Variance Distortionless-Response Beamformer......Page 321
6.10 Summary and Discussion......Page 325
Problems......Page 327
7.1 Normalized LMS Algorithm: The Solution to a Constrained Optimization Problem......Page 334
7.2 Stability of the Normalized LMS Algorithm......Page 338
7.3 Step-Size Control for Acoustic Echo Cancellation......Page 341
7.4 Geometric Considerations Pertaining to the Convergence Process for Real-Valued Data......Page 346
7.5 Affine Projection Adaptive Filters......Page 349
7.6 Summary and Discussion......Page 353
Problems......Page 354
Chapter 8 Block-Adaptive Filters......Page 358
8.1 Block-Adaptive Filters: Basic Ideas......Page 359
8.2 Fast Block LMS Algorithm......Page 363
8.3 Unconstrained Frequency-Domain Adaptive Filters......Page 369
8.4 Self-Orthogonalizing Adaptive Filters......Page 370
8.5 Computer Experiment on Adaptive Equalization......Page 380
8.6 Subband Adaptive Filters......Page 386
8.7 Summary and Discussion......Page 394
Problems......Page 395
9.1 Statement of the Linear Least-Squares Estimation Problem......Page 399
9.2 Data Windowing......Page 402
9.3 Principle of Orthogonality Revisited......Page 403
9.4 Minimum Sum of Error Squares......Page 406
9.5 Normal Equations and Linear Least-Squares Filters......Page 407
9.6 Time-Average Correlation Matrix Φ......Page 410
9.7 Reformulation of the Normal Equations in Terms of Data Matrices......Page 412
9.8 Properties of Least-Squares Estimates......Page 416
9.9 Minimum-Variance Distortionless Response (MVDR) Spectrum Estimation......Page 420
9.10 Regularized MVDR Beamforming......Page 423
9.11 Singular-Value Decomposition......Page 428
9.12 Pseudoinverse......Page 435
9.13 Interpretation of Singular Values and Singular Vectors......Page 437
9.14 Minimum-Norm Solution to the Linear Least-Squares Problem......Page 438
9.15 Normalized LMS Algorithm Viewed as the Minimum-Norm Solution to an Underdetermined Least-Squares Estimation Problem......Page 441
9.16 Summary and Discussion......Page 443
Problems......Page 444
10.1 Some Preliminaries......Page 450
10.2 The Matrix Inversion Lemma......Page 454
10.3 The Exponentially Weighted RLS Algorithm......Page 455
10.4 Selection of the Regularization Parameter......Page 458
10.5 Updated Recursion for the Sum of Weighted Error Squares......Page 460
10.6 Example: Single-Weight Adaptive Noise Canceller......Page 462
10.7 Statistical Learning Theory......Page 463
10.8 Efficiency......Page 468
10.9 Computer Experiment on Adaptive Equalization......Page 469
10.10 Summary and Discussion......Page 472
Problems......Page 473
11.1 Robustness, Adaptation, and Disturbances......Page 475
11.2 Robustness: Preliminary Considerations Rooted in H∞ Optimization......Page 476
11.3 Robustness of the LMS Algorithm......Page 479
11.4 Robustness of the RLS Algorithm......Page 484
11.6 Risk-Sensitive Optimality......Page 489
11.7 Trade-Offs Between Robustness and Efficiency......Page 491
Problems......Page 493
Chapter 12 Finite-Precision Effects......Page 498
12.1 Quantization Errors......Page 499
12.2 Least-Mean-Square (LMS) Algorithm......Page 501
12.3 Recursive Least-Squares (RLS) Algorithm......Page 510
12.4 Summary and Discussion......Page 516
Problems......Page 517
13.1 Causes and Consequences of Nonstationarity......Page 519
13.2 The System Identification Problem......Page 520
13.3 Degree of Nonstationarity......Page 523
13.4 Criteria for Tracking Assessment......Page 524
13.5 Tracking Performance of the LMS Algorithm......Page 526
13.6 Tracking Performance of the RLS Algorithm......Page 529
13.7 Comparison of the Tracking Performance of LMS and RLS Algorithms......Page 533
13.8 Tuning of Adaptation Parameters......Page 537
13.9 Incremental Delta-Bar-Delta (IDBD) Algorithm......Page 539
13.10 Autostep Method......Page 545
13.11 Computer Experiment: Mixture of Stationary and Nonstationary Environmental Data......Page 549
13.12 Summary and Discussion......Page 553
Problems......Page 554
Chapter 14 Kalman Filters......Page 559
14.1 Recursive Minimum Mean-Square Estimation for Scalar Random Variables......Page 560
14.2 Statement of the Kalman Filtering Problem......Page 563
14.3 The Innovations Process......Page 566
14.4 Estimation of the State Using the Innovations Process......Page 568
14.5 Filtering......Page 574
14.6 Initial Conditions......Page 576
14.7 Summary of the Kalman Filter......Page 577
14.8 Optimality Criteria for Kalman Filtering......Page 578
14.9 Kalman Filter as the Unifying Basis for RLS Algorithms......Page 580
14.10 Covariance Filtering Algorithm......Page 585
14.11 Information Filtering Algorithm......Page 587
14.12 Summary and Discussion......Page 590
Problems......Page 591
15.1 Square-Root Kalman Filters......Page 595
15.2 Building Square-Root Adaptive Filters on the Two Kalman Filter Variants......Page 601
15.3 QRD-RLS Algorithm......Page 602
15.4 Adaptive Beamforming......Page 610
15.5 Inverse QRD-RLS Algorithm......Page 617
15.6 Finite-Precision Effects......Page 620
15.7 Summary and Discussion......Page 621
Problems......Page 622
Chapter 16 Order-Recursive Adaptive Filtering Algorithm......Page 626
16.1 Order-Recursive Adaptive Filters Using Least-Squares Estimation: An Overview......Page 627
16.2 Adaptive Forward Linear Prediction......Page 628
16.3 Adaptive Backward Linear Prediction......Page 631
16.4 Conversion Factor......Page 634
16.5 Least-Squares Lattice (LSL) Predictor......Page 637
16.6 Angle-Normalized Estimation Errors......Page 647
16.7 First-Order State-Space Models for Lattice Filtering......Page 651
16.8 QR-Decomposition-Based Least-Squares Lattice (QRD-LSL) Filters......Page 656
16.9 Fundamental Properties of the QRD-LSL Filter......Page 663
16.10 Computer Experiment on Adaptive Equalization......Page 668
16.11 Recursive (LSL) Filters Using A Posteriori Estimation Errors......Page 673
16.12 Recursive LSL Filters Using A Priori Estimation Errors with Error Feedback......Page 676
16.13 Relation Between Recursive LSL and RLS Algorithms......Page 681
16.14 Finite-Precision Effects......Page 684
16.15 Summary and Discussion......Page 686
Problems......Page 688
17.1 Overview of Blind Deconvolution......Page 695
17.2 Channel Identifiability Using Cyclostationary Statistics......Page 700
17.3 Subspace Decomposition for Fractionally Spaced Blind Identification......Page 701
17.4 Bussgang Algorithm for Blind Equalization......Page 715
17.5 Extension of the Bussgang Algorithm to Complex Baseband Channels......Page 732
17.6 Special Cases of the Bussgang Algorithm......Page 733
17.7 Fractionally Spaced Bussgang Equalizers......Page 737
17.8 Estimation of Unknown Probability Distribution Function of Signal Source......Page 742
17.9 Summary and Discussion......Page 746
Problems......Page 747
1. Robustness, Efficiency, and Complexity......Page 751
2. Kernel-Based Nonlinear Adaptive Filtering......Page 754
A.1 Cauchy–Riemann Equations......Page 771
A.2 Cauchy’s Integral Formula......Page 773
A.3 Laurent’s Series......Page 775
A.4 Singularities and Residues......Page 777
A.5 Cauchy’s Residue Theorem......Page 778
A.6 Principle of the Argument......Page 779
A.7 Inversion Integral for the z-Transform......Page 782
A.8 Parseval’s Theorem......Page 784
B.1 Wirtinger Calculus: Scalar Gradients......Page 786
B.2 Generalized Wirtinger Calculus: Gradient Vectors......Page 789
B.3 Another Approach to Compute Gradient Vectors......Page 791
B.4 Expressions for the Partial Derivatives......Page 792
C.1 Optimization Involving a Single Equality Constraint......Page 793
C.2 Optimization Involving Multiple Equality Constraints......Page 794
C.3 Optimum Beamformer......Page 795
D.1 Likelihood Function......Page 796
D.2 Cramér–Rao Inequality......Page 797
D.3 Properties of Maximum-Likelihood Estimators......Page 798
D.4 Conditional Mean Estimator......Page 799
E.1 The Eigenvalue Problem......Page 801
E.2 Properties of Eigenvalues and Eigenvectors......Page 803
E.3 Low-Rank Modeling......Page 817
E.4 Eigenfilters......Page 821
E.5 Eigenvalue Computations......Page 823
F.2 Langevin Equation......Page 826
G.1 Plane Rotations......Page 828
G.2 Two-Sided Jacobi Algorithm......Page 830
G.3 Cyclic Jacobi Algorithm......Page 836
G.4 Householder Transformation......Page 839
G.5 The QR Algorithm......Page 842
H.1 Definition......Page 849
H.2 The Chi-Square Distribution as a Special Case......Page 850
H.3 Properties of the Complex Wishart Distribution......Page 851
H.4 Expectation of the Inverse Correlation Matrix Φ-1(n)......Page 852
Text Conventions......Page 853
Abbreviations......Page 856
Principal Symbols......Page 859
Bibliography......Page 865
Suggested Reading......Page 880
A......Page 898
C......Page 899
F......Page 900
I......Page 901
L......Page 902
M......Page 903
O......Page 904
R......Page 905
S......Page 906
W......Page 907
Z......Page 908