Adaptive filters MNw

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Author(s): Ali H. Sayed
Publisher: Wiley
Year: 2008

Language: English
Pages: 820
Tags: Приборостроение;Обработка сигналов;

Adaptive Filters......Page 3
Contents......Page 9
Preface......Page 19
Notation......Page 27
Acknowledgments......Page 32
BACKGROUND MATERIAL......Page 33
A.1 Variance of a Random Variable......Page 35
A.2 Dependent Random Variables......Page 37
A.3 Complex-Valued Random Variables......Page 38
A.4 Vector-Valued Random Variables......Page 40
A.5 Gaussian Random Vectors......Page 41
B.1 Hermitian and Positive-Definite Matrices......Page 46
B.2 Range Spaces and Nullspaces of Matrices......Page 48
B.3 Schur Complements......Page 50
B.4 Cholesky Factorization......Page 51
B.5 QR Decomposition......Page 53
B.6 Singular Value Decomposition......Page 54
B.7 Kronecker Products......Page 57
C.1 Cauchy-Riemann Conditions......Page 59
C.3 Vector Arguments......Page 60
PART I: OPTIMAL ESTIMATION......Page 62
1.1 Estimation Without Observations......Page 63
1.2 Estimation Given Dependent Observations......Page 65
1.3 Orthogonality Principle......Page 70
1.4 Gaussian Random Variables......Page 72
2.1 Optimal Estimator in the Vector Case......Page 76
2.2 Spherically Invariant Gaussian Variables......Page 80
2.3 Equivalent Optimization Criterion......Page 83
Summary and Notes......Page 85
Problems and Computer Projects......Page 88
PART II: LINEAR ESTIMATION......Page 93
3 Normal Equations......Page 94
3.1 Mean-Square Error Criterion......Page 95
3.3 Minimization by Completion-of-Squares......Page 97
3.4 Minimization of the Error Covariance Matrix......Page 99
3.5 Optimal Linear Estimator......Page 100
4.1 Design Examples......Page 101
4.2 Orthogonality Condition......Page 106
4.3 Existence of Solutions......Page 108
4.4 Nonzero-Mean Variables......Page 110
5.1 Estimation using Linear Relations......Page 112
5.2 Application: Channel Estimation......Page 114
5.3 Application: Block Data Estimation......Page 115
5.4 Application: Linear Channel Equalization......Page 116
5.5 Application: Multiple-Antenna Receivers......Page 119
6 Constrained Estimation......Page 121
6.1 Minimum-Variance Unbiased Estimation......Page 122
6.2 Example: Mean Estimation......Page 124
6.3 Application: Channel and Noise Estimation......Page 125
6.4 Application: Decision Feedback Equalization......Page 127
6.5 Application: Antenna Beamforming......Page 135
7.1 Innovations Process......Page 138
7.2 State-Space Model......Page 140
7.3 Recursion for the State Estimator......Page 141
7.4 Computing the Gain Matrix......Page 142
7.6 Covariance Form......Page 143
7.7 Measurement and Time-Update Form......Page 144
Summary and Notes......Page 145
Problems and Computer Projects......Page 149
PART III: STOCHASTIC GRADIENT ALGORITHMS......Page 172
8 Steepest–Descent Technique......Page 173
8.1 Linear Estimation Problem......Page 174
8.2 Steepest-Descent Method......Page 176
8.3 More General Cost Functions......Page 181
9.1 Modes of Convergence......Page 182
9.2 Optimal Step-Size......Page 183
9.3 Weight-Error Vector Convergence......Page 185
9.4 Time Constants......Page 187
9.5 Learning Curve......Page 188
9.6 Contour Curves of the Error Surface......Page 189
9.7 Iteration-Dependent Step-Sizes......Page 191
9.8 Newton’s Method......Page 194
10.1 Motivation......Page 197
10.2 Instantaneous Approximation......Page 199
10.3 Computational Cost......Page 200
10.4 Least-Perturbation Property......Page 201
10.5 Application: Adaptive Channel Estimation......Page 202
10.6 Application: Adaptive Channel Equalization......Page 205
10.7 Application: Decision-Feedback Equalization......Page 206
10.8 Ensemble-Average Learning Curves......Page 208
11.1 Instantaneous Approximation......Page 212
11.2 Computational Cost......Page 213
11.3 Power Normalization......Page 214
11.4 Least-Perturbation Property......Page 216
12.1 Non-Blind Algorithms......Page 217
12.2 Blind Algorithms......Page 220
12.3 Some Properties......Page 222
13.1 Instantaneous Approximation......Page 225
13.3 Least-Perturbation Property......Page 227
13.4 Affine Projection Interpretation......Page 228
14.1 Instantaneous Approximation......Page 232
14.2 Computational Cost......Page 234
Summary and Notes......Page 236
Problems and Computer Projects......Page 243
PART IV: MEAN-SQUARE PERFORMANCE......Page 261
15.1 Performance Measure......Page 262
15.2 Stationary Data Model......Page 264
15.3 Energy Conservation Relation......Page 268
15.4 Variance Relation......Page 271
15.A Interpretations of the Energy Relation......Page 273
16.1 Variance Relation......Page 278
16.3 Separation Principle......Page 279
16.4 White Gaussian Input......Page 280
16.5 Statement of Results......Page 283
16.6 Simulation Results......Page 284
17.1 Separation Principle......Page 286
17.A Relating NLMS to LMS......Page 288
18.1 Real-Valued Data......Page 291
18.2 Complex-Valued Data......Page 293
18.3 Simulation Results......Page 294
19.1 Performance of RLS......Page 296
19.2 Performance of Other Filters......Page 300
19.3 Performance Table for Small Step-Sizes......Page 303
20.1 Motivation......Page 304
20.2 Nonstationary Data Model......Page 305
20.3 Energy Conservation Relation......Page 310
20.4 Variance Relation......Page 311
21.1 Performance of LMS......Page 314
21.2 Performance of NLMS......Page 318
21.3 Performance of Sign-Error LMS......Page 319
21.4 Performance of RLS......Page 321
21.5 Comparison of Tracking Performance......Page 323
21.6 Comparing RLS and LMS......Page 326
21.7 Performance of Other Filters......Page 327
21.8 Performance Table for Small Step-Sizes......Page 329
Summary and Notes......Page 330
Problems and Computer Projects......Page 338
PART V: TRANSIENT PERFORMANCE......Page 362
22.1 Data Model......Page 363
22.3 Weighted Energy Conservation Relation......Page 364
22.4 Weighted Variance Relation......Page 367
23.1 Mean and Variance Relations......Page 374
23.3 Mean-Square Behavior......Page 377
23.4 Mean-Square Stability......Page 380
23.5 Steady-State Performance......Page 384
23.A Convergence Time......Page 387
24.1 Mean and Variance Relations......Page 391
24.2 Mean-Square Stability and Performance......Page 394
24.3 Small Step-Size Approximations......Page 396
24.A Independence and Averaging Analysis......Page 397
25.1 NLMS Filter......Page 405
25.2 Data-Normalized Filters......Page 408
25.A Stability Bound......Page 411
25.B Stability of NLMS......Page 412
Summary and Notes......Page 414
Problems and Computer Projects......Page 422
PART VI: BLOCK ADAPTIVE FILTERS......Page 446
26.1 Transform-Domain Filters......Page 447
26.2 DFT-Domain LMS......Page 455
26.3 DCT-Domain LMS......Page 457
26.A DCT-Transformed Regressors......Page 458
27.1 Motivation......Page 460
27.2 Block Data Formulation......Page 462
27.3 Block Convolution......Page 465
28.1 DFT Block Adaptive Filters......Page 474
28.2 Subband Adaptive Filters......Page 481
28.A Another Constrained DFT Block Filter......Page 487
28.B Overlap-Add Block Adaptive Filters......Page 489
Summary and Notes......Page 499
Problems and Computer Projects......Page 502
PART VII: LEAST-SQUARES METHODS......Page 509
29 Least-Squares Criterion......Page 510
29.1 Least-Squares Problem......Page 511
29.2 Geometric Argument......Page 512
29.3 Algebraic Arguments......Page 514
29.4 Properties of Least-Squares Solution......Page 516
29.5 Projection Matrices......Page 518
29.6 Weighted Least-Squares......Page 519
29.7 Regularized Least-Squares......Page 521
29.8 Weighted Regularized Least-Squares......Page 523
30.1 Motivation......Page 526
30.2 RLS Algorithm......Page 527
30.3 Regularization......Page 529
30.4 Conversion Factor......Page 530
30.5 Time-Update of the Minimum Cost......Page 531
30.6 Exponentially-Weighted RLS Algorithm......Page 532
31.1 Equivalence in Linear Estimation......Page 535
31.2 Kalman Filtering and Recursive Least-Squares......Page 536
31.A Extended RLS Algorithms......Page 542
32.1 Backward Order-Update Relations......Page 549
32.2 Forward Order-Update Relations......Page 559
32.3 Time-Update Relation......Page 563
Summary and Notes......Page 568
Problems and Computer Projects......Page 575
PART VIII: ARRAY ALGORITHMS......Page 594
33.1 Some Difficulties......Page 595
33.2 Square-Root Factors......Page 596
33.3 Preservation Properties......Page 598
33.4 Motivation for Array Methods......Page 600
34.1 Givens Rotations......Page 605
34.2 Householder Transformations......Page 610
35 QR and Inverse QR Algorithms......Page 614
35.1 Inverse QR Algorithm......Page 615
35.2 QR Algorithm......Page 618
35.3 Extended QR Algorithm......Page 622
35.A Array Algorithms for Kalman Filtering......Page 623
Summary and Notes......Page 627
Problems and Computer Projects......Page 629
PART IX: FAST RLS ALGORITHMS......Page 635
36.1 Hyperbolic Givens Rotations......Page 636
36.2 Hyperbolic Householder Transformations......Page 639
36.3 Hyperbolic Basis Rotations......Page 642
37 Fast Array Algorithm......Page 644
37.1 Time-Update of the Gain Vector......Page 645
37.2 Time-Update of the Conversion Factor......Page 646
37.3 Initial Conditions......Page 647
37.4 Array Algorithm......Page 648
37.A Chandrasekhar Filter......Page 652
38 Regularized Prediction Problems......Page 655
38.1 Regularized Backward Prediction......Page 656
38.2 Regularized Forward Prediction......Page 658
38.3 Low-Rank Factorization......Page 661
39.1 Fast Transversal Filter......Page 662
39.2 FAEST Filter......Page 664
39.3 Fast Kalman Filter......Page 665
39.4 Stability Issues......Page 667
Summary and Notes......Page 673
Problems and Computer Projects......Page 676
PART X: LATTICE FILTERS......Page 686
40 Three Basic Estimation Problems......Page 687
40.1 Motivation for Lattice Filters......Page 688
40.2 Joint Process Estimation......Page 690
40.3 Backward Estimation Problem......Page 693
40.4 Forward Estimation Problem......Page 696
40.5 Time and Order-Update Relations......Page 698
41.1 Significance of Data Structure......Page 703
41.2 A Posteriori-Based Lattice Filter......Page 706
41.3 A Priori-Based Lattice Filter......Page 707
42.1 A Priori Error-Feedback Lattice Filter......Page 710
42.2 A Posteriori Error-Feedback Lattice Filter......Page 714
42.3 Normalized Lattice Filter......Page 716
43 Array Lattice Filters......Page 722
43.1 Order-Update of Output Estimation Errors......Page 723
43.2 Order-Update of Backward Estimation Errors......Page 724
43.3 Order-Update of Forward Estimation Errors......Page 725
43.4 Significance of Data Structure......Page 727
Summary and Notes......Page 729
Problems and Computer Projects......Page 732
PART XI: ROBUST FILTERS......Page 738
44.1 Indefinite Least-Squares Formulation......Page 739
44.2 Recursive Minimization Algorithm......Page 744
44.3 Time-Update of the Minimum Cost......Page 747
44.4 Singular Weighting Matrices......Page 748
44.B Inertia Conditions......Page 750
45.1 A Posteriori-Based Robust Filters......Page 752
45.2 ε-NLMS Algorithm......Page 758
45.3 A Priori-Based Robust Filters......Page 760
45.4 LMS Algorithm......Page 764
45.A H∞ Filters......Page 766
46.1 Robustness of LMS......Page 769
46.2 Robustness of ε--NLMS......Page 773
46.3 Robustness of RLS......Page 774
Summary and Notes......Page 777
Problems and Computer Projects......Page 781
REFERENCES AND INDICES......Page 791
References......Page 792
Author Index......Page 809
Subject Index......Page 814