A complete, self-contained introduction to matrix analysis theory and practice
Matrix methods have evolved from a tool for expressing statistical problems to an indispensable part of the development, understanding, and use of various types of complex statistical analyses. As such, they have become a vital part of any statistical education. Unfortunately, matrix methods are usually treated piecemeal in courses on everything from regression analysis to stochastic processes. Matrix Analysis for Statistics offers a unique view of matrix analysis theory and methods as a whole.
Professor James R. Schott provides in-depth, step-by-step coverage of the most common matrix methods now used in statistical applications, including eigenvalues and eigenvectors, the Moore-Penrose inverse, matrix differentiation, the distribution of quadratic forms, and more. The subject matter is presented in a theorem/proof format, and every effort has been made to ease the transition from one topic to another. Proofs are easy to follow, and the author carefully justifies every step. Accessible even for readers with a cursory background in statistics, the text uses examples that are familiar and easy to understand. Other key features that make this the ideal introduction to matrix analysis theory and practice include:
- Self-contained chapters for flexibility in topic choice.
- Extensive examples and chapter-end practice exercises.
- Optional sections for mathematically advanced readers.
Author(s): James R. Schott
Series: Wiley series in probability and statistics. Applied probability and statistics
Edition: 1
Publisher: Wiley
Year: 1997
Language: English
Pages: 445
City: New York
Contents......Page 7
Preface......Page 11
2. DEFINITIONS AND NOTATION......Page 13
3. MATRIX ADDITION AND MULTIPLICATION......Page 14
4. 'fHI': TRANSPOSE......Page 15
5. THE TRACE......Page 16
6. THE DETERMINANT......Page 17
7. THE INVERSE......Page 20
8. PARTITIONED MATRICES......Page 23
9. THE RANK OF A MATRIX......Page 25
10. ORTHOGONAL MATRICES......Page 26
11. QUADRATIC FORMS......Page 27
12. COMPLEX MATRICES......Page 28
13. RANDOM VECTORS AND SOME RELATED STATISTICALCONCEPTS......Page 30
PROBLEMS......Page 40
2. DEFINITIONS......Page 44
3. LINEAR INDEPENDENCE AND DEPENDENCE......Page 50
4. BASES AND DIMENSION......Page 53
5. MATRIX RANK AND LINEAR INDEPENDENCE......Page 55
6. ORTHONORMAL BASES AND PROJECTIONS......Page 60
7. PROJECTION MATRICES......Page 64
8. LINEAR TRANSFORMATIONS AND SYSTEMS OF LINEAREQUATIONS......Page 72
9. THE INTERSECTION AND SUM OF VECTOR SPACES......Page 79
10. CONVEX SETS......Page 82
PROBLEMS......Page 86
2. EIGENVALUES, EIGENVECTORS, AND EIGENSPACES......Page 96
3. SOME BASIC PROPERTIES OF EIGENV ALVESAND EIGENVECTORS......Page 100
5. CONTINUITY OF EIGENV ALVES AND EIGENPROJECTIONS......Page 114
6. EXTREMAL PROPERTIES OF EIGENV ALVES......Page 116
7. SOM .. : ADDITIONAL RESULTS CONCERNING EIGENVALUES......Page 123
2. THE SINGULAR VALUE DECOMPOSITION......Page 143
3. THE SPECTRAL DECOMPOSITION AND SQUARE ROOTMATRICES OF A SYMMETRIC MATRIX......Page 150
4. THE DIAGONALIZATION OF A SQUARE MATRIX......Page 156
5. THE JORDAN DECOMPOSITION......Page 159
6. THE SCHUR DECOMPOSITION......Page 161
7. THE SIMULTANEOUS DIAGONALIZATIONOF TWO S TRIC MATRICES......Page 166
8. MATRIX NORMS......Page 169
PROBLEMS......Page 174
1. INTRODUCTION......Page 182
2. 'I'HE MOORE-PENROSE GENERALIZED INVERSE......Page 183
3. SOME BASIC PROPERTIES OF THE MOORE-PENROSEINVERSE......Page 186
4. THE MOORE-PENROSE INVERSE OF A MATRIX PRODUCT......Page 192
5. THE MOORE-PENROSE INVERSE OF PARTITIONEDMATRICES......Page 197
6. THE MOORE-PENROSE INVERSE OF A SUM......Page 198
7. THE CONTINUITY OF THE MOORE-PENROSE INVERSE......Page 200
8. SOME OTHER GENERALIZED INVERSES......Page 202
9. COMPUTING GENERALIZED INVERSES......Page 209
PROBLEMS......Page 216
2. CONSISTENCY OF A SYSTEM OF EQUATIONS......Page 222
3. SOLUTIONS TO A CONSISTENT SYSTEM OF EQUATIONS......Page 225
4. HOMOGENEOUS SYSTEMS OF EQUATIONS......Page 231
5. LEAST SQUARES SOLUTIONS TO A SYSTEM OF LINEAREQUATIONS......Page 234
6. LEAST SQUARES ESTIMATION FOR LESS THANFULL RANK MODELS......Page 240
7. SYSTEMS OF LINEAR EQUATIONS AND THE SINGULARVALUE DECOMPOSITION......Page 245
8. SPARSE LINEAR SYSTEMS OF EQUATIONS......Page 247
PROBLEMS......Page 253
2. PARI'ITIONED MATRICES......Page 259
3. THE KRONECKER PRODUCT......Page 265
4. THE DIRECT SUM......Page 272
S. THE VEe OPERATOR......Page 273
6. THE HADAMARD PRODUCT......Page 278
7. THE COMMUTATION MATRIX......Page 288
8. SOME OTHER MATRICES ASSOCIATED WITH '1'011: VECOPERATOR......Page 295
9. NONNEGATIVE MATRICES......Page 300
10. CIRCULANT AND TOEPLITZ MATRICES......Page 312
11. HADAMARD AND VANDERMONDE MATRICES......Page 317
PROBLEMS......Page 321
2. MULTIV ARIABLE DIFFERENTIAL CALCULUS......Page 335
3. VECTOR AND MATRIX FUNCTIONS......Page 338
4. SOME USEFUL MATRIX DERIVATIVES......Page 344
5. DERIVATIVES OF FUNCTIONS OF PATTERNED MATRICES......Page 347
6. THE PERTURBATION METHOD......Page 349
7. MAXIMA AND MINIMA......Page 356
8. CONVEX AND CONCAVE FUNCTIONS......Page 361
9. THE METHOD OF LAGRANGE MULTIPLIERS......Page 365
PROBLEMS......Page 372
2. SOME RESULTS ON IDEMPOTENT MATRICES......Page 382
3. COCHRAN'S THEOREM......Page 386
4. DISTRIBUTION OF QUADRATIC FORMS IN NORMALVARIATES......Page 390
5. INDEPENDENCE OF QUADRATIC FORMS......Page 396
6. EXPECTED V ALVES OF QUADRATIC FORMS......Page 402
7. THE WISHART DISTRIBUTION......Page 410
PROBLEMS......Page 421
References......Page 428
Index......Page 433