This textbook develops the essential tools of linear algebra, with the goal of imparting technique alongside contextual understanding. Applications go hand-in-hand with theory, each reinforcing and explaining the other. This approach encourages students to develop not only the technical proficiency needed to go on to further study, but an appreciation for when, why, and how the tools of linear algebra can be used across modern applied mathematics. Providing an extensive treatment of essential topics such as Gaussian elimination, inner products and norms, and eigenvalues and singular values, this text can be used for an in-depth first course, or an application-driven second course in linear algebra. In this second edition, applications have been updated and expanded to include numerical methods, dynamical systems, data analysis, and signal processing, while the pedagogical flow of the core material has been improved. Throughout, the text emphasizes the conceptual connections between each application and the underlying linear algebraic techniques, thereby enabling students not only to learn how to apply the mathematical tools in routine contexts, but also to understand what is required to adapt to unusual or emerging problems. No previous knowledge of linear algebra is needed to approach this text, with single-variable calculus as the only formal prerequisite. However, the reader will need to draw upon some mathematical maturity to engage in the increasing abstraction inherent to the subject. Once equipped with the main tools and concepts from this book, students will be prepared for further study in differential equations, numerical analysis, data science and statistics, and a broad range of applications. The first author’s text, Introduction to Partial Differential Equations, is an ideal companion volume, forming a natural extension of the linear mathematical methods developed here.
Author(s): Peter J. Olver; Chehrzad Shakiban
Publisher: Springer
Year: 2018
Language: English
Pages: 679
Preface
Syllabi and Prerequisites
Survey of Topics
Course Outlines
Comments on Individual Chapters
Changes from the First Edition
Exercises and Software
Conventions and Notations
History and Biography
Some Final Remarks
Acknowledgments
Table of Contents
Chapter 1: Linear Algebraic Systems
1.1 Solution of Linear Systems
1.2 Matrices and Vectors
Matrix Arithmetic
1.3 Gaussian Elimination—Regular Case
Elementary Matrices
The LU Factorization
Forward and Back Substitution
1.4 Pivoting and Permutations
Permutations and Permutation Matrices
The Permuted LU Factorization
1.5 Matrix Inverses
Gauss–Jordan Elimination
Solving Linear Systems with the Inverse
The LDV Factorization
1.6 Transposes and Symmetric Matrices
Factorization of Symmetric Matrices
1.7 Practical Linear Algebra
Tridiagonal Matrices
Pivoting Strategies
1.8 General Linear Systems
Homogeneous Systems
1.9 Determinants
Chapter 2: Vector Spaces and Bases
2.1 Real Vector Spaces
2.2 Subspaces
2.3 Span and Linear Independence
Linear Independence and Dependence
2.4 Basis and Dimension
2.5 The Fundamental Matrix Subspaces
Kernel and Image
The Superposition Principle
Adjoint Systems, Cokernel, and Coimage
The Fundamental Theorem of Linear Algebra
2.6 Graphs and Digraphs
Chapter 3: Inner Products and Norms
3.1 Inner Products
Inner Products on Function Spaces
3.2 Inequalities
The Cauchy–Schwarz Inequality
Orthogonal Vectors
The Triangle Inequality
3.3 Norms
Unit Vectors
Equivalence of Norms
Matrix Norms
3.4 Positive Definite Matrices
Gram Matrices
3.5 Completing the Square
The Cholesky Factorization
3.6 Complex Vector Spaces
Complex Numbers
Complex Vector Spaces and Inner Products
Chapter 4: Orthogonality
4.1 Orthogonal and Orthonormal Bases
Computations in Orthogonal Bases
4.2 The Gram–Schmidt Process
Modifications of the Gram–Schmidt Process
4.3 Orthogonal Matrices
The QR Factorization
Ill-Conditioned Systems and Householder’s Method
4.4 Orthogonal Projections and Orthogonal Subspaces
Orthogonal Projection
Orthogonal Subspaces
Orthogonality of the Fundamental Matrix Subspaces and the Fredholm Alternative
4.5 Orthogonal Polynomials
The Legendre Polynomials
Other Systems of Orthogonal Polynomials
Chapter 5: Minimization and Least Squares
5.1 Minimization Problems
Equilibrium Mechanics
Solution of Equations
The Closest Point
5.2 Minimization of Quadratic Functions
5.3 The Closest Point
5.4 Least Squares
5.5 Data Fitting and Interpolation
Polynomial Approximation and Interpolation
Approximation and Interpolation by General Functions
Least Squares Approximation in Function Spaces
Orthogonal Polynomials and Least Squares
Splines
5.6 Discrete Fourier Analysis and the Fast Fourier Transform
Compression and Denoising
The Fast Fourier Transform
Chapter 6: Equilibrium
6.1 Springs and Masses
Positive Definiteness and the Minimization Principle
6.2 Electrical Networks
Batteries, Power, and the Electrical–Mechanical Correspondence
6.3 Structures
Chapter 7: Linearity
7.1 Linear Functions
Linear Operators
The Space of Linear Functions
Dual Spaces
Composition
Inverses
7.2 Linear Transformations
Change of Basis
7.3 Affine Transformations and Isometries
Isometry
7.4 Linear Systems
The Superposition Principle
Inhomogeneous Systems
Superposition Principles for Inhomogeneous Systems
Complex Solutions to Real Systems
7.5 Adjoints, Positive Definite Operators, and Minimization Principles
Self-Adjoint and Positive Definite Linear Functions
Minimization
Chapter 8: Eigenvalues and Singular Values
8.1 Linear Dynamical Systems
Scalar Ordinary Differential Equations
First Order Dynamical Systems
8.2 Eigenvalues and Eigenvectors
Basic Properties of Eigenvalues
The Gershgorin Circle Theorem
8.3 Eigenvector Bases
Diagonalization
8.4 Invariant Subspaces
8.5 Eigenvalues of Symmetric Matrices
The Spectral Theorem
Optimization Principles for Eigenvalues of Symmetric Matrices
8.6 Incomplete Matrices
The Schur Decomposition
The Jordan Canonical Form
8.7 Singular Values
The Pseudoinverse
The Euclidean Matrix Norm
Condition Number and Rank
Spectral Graph Theory
8.8 Principal Component Analysis
Variance and Covariance
The Principal Components
Chapter 9: Iteration
9.1 Linear Iterative Systems
Scalar Systems
Powers of Matrices
Diagonalization and Iteration
9.2 Stability
Spectral Radius
Fixed Points
Matrix Norms and Convergence
9.3 Markov Processes
9.4 Iterative Solution of Linear Algebraic Systems
The Jacobi Method
The Gauss–Seidel Method
Successive Over-Relaxation
9.5 Numerical Computation of Eigenvalues
The Power Method
The QR Algorithm
Tridiagonalization
9.6 Krylov Subspace Methods
Krylov Subspaces
Arnoldi Iteration
The Full Orthogonalization Method
The Conjugate Gradient Method
The Generalized Minimal Residual Method
9.7 Wavelets
The Haar Wavelets
Modern Wavelets
Solving the Dilation Equation
Chapter 10: Dynamics
10.1 Basic Solution Techniques
The Phase Plane
Existence and Uniqueness
Complete Systems
The General Case
10.2 Stability of Linear Systems
10.3 Two-Dimensional Systems
Distinct Real Eigenvalues
Complex Conjugate Eigenvalues
Incomplete Double Real Eigenvalue
Complete Double Real Eigenvalue
10.4 Matrix Exponentials
Applications in Geometry
Invariant Subspaces and Linear Dynamical Systems
Inhomogeneous Linear Systems
10.5 Dynamics of Structures
Stable Structures
Unstable Structures
Systems with Differing Masses
Friction and Damping
10.6 Forcing and Resonance
Electrical Circuits
Forcing and Resonance in Systems
References
Symbol Index
Subject Index