Linear Algebra: Theory, Intuition, Code

This document was uploaded by one of our users. The uploader already confirmed that they had the permission to publish it. If you are author/publisher or own the copyright of this documents, please report to us by using this DMCA report form.

Simply click on the Download Book button.

Yes, Book downloads on Ebookily are 100% Free.

Sometimes the book is free on Amazon As well, so go ahead and hit "Search on Amazon"

Linear algebra is perhaps the most important branch of mathematics for computational sciences, including machine learning, AI, data science, statistics, simulations, computer graphics, multivariate analyses, matrix decompositions, signal processing, and so on. The way linear algebra is presented in traditional textbooks is different from how professionals use linear algebra in computers to solve real-world applications in machine learning, data science, statistics, and signal processing. For example, the "determinant" of a matrix is important for linear algebra theory, but should you actually use the determinant in practical applications? The answer may surprise you! If you are interested in learning the mathematical concepts linear algebra and matrix analysis, but also want to apply those concepts to data analyses on computers (e.g., statistics or signal processing), then this book is for you. You'll see all the math concepts implemented in MATLAB and in Python. Unique aspects of this book: - Clear and comprehensible explanations of concepts and theories in linear algebra. - Several distinct explanations of the same ideas, which is a proven technique for learning. - Visualization using graphs, which strengthens the geometric intuition of linear algebra. - Implementations in MATLAB and Python. Com'on, in the real world, you never solve math problems by hand! You need to know how to implement math in software! - Beginner to intermediate topics, including vectors, matrix multiplications, least-squares projections, eigendecomposition, and singular-value decomposition. - Strong focus on modern applications-oriented aspects of linear algebra and matrix analysis. - Intuitive visual explanations of diagonalization, eigenvalues and eigenvectors, and singular value decomposition. - Codes (MATLAB and Python) are provided to help you understand and apply linear algebra concepts on computers. - A combination of hand-solved exercises and more advanced code challenges. Math is not a spectator sport!

Author(s): Mike X Cohen
Publisher: Sincxpress BV
Year: 2021

Language: English
Pages: 582

Cover 1
0 Front Matter
0.1 Front matter
0.2 Dedication
0.3 Forward
1 Introduction
1.1 What is linear algebra and why learn it?
1.2 About this book
1.3 Prerequisites
1.4 Exercises and code challenges
1.5 Online and other resources
2 Vectors
2.1 Scalars
2.2 Vectors: geometry and algebra
2.3 Transpose operation
2.4 Vector addition and subtraction
2.5 Vector-scalar multiplication
2.6 Exercises
2.7 Answers
2.8 Code challenges
2.9 Code solutions
3 Vector multiplication
3.1 Vector dot product: Algebra
3.2 Dot product properties
3.3 Vector dot product: Geometry
3.4 Algebra and geometry
3.5 Linear weighted combination
3.6 The outer product
3.7 Hadamard multiplication
3.8 Cross product
3.9 Unit vectors
3.10 Exercises
3.11 Answers
3.12 Code challenges
3.13 Code solutions
4 Vector spaces
4.1 Dimensions and fields
4.2 Vector spaces
4.3 Subspaces and ambient spaces
4.4 Subsets
4.5 Span
4.6 Linear independence
4.7 Basis
4.8 Exercises
4.9 Answers
5 Matrices
5.1 Interpretations and uses of matrices
5.2 Matrix terms and notation
5.3 Matrix dimensionalities
5.4 The transpose operation
5.5 Matrix zoology
5.6 Matrix addition and subtraction
5.7 Scalar-matrix mult.
5.8 "Shifting" a matrix
5.9 Diagonal and trace
5.10 Exercises
5.11 Answers
5.12 Code challenges
5.13 Code solutions
6 Matrix multiplication
6.1 "Standard" multiplication
6.2 Multiplication and eqns.
6.3 Multiplication with diagonals
6.4 LIVE EVIL
6.5 Matrix-vector multiplication
6.6 Creating symmetric matrices
6.7 Multiply symmetric matrices
6.8 Hadamard multiplication
6.9 Frobenius dot product
6.10 Matrix norms
6.11 What about matrix division?
6.12 Exercises
6.13 Answers
6.14 Code challenges
6.15 Code solutions
7 Rank
7.1 Six things about matrix rank
7.2 Interpretations of matrix rank
7.3 Computing matrix rank
7.4 Rank and scalar multiplication
7.5 Rank of added matrices
7.6 Rank of multiplied matrices
7.7 Rank of A, AT, ATA, and AAT
7.8 Rank of random matrices
7.9 Boosting rank by "shifting"
7.10 Rank difficulties
7.11 Rank and span
7.12 Exercises
7.13 Answers
7.14 Code challenges
7.15 Code solutions
8 Matrix spaces
8.1 Column space of a matrix
8.2 Column space: A and AAT
8.3 Determining whether v ∈ C(A)
8.4 Row space of a matrix
8.5 Row spaces of A and ATA
8.6 Null space of a matrix
8.7 Geometry of the null space
8.8 Orthogonal subspaces
8.9 Matrix space orthogonalities
8.10 Dimensionalities of matrix spaces
8.11 More on Ax = b and Ay = 0
8.12 Exercises
8.13 Answers
8.14 Code challenges
8.15 Code solutions
9 Complex numbers 239
9.1 Complex numbers and C
9.2 What are complex numbers?
9.3 The complex conjugate
9.4 Complex arithmetic
9.5 Complex dot product
9.6 Special complex matrices
9.7 Exercises
9.8 Answers
9.9 Code challenges
9.10 Code solutions
10 Systems of equations
10.1 Algebra and geometry of eqns.
10.2 From systems to matrices
10.3 Row reduction
10.4 Gaussian elimination
10.5 Row-reduced echelon form
10.6 Gauss-Jordan elimination
10.7 Possibilities for solutions
10.8 Matrix spaces, row reduction
10.9 Exercises
10.10 Answers
10.11 Coding challenges
10.12 Code solutions
11 Determinant
11.1 Features of determinants
11.2 Determinant of a 2×2 matrix
11.3 The characteristic polynomial
11.4 3×3 matrix determinant
11.5 The full procedure
11.6 ∆ of triangles
11.7 Determinant and row reduction
11.8 ∆ and scalar multiplication
11.9 Theory vs practice
11.10 Exercises
11.11 Answers
11.12 Code challenges
11.13 Code solutions
12 Matrix inverse
12.1 Concepts and applications
12.2 Inverse of a diagonal matrix
12.3 Inverse of a 2×2 matrix
12.4 The MCA algorithm
12.5 Inverse via row reduction
12.6 Left inverse
12.7 Right inverse
12.8 The pseudoinverse, part 1
12.9 Exercises
12.10 Answers
12.11 Code challenges
12.12 Code solutions
13 Projections
13.1 Projections in R2
13.2 Projections in RN
13.3 Orth and par vect comps
13.4 Orthogonal matrices
13.5 Orthogonalization via GS
13.6 QR decomposition
13.7 Inverse via QR
13.8 Exercises
13.9 Answers
13.10 Code challenges
13.11 Code solutions
14 Least-squares
14.1 Introduction
14.2 5 steps of model-fitting
14.3 Terminology
14.4 Least-squares via left inverse
14.5 Least-squares via projection
14.6 Least-squares via row-reduction
14.7 Predictions and residuals
14.8 Least-squares example
14.9 Code challenges
14.10 Code solutions
15 Eigendecomposition
15.1 Eigenwhatnow?
15.2 Finding eigenvalues 421
15.3 Finding eigenvectors
15.4 Diagonalization
15.5 Conditions for diagonalization
15.6 Distinct, repeated eigenvalues
15.7 Complex solutions
15.8 Symmetric matrices
15.9 Eigenvalues singular matrices
15.10 Eigenlayers of a matrix
15.11 Matrix powers and inverse
15.12 Generalized eigendecomposition
15.13 Exercises
15.14 Answers
15.15 Code challenges
15.16 Code solutions
16 The SVD
16.1 Singular value decomposition
16.2 Computing the SVD
16.3 Singular values and eigenvalues
16.4 SVD of a symmetric matrix
16.5 SVD and the four subspaces
16.6 SVD and matrix rank
16.7 SVD spectral theory
16.8 Low-rank approximations
16.9 Normalizing singular values
16.10 Condition number of a matrix
16.11 SVD and the matrix inverse
16.12 MP Pseudoinverse, part 2
16.13 Code challenges
16.14 Code solutions
17 Quadratic form
17.1 Algebraic perspective
17.2 Geometric perspective
17.3 The normalized quadratic form
17.4 Evecs and the qf surface
17.5 Matrix definiteness
17.6 The definiteness of ATA
17.7 λ and definiteness
17.8 Code challenges
17.9 Code solutions
18 Covariance matrices
18.1 Correlation
18.2 Variance and standard deviation
18.3 Covariance
18.4 Correlation coefficient
18.5 Covariance matrices
18.6 Correlation to covariance
18.7 Code challenges
18.8 Code solutions
19 PCA
19.1 PCA: interps and apps
19.2 How to perform a PCA
19.3 The algebra of PCA
19.4 Regularization
19.5 Is PCA always the best?
19.6 Code challenges
19.7 Code solutions
20 The end.
20.1 The end... of the beginning!
20.2 Thanks!
Index