Determinant: det(A)=∑(−1)i+jaijdet(Aij) for 2x2: det(A)=ad−bc
Systems of Linear Equations
Row Echelon Form (REF): Leading entry of each nonzero row after the first occurs to the right of the leading entry of the previous row.
Reduced Row Echelon Form (RREF): The leading entry in each nonzero row is 1, and is the only nonzero entry in its column.
Gauss-Jordan Elimination: Process to reduce a matrix to RREF.
Linear Independence and Span
Linear Independence: A set of vectors {v1,v2,…,vn} is linearly independent if no vector in the set can be written as a linear combination of the others. This is equivalent to the equation c1v1+c2v2+…+cnvn=0 having only the trivial solution (c1=c2=…=cn=0).
Span: The span of a set of vectors is the set of all possible linear combinations of those vectors. It represents a vector space or subspace that those vectors can generate.
Vector Spaces
Subspaces: Must contain the zero vector, be closed under addition and scalar multiplication.
Basis: A linearly independent set of vectors that spans the space.
Dimension: Number of vectors in a basis for the space.
Symmetric and Triangular Matrices
Symmetric Matrix: A is symmetric if A=AT. Eigenvectors of symmetric matrices corresponding to different eigenvalues are orthogonal.
Triangular Matrix: A matrix is upper triangular if all the entries below the main diagonal are zero. Similarly, it’s lower triangular if all entries above the main diagonal are zero. The determinant of a triangular matrix is the product of its diagonal entries.
Cofactor Expansion
Cofactor: The cofactor Cij of an element aij in a matrix A is given by Cij=(−1)i+jdet(Mij), where Mij is the submatrix obtained by removing the i-th row and j-th column from A.
Determinant via Cofactor Expansion: det(A)=ai1Ci1+ai2Ci2+…+ainCin for any row i, or similarly for any column.
Properties of Inverses and Determinants
Inverse Properties: The inverse of a product of matrices is the product of their inverses in reverse order: (AB)−1=B−1A−1. The determinant of an inverse matrix is the reciprocal of the determinant of the matrix: det(A−1)=1/det(A).
Determinant Properties: The determinant of a product of matrices equals the product of their determinants: det(AB)=det(A)det(B).
Determinants and Volume
Geometric Interpretation: The determinant of a matrix represents the scaling factor of the linear transformation described by the matrix and the orientation (positive for same orientation as the original space, negative for opposite).
Volume Interpretation: The absolute value of the determinant of a matrix describing a transformation gives the volume of the parallelepiped spanned by the column vectors of the matrix.
Eigenvalues and Eigenvectors
For matrix A, vector v, and scalar λ, if Av=λv, then λ is an eigenvalue and v is an eigenvector.
Characteristic polynomial: p(λ)=det(A−λI)
Diagonalisation: A=PDP−1 where D is diagonal and columns of P are eigenvectors.
Fundamental Theorem of Invertible Matrices and Rank Theorem
Fundamental Theorem of Invertible Matrices: A matrix is invertible if and only if it is row equivalent to an identity matrix, its determinant is not zero, it has full rank, and there exists a matrix B such that AB=BA=I.
Rank Theorem: For any matrix A, the rank of A plus the nullity of A equals the number of columns of A (rank(A)+nullity(A)=n). This highlights the relationship between the dimensions of the column space and the null space.
Special Topics
Determinant Properties:
Swap rows reverses sign.
Factor of k out of a row multiplies det by k.
Det of identity is 1.
Cramer’s Rule: Solution of Ax=b is xi=det(A)det(Ai) where Ai is A with column i replaced by b.