- The set of all matrices is denoted by (or if the field is understood, or if ), or by is a vector space over where
- Vector space operations and axioms of
- Addition: (matrix addition)
- Commutative:
- Associative:
- Identity:
- Inverse:
- Scalar Multiplication: (scalar multiplication of a matrix)
- Distributive (vector (matrix) addition):
- Distributive (field addition):
- Compatible with field mul.:
- Identity:
- Addition: (matrix addition)
- Vector space properties of
- or
- and more…
- Matrix Multiplication operation between two matrices (represents a composition of linear transformations)
- Multiplication of two matrices and is defined if and only if the number of columns of is equal to the number of rows of .
- , where is the dot product of the th row of and the th column of
- (3.4.3) and
- (3.4.4) and
- Matrix–Vector Product: (it’s actually performing a linear transformation on a vector)
- Vector space operations and axioms of
Matrix
In this section:
- is a matrix
- is a set of vectors which are the rows of (equally, the columns of )
Row Echelon form (REF)
- (1.11.1) and are row equivalent
- (8.5.1) The non-zero rows of are a basis of
Reduced Row Echelon form (RREF)
- Uniqueness: Each matrix is row equivalent to one and only one reduced echelon matrix.
Elementary Row Operations
- is a -ordered elementary matrix by which is multiplied from the left (left-multiplication is a row operation)
- is the matrix obtained by applying to one of the elementary row operations represented by
- Every invertible matrix can be written as a product of elementary matrices
Row Operation | Elementary Matrix | ||
---|---|---|---|
Row Switching | The matrix obtained by switching rows and of | ||
Row Scaling | (where ) | The matrix obtained by multiplying row of by | |
Row Addition | The matrix obtained by adding times row to row of |
Row equivalence
- The following statements are equivalent:
- and are row equivalent
- (1.11.3)
- There exists an invertible matrix such that
- It is possible to transform into by a sequence of elementary row operations
- (q7.5.12)
- and are the same linear transformation with respect to different bases of the codomain
- Row equivalence is an equivalence relation on the set
- If and are row equivalent matrices, then:
- A given set of column vectors of is linearly independent if and only if the corresponding column vectors of are linearly independent.
- A given set of column vectors of forms a basis for the column space of if and only if the corresponding column vectors of form a basis for the column space of .
- (4.2.2) (for square matrices)
Fundamental Spaces
Row space
Column space
The following statements are equivalent:
- is consistent
Null space
- . (in the book it’s denoted by (!!!))
The following statements are equivalent:
- is orthogonal to (the rows of )
Left null space
Theorems
-
-
-
-
-
-
-
(9.8.7a)
-
(9.8.7b)
-
-
Bases for the Fundamental Spaces
Subspace | Dimension | Bases |
---|---|---|
- todo Let is linearly independent, then forms a basis for .
Rank
- (d8.5.4) The following are equal:
- (or )
- (notation used in the course)
- The number of linearly independent rows
- The number of linearly independent columns
- The number of the non-zero rows of
- The number of pivots in
- (q8.5.4)
Nullity
- (8.6.1)
Theorems
- (8.6.1) Rank–nullity theorem
- (q8.5.6)
-
- (i.e. does not have full row rank)
- (i.e. does not have full column rank)
- see also [[#Square Matrices#Rank|rank of square matrix]] and of [[#Invertibility#Properties|invariable]]
- Row equivalent matrices have the same rank
- todo
- (8.3.4a+8.6.1)
- is linearly dependent
Full Rank
- (q8.5.5) The following statements are equivalent:
- has full rank
- or
- is not rank deficient
- has either full column rank or full row rank
Full Column Rank
- The following statements are equivalent:
- has full column rank
- The columns of are linearly independent
- is injective (one-to-one, monomorphism)
- (i.e. spans )
- The matrix is invertible
- For every , the system has at most one solution
- is left-invertible (There exists a matrix such that )
- is left-cancellable (i.e. )
- has full row rank
Full Row Rank
- The following statements are equivalent:
- has full row rank
- (that is, A’s rows) is linearly independent
- is surjective (onto, epimorphism)
- has full column rank
- For every , the system is consistent
- Every in is a linear combination of the columns of
- (i.e. ‘s columns span )
- has a pivot position in every row
- The matrix is invertible
- is right-invertible (There exists a matrix such that )
- is right-cancellable (i.e. )
- (8.4.4) is linearly independent (where the vectors of are the coordinates vectors of any set of vectors )
Theorems
- If , and , and is linearly dep., then is also linearly dep. (by 7.5.1, 8.3.4)
Full Row-and-Column Rank
- The following statements are equivalent:
- is invertible square
- is a basis of
- is a maximal linearly independent set
- is a minimal spanning set of
- is linearly independent and spans
- has both a full row rank and a full column rank
- and has a full row rank
- and has a full column rank
- (8.4.5) and the transition matrix from some basis to is invertible
- (8.2.5) (in other words, every element of can be written in a unique way as a finite linear combination of elements of )
- (8.4.5) and the transition matrix from some basis to is invertible
Rank Deficiency
- The following statements are equivalent:
- is rank deficient
- The columns and rows of are linearly dependent
- is neither injective nor surjective
- is not invertible
- There exists a such that the system has more than one solution
- is neither left-invertible nor right-invertible
- has neither full row rank nor full column rank
Zero Rank
- The following statements are equivalent:
- has zero rank
- is the zero (null) matrix (of order )
- is the zero transformation
Transformation matrix
Transpose
- Notation: ,
- (3.2.4)
- (3.4.5)
Equivalence
- Two matrices and are equivalent if there exist invertible matrices and such that
- Two matrices and are equivalent if and only if they have the same rank
- Matrix equivalence is an equivalence relation on
- If and are row equivalent, then they are equivalent
- todo Matrix equivalent matrices represent the same map, with respect to appropriate pairs of bases.
Theorems
- todo If is matrix, then there exist invertible matrices and such that has the first diagonal entries equal to and the remaining entries equal to
Square Matrices
In this section:
- is a square matrix of order over a field
- is a basis of
- is a linear transformation
Theorems
- (by 9.3.7, 12.3.1, 12.3.2a, e2023a85q1a)
- (e2024a83q1)
- (see Nilpotent matrix)
- (see Exercises)
Invertibility
-
(3.10.6) The following statements are equivalent:
- is an invertible matrix
- can be expressed as a finite product of elementary matrices.
- There exists a such that
- There exists a such that
- There exists a such that , (in such case , and ) ()
- is row-equivalent to .
- is column-equivalent to .
- The columns of A are linearly independent.
- The rows of A are linearly independent.
- The columns of A span
- The rows of A span
- The columns of A is a basis
- The rows of A a basis
- is an invertible matrix
- (4.4.1, q10.7.7 for l.t.) The determinant of A is non-zero:
- (4.4.1, and q11.3.1) The number is not an eigenvalue of .
- (q8.5.8b) has a full rank:
- (10.5.1, and 9.6.2)
- The linear transformation mapping to is surjective; that is, the equation has at least one solution for each in .
- The linear transformation mapping to is injective; that is, the equation has at most one solution for each in .
- The linear transformation mapping to is bijective; that is, the equation has exactly one solution for each in . (
-
The general linear group of order over , denoted by , is the set of all invertible matrices over a field , together with the operation of matrix multiplication.
- is a group under matrix multiplication
-
The special linear group of order over , denoted by , is the subset of consisting of all matrices with determinant
Properties
- for invertible matrix
- (3.8.3)
- (left-cancellable)
- (right-cancellable)
- (3.8.4b)
- (3.8.4d) if , then is also invertible. (in such case )
- (q8.5.7a) for any matrix
- (q8.5.7b) for any matrix
- (3.8.3)
- if and are invertible, (in order )
- and are row equivalent
- (3.8.4c) is also invertible and
Theorems
- (4.5.2) are square matries, and , then:
- and are both invetible
- (q3.10.2) are invertible, if and only if, is invertible
- are square matries
- if is invertible and is singular, then is singular
Computing the Inverse of a Matrix (if it exists)
- 2x2 matrix:
- n×n matrix:
- Form the augmented matrix and put it into RREF.
- If the RREF has the form , then is invertible and .
- Otherwise, is singular.
Elementary matrix
- is called an elementary matrix if it can be obtained from an identity matrix by performing a single elementary row operation.
- Every elementary matrix is invertible, and the inverse is also an elementary matrix.
Rank
- rank of square matrix: let are square matrices of order , then:
- (q8.5.8a)
- (q10.5.3) Sylvester’s inequality
- . (from Sylvester’s inequality and Rank–nullity theorem)
- for invetible see [[#Invertibility#Properties]]
Determinant
- Notation: ,
- (4.3.1)
- (4.5.1) (Multiplicativity)
- (q4.3.3b) (Homogeneity)
- (by 4.5.1)
- (4.5.3)
- (4.5.4)
- (4.3.8) If is triangular, then
- Row Operations
- (4.3.6) If a multiple of one row of is added to another row to produce a matrix , then
- (4.3.2) If two rows of are interchanged to produce , then
- (4.3.3) if one row of is multiplied by to produce , then
- ()
- The following statements are equvivalent:
- is invertible
- (4.4.1)
- The following statements are equvivalent:
- is singular
- (4.4.1)
- (q11.3.1) is an eigenvalue of
- Zero determinant cases:
- (4.2.2) if has zero row/column, then
- (4.3.5) if has two equal rows (or colmuns), then
- (q4.4.4) If the sum of each row of is , then
- (q4.3.10) determinant of an odd dimension anti-symmetric matrix is zero
- (4.3.4) Let , where differ only in the th row, where the th row of is the sum of and ‘s th row, then (similar result for columns)
- (10.7.3) Similar matrices have the same determinant
Computing the Detrminant
- 2x2 matrix:
- 3x3 matrix:
- See also the Sarrus rule
- matrix:
- (Laplace) Cofactor Expansion:
- here is a constant, and this is called expansion along the th row, (similarly, we can expand along the th column, like )
- is the entry of the th row and th column of
- is the submatrix obtained by removing the th row and the th column of
- is minor of
- is cofactor of entry
- Triangular matrix:
- Gaussian elimination:
- Transform into an upper triangular matrix by a sequence of elementary row operations, where:
- Each row swap changes the sign of the determinant
- Each row multiplication by multiplies the determinant by
- Eigenvalues: (see q11.4.7)
Trace
Characteristic polynomial
Properties:
- The characteristic polynomial is a monic polynomial of degree
- (q11.5.4) The coefficient of is
- (q11.4.6) The coefficient of equals
- (q11.4.7) The free coefficient equals
- The characteristic polynomial of is
Eigenvalues
Equivalent definitions of eigenvalue.
- is an eigenvalue of
- (d11.3.1) There exists a non-zero vector such that .
- (in such case, is called an eigenvector of that related to the eigenvalue )
- is singular
- has nontrivial solutions, i.e.
- (11.4.1) The characteristic equation
- is a root of the characteristic equation
- is an eigenvalue of
Theorems:
-
Similar matrices have the same eigenvalues (11.3.3), the same characteristic polynomial (11.4.3), and the same algebraic multiplicities of eigenvalues (todo )
-
The sum of eigenvalues of equals to todo
-
The product of eigenvalues of equals to todo
-
(q11.3.2a) if is an eigenvalue of , then for each , is an eigenvalue of
-
(q11.3.2b) if is an eigenvalue of , then , is a eigenvalue of . (for each natural )
-
The eigenvalues of a triangular matrix equal the values on its diagonal.
-
(q11.3.5b) The eigenvalues of diagonal matrix , are
-
(q11.3.5b, q11.3.6) The eigenvalues of diagonalizable matrix (that similar to ) are
-
if for some natural , then has at most the eigenvalues (todo by q11.2.4)
-
(11.2.6) has at most distinct eigenvalues
-
(4.4.1+q11.3.1+left-multiple with A) if is invertible, then is an eigenvalue of , if and only if, is an eigenvalue of . (with the same eigenvectors)
-
(11.2.4) Eigenvectors corresponding to distinct eigenvalues are linearly independent
Eigenvectors
Definitions of eigenvector. The following statements are equivalent:
- is an eigenvector of that related to
- (d11.3.1) is non-zero vector in such that
- (11.3.2) is an eigenvector of that related to
Eigenbasis
- An eigenbasis of , is a basis of consisting of eigenvectors of .
- if are distinct eigenvalues of a matrix with corresponding eigenspaces, spanned by bases respectively, then the union is linearly independent set of eigenvectors of . thereby if the size of the union is , then is also an eigenbasis of
Eigenspace
Definitions of the eigenspace of associated with its eigenvalue .
Algebraic & geometric multiplicity
-
is an eigenvalue of
- (d11.5.2) The algebraic multiplicity of is:
- the multiplicity of as a root of the characteristic equation
- the highest power of that divides the characteristic polynomial of
- (q11.5.2) The geometric multiplicity of , is:
- the dimension of the eigenspace corresponding to ,
- todo if is diagonalizable, then the geometric and algebraic multiplicity of is the number that appears in the diagonalization of
- (11.5.3, q11.5.3) the geometric multiplicity the algebraic multiplicity
- (d11.5.2) The algebraic multiplicity of is:
-
finding the algebraic multiplicity of eigenvaluetodo
-
finding the geometric multiplicity of eigenvaluetodo
Procedure: Finding Eigenvalues and Eigenvectors
- First, find the eigenvalues of by solving the characteristic equation .
- For each , find the basic eigenvectors by finding the basic solutions to .
To verify your work, make sure that for each and associated eigenvector .
Similarity
Similarity is an equivalence relation on the space of square matrices.
and are square matrices
Definitions of similarity. The following statements are equivalent:
- and are similar
- (d10.7.1) There exists an invertible matrix such that
- being the change of basis matrix
- (10.7.2) and represent the same linear transformation (possibly different bases)
Theorems:
- (q10.7.8) zero matrix is similar only to itself. identify matrix is similar only to itself.
- todo to show that two matrices are similar, show that are similar to the same diagonal matrix
- todo let and are diagonalizable, and they both have the same eigenvalues, then they’re similar (because similarity is transitive)
- todo let and are diagonalizable, and they both have the same characteristic polynomial, then they’re similar (because similarity is transitive)
Properties:
- If the matrices and are similar, then
Triangular matrix
Properties:
- if is a triangular matrix, then
- (4.3.8)
- the eigenvalues of are
- each eigenvalue occurs exactly k times on the diagonal, where k is its algebraic multiplicity
- the characteristic polynomial of is
Diagonal matrix
Diagonal equivalent definitions.
- is a diagonal matrix
- is both upper- and lower-triangular
- see q11.1.1
Properties:
- Addition:
- Multiplication
- Powers of a matrix
- . in such case
- A diagonal matrix is symmetric.
- the rank of a diagonal matrix is simply the number of nonzero entries (the eigenvalues)
Diagonalizable
-
Diagonalizable definition. The following statements are equivalent.
- is a diagonalizable matrix
- (d11.3.4) There exists an invertible matrix , such that is a diagonal matrix
- (d11.3.4) is similar to a diagonal matrix
- (11.3.7) has linearly independent eigenvectors. (they are ‘s columns. that are , and is a eigenvector of that’s related to the eigenvalue . and )
- (q11.3.7) has a basis that consists of eigenvectors of
- (11.3.5) is diagonalizable
- (q11.4.10) is diagonalizable
- (11.5.4’)
- (i) the characteristic polynomial factors completely into linear factors. and
- (ii) the geometric multiplicity of every eigenvalue is equal to the algebraic multiplicity
- todo The sum of the dimensions of the eigenspaces equals to
-
(11.3.6) if has distinct eigenvalues, then diagonalizable
-
todo , ( is a diagonal matrix)
Symmetric matrix
- (d3.2.6)
- (q3.2.3)
- (q3.2.4)
- (q3.2.4) sum of symmetric matries is symmetric matrix
- (q3.4.6) and are symmetric matries, then
Antisymmetric matrix
- (q4.3.10)
- if is anti-symmetric, and is odd, then
Transition Matrix
- (c8.4, 10.6.1) Let and be bases of a vector space , and be a linear transformation.
- The unique matrix such that is called the transition matrix from to , and is equal to
- is invertible, and its inverse is the transition matrix from to .
Also known as change-of-basis matrix, or change-of-coordinate matrix
Some authors (as Lay and Anton) call as the transition matrix from to , and vice versa for
Finding the transition matrix from an old basis to a new basis
- Form the partitioned matrix in which the basis vectors (or coordinate vectors) are in column form.
- Use elementary row operations to reduce the matrix in Step 1 to RREF.
- The resulting matrix will be where is an identity matrix.
- Extract the matrix on the right side of the matrix obtained in Step 3.
!!! todo check it - transition matrix from to is , therefore
Transition matrix from a basis B to the standard basis
- if , then is the transition matrix from to the standard basis
Orthogonality
- Orthogonal matrixtodonot-in-course
Commuting
- (d3.6.2) and are said to commute if
- (3.6.3)
- and share the same independent eigenvectors if and only if .
Nilpotent matrix
- is called a nilpotent matrix if for some natural . The smallest such is called the index of nilpotency of .
Scalar matrix
- is called a scalar matrix if for some scalar .
Projection matrix
- A square matrix is called a projection matrix if (see Idempotent and Projection)