is matrix

Arithmetic

Matrix addition & multiplication

Definitions:

  • Matrix addition - where
  • Matrix multiplication - , where is the dot product of the th row of and the th column of
    • (3.4.3) and
    • (3.4.4) and

Properties:

  • Commutative (addition)
  • Commutative (multiplication) see Commuting
  • Associative (addition)
  • Associative (multiplication)
  • distributive (left)
  • distributive (right)

Matrix–Vector Product

Row Echelon form (REF)

  • (1.11.1) is row equivalence to
  • (8.5.1) The non-zero rows of are a basis for .

Reduced Row Echelon form (RREF)

  • Uniqueness: Each matrix is row equivalent to one and only one reduced echelon matrix.

Row equivalence

Row equivalence is an equivalence relation

Equivalent Definitions:

  • and are row equivalent
  • It is possible to transform into by a sequence of elementary row operations
  • (q7.5.12)
  • There exists an invertible matrix such that .

Properties: If and are row equivalent matrices, then:

  • A given set of column vectors of is linearly independent if and only if the corresponding column vectors of are linearly independent.
  • A given set of column vectors of forms a basis for the column space of if and only if the corresponding column vectors of form a basis for the column space of .
  • and have the same rank
  • (4.2.2)

Fundamental Spaces

Row space

Definitions:

Theorems:

Column space

Theorems:

  • (9.8.7a)
  • (9.8.7c)

Null space

Definitions:

  • . (in the book it’s notated as (!!) )

Theorems:

  • (9.8.7b)

Rank

Notation

Definitions:

  • (d8.5.4)
  • is the number of linearly independent rows
  • is the number of linearly independent columns
  • is the number of the non-zero rows of
  • is the number of pivots in

Theorems:

  • (q8.5.4)

  • (q8.5.5)

    • if then has full rank.
      • if then has full column rank. ( is injective)
      • if then has full row rank. ( is surjective) (A’s rows are linearly indepndenttodo )
    • Otherwise if , then has rank deficient.
  • (q8.5.6)

  • if is invertible matrix, then

    • (q8.5.7a) for any matrix
    • (q8.5.7b) for any matrix
  • (8.6.1) Rank–nullity theorem

  • Row equivalent matrices have the same ranktodo

  • todo

  • (8.3.4a+8.6.1)

  • rank of square matrix: let are square matrices of order , then:

    • (q8.5.8a)
    • (q10.5.3) Sylvester’s inequality
    • . (from Sylvester’s inequality and Rank–nullity theorem)

Nullity

  • (8.6.1)

Transformation matrix

Matrix Representations of Linear Transformation

  • (d10.1.1) , and and are bases of and . (respectively)

Transpose

  • Notation: ,
  • (3.2.4)
  • (3.4.5)

Equivalence

This term is beyond the scope of the course.

Matrix equivalence is an equivalence relation on the space of rectangular matrices.

Two matrices and are called equivalent if for some matrix and matrix .

Equivalent matrices represent the same linear transformation under two different choices of a pair of bases of and , with and being the change of basis matrices in and respectively.

  • are old bases of (respectively)
  • are new bases of (respectively)
  • is change-of-basis invertible matrix of from to
  • is change-of-basis invertible matrix of from to
  • is the transformation matrix by the old bases
  • is the transformation matrix by the new bases

Theorems

  • The following statements are equivalent:

    • For each in , the equation has a solution
    • Each in is a linear combination of the columns of
    • The columns of span
    • has a pivot postion in every row
  • The following statements are equivalent:

    • The equation has a unique least-squares for each in
    • The columns of are linearly indepndent
    • The matrix is invertible

Square Matrices

  • Let  be an  square matrix.
  • where is a basis of

theorems:

  • (by 9.3.7, 12.3.1, 12.3.2a, e2023a85q1a)

Invertibility

  • Theorem 3.10.6: Let be a -ordered square matrix over a field . The following statements are equivalent:
    • is an invertible matrix
    • can be expressed as a finite product of elementary matrices.
    • There exists a such that
    • There exists a such that
    • There exists a such that , (in such case , and ) ()
    • is row-equivalent to .
    • is column-equivalent to .
    • The columns of A are linearly independent.
    • The rows of A are linearly independent.
    • The columns of A span
    • The rows of A span
    • The columns of A is a basis
    • The rows of A a basis
    • is an invertible matrix (in such case ) (3.8.4b)
    • (4.4.1, q10.7.7 for l.t.) The determinant of A is non-zero:
    • (4.4.1, and q11.3.1) The number is not an eigenvalue of .
    • (q8.5.8b) has a full rank:
    • (10.5.1, and 9.6.2)
    • The linear transformation mapping  to  is surjective; that is, the equation  has at least one solution for each  in .
    • The linear transformation mapping  to  is injective; that is, the equation  has at most one solution for each  in .
    • The linear transformation mapping  to  is bijective; that is, the equation  has exactly one solution for each  in . (

Properties:

  • for invertible matrix
    • (3.8.3)
      • (left-cancellable)
      • (right-cancellable)
    • (3.8.4d) if , then is also invertible. (in such case )
    • (q8.5.7a) for any matrix
    • (q8.5.7b) for any matrix
  • if and are invertible, (in order )
    • and are row equivalent
    • (3.8.4c) is also invertible. and

Theorems:

  • (4.5.2) are square matries, and , then:
    • and are both invetible
  • (q3.10.2) are invertible, if and only if, is invertible
  • are square matries
    • if is invertible and is singular, then is singular

Procedure: determine whether a square matrix A is invertible and, if so, find A^{−1}:

  • Form the augmented matrix and put it into RREF.
  • If the RREF has the form , then is invertible and .
  • else, if the matrix in the left half of the RREF is not , then is singular.

WolframAlpha inverse [matrix]

Elementary matrix

  • is called an elementary matrix if it can be obtained from an identity matrix by performing a single elementary row operation.
  • Every elementary matrix is invertible, and the inverse is also an elementary matrix.

Determinant

  • Notation:

  • Cofactor Expansion:

    • is the entry of the th row and jth column of 
    • is the submatrix obtained by removing the th row and the th column of
    • Here is variable and is some consonent, but the opposite is also possible.
    • is minor of .
    • is cofactor of entry
  • Theorems:

    • (4.3.1)
    • (4.2.2) if has zero row/column, then
    • (4.3.5) if has two equal rows (or colmuns), then
    • (4.5.1) Multiplicative Property:
    • (by 4.5.1)
    • (4.5.3)
    • (4.5.4)
    • (4.4.1)
    • (q4.4.4) if the sum of each of s rows is zero, then
    • (4.3.8) if is a triangular matrix, then
    • (10.7.3) similar matrices have the same determinant
    • (q11.3.1) , if and only if, is eigenvalue of
    • Row Operations
      • (4.3.6) If a multiple of one row of is added to another row to produce a matrix , then
      • (4.3.2) If two rows of are interchanged to produce , then
      • (4.3.3) if one row of is multiplied by to produce , then
    • (q4.3.3b) Homogeneity:
    • is equal to the product of its eigenvalues (see q11.4.7)
    • (q4.3.10) determinant of an odd dimension anti-symmetric matrix is zero
    • (4.3.4) Let A, B, and C be n x n matrices that differ only in a single row, say the rth, and assume that the rth row of C can be obtained by adding corresponding entries in the rth rows of A and B. Then The same result holds for columns.

Procedure: computing the detrminant

  1. convert into an upper triangular matrix via row operations of switching rows, and adding multiplication of another row. (but NOT scaling of row by scalar)
  • Note: if during the row operations we find zero-row, then
  1. let be the number of times two rows are switched

Trace

  • is the sum of its eigenvaluestodo
  • (10.7.6)
  • (10.7.5) similar matrices have the same trace
  • todo
  • todo

Characteristic polynomial

Properties:

  • The characteristic polynomial is a monic polynomial of degree
  • (q11.5.4) The coefficient of is
  • (q11.4.6) The coefficient of equals
  • (q11.4.7) The free coefficient equals
  • The characteristic polynomial of is

Eigenvalues

Equivalent definitions of eigenvalue.

  • is an eigenvalue of
  • (d11.3.1) There exists a non-zero vector such that .
    • (in such case, is called an eigenvector of that related to the eigenvalue )
  • is singular
  • has nontrivial solutions, i.e.
  • (11.4.1) The characteristic equation
  • is a root of the characteristic equation
  • is an eigenvalue of

Theorems:

  • Similar matrices have the same eigenvalues (11.3.3), the same characteristic polynomial (11.4.3), and the same algebraic multiplicities of eigenvalues (todo )
  • The sum of eigenvalues of equals to todo
  • The product of eigenvalues of equals to todo
  • (q11.3.2a) if is an eigenvalue of , then for each , is an eigenvalue of
  • (q11.3.2b) if is an eigenvalue of , then , is a eigenvalue of . (for each natural )
  • The eigenvalues of a triangular matrix equal the values on its diagonal.
  • (q11.3.5b) The eigenvalues of diagonal matrix , are
  • (q11.3.5b, q11.3.6) The eigenvalues of diagonalizable matrix (that similar to ) are
  • if for some natural , then has at most the eigenvalues (todo by q11.2.4)
  • (11.2.6) has at most distinct eigenvalues
  • (4.4.1+q11.3.1+left-multiple with A) if is invertible, then is an eigenvalue of , if and only if, is an eigenvalue of . (with the same eigenvectors)

Eigenvectors

Definitions of eigenvector. The following statements are equivalent:

  • is an eigenvector of that related to
  • (d11.3.1) is non-zero vector in such that
  • (11.3.2) is an eigenvector of that related to

Eigenbasis

  • An eigenbasis of , is a basis of consisting of eigenvectors of .
  • if are distinct eigenvalues of a matrix with corresponding eigenspaces, spanned by bases respectively, then the union is linearly independent set of eigenvectors of . thereby if the size of the union is , then is also an eigenbasis of

Eigenspace

Definitions of the eigenspace of associated with its eigenvalue .

Algebraic & geometric multiplicity

  • is an eigenvalue of

    • (d11.5.2) The algebraic multiplicity of is:
      • the multiplicity of as a root of the characteristic equation
      • the highest power of that divides the characteristic polynomial of
    • (q11.5.2) The geometric multiplicity of , is:
      • the dimension of the eigenspace corresponding to ,
    • todo if is diagonalizable, then the geometric and algebraic multiplicity of is the number that appears in the diagonalization of
    • (11.5.3, q11.5.3) the geometric multiplicity the algebraic multiplicity
  • finding the algebraic multiplicity of eigenvaluetodo

  • finding the geometric multiplicity of eigenvaluetodo

Procedure: Finding Eigenvalues and Eigenvectors

  1. First, find the eigenvalues  of  by solving the characteristic equation .
  2. For each , find the basic eigenvectors  by finding the basic solutions to .

To verify your work, make sure that  for each  and associated eigenvector .

Similarity

Similarity is an equivalence relation on the space of square matrices.

and are square matrices

Definitions of similarity. The following statements are equivalent:

  • and are similar
  • (d10.7.1) There exists an invertible matrix such that
    • being the change of basis matrix
  • (10.7.2) and represent the same linear transformation. (possibly different bases)

Theorems:

  • (q10.7.8) zero matrix is similar only to itself. identify matrix is similar only to itself.
  • todo to show that two matrices are similar, show that are similar to the same diagonal matrix
  • todo let and are diagonalizable, and they both have the same eigenvalues, then they’re similar (because similarity is transitive)
  • todo let and are diagonalizable, and they both have the same characteristic polynomial, then they’re similar (because similarity is transitive)

Properties:

  • If the matrices and are similar, then
    • todo
    • todo
    • is invertible, if and only if is also invertibletodo
    • (10.7.3)
    • (10.7.5)
    • and have the same eigenvalues (11.3.3)
    • and have the same algebraic multiplicities of eigenvaluestodo
    • and have the same characteristic polynomial (11.4.3)

Triangular matrix

Properties:

  • if is a triangular matrix, then
    • (4.3.8)
    • the eigenvalues of are
      • each eigenvalue occurs exactly k times on the diagonal, where k is its algebraic multiplicity
    • the characteristic polynomial of is

Diagonal matrix

Diagonal equivalent definitions.

  • is a diagonal matrix
  • is both upper- and lower-triangular
  • see q11.1.1

Properties:

  • Addition:
  • Multiplication
  • Powers of a matrix
  • . in such case
  • A diagonal matrix is symmetric.
  • the rank of a diagonal matrix is simply the number of nonzero entries (the eigenvalues)

Diagonalizable

  • Diagonalizable definition. The following statements are equivalent.

    • is a diagonalizable matrix
    • (d11.3.4) There exists an invertible matrix , such that is a diagonal matrix
    • (d11.3.4) is similar to a diagonal matrix
    • (11.3.7) has linearly independent eigenvectors. (they are ‘s columns. that are , and is a eigenvector of that’s related to the eigenvalue . and )
    • (q11.3.7) has a basis that consists of eigenvectors of
    • (11.3.5) is diagonalizable
    • (q11.4.10) is diagonalizable
    • (11.5.4’)
      • (i) the characteristic polynomial factors completely into linear factors. and
      • (ii) the geometric multiplicity of every eigenvalue is equal to the algebraic multiplicity
    • todo The sum of the dimensions of the eigenspaces equals to
  • (11.3.6) if has distinct eigenvalues, then diagonalizable

  • todo , ( is a diagonal matrix)

Symmetric matrix

  • (d3.2.6)
  • (q3.2.3)
  • (q3.2.4)
  • (q3.2.4) sum of symmetric matries is symmetric matrix
  • (q3.4.6) and are symmetric matries, then
  • (q4.3.10)
    • if is anti-symmetric, and is odd, then

Change of Basis matrix (Transition matrix)

also change-of-coordinates matrix

  • (d8.4.6) Let and bases of . if \begin{align} u_{1} &= a_{11}v_{1}+\dots+a_{n1}v_{n} \\ \vdots \notag \\ u_{n} &= a_{1n}v_{1}+\dots+a_{nn}v_{n} \\ \end{align}then is the transition matrix from basis to basis .
  • the transition matrix from basis to basis is the matrix that its columns are the coordinate vectors of the vectors by .
  • (8.4.9) is the transition matrix from to
  • Transition matrix is square matrix of -order, where
  • (by 8.4.5) Transition matrix is invertible matrix

Theorems:

  • (8.4.8) if is square matrix, and for each , then is the transition matrix from to .
  • (10.6.1) and and are bases of . if transition matrix from to , then . (or symmetrically )
    • transition matrix from to is , therefore

Finding the transition matrix from an old basis to a new basis

  1. Form the partitioned matrix in which the basis vectors (or coordinate vectors) are in column form.
  2. Use elementary row operations to reduce the matrix in Step 1 to RREF.
  3. The resulting matrix will be where is an identity matrix.
  4. Extract the matrix on the right side of the matrix obtained in Step 3.

Transition matrix from a basis B to the standard basis

  • if , then is the transition matrix from to the standard basis

Orthogonality

todo Orthogonal matrix - This is not taught in the course.

Commuting

  • (d3.6.2) and are said to commute if
  • (3.6.3)
  • and share the same independent eigenvectors if and only if .