Published on

Notes on Strang Lecture - 3

Lecture link

Matrix multiplication

For matrices AA and BB, the product C=ABC = AB can be computed and interpreted in multiple ways.

Element-wise computation

The element CijC_{ij} equals the dot product of the ii-th row of AA and the jj-th column of BB:

Cij=k=1nAikBkjC_{ij} = \sum_{k=1}^{n} A_{ik} B_{kj}

Column and row perspectives

  1. Column perspective: Matrix CC consists of linear combinations of the columns of AA, where the coefficients come from the columns of BB.

  2. Row perspective: Matrix CC consists of linear combinations of the rows of BB, where the coefficients come from the rows of AA.

Sum of rank-one matrices

Another important interpretation: ABAB is the sum of rank-one matrices formed by the outer product of the ii-th column of AA and the ii-th row of BB:

AB=i=1naibiTAB = \sum_{i=1}^{n} \vec{a}_i \vec{b}_i^T

where ai\vec{a}_i denotes the ii-th column of AA and biT\vec{b}_i^T denotes the ii-th row of BB.

Inverse matrices

For a square matrix AA, the inverse matrix A1A^{-1} satisfies:

A1A=I,if A1A^{-1}A = I, \quad \text{if } \exists A^{-1}

and equivalently:

AA1=IAA^{-1} = I

Existence conditions

The inverse A1A^{-1} exists if and only if there does not exists a non-zero vector x\vec{x} such that Ax=0A\vec{x} = \vec{0}. To see this, note that if such an x\vec{x} exists and A1A^{-1} existed, we would have:

x=A1Ax=0\vec{x} = A^{-1}A\vec{x} = \vec{0}

which contradicts the assumption that x0\vec{x} \neq \vec{0}.

Similarly, A1A^{-1} exist iff there is no non-zero row vector xT\vec{x}^T such that xTA=0T\vec{x}^T A = \vec{0}^T.

Computing the inverse

Suppose AA is invertible. One method to find A1A^{-1} is to perform Gaussian-Jordan elimination on the augmented matrix [AI][A \mid I], applying both forward and backward elimination steps to transform it into [IA1][I \mid A^{-1}].