In this lecture, we (1) review the Gaussian elimination (GE) for solving linear equations $\mathbf{A} \mathbf{x} = \mathbf{b}$, and then (2) show that GE leads to a useful decomposition $$ \mathbf{A} = \mathbf{L} \mathbf{U}. $$
Let's review (from any undergraduate linear algebra textbook) how to solve a linear system $$ \begin{pmatrix} 2 & 1 & -1 \\ -3 & -1 & 2 \\ -2 & 1 & 2 \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \\ x_3 \end{pmatrix} = \begin{pmatrix} 8 \\ -11 \\ -3 \end{pmatrix}. $$
Stage 1: Let's eliminate variable $x_1$ from equations (2) and (3).
Multiplying equation (1) by $\ell_{21} = 3/2$ and adds to equation (2) leads to $$ \begin{pmatrix} 2 & 1 & -1 \\ 0 & 1/2 & 1/2 \\ -2 & 1 & 2 \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \\ x_3 \end{pmatrix} = \begin{pmatrix} 8 \\ 1 \\ -3 \end{pmatrix}. $$
Multiplying equation (1) by $\ell_{31} = 1$ and adds to equation (3) leads to $$ \begin{pmatrix} 2 & 1 & -1 \\ 0 & 1/2 & 1/2 \\ 0 & 2 & 1 \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \\ x_3 \end{pmatrix} = \begin{pmatrix} 8 \\ 1 \\ 5 \end{pmatrix}. $$
Stage 2: Let's eliminate variable $x_2$ from equation (3).
Multiplying equation (2) by $\ell_{32} = -4$ and adds to equation (3) leads to $$ \begin{pmatrix} 2 & 1 & -1 \\ 0 & 1/2 & 1/2 \\ 0 & 0 & -1 \end{pmatrix} \begin{pmatrix} x_1 \\ x_2 \\ x_3 \end{pmatrix} = \begin{pmatrix} 8 \\ 1 \\ 1 \end{pmatrix}. $$
Stage 3: Now we can collect results by back-solve or back-substitution.
From the equation (3), $x_3 = 1$.
Substituting $x_3 = 1$ to equation (2) and solve for $x_2 = 3$.
Substituting $x_2=3$ and $x_3 = 1$ to equation (1) and solve for $x_2 = 2$.
This is essentially how computer solves linear equation:
using LinearAlgebra
A = [2.0 1.0 -1.0; -3.0 -1.0 2.0; -2.0 1.0 2.0]
b = [8.0, -11.0, -3.0]
# Julia way to solve linear equation
# equivalent to `solve(A, b)` in R
A \ b
Let's collect those multipliers $-\ell_{ij}$ into a unit lower triangular matrix $\mathbf{L}$
L = [1.0 0.0 0.0; -1.5 1.0 0.0; -1.0 4.0 1.0]
and save the resultant upper triangular matrix after GE into $\mathbf{U}$
U = [2.0 1.0 -1.0; 0.0 0.5 0.5; 0.0 0.0 -1.0]
Surprisingly, we find that $\mathbf{A} = \mathbf{L} \mathbf{U}$!
A ≈ L * U
Theorem: For any nonsingular $\mathbf{A} \in \mathbb{R}^{n \times n}$, there exists a unique unit lower triangular matrix $\mathbf{L}$ and upper triangular matrix $\mathbf{U}$ such that $$ \mathbf{A} = \mathbf{L} \mathbf{U}. $$
Given LU decomposition $\mathbf{A} = \mathbf{L} \mathbf{U}$, for a new right hand side $\mathbf{b}$, the solution to $\mathbf{A} \mathbf{x} = \mathbf{b}$ is readily given by two triangular solves: \begin{eqnarray*} \mathbf{L} \mathbf{y} &=& \mathbf{b} \\ \mathbf{U} \mathbf{x} &=& \mathbf{y}. \end{eqnarray*}
Indeed, computer performs GE/LU on a row-permuted version of $\mathbf{A}$ (pivoting) for numerical stability. That is $$ \mathbf{P} \mathbf{A} = \mathbf{L} \mathbf{U}, $$ where $\mathbf{P}$ is a permutation matrix. All multipliers $\ell_{ij}$ in $\mathbf{L}$ has magnitude $\le 1$.
A
Alu = lu(A)
Alu.p
A[Alu.p, :] ≈ Alu.L * Alu.U
Given LU decomposition $\mathbf{A} = \mathbf{L} \mathbf{U}$, $$ \text{det}(\mathbf{A}) = \text{det}(\mathbf{L}) \text{det}(\mathbf{U}) = \prod_{i=1}^n u_{ii}. $$
det(Alu)