Loading [MathJax]/jax/output/HTML-CSS/jax.js

1  Eigenvalues and Eigenvectors (BR Chapter 11)

  • Let ARn×n and Ax=λx,x0. Then λ is an eigenvalue of A with corresponding eigenvector x.

1.1  Compute eigenvalues (by hand)

  • From eigen-equation Ax=λx, we have (AλI)x=0. That is the marix AλI must be singular and det(AλI)=0.

  • The n-degree polynomial pA(λ)=det(λIA) is called the characteristic polynomial. Eigenvalues are the roots of pA(λ).

  • Example: For A=(2112), the characteristic polynomial is pA(λ)=det(λ211λ2)=λ24λ+3=(λ1)(λ3). Therefore A's eigenvalues are λ1=1 and λ2=3. Solving linear equations (λ211λ2)x=0 now gives the corresponding eigenvectors x1=(11),x1=(11). We observe that (1) tr(A)=λ1+λ2, (2) det(A)=λ1λ2, and (3) the two eigenvectors are orthogonal to each other.

In [2]:
using LinearAlgebra

A = [2.0 1.0; 1.0 2.0]
Out[2]:
2×2 Array{Float64,2}:
 2.0  1.0
 1.0  2.0
In [2]:
eigen(A)
Out[2]:
Eigen{Float64,Float64,Array{Float64,2},Array{Float64,1}}
eigenvalues:
2-element Array{Float64,1}:
 1.0
 3.0
eigenvectors:
2×2 Array{Float64,2}:
 -0.707107  0.707107
  0.707107  0.707107
  • Example: For the rotation matrix Q=(0110), same procedure shows eigen-pairs Q(1i)=i(1i),Q(1i)=(i)(1i). The three properties (1)-(3) still hold.
In [5]:
Q = [0.0 -1.0; 1.0 0.0]
Out[5]:
2×2 Array{Float64,2}:
 0.0  -1.0
 1.0   0.0
In [6]:
eigen(Q)
Out[6]:
Eigen{Complex{Float64},Complex{Float64},Array{Complex{Float64},2},Array{Complex{Float64},1}}
eigenvalues:
2-element Array{Complex{Float64},1}:
 0.0 - 1.0im
 0.0 + 1.0im
eigenvectors:
2×2 Array{Complex{Float64},2}:
 0.707107-0.0im       0.707107+0.0im     
      0.0+0.707107im       0.0-0.707107im

1.2  Similar matrices

  • If Ax=λx, then (BAB1)(Bx)=BAx=λ(Bx). That is Bx is an eigenvector of the matrix BAB1.

    We say the matrix BAB1 is similar to A because they share the same eigenvalues.

1.3  Diagonalizing a matrix

  • Collecting the n eigen-equations Axi=λixi,i=1,,n, into a matrix multiplication format gives AX=XΛ,where Λ=(λ1λn). If we assume that the n eigenvectors are linearly independent (most matrices do, but not all matrices), then we have X1AX=Λdiagonalizing a matrix or A=XΛX1.eigen-decomposition.

1.4  Non-diagonalizable matrices

  • Geometric multiplicity (GM) of an eigenvalue λ: count the independent eigenvectors for λ. Look at the null space of AλI.

  • Algebraic multiplicity (AM) of an eigenvalue λ: count the repetition for λ. Look at the roots of characteristic polynomial det(λIA).

  • Always GMAM.

  • The shortage of eigenvectors when GM<AM means that A is not diagonalizable. There is no invertible matrix such that A=XΛX1.

  • Classical example: A=(0100). AM = 2, GM=1. Eigenvalue 0 is repeated twice but there is only one eigenvector (1,0).

  • More examples: all three matrices (5105),(6114),(7223) have AM=2 and GM=1.

In [3]:
eigen([0 1; 0 0])
Out[3]:
Eigen{Float64,Float64,Array{Float64,2},Array{Float64,1}}
eigenvalues:
2-element Array{Float64,1}:
 0.0
 0.0
eigenvectors:
2×2 Array{Float64,2}:
 1.0  -1.0         
 0.0   2.00417e-292
In [4]:
eigen([5 1; 0 5])
Out[4]:
Eigen{Float64,Float64,Array{Float64,2},Array{Float64,1}}
eigenvalues:
2-element Array{Float64,1}:
 5.0
 5.0
eigenvectors:
2×2 Array{Float64,2}:
 1.0  -1.0        
 0.0   1.11022e-15
In [5]:
eigen([6 -1; 1 4])
Out[5]:
Eigen{Float64,Float64,Array{Float64,2},Array{Float64,1}}
eigenvalues:
2-element Array{Float64,1}:
 5.0
 5.0
eigenvectors:
2×2 Array{Float64,2}:
 0.707107  0.707107
 0.707107  0.707107
In [6]:
eigen([7 2; -2 3])
Out[6]:
Eigen{Complex{Float64},Complex{Float64},Array{Complex{Float64},2},Array{Complex{Float64},1}}
eigenvalues:
2-element Array{Complex{Float64},1}:
 5.0 - 4.2146848510894035e-8im
 5.0 + 4.2146848510894035e-8im
eigenvectors:
2×2 Array{Complex{Float64},2}:
  0.707107-0.0im          0.707107+0.0im       
 -0.707107-1.49012e-8im  -0.707107+1.49012e-8im

1.5  Some properties

  • Multiplying both sides of the eigen-equation Ax=λx by A gives A2x=λAx=λ2x, showing that λ2 is an eigenvalue of A with eigenvector x.

    Similarly λk is an eigenvalue of Ak with eigenvector x.

  • For a diagonalizable matrix A=XΛX1, we have Ak=XΛkX1.

  • Shifting A shifts all eigenvalues. (A+sI)x=λx+sx=(λ+s)x.

  • A is singular if and only if it has at least one 0 eigenvalue.

  • Eigenvectors associated with distinct eigenvalues are linearly independent.

    Proof: Let Ax1=λ1x1,Ax2=λ2x2, and λ1λ2. Suppose x1 and x2 are linealy dependent. Then there is α0 such that x2=αx1. Hence αλ1x1=αAx1=Ax2=λ2x2=αλ2x1, or α(λ1λ2)x1=0. Since α0, λ1λ2, x1=0, a contradiction.

  • The eigenvalues of a triangular matrix are its diagonal entries.

    Proof: pA(λ)=(λa11)(λann).

  • Eigenvalues of an idempotent matrix are either 0 or 1.

    Proof: λx=Ax=AAx=λ2x. So λ=λ2 or λ=0,1.

  • Eigenvalues of an orthogonal matrix have complex modulus 1.

    Proof: Since AA=I, xx=xAAx=λλxx. Since xx0, we have λλ=|λ|=1.

  • Let ARn×n (not required to be diagonalizable), then tr(A)=iλi and det(A)=iλi. The general version can be proved by the Jordan canonical form, a generalization of the eigen-decomposition.

1.6  Spectral decomposition for symmetric matrices

  • For a symmetric matrix ARn,

    1. all eigenvalues of A are real numbers, and
    2. the eigenvectors corresponding to distinct eigenvalues are orthogonal to each other.

      Proof of 1: Pre-multiplying the eigen-equation Ax=λx by x (conjugate transpose) gives xAx=λxx. Now both xx=x1x1++xnxn and xAx=i,jaijxixj=iaiixixi+i<jaij(xixj+xixj)=a11x1x1+a12(x1x2+x1x2)+ are real numbers. Therefore λ is a real number.

      Proof of 2: Suppose Ax1=λ1x1,Ax2=λ2x2, and λ1λ2. Then (Aλ2I)x1=(λ1λ2)x1(Aλ2I)x2=0. Thus x1C(Aλ2I) and x2N(Aλ2I). By the fundamental theorem of linear algebra, x1x2.

  • For an eigenvalue with multiplicity, we can choose its eigenvectors to be orthogonal to each other. Also we normalize each eigenvector to have unit 2 norm. Thus we obtain the extremely useful spectral decomposition A=QΛQ=(q1qn)(λ1λn)(q1qn)=ni=1λiqiqi, where Q is orthogonal (columns are eigenvectors) and Λ=diag(λ1,,λn) (diagonal entries are eigenvalues).