It is useful when we want to know the rough location of eigenvalues of a matrix.
Gershgorin’s Theorem says that all the eigenvalues of an n×n matrix A is within n disks. The k-th disk of these n disks is centered at akk and its radius is ∑j=k∣akj∣ where aij is (i,j) element of A.
Suppose that x is the eigenvector of A and λ is the eigenvalue of x such that ∥x∥∞=1. Then there exists xk=1 in x.
A1 and A2 have the same diagonal elements and sum of absolute values in each row except the diagonal element, so their disks are the same. As above, the blue circles are the eigenvalues of A1 and the red circles are those of A2.
It is about the sensitivity of the eigenvectors and eigenvalues to small changes in an n×n matrix A.
Suppose that an n×n matrix X=(x(1),⋯,x(n)) where x(i) is the eigenvector of A and n×n diagonal matrix D=diag(λ1,⋯,λn) where λi is the eigenvalue corresponding to x(i). Then AX=XD.
When all the eigenvectors of A are nondefective, which means they are linearly independent, X−1AX=D. After A goes through some small changes, assume that A turns into A+E where E denotes errors. Then X−1(A+I)X=X−1AX+X−1EX=D+F. Since A+E and D+F are similar, they must have the same eigenvalues. Let μ be an eigenvalue of A+E, then for the eigenvector v corresponding to μ,
For this to be true, (μI−D) should be nonsingular which happens when all its diagonal elements are not zero. It means that any elements of D should not have μ. Otherwise, D has the eigenvalue μ, so F=0 and E=0 which mean there are no errors. However, we consider E=0, so (μI−D) is nonsingular. Accordingly, ∥v∥2≤∥∥(μI−D)−1∥∥2∥F∥2∥v∥2⟹∥∥(μI−D)−1∥∥2≤∥F∥2
By definition, ∥∥(μI−D)−1∥∥2 is the largest singular value. Therefore ∥∥(μI−D)−1∥∥2=μ−λk1 where λk is the eigenvalue of D closest to μ. ∥∥(μI−D)−1∥∥2=∣μ−λk∣≤∥F∥2=∥∥X−1EX∥∥2≤∥∥X−1∥∥2∥E∥2∥X∥2=condition number(X)∥E∥2
So, the effect from a small change of A depends on the condition number of X, not the condition number of A.
Another way to check the sensitivity of eigenvalues and eigenvectors including even when A is defective is using right and left eigenvectors together. Let x and y be right and left eigenvectors, then there exist λ and μ such that Ax=λx and ytA=μyt. It yields that ytAx=λytx=μytx, so λ=μ or ytx=0. Assume that ytx=0. If A has been changed by an error E and x and λ are also changed, then
because EΔx and ΔλΔx are small enough and negligible. Since λ=μ, we can add y as follows: ytEx+ytAΔxytExΔλ∣Δλ∣≈Δλytx+λytΔx≈Δλytx≈ytxytEx⪅∥ytx∥2∥y∥2∥x∥2∥E∥2=cosθ1∥E∥2
where θ is the angle between x and y. So, it is sensitive as θ increases.
A=⎝⎛−149537−27−50180−9−154546−25⎠⎞eigenvalues: λ1=1,λ2=2,λ3=3normalized right eigenvectors: X=(x(1)x(2)x(3))=⎝⎛0.316−0.9490.0000.404−0.909−0.1010.139−0.9740.179⎠⎞normalized left eigenvectors: Y=(y(1)y(2)y(3))=⎝⎛0.6810.2250.697−0.676−0.225−0.701−0.688−0.229−0.688⎠⎞
First, the condition number of X is 1289, so the eigenvalues of A are sensitive. Second, y(1)tx(1)=0.0017, y(2)tx(2)=0.0025, and y(3)tx(3)=0.0046, so the angles between x(i) and y(i) are large. Therefore, it is expected that A is sensitive to small changes. The following are eigenvalues changed from tiny changes of A. A+EA+E=⎝⎛−149537−27−50180.01−9−154546−25⎠⎞⟹⎩⎨⎧λ1=0.207λ2=2.301λ3=3.502=⎝⎛−149537−27−50179.99−9−154546−25⎠⎞⟹⎩⎨⎧λ1=1.664+1.054iλ2=1.664−1.054iλ3=2.662
It shows the large changes of eigenvalues about the small changes of A.
Suppose that x is the eigenvector of an n×n matrix A and λ is the eigenvalue corresponding to x.
A has the eigenvalue which is zero
⟺ There exists x=0 such that Ax=0
⟺A is singular
For α=0∈R, αx is also the eigenvector of A.
x is the eigenvector of αA for α∈R, and its eigenvalue is αλ.
x is the eigenvector of A+αI for α∈R, and its eigenvalue is λ+α.
For k∈N, x is the eigenvector of Ak, and its eigenvalue is λk.
If A is invertible, x is the eigenvector of A−1, and its eigenvalue is λ1.
For a diagonal matrix D=diag(a1,⋯,an), its eigenvalues are a1,⋯,an and its eigenvectors are e1,⋯,en where all components of a standard basis vector ei∈Rn are 0 except the i-th element which is 1.
For upper or lower triangular matrices, the eigenvalues are their diagonal elements.
For an n×n nonsingular matrix S, S−1x is the eigenvector of S−1AS and the eigenvalue corresponding to S−1x is λ. Eigenvalues are not changed by similar transformations.
For k≤n, if eigenvalues λ1,⋯,λk are distint, the eigenvectors x(1),⋯,x(n) corresponding to them are linearly independent.
If all the eigenvalues are distint, all the eigenvectors are linearly independent, so X=(x(1),⋯,x(n)) is nonsingular and diagonalizable as X−1AX=diag(λ1,⋯,λn). However, although X is diagonalizable, its all eigenvalues may not be distint such A=I. Moreover, although the eigenvalues of A are not unique, A may be diagonalizable. For example,
Keep going!Keep going ×2!Give me more!Thank you, thank youFar too kind!Never gonna give me up?Never gonna let me down?Turn around and desert me!You're an addict!Son of a clapper!No wayGo back to work!This is getting out of handUnbelievablePREPOSTEROUSI N S A N I T YFEED ME A STRAY CAT