Evanalysis
8.1Estimated reading time: 12 min

8.1 Eigenvalues, eigenvectors, and eigenspaces

Define eigenvalues through the equation Av=λv, then recast the same idea as a null-space and determinant question so the structure becomes computable.

Course contents

MATH1030: Linear algebra I

Rigorous linear algebra notes on systems, matrices, structure, and proof, with interaction used only where it clarifies the mathematics.

Chapter 1Systems of equations1 sections
Chapter 4Solution structure1 sections
Chapter 5Invertibility1 sections

Determinants tell you whether a square matrix is invertible. Eigenvalues ask a different question: which vectors keep their direction when the matrix acts on them?

Most vectors are bent into new directions by matrix multiplication. An eigenvector is exceptional. It is only stretched, shrunk, or reversed by a scalar factor. That is why eigenvalues reveal the internal geometry of a matrix rather than only its solvability properties.

Why this section matters

When you solve Ax=bAx=b, you focus on the whole system. When you study eigenvalues, you focus on the special vectors v for which the matrix action collapses to the simpler rule

Av=λv.Av=\lambda v.

If you can recognize those vectors, then later you can simplify powers of matrices, understand diagonalization, and describe invariant directions.

Definition

Eigenvalue and eigenvector

Let AA be an n×nn\times n square matrix. Let λ\lambda be a scalar, and let v be a nonzero column vector in Rn\mathbb{R}^n.

We say that v is an eigenvector of AA with eigenvalue λ\lambda if

Av=λv.Av=\lambda v.

The vector v must be nonzero. The zero vector satisfies A0=λ0A0=\lambda0 for every scalar, so it would destroy the meaning of the definition if we allowed it.

The point is that the matrix action and scalar multiplication agree on that vector. The direction of v is preserved, although the magnitude may change and the sign may flip.

Immediate consequences of the definition

Theorem

A fixed eigenvector has only one eigenvalue

Let AA be a square matrix, and let v be a nonzero vector. If

Av=λvandAv=μv,Av=\lambda v \qquad\text{and}\qquad Av=\mu v,

then λ=μ\lambda=\mu.

The reason is simple. Subtract the two equations:

(λμ)v=0.(\lambda-\mu)v=0.

Since v0v\neq0, the scalar factor must be 0, so λ=μ\lambda=\mu.

Theorem

Nonzero scalar multiples stay in the same eigen-direction

If v is an eigenvector of AA with eigenvalue λ\lambda, then every nonzero scalar multiple cv is also an eigenvector of AA with eigenvalue λ\lambda.

That is why an eigenvector never comes alone. It naturally represents a whole line through the origin.

Theorem

Linear combinations with a common eigenvalue

Suppose u1,,uku_1,\dots,u_k are eigenvectors of AA, all with the same eigenvalue λ\lambda. Then every nonzero linear combination

α1u1++αkuk\alpha_1u_1+\cdots+\alpha_ku_k

is also an eigenvector of AA with eigenvalue λ\lambda.

This is the first hint that eigenvectors belonging to the same eigenvalue form a subspace once the zero vector is added back in.

First examples

Worked example

A 2×2 matrix with two distinct eigenvalues

Let

A=[1330614].A= \begin{bmatrix} 13&30\\ -6&-14 \end{bmatrix}.

Check the vector

u1=[52].u_1= \begin{bmatrix} 5\\ -2 \end{bmatrix}.

Then

Au1=[1330614][52]=[52]=1u1.Au_1= \begin{bmatrix} 13&30\\ -6&-14 \end{bmatrix} \begin{bmatrix} 5\\ -2 \end{bmatrix} = \begin{bmatrix} 5\\ -2 \end{bmatrix} =1\cdot u_1.

So u1u_1 is an eigenvector with eigenvalue 1.

Now try

u2=[21].u_2= \begin{bmatrix} 2\\ -1 \end{bmatrix}.

Then

Au2=[42]=2[21]=2u2.Au_2= \begin{bmatrix} -4\\ 2 \end{bmatrix} =-2 \begin{bmatrix} 2\\ -1 \end{bmatrix} =-2u_2.

So u2u_2 is an eigenvector with eigenvalue 2-2.

Worked example

One eigenvalue can have more than one direction

Let

B=[211121112].B= \begin{bmatrix} 2&1&1\\ 1&2&1\\ 1&1&2 \end{bmatrix}.

The vector

u1=[111]u_1= \begin{bmatrix} 1\\1\\1 \end{bmatrix}

satisfies Bu1=4u1Bu_1=4u_1, so 4 is one eigenvalue.

Now consider

u2=[110],u3=[101].u_2= \begin{bmatrix} 1\\-1\\0 \end{bmatrix}, \qquad u_3= \begin{bmatrix} 1\\0\\-1 \end{bmatrix}.

Both satisfy

Bu2=u2,Bu3=u3.Bu_2=u_2, \qquad Bu_3=u_3.

So the same eigenvalue 1 has at least two linearly independent eigenvectors. That does not contradict the uniqueness theorem above. The theorem says one fixed nonzero vector cannot correspond to two different eigenvalues. It does not say one eigenvalue can have only one eigenvector direction.

Eigenvalues are a null-space question

The equation Av=λvAv=\lambda v becomes much easier to analyze when you move all terms to one side:

Avλv=0.Av-\lambda v=0.

Since λv=λInv\lambda v=\lambda I_nv, this is equivalent to

(AλIn)v=0.(A-\lambda I_n)v=0.

Theorem

Eigenvalue criterion via a homogeneous system

Let AA be an n×nn\times n matrix, λ\lambda a scalar, and v0v\neq0.

Then the following are equivalent:

  1. v is an eigenvector of AA with eigenvalue λ\lambda.
  2. v is a nontrivial solution of the homogeneous system
(AλIn)x=0.(A-\lambda I_n)x=0.

This recasts the eigenvalue problem as an ordinary linear-system problem. The vector v must live in the null space of AλIA-\lambda I.

Definition

Eigenspace

Let AA be an n×nn\times n matrix, and let λ\lambda be an eigenvalue of AA. The eigenspace of AA corresponding to λ\lambda is

EA(λ)=N(AλIn).E_A(\lambda)=N(A-\lambda I_n).

So the eigenspace contains all eigenvectors for λ\lambda together with the zero vector.

Because it is a null space, EA(λ)E_A(\lambda) is automatically a subspace.

Equivalent formulations for an eigenvalue

Once the null-space formulation is in place, the invertibility dictionary immediately gives several equivalent tests.

Theorem

Equivalent tests for λ to be an eigenvalue

Let AA be an n×nn\times n matrix and λ\lambda a scalar. The following statements are equivalent:

  1. λ\lambda is an eigenvalue of AA.
  2. (AλIn)x=0(A-\lambda I_n)x=0 has a nontrivial solution.
  3. N(AλIn){0}N(A-\lambda I_n)\neq\{0\}.
  4. AλInA-\lambda I_n is not invertible.
  5. det(AλIn)=0\det(A-\lambda I_n)=0.

This theorem is the bridge from eigenvalues back to determinants. It is also the theorem that later produces the characteristic polynomial.

Worked example

Find eigenvalues and eigenspaces by row reduction

Let

C=[3232].C= \begin{bmatrix} 3&2\\ 3&-2 \end{bmatrix}.

To find the eigenvalues, solve

det(CλI)=0.\det(C-\lambda I)=0.

We get

det[3λ232λ]=(3λ)(2λ)6=λ2λ12.\det \begin{bmatrix} 3-\lambda&2\\ 3&-2-\lambda \end{bmatrix} =(3-\lambda)(-2-\lambda)-6 =\lambda^2-\lambda-12.

So the eigenvalues are the roots:

λ=4,λ=3.\lambda=4,\qquad \lambda=-3.

For λ=4\lambda=4,

C4I=[1236][1200].C-4I= \begin{bmatrix} -1&2\\ 3&-6 \end{bmatrix} \sim \begin{bmatrix} 1&-2\\ 0&0 \end{bmatrix}.

Thus x1=2x2x_1=2x_2, so one basis vector is

[21].\begin{bmatrix} 2\\1 \end{bmatrix}.

For λ=3\lambda=-3,

C+3I=[6231][3100].C+3I= \begin{bmatrix} 6&2\\ 3&1 \end{bmatrix} \sim \begin{bmatrix} 3&1\\ 0&0 \end{bmatrix}.

Thus 3x1+x2=03x_1+x_2=0, so one basis vector is

[13].\begin{bmatrix} 1\\-3 \end{bmatrix}.

Therefore

EC(4)=span{[21]},EC(3)=span{[13]}.E_C(4)=\operatorname{span}\left\{ \begin{bmatrix} 2\\1 \end{bmatrix} \right\}, \qquad E_C(-3)=\operatorname{span}\left\{ \begin{bmatrix} 1\\-3 \end{bmatrix} \right\}.

Important properties

Theorem

Zero as an eigenvalue detects noninvertibility

For a square matrix AA, the following are equivalent:

  1. 0 is an eigenvalue of AA.
  2. AA is not invertible.

Equivalently, AA is invertible if and only if 0 is not an eigenvalue.

This is just the previous equivalence theorem with λ=0\lambda=0.

Theorem

Eigenvalues under simple matrix operations

If λ\lambda is an eigenvalue of AA, then:

  1. kλk\lambda is an eigenvalue of kA;
  2. λm\lambda^m is an eigenvalue of AmA^m for every nonnegative integer m;
  3. λ\lambda is an eigenvalue of ATA^T;
  4. if AA is invertible, then λ1\lambda^{-1} is an eigenvalue of A1A^{-1}.

These statements are not mysterious. They all come from applying the relevant matrix operation to the defining equation Av=λvAv=\lambda v.

Common mistake

Common mistake

The zero vector is never an eigenvector

Students often notice that A0=λ0A0=\lambda0 for every scalar and conclude that the zero vector is an eigenvector for every eigenvalue. That is exactly why the definition excludes 0. An eigenvector must point in a genuine direction, and the zero vector has no direction to preserve.

Quick check

Quick check

If v is an eigenvector of AA with eigenvalue λ\lambda, is 3v also an eigenvector of AA?

Assume v0v\neq0.

Solution

Answer

Quick check

Why does 0 being an eigenvalue force AA to be noninvertible?

Use the system (A0I)x=0(A-0I)x=0.

Solution

Answer

Quick check

What is the eigenspace EA(λ)E_A(\lambda) in null-space language?

Use the definition introduced above.

Solution

Answer

Exercises

Quick check

Show that [11]\begin{bmatrix}1\\1\end{bmatrix} is an eigenvector of [2112]\begin{bmatrix}2&1\\1&2\end{bmatrix} and find its eigenvalue.

Multiply first, then compare the output with the original vector.

Solution

Guided solution

Quick check

Find the eigenspace of A=[1004]A=\begin{bmatrix}1&0\\0&4\end{bmatrix} for the eigenvalue 4.

Solve (A4I)x=0(A-4I)x=0.

Solution

Guided solution

Quick check

If AA is invertible, can 0 appear as an eigenvalue of ATA^T?

Use the transpose property and the invertibility criterion together.

Solution

Guided solution

Keep 7.2 Row operations, products, and invertibility nearby, because the test det(AλI)=0\det(A-\lambda I)=0 depends directly on determinant and invertibility reasoning.

Continue with 8.2 Diagonalization and similarity to see how a full basis of eigenvectors changes the whole shape of a matrix.

The subspace viewpoint here also leans on 6.5 Basis and dimension.

Section mastery checkpoint

Answer each question correctly to complete this section checkpoint. Correct progress: 0%.

Skills: eigenvalue, invertibility, determinant

Which statement is equivalent to saying that 0 is an eigenvalue of a square matrix A?

Attempts used: 0

Attempts remaining: Unlimited attempts

Preview does not consume an attempt.

Submit records a graded attempt.

Key terms in this unit