Determinants tell you whether a square matrix is invertible. Eigenvalues ask a different question: which vectors keep their direction when the matrix acts on them?
Most vectors are bent into new directions by matrix multiplication. An eigenvector is exceptional. It is only stretched, shrunk, or reversed by a scalar factor. That is why eigenvalues reveal the internal geometry of a matrix rather than only its solvability properties.
Why this section matters
When you solve , you focus on the whole system. When you study
eigenvalues, you focus on the special vectors v for which the matrix action
collapses to the simpler rule
If you can recognize those vectors, then later you can simplify powers of matrices, understand diagonalization, and describe invariant directions.
Definition
Eigenvalue and eigenvector
Let be an square matrix. Let be a scalar, and let
v be a nonzero column vector in .
We say that v is an eigenvector of with eigenvalue if
The vector v must be nonzero. The zero vector satisfies for
every scalar, so it would destroy the meaning of the definition if we allowed
it.
The point is that the matrix action and scalar multiplication agree on that
vector. The direction of v is preserved, although the magnitude may change
and the sign may flip.
Immediate consequences of the definition
Theorem
A fixed eigenvector has only one eigenvalue
Let be a square matrix, and let v be a nonzero vector. If
then .
The reason is simple. Subtract the two equations:
Since , the scalar factor must be 0, so .
Theorem
Nonzero scalar multiples stay in the same eigen-direction
If v is an eigenvector of with eigenvalue , then every nonzero
scalar multiple cv is also an eigenvector of with eigenvalue .
That is why an eigenvector never comes alone. It naturally represents a whole line through the origin.
Theorem
Linear combinations with a common eigenvalue
Suppose are eigenvectors of , all with the same eigenvalue . Then every nonzero linear combination
is also an eigenvector of with eigenvalue .
This is the first hint that eigenvectors belonging to the same eigenvalue form a subspace once the zero vector is added back in.
First examples
Worked example
A 2×2 matrix with two distinct eigenvalues
Let
Check the vector
Then
So is an eigenvector with eigenvalue 1.
Now try
Then
So is an eigenvector with eigenvalue .
Worked example
One eigenvalue can have more than one direction
Let
The vector
satisfies , so 4 is one eigenvalue.
Now consider
Both satisfy
So the same eigenvalue 1 has at least two linearly independent eigenvectors.
That does not contradict the uniqueness theorem above. The theorem says one
fixed nonzero vector cannot correspond to two different eigenvalues. It does not
say one eigenvalue can have only one eigenvector direction.
Eigenvalues are a null-space question
The equation becomes much easier to analyze when you move all terms to one side:
Since , this is equivalent to
Theorem
Eigenvalue criterion via a homogeneous system
Let be an matrix, a scalar, and .
Then the following are equivalent:
vis an eigenvector of with eigenvalue .vis a nontrivial solution of the homogeneous system
This recasts the eigenvalue problem as an ordinary linear-system problem. The
vector v must live in the null space of .
Definition
Eigenspace
Let be an matrix, and let be an eigenvalue of . The eigenspace of corresponding to is
So the eigenspace contains all eigenvectors for together with the zero vector.
Because it is a null space, is automatically a subspace.
Equivalent formulations for an eigenvalue
Once the null-space formulation is in place, the invertibility dictionary immediately gives several equivalent tests.
Theorem
Equivalent tests for λ to be an eigenvalue
Let be an matrix and a scalar. The following statements are equivalent:
- is an eigenvalue of .
- has a nontrivial solution.
- .
- is not invertible.
- .
This theorem is the bridge from eigenvalues back to determinants. It is also the theorem that later produces the characteristic polynomial.
Worked example
Find eigenvalues and eigenspaces by row reduction
Let
To find the eigenvalues, solve
We get
So the eigenvalues are the roots:
For ,
Thus , so one basis vector is
For ,
Thus , so one basis vector is
Therefore
Important properties
Theorem
Zero as an eigenvalue detects noninvertibility
For a square matrix , the following are equivalent:
0is an eigenvalue of .- is not invertible.
Equivalently, is invertible if and only if 0 is not an eigenvalue.
This is just the previous equivalence theorem with .
Theorem
Eigenvalues under simple matrix operations
If is an eigenvalue of , then:
- is an eigenvalue of
kA; - is an eigenvalue of for every nonnegative integer
m; - is an eigenvalue of ;
- if is invertible, then is an eigenvalue of .
These statements are not mysterious. They all come from applying the relevant matrix operation to the defining equation .
Common mistake
Common mistake
The zero vector is never an eigenvector
Students often notice that for every scalar and conclude that the
zero vector is an eigenvector for every eigenvalue. That is exactly why the
definition excludes 0. An eigenvector must point in a genuine direction, and
the zero vector has no direction to preserve.
Quick check
Quick check
If v is an eigenvector of with eigenvalue , is 3v also an eigenvector of ?
Assume .
Solution
Answer
Quick check
Why does 0 being an eigenvalue force to be noninvertible?
Use the system .
Solution
Answer
Quick check
What is the eigenspace in null-space language?
Use the definition introduced above.
Solution
Answer
Exercises
Quick check
Show that is an eigenvector of and find its eigenvalue.
Multiply first, then compare the output with the original vector.
Solution
Guided solution
Quick check
Find the eigenspace of for the eigenvalue 4.
Solve .
Solution
Guided solution
Quick check
If is invertible, can 0 appear as an eigenvalue of ?
Use the transpose property and the invertibility criterion together.
Solution
Guided solution
Related notes
Keep 7.2 Row operations, products, and invertibility nearby, because the test depends directly on determinant and invertibility reasoning.
Continue with 8.2 Diagonalization and similarity to see how a full basis of eigenvectors changes the whole shape of a matrix.
The subspace viewpoint here also leans on 6.5 Basis and dimension.