Eigenvalues and Diagonalization
Eigenvalues identify directions that a linear transformation stretches without turning. If , the vector keeps its direction and the scalar gives the stretch factor. This idea organizes vibration modes, stability of ODE systems, Markov chains, principal axes, and many numerical algorithms.
Diagonalization is powerful because it replaces a coupled linear map with independent scalar actions along eigenvector directions. When a matrix can be diagonalized, powers, exponentials, and recurrence formulas become much easier. When it cannot, generalized eigenvectors and Jordan or Schur forms explain what replaces the ideal picture.
Definitions
An eigenvalue of a square matrix is a scalar such that
for some nonzero vector . The vector is an eigenvector. Equivalently,
so must satisfy
The polynomial
is the characteristic polynomial. The algebraic multiplicity of an eigenvalue is its multiplicity as a root of . The geometric multiplicity is the dimension of its eigenspace.
A matrix is diagonalizable if there is an invertible matrix and a diagonal matrix such that
The columns of are eigenvectors, and the diagonal entries of are the corresponding eigenvalues.
Key results
An matrix is diagonalizable if it has linearly independent eigenvectors. Distinct eigenvalues always have linearly independent eigenvectors, so a matrix with distinct eigenvalues is diagonalizable. Repeated eigenvalues require checking geometric multiplicity.
If , then
for nonnegative integers . Also,
where is diagonal with entries . This is why diagonalization is so useful for linear ODE systems.
Trace and determinant are linked to eigenvalues:
counting algebraic multiplicity over the complex numbers. For systems, these two quantities help classify phase portraits quickly.
Symmetric real matrices have especially good behavior. The spectral theorem says that a real symmetric matrix has real eigenvalues and an orthonormal basis of eigenvectors. Thus it can be orthogonally diagonalized:
This is more stable numerically than a general diagonalization because orthogonal matrices preserve lengths.
Defective matrices lack enough eigenvectors. A repeated eigenvalue with geometric multiplicity smaller than algebraic multiplicity prevents diagonalization. Such matrices can still be analyzed, but powers and exponentials include polynomial factors. In ODE systems this creates terms such as .
Eigenvalues are basis-independent but matrices are basis-dependent. Changing coordinates by an invertible matrix replaces with , a similar matrix. Similar matrices have the same characteristic polynomial, trace, determinant, and eigenvalues. Diagonalization is therefore a change to an eigenvector basis when such a basis exists.
Numerical eigenvalue problems are sensitive to conditioning and structure. Symmetric eigenvalue problems are comparatively stable and common in mechanics because stiffness and mass matrices often lead to symmetric forms. General nonsymmetric matrices can have ill-conditioned eigenvectors, so small perturbations may move eigenvalues or eigenvectors noticeably.
The geometric meaning of an eigenvector is easiest to see in two dimensions. Most vectors change both length and direction when multiplied by . Eigenvectors are the exceptional directions that stay on their own lines. If two independent eigenvector directions exist in the plane, every vector can be decomposed into those directions, and the matrix acts by independently scaling the two components. Diagonalization is exactly this decomposition written in matrix form.
Eigenvalues of powers and inverses follow simple rules when the operations exist. If , then . If is invertible, then . These facts are useful for discrete dynamical systems, where repeated multiplication by a matrix models time steps. The eigenvalue with largest magnitude usually dominates long-term behavior.
For differential equations, the sign of the real part matters more than the magnitude alone. In , a real eigenvalue gives decay, while gives growth. A complex pair gives rotation with exponential envelope . Thus eigenvalues are a bridge between linear algebra and phase-plane classification.
Diagonalization is not always numerically wise even when it is algebraically possible. If eigenvectors are nearly dependent, the matrix is ill-conditioned. Then forming can amplify roundoff. Algorithms for powers, exponentials, and eigenvalue computations often use Schur decompositions or specialized symmetric methods instead of explicitly forming a poorly conditioned eigenbasis.
The spectral theorem has strong consequences for quadratic forms. If is real symmetric, then after an orthogonal change of variables,
The signs of the eigenvalues determine whether the quadratic form is positive definite, negative definite, indefinite, or semidefinite. This matters in optimization, energy methods, and stability analysis.
In vibration problems, eigenvectors are mode shapes and eigenvalues are related to natural frequencies. Orthogonality of modes means a complicated displacement can be decomposed into independent modal coordinates. In Markov chains, the eigenvalue represents a steady state under suitable hypotheses, and other eigenvalues determine the rate of convergence toward equilibrium. The same algebra supports very different interpretations.
Repeated eigenvalues require a calm check. A repeated root of the characteristic polynomial is not automatically bad. The identity matrix has one repeated eigenvalue but every nonzero vector is an eigenvector, so it is diagonalizable. A Jordan block has the same repeated eigenvalue but too few eigenvectors. The difference is geometric multiplicity.
Visual
| Matrix type | Eigenvalue behavior | Diagonalization result |
|---|---|---|
| Distinct eigenvalues | Independent eigenvectors | Diagonalizable |
| Real symmetric | Real eigenvalues, orthogonal eigenvectors | Orthogonally diagonalizable |
| Repeated eigenvalue | Must check eigenspace dimension | May be defective |
| Triangular | Eigenvalues on diagonal | Diagonalizable only if enough eigenvectors |
Worked example 1: Diagonalizing a two-by-two matrix
Problem. Diagonalize
Method.
- Compute the characteristic polynomial:
- Expand:
- Factor:
- For ,
so , and an eigenvector is
- For ,
so , and an eigenvector is
- Form
Answer.
Check. Multiplying gives columns and , which equals .
The diagonalization also makes powers transparent. For example, , and has diagonal entries and . The mode associated with dominates repeated multiplication unless its initial coefficient is zero. This is the same modal dominance idea used in linear systems of ODEs.
Worked example 2: Detecting a defective matrix
Problem. Determine whether
is diagonalizable.
Method.
- Since is triangular, its eigenvalues are the diagonal entries:
with algebraic multiplicity .
- Compute the eigenspace:
- Solve
This gives
while is free.
- The eigenspace is
- Its dimension is , not .
Answer. The matrix is not diagonalizable.
Check. A matrix needs two independent eigenvectors for diagonalization. This matrix has only one.
This matrix is the simplest Jordan block. Its powers contain terms that grow like as well as , and its exponential contains a term proportional to . Those polynomial factors are the practical consequence of missing eigenvectors.
Code
import numpy as np
A = np.array([[4.0, 1.0],
[2.0, 3.0]])
w, P = np.linalg.eig(A)
D = np.diag(w)
print(w)
print(np.linalg.norm(A @ P - P @ D))
print(np.linalg.cond(P))
The residual A @ P - P @ D checks the computed eigenpairs. The condition number of P gives a warning about how sensitive the eigenvector basis may be. For symmetric matrices, np.linalg.eigh is usually preferred because it uses algorithms specialized for Hermitian structure.
Common pitfalls
- Treating algebraic multiplicity as if it automatically equals the number of eigenvectors.
- Forgetting that eigenvectors cannot be the zero vector.
- Mixing the order of columns in with the order of eigenvalues in .
- Assuming every real matrix has real eigenvalues.
- Using diagonalization for a defective matrix without checking eigenspace dimensions.
- Computing powers of a matrix by repeated multiplication when a reliable diagonalization or decomposition is available.
- Ignoring eigenvector conditioning in numerical diagonalization.
- Forgetting that symmetric matrices should use orthogonal diagonalization.
- Assuming that similar matrices look numerically similar. A poorly conditioned change of basis can make equivalent algebra unstable in floating-point arithmetic.
- Sorting eigenvalues without sorting the corresponding eigenvectors in the same order.
- Forgetting that complex eigenvectors of a real matrix usually combine into real sine-cosine behavior in ODE applications.
- Reporting approximate eigenvalues without checking residuals .
- Using determinant expansion by hand for large matrices when numerical algorithms are more appropriate.
- Ignoring units in modal interpretations.