Linear Algebra 53 | Eigenvalues and Eigenvectors
Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
A vector x is an eigenvector of A if Ax=λx for some scalar λ, with x required to be nonzero.
Briefing
Eigenvalues and eigenvectors identify the special directions a linear transformation preserves—up to scaling—when a matrix acts on space. For a square matrix A representing a linear map f_A, an eigenvector x is any nonzero vector that, after applying the map, lands on the same line as before: f_A(x)=Ax=λx. The scalar λ is the eigenvalue, and it matters because it tells how that direction changes: λ>0 keeps the direction, λ<0 flips it, and λ=0 collapses the vector to the origin. This turns a geometric question about how a transformation reshapes space into an algebraic one about solving a matrix equation.
The core algebraic move is rewriting Ax=λx into (A−λI)x=0, where I is the identity matrix. That equation means x lies in the kernel (null space) of the matrix (A−λI). Since eigenvectors must be nonzero, the focus becomes finding values of λ for which (A−λI) has a nontrivial kernel. Once such an eigenvector exists, there are infinitely many eigenvectors for the same λ because any nonzero vector in that kernel can be scaled to produce another valid eigenvector. Geometrically, this is why eigenvectors correspond to “characteristic directions” of the transformation.
A worked 2×2 example illustrates the mechanics. Starting with a simple matrix (with entries chosen as 1s and 0s), the eigenvalue equation Ax=λx produces two component equations. The system is not linear in the unknowns because λ multiplies the vector components, so standard linear-system methods don’t apply directly. By analyzing the second equation first, one sees it is satisfied when λ=1 or when the relevant component x2=0. Checking each case against the first equation eliminates the options that lead only to the zero vector, leaving λ=1 as the only valid eigenvalue. Substituting λ=1 back into the equations forces x2=0 while leaving x1 free, yielding eigenvectors of the form (x1,0) with x1≠0. The example shows how eigenvalues can be determined by consistency conditions that prevent the solution from collapsing to the zero vector.
The formal definitions tie everything together. A scalar λ is an eigenvalue of A if there exists a nonzero vector x such that Ax=λx; that vector x is then an eigenvector. All eigenvectors associated with a fixed eigenvalue λ form an eigenspace, defined as the kernel of (A−λI). Unlike the set of eigenvectors themselves, the eigenspace is a subspace that includes the zero vector, which is included for convenience so the structure behaves like an ordinary subspace. Finally, the collection of all eigenvalues of A is called the spectrum of A. These definitions set up the systematic computation methods that come next.
Cornell Notes
Eigenvalues and eigenvectors describe the directions a linear transformation preserves under a matrix A. A nonzero vector x is an eigenvector if Ax=λx for some scalar λ, meaning the output stays on the same line as x and is only scaled (possibly flipped if λ<0, or collapsed if λ=0). Rearranging gives (A−λI)x=0, so eigenvectors correspond to nontrivial vectors in the kernel of (A−λI). For a fixed λ, all eigenvectors form the eigenspace ker(A−λI), which is a subspace that includes the zero vector. The set of all eigenvalues of A is the spectrum of A, a central concept for understanding how A acts.
Why does Ax=λx capture “preserved direction” under a linear map?
How does the equation Ax=λx turn into a kernel problem?
Why must eigenvectors exclude the zero vector, but eigenvectors’ eigenspaces include it?
What does it mean that one eigenvector implies infinitely many eigenvectors for the same eigenvalue?
In the 2×2 example, why does the analysis lead to λ=1 as the only eigenvalue?
Review Questions
- Given a matrix A and scalar λ, how do you test whether λ is an eigenvalue using (A−λI)?
- Explain the relationship between eigenvectors, the kernel ker(A−λI), and the eigenspace.
- If λ is negative, what geometric change should an eigenvector undergo under the transformation Ax?
Key Points
- 1
A vector x is an eigenvector of A if Ax=λx for some scalar λ, with x required to be nonzero.
- 2
The eigenvalue λ describes how the transformation scales an eigenvector: positive preserves direction, negative flips it, and zero collapses it to the origin.
- 3
Rewriting Ax=λx as (A−λI)x=0 converts the eigenvalue problem into finding values of λ that make ker(A−λI) nontrivial.
- 4
All eigenvectors for a fixed eigenvalue λ form the eigenspace ker(A−λI), which is a subspace including the zero vector.
- 5
Once an eigenvector exists for λ, infinitely many eigenvectors follow by scaling within the eigenspace.
- 6
The spectrum of A is the set of all eigenvalues of A, obtained by collecting every λ that yields a nontrivial kernel for (A−λI).