Linear Algebra 57 | Spectrum of Triangular Matrices
Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Diagonal entries of a diagonal matrix are exactly its eigenvalues, with standard unit vectors as eigenvectors.
Briefing
Eigenvalues of triangular and certain block matrices can be read off directly—often without computing determinants or solving characteristic polynomials—because the matrix structure forces the spectrum to match the diagonal blocks. For diagonal matrices, the eigenvalues are exactly the diagonal entries: each standard basis (unit) vector points in an eigenvector direction, and the corresponding diagonal entry gives the scaling factor. The same “diagonal determines eigenvalues” principle extends to triangular matrices: as long as the matrix is triangular, the eigenvalues are precisely the entries on the diagonal, with algebraic multiplicities visible from how often each diagonal value repeats.
This structural shortcut also generalizes to block matrices arranged in a triangular block form. When a square matrix is written as (upper block-triangular) where both and are square, the spectrum of the whole matrix is the union of the spectra of and . The key requirement is the block-triangular zero pattern: the off-diagonal block in the lower-left corner must be zero so that the determinant/characteristic polynomial behavior matches the triangular case. The same conclusion carries over to lower triangular matrices by using the fact that transposition preserves eigenvalues (equivalently, the characteristic polynomial is unchanged under transpose).
A 4×4 example illustrates the “read the diagonal” rule. Because the matrix is triangular, the eigenvalues are the diagonal entries, namely , with the value 2 appearing twice. The discussion emphasizes an important limitation: this method determines algebraic multiplicity (how many times an eigenvalue repeats as a root of the characteristic polynomial), but it does not automatically reveal geometric multiplicity (the dimension of the eigenspace), which can be harder to compute.
A larger 6×6 example shows how block decomposition can replace a full eigenvalue calculation when the matrix is not triangular as a whole. By spotting a block-triangular layout—zeros in one corner—the matrix is split into two square diagonal blocks. The spectrum then becomes the union of the spectra of those blocks. One diagonal block is triangular, so its eigenvalues are read off immediately; the other block is not triangular outright, but it again has a block-triangular structure, allowing another split until the eigenvalues can be assembled. The final result includes the eigenvalue 1 with algebraic multiplicity 2, again without addressing geometric multiplicity.
Overall, the central takeaway is practical: recognizing triangular structure or triangular block structure turns eigenvalue computation into a bookkeeping task—diagonal entries and diagonal blocks—while eigenvectors remain a separate, more demanding problem reserved for later discussion.
Cornell Notes
Triangular matrices have eigenvalues equal to their diagonal entries, so the spectrum can be read directly from the diagonal. The same idea extends to block matrices with a triangular block layout: for (with square and ), the spectrum is the union of the spectra of and . Transposition preserves eigenvalues, so lower triangular cases follow from upper triangular ones. Examples show how algebraic multiplicities come from repeated diagonal values, while geometric multiplicities (eigenspace dimensions) are not determined by these shortcuts. This makes eigenvalue calculations fast when the matrix can be recognized as triangular or block-triangular.
Why do diagonal matrices have eigenvalues equal to their diagonal entries?
What structural condition lets eigenvalues of a block matrix be obtained as a union of spectra?
How does transposition help extend results from upper triangular to lower triangular matrices?
What can be determined immediately from the diagonal in triangular examples, and what cannot?
How does the 6×6 example avoid computing eigenvalues for the full matrix?
Review Questions
- In a triangular matrix, how do you determine both the eigenvalues and their algebraic multiplicities without solving a characteristic polynomial?
- For a block matrix , what requirements on and are necessary for the spectrum to equal ?
- Why does knowing algebraic multiplicity from the diagonal not automatically tell you geometric multiplicity?
Key Points
- 1
Diagonal entries of a diagonal matrix are exactly its eigenvalues, with standard unit vectors as eigenvectors.
- 2
For triangular matrices, the spectrum equals the set of diagonal entries, and repeated diagonal values reveal algebraic multiplicities.
- 3
Eigenvalues of a block-triangular matrix (square , square ) are the union of the spectra of and .
- 4
Transposition preserves eigenvalues, so results for upper triangular matrices extend to lower triangular matrices.
- 5
These structural methods determine algebraic multiplicity but do not automatically determine geometric multiplicity (eigenspace dimension).
- 6
Recognizing block-triangular structure can reduce eigenvalue computation for large matrices by splitting them into smaller triangular or nearly triangular blocks.