Get AI summaries of any video or article — Sign up free
Linear Algebra 57 | Spectrum of Triangular Matrices thumbnail

Linear Algebra 57 | Spectrum of Triangular Matrices

4 min read

Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Diagonal entries of a diagonal matrix are exactly its eigenvalues, with standard unit vectors as eigenvectors.

Briefing

Eigenvalues of triangular and certain block matrices can be read off directly—often without computing determinants or solving characteristic polynomials—because the matrix structure forces the spectrum to match the diagonal blocks. For diagonal matrices, the eigenvalues are exactly the diagonal entries: each standard basis (unit) vector points in an eigenvector direction, and the corresponding diagonal entry gives the scaling factor. The same “diagonal determines eigenvalues” principle extends to triangular matrices: as long as the matrix is triangular, the eigenvalues are precisely the entries on the diagonal, with algebraic multiplicities visible from how often each diagonal value repeats.

This structural shortcut also generalizes to block matrices arranged in a triangular block form. When a square matrix is written as (upper block-triangular) where both and are square, the spectrum of the whole matrix is the union of the spectra of and . The key requirement is the block-triangular zero pattern: the off-diagonal block in the lower-left corner must be zero so that the determinant/characteristic polynomial behavior matches the triangular case. The same conclusion carries over to lower triangular matrices by using the fact that transposition preserves eigenvalues (equivalently, the characteristic polynomial is unchanged under transpose).

A 4×4 example illustrates the “read the diagonal” rule. Because the matrix is triangular, the eigenvalues are the diagonal entries, namely , with the value 2 appearing twice. The discussion emphasizes an important limitation: this method determines algebraic multiplicity (how many times an eigenvalue repeats as a root of the characteristic polynomial), but it does not automatically reveal geometric multiplicity (the dimension of the eigenspace), which can be harder to compute.

A larger 6×6 example shows how block decomposition can replace a full eigenvalue calculation when the matrix is not triangular as a whole. By spotting a block-triangular layout—zeros in one corner—the matrix is split into two square diagonal blocks. The spectrum then becomes the union of the spectra of those blocks. One diagonal block is triangular, so its eigenvalues are read off immediately; the other block is not triangular outright, but it again has a block-triangular structure, allowing another split until the eigenvalues can be assembled. The final result includes the eigenvalue 1 with algebraic multiplicity 2, again without addressing geometric multiplicity.

Overall, the central takeaway is practical: recognizing triangular structure or triangular block structure turns eigenvalue computation into a bookkeeping task—diagonal entries and diagonal blocks—while eigenvectors remain a separate, more demanding problem reserved for later discussion.

Cornell Notes

Triangular matrices have eigenvalues equal to their diagonal entries, so the spectrum can be read directly from the diagonal. The same idea extends to block matrices with a triangular block layout: for (with square and ), the spectrum is the union of the spectra of and . Transposition preserves eigenvalues, so lower triangular cases follow from upper triangular ones. Examples show how algebraic multiplicities come from repeated diagonal values, while geometric multiplicities (eigenspace dimensions) are not determined by these shortcuts. This makes eigenvalue calculations fast when the matrix can be recognized as triangular or block-triangular.

Why do diagonal matrices have eigenvalues equal to their diagonal entries?

For a diagonal matrix, each standard unit vector is an eigenvector. Multiplying the matrix by scales that vector by the -th diagonal entry, so the scaling factors (eigenvalues) are exactly the diagonal entries. Equivalently, the characteristic polynomial’s roots match those diagonal values.

What structural condition lets eigenvalues of a block matrix be obtained as a union of spectra?

The matrix must be block-triangular with a zero block in the lower-left corner, like . When and are square, the characteristic polynomial factors in a way that makes the spectrum of the whole matrix equal to . The zero pattern is what preserves the determinant/characteristic-polynomial behavior.

How does transposition help extend results from upper triangular to lower triangular matrices?

Eigenvalues are unchanged under transpose because the characteristic polynomial is unchanged. Since upper triangular matrices have eigenvalues on the diagonal, the same must hold for lower triangular matrices as well. This lets the “diagonal entries give eigenvalues” rule apply regardless of whether the zeros are above or below the diagonal.

What can be determined immediately from the diagonal in triangular examples, and what cannot?

The diagonal gives eigenvalues and their algebraic multiplicities (how many times each eigenvalue appears as a root of the characteristic polynomial). But geometric multiplicity—the dimension of the eigenspace—does not follow automatically and may require additional work.

How does the 6×6 example avoid computing eigenvalues for the full matrix?

It uses block decomposition. Zeros in a corner allow the matrix to be rewritten in block-triangular form, splitting it into square diagonal blocks. One block is triangular (so its eigenvalues are read off), and the other block is further decomposed using another block-triangular pattern. The final spectrum is assembled as unions of the spectra of these triangular blocks.

Review Questions

  1. In a triangular matrix, how do you determine both the eigenvalues and their algebraic multiplicities without solving a characteristic polynomial?
  2. For a block matrix , what requirements on and are necessary for the spectrum to equal ?
  3. Why does knowing algebraic multiplicity from the diagonal not automatically tell you geometric multiplicity?

Key Points

  1. 1

    Diagonal entries of a diagonal matrix are exactly its eigenvalues, with standard unit vectors as eigenvectors.

  2. 2

    For triangular matrices, the spectrum equals the set of diagonal entries, and repeated diagonal values reveal algebraic multiplicities.

  3. 3

    Eigenvalues of a block-triangular matrix (square , square ) are the union of the spectra of and .

  4. 4

    Transposition preserves eigenvalues, so results for upper triangular matrices extend to lower triangular matrices.

  5. 5

    These structural methods determine algebraic multiplicity but do not automatically determine geometric multiplicity (eigenspace dimension).

  6. 6

    Recognizing block-triangular structure can reduce eigenvalue computation for large matrices by splitting them into smaller triangular or nearly triangular blocks.

Highlights

Triangular structure turns eigenvalue computation into a diagonal-reading exercise: eigenvalues sit on the diagonal.
Block-triangular matrices let eigenvalues be assembled as a union of the spectra of the diagonal blocks.
Algebraic multiplicity is easy to read from the diagonal/block repetition, while geometric multiplicity requires more than structure alone.