Get AI summaries of any video or article — Sign up free
Functional Analysis 11 | Orthogonality [dark version] thumbnail

Functional Analysis 11 | Orthogonality [dark version]

4 min read

Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Two vectors x and y in an inner product space are orthogonal exactly when their inner product is zero: ⟨x, y⟩ = 0.

Briefing

Orthogonality in an inner product space is defined entirely through the inner product: two vectors are orthogonal exactly when their inner product is zero. That same idea extends from individual vectors to sets. If U and V are subsets of a vector space X, then U is orthogonal to V when every vector in U is orthogonal to every vector in V—meaning ⟨x, y⟩ = 0 for all x ∈ U and y ∈ V. This set-based viewpoint leads naturally to the orthogonal complement: for a given subset U ⊆ X, the orthogonal complement U⊥ is the collection of all vectors x ∈ X that are orthogonal to every vector in U, i.e., ⟨x, u⟩ = 0 for all u ∈ U. A key takeaway is that U⊥ is always a subspace of X even if U itself is not a subspace; only the definition of orthogonality is needed to guarantee the subspace properties.

Several boundary cases clarify how U⊥ behaves. The orthogonal complement of the zero vector is the entire space: every vector is orthogonal to 0 because ⟨x, 0⟩ = 0. Conversely, the orthogonal complement of the whole space is just the zero vector, since the only vector orthogonal to every vector in X must be 0. There’s also a useful inclusion reversal when comparing subsets: if U ⊆ V, then V⊥ ⊆ U⊥. The reason is direct from the definition—being orthogonal to all vectors in V automatically implies being orthogonal to all vectors in U.

Beyond geometry, orthogonality connects to a computational identity. When two vectors x and y are orthogonal, the norm of their sum satisfies a Pythagorean-type formula: ‖x + y‖² = ‖x‖² + ‖y‖². Here the norm is the one associated with the inner product, so the identity is not just a special case from Euclidean space; it holds abstractly in any inner product space. The transcript frames this as a general theorem that can be verified by a straightforward proof.

For intuition, the discussion uses a geometric picture: in three-dimensional space, a 2D plane has an orthogonal complement that is a 1D line, and the only point shared by the plane and its orthogonal complement is the zero vector. In higher (or infinite) dimensions—typical in functional analysis—these relationships still exist, but the geometry becomes harder to visualize. The next step is flagged as a topological refinement: in functional analysis, the orthogonal complement U⊥ is always a closed subspace, a fact that requires more than linear algebra and will be addressed later.

Cornell Notes

Orthogonality in an inner product space is defined by the inner product: vectors x and y are orthogonal when ⟨x, y⟩ = 0. For a subset U of a vector space X, the orthogonal complement U⊥ consists of all vectors in X that are orthogonal to every vector in U. U⊥ is always a subspace of X even when U is not a subspace. The complement behaves predictably in extreme cases: {0}⊥ = X and X⊥ = {0}, and subset inclusion reverses (if U ⊆ V, then V⊥ ⊆ U⊥). Finally, orthogonality yields a Pythagorean identity: if x ⟂ y, then ‖x + y‖² = ‖x‖² + ‖y‖², using the norm induced by the inner product.

How does the definition of orthogonality for vectors extend to subsets of a vector space?

For vectors, orthogonality means ⟨x, y⟩ = 0. For subsets U and V, the condition becomes uniform: every x ∈ U must be orthogonal to every y ∈ V, so ⟨x, y⟩ = 0 for all x ∈ U and y ∈ V.

What exactly is the orthogonal complement U⊥ of a subset U, and what is guaranteed about it?

U⊥ = {x ∈ X : ⟨x, u⟩ = 0 for all u ∈ U}. Regardless of whether U is a subspace, U⊥ is always a subspace of X; the transcript emphasizes that proving the subspace property uses linear algebra rather than assuming U is already a subspace.

What do the orthogonal complements of the zero vector and of the whole space look like?

Because ⟨x, 0⟩ = 0 for every x, the orthogonal complement of the zero vector is the whole space: {0}⊥ = X. On the other hand, if a vector x is orthogonal to every vector in X, then x must be 0, so X⊥ = {0} (the set containing only the zero vector).

Why does inclusion reverse for orthogonal complements (if U ⊆ V, then V⊥ ⊆ U⊥)?

Take x ∈ V⊥. By definition, ⟨x, v⟩ = 0 for all v ∈ V. Since U ⊆ V, every u ∈ U is also in V, so ⟨x, u⟩ = 0 for all u ∈ U. That means x ∈ U⊥, giving V⊥ ⊆ U⊥.

How does orthogonality lead to a Pythagorean formula in inner product spaces?

When x ⟂ y, the norm induced by the inner product satisfies ‖x + y‖² = ‖x‖² + ‖y‖². The transcript frames this as the abstract Pythagorean theorem, valid in any inner product space (not just Euclidean space), because the norm is tied to the inner product.

Review Questions

  1. If U is a subset of X but not a subspace, what can still be said about U⊥? Why?
  2. Given U ⊆ V, what is the correct relationship between U⊥ and V⊥, and how does it follow from the definition?
  3. State the orthogonality condition and the resulting norm identity when two vectors are orthogonal.

Key Points

  1. 1

    Two vectors x and y in an inner product space are orthogonal exactly when their inner product is zero: ⟨x, y⟩ = 0.

  2. 2

    For subsets U and V, orthogonality means every x ∈ U is orthogonal to every y ∈ V.

  3. 3

    The orthogonal complement U⊥ is defined as all x ∈ X such that ⟨x, u⟩ = 0 for every u ∈ U.

  4. 4

    U⊥ is always a subspace of X, even if U is not a subspace.

  5. 5

    Extreme cases follow directly from the definition: {0}⊥ = X and X⊥ = {0}.

  6. 6

    Orthogonal complements reverse inclusion: if U ⊆ V, then V⊥ ⊆ U⊥.

  7. 7

    If x ⟂ y, then the induced norm satisfies the Pythagorean identity ‖x + y‖² = ‖x‖² + ‖y‖².

Highlights

Orthogonal complements are built from the inner product: U⊥ = {x : ⟨x, u⟩ = 0 for all u ∈ U}.
Even without U being a subspace, U⊥ is guaranteed to be a subspace.
Inclusion flips direction: U ⊆ V implies V⊥ ⊆ U⊥.
Orthogonality produces an abstract Pythagorean theorem: ‖x + y‖² = ‖x‖² + ‖y‖² when x ⟂ y.
Geometric intuition in finite dimensions: a plane’s orthogonal complement is a line, intersecting only at the zero vector.