Functional Analysis 11 | Orthogonality [dark version]
Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Two vectors x and y in an inner product space are orthogonal exactly when their inner product is zero: ⟨x, y⟩ = 0.
Briefing
Orthogonality in an inner product space is defined entirely through the inner product: two vectors are orthogonal exactly when their inner product is zero. That same idea extends from individual vectors to sets. If U and V are subsets of a vector space X, then U is orthogonal to V when every vector in U is orthogonal to every vector in V—meaning ⟨x, y⟩ = 0 for all x ∈ U and y ∈ V. This set-based viewpoint leads naturally to the orthogonal complement: for a given subset U ⊆ X, the orthogonal complement U⊥ is the collection of all vectors x ∈ X that are orthogonal to every vector in U, i.e., ⟨x, u⟩ = 0 for all u ∈ U. A key takeaway is that U⊥ is always a subspace of X even if U itself is not a subspace; only the definition of orthogonality is needed to guarantee the subspace properties.
Several boundary cases clarify how U⊥ behaves. The orthogonal complement of the zero vector is the entire space: every vector is orthogonal to 0 because ⟨x, 0⟩ = 0. Conversely, the orthogonal complement of the whole space is just the zero vector, since the only vector orthogonal to every vector in X must be 0. There’s also a useful inclusion reversal when comparing subsets: if U ⊆ V, then V⊥ ⊆ U⊥. The reason is direct from the definition—being orthogonal to all vectors in V automatically implies being orthogonal to all vectors in U.
Beyond geometry, orthogonality connects to a computational identity. When two vectors x and y are orthogonal, the norm of their sum satisfies a Pythagorean-type formula: ‖x + y‖² = ‖x‖² + ‖y‖². Here the norm is the one associated with the inner product, so the identity is not just a special case from Euclidean space; it holds abstractly in any inner product space. The transcript frames this as a general theorem that can be verified by a straightforward proof.
For intuition, the discussion uses a geometric picture: in three-dimensional space, a 2D plane has an orthogonal complement that is a 1D line, and the only point shared by the plane and its orthogonal complement is the zero vector. In higher (or infinite) dimensions—typical in functional analysis—these relationships still exist, but the geometry becomes harder to visualize. The next step is flagged as a topological refinement: in functional analysis, the orthogonal complement U⊥ is always a closed subspace, a fact that requires more than linear algebra and will be addressed later.
Cornell Notes
Orthogonality in an inner product space is defined by the inner product: vectors x and y are orthogonal when ⟨x, y⟩ = 0. For a subset U of a vector space X, the orthogonal complement U⊥ consists of all vectors in X that are orthogonal to every vector in U. U⊥ is always a subspace of X even when U is not a subspace. The complement behaves predictably in extreme cases: {0}⊥ = X and X⊥ = {0}, and subset inclusion reverses (if U ⊆ V, then V⊥ ⊆ U⊥). Finally, orthogonality yields a Pythagorean identity: if x ⟂ y, then ‖x + y‖² = ‖x‖² + ‖y‖², using the norm induced by the inner product.
How does the definition of orthogonality for vectors extend to subsets of a vector space?
What exactly is the orthogonal complement U⊥ of a subset U, and what is guaranteed about it?
What do the orthogonal complements of the zero vector and of the whole space look like?
Why does inclusion reverse for orthogonal complements (if U ⊆ V, then V⊥ ⊆ U⊥)?
How does orthogonality lead to a Pythagorean formula in inner product spaces?
Review Questions
- If U is a subset of X but not a subspace, what can still be said about U⊥? Why?
- Given U ⊆ V, what is the correct relationship between U⊥ and V⊥, and how does it follow from the definition?
- State the orthogonality condition and the resulting norm identity when two vectors are orthogonal.
Key Points
- 1
Two vectors x and y in an inner product space are orthogonal exactly when their inner product is zero: ⟨x, y⟩ = 0.
- 2
For subsets U and V, orthogonality means every x ∈ U is orthogonal to every y ∈ V.
- 3
The orthogonal complement U⊥ is defined as all x ∈ X such that ⟨x, u⟩ = 0 for every u ∈ U.
- 4
U⊥ is always a subspace of X, even if U is not a subspace.
- 5
Extreme cases follow directly from the definition: {0}⊥ = X and X⊥ = {0}.
- 6
Orthogonal complements reverse inclusion: if U ⊆ V, then V⊥ ⊆ U⊥.
- 7
If x ⟂ y, then the induced norm satisfies the Pythagorean identity ‖x + y‖² = ‖x‖² + ‖y‖².