Get AI summaries of any video or article — Sign up free
Abstract Linear Algebra 4 | Basis, Linear Independence, Generating Sets thumbnail

Abstract Linear Algebra 4 | Basis, Linear Independence, Generating Sets

5 min read

Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

A linear combination in any vector space uses finitely many vectors, even when the vector space is infinite-dimensional.

Briefing

The core takeaway is that “basis,” “linear independence,” and “dimension” from standard linear algebra extend cleanly to abstract vector spaces—including infinite-dimensional ones—by defining everything in terms of finite linear combinations. That extension matters because it lets mathematicians describe polynomial spaces, function spaces, and other large objects using the same structural language as  and , even when the number of needed directions is unbounded.

The discussion starts by recalling the polynomial vector spaces : all real-valued polynomials on with degree at most . As increases, these spaces grow, and in the limit there is no degree bound—leading to an infinite-dimensional polynomial space. That forces a broader notion of dimension than the usual finite one.

To set up the abstract framework, the transcript defines a general linear combination in a vector space : given vectors and scalars , a linear combination is . A key rule is emphasized: linear combinations are always finite sums. Even when the eventual vector space is infinite, the coefficients are chosen for only finitely many vectors at a time.

Next comes the span. For any subset , is the set of all vectors obtainable as finite linear combinations of elements of . The span is always a subspace of . When is empty, is defined as the smallest subspace, containing only the zero vector.

A subset is then called a generating set for a subspace if . This captures the idea that provides exactly enough information to build every vector in using linear combinations—no more and no less.

Linear independence is introduced as a uniqueness condition on coefficients. A set is linearly independent if the only way to represent the zero vector as a linear combination of elements of is the trivial one where all coefficients are zero. The set itself may be infinite, but the combinations tested are still finite.

Combining generating and independence yields the definition of a basis: a set is a basis of if it both generates and is linearly independent. The dimension is the number of vectors in any basis of ; this number is fixed even though many different bases may exist. For infinite-dimensional cases, the transcript notes that one could use cardinality to distinguish different infinities, but for this course it will mainly distinguish only finite versus infinite, writing when it is not finite.

Concrete examples anchor the definitions. For , the basis is the constant function , so . For , the monomials form a basis, giving . The transcript also highlights that the space of all functions on is infinite-dimensional. Finally, it points to a finite-dimensional matrix space: all complex-valued matrices have dimension , and the task is to find a basis (a linearly independent generating set) with six elements, foreshadowing coordinate methods in the next video.

Cornell Notes

The transcript extends basis, linear independence, span, and dimension from finite-dimensional spaces to general (including infinite-dimensional) vector spaces. A linear combination always uses only finitely many vectors, even when the underlying set of vectors is infinite. For any subset , is the set of all finite linear combinations of elements of , and it is always a subspace. A set generates a subspace if , and it is linearly independent if the zero vector has only the trivial representation using vectors from . A basis is both generating and linearly independent, and is the (fixed) number of vectors in any basis—finite or infinite.

Why does the definition of linear combination insist on finite sums, and what does that mean for infinite-dimensional spaces?

Even if a set is infinite, a linear combination is formed using only finitely many vectors . The coefficients multiply those finitely many vectors, producing . This keeps the algebra manageable and ensures is built from finite combinations, not infinite series.

How do span, generating sets, and bases fit together logically?

For , is the subspace of all vectors obtainable from finite linear combinations of elements of . If , then is a generating set for . A basis is a generating set that is also linearly independent, meaning every vector in can be built from without ambiguity in the coefficients (in particular, the zero vector cannot be built non-trivially).

What exactly does linear independence mean in this abstract setting?

Linear independence is defined through uniqueness of coefficients: the only way to represent the zero vector as a linear combination of vectors from is the trivial one where all coefficients are zero. The set itself can be infinite, but the test uses finite linear combinations. So no finite non-trivial combination of vectors in may equal .

How is dimension defined, and why is it not dependent on which basis is chosen?

Once a subspace has a basis , the number of elements in that basis is fixed for . That fixed number is . So even though different bases can exist, they all have the same cardinality (finite case) or the same “infinite vs finite” status in this course’s treatment. The transcript emphasizes when the dimension is not finite.

How do the examples and illustrate the definitions?

For , the space consists of constant functions, and the single function generates all constants by scaling, with no non-trivial way to combine it to get zero—so is a basis and . For , the monomials generate every polynomial of degree and are linearly independent, so .

Why is the matrix space said to have dimension 6?

A complex matrix has 223=6 independent entries. The transcript frames the task as finding a basis: a linearly independent set that generates the entire space. Since the space is finite-dimensional, the basis must have six elements, matching the number of degrees of freedom.

Review Questions

  1. What is the relationship between and the smallest subspace containing , and how does the empty set case fit in?
  2. Give the abstract definition of a basis for a subspace , and state how it leads to the definition of .
  3. In what sense can a set be infinite while linear independence still checks only finite combinations?

Key Points

  1. 1

    A linear combination in any vector space uses finitely many vectors, even when the vector space is infinite-dimensional.

  2. 2

    For any subset , is the set of all finite linear combinations of elements of , and it is always a subspace.

  3. 3

    A generating set for satisfies , meaning provides exactly the vectors needed to build .

  4. 4

    Linear independence means the zero vector has only the trivial finite linear combination representation using vectors from .

  5. 5

    A basis for is a set that both generates and is linearly independent.

  6. 6

    Dimension is the fixed number of vectors in any basis of ; it is finite for finite-dimensional subspaces and written as when not finite.

  7. 7

    Examples: with basis , and with basis ; complex matrices have dimension 6.

Highlights

Linear combinations are always finite sums, which keeps the definitions consistent even when the ambient space is infinite-dimensional.
A basis is exactly the sweet spot: it both generates a subspace and prevents non-trivial ways to combine basis vectors into zero.
has dimension 3 because every quadratic polynomial is uniquely built from the monomials .
The space of all complex matrices has dimension 6, matching the six independent entries.