Abstract Linear Algebra 4 | Basis, Linear Independence, Generating Sets
Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
A linear combination in any vector space uses finitely many vectors, even when the vector space is infinite-dimensional.
Briefing
The core takeaway is that “basis,” “linear independence,” and “dimension” from standard linear algebra extend cleanly to abstract vector spaces—including infinite-dimensional ones—by defining everything in terms of finite linear combinations. That extension matters because it lets mathematicians describe polynomial spaces, function spaces, and other large objects using the same structural language as and , even when the number of needed directions is unbounded.
The discussion starts by recalling the polynomial vector spaces : all real-valued polynomials on with degree at most . As increases, these spaces grow, and in the limit there is no degree bound—leading to an infinite-dimensional polynomial space. That forces a broader notion of dimension than the usual finite one.
To set up the abstract framework, the transcript defines a general linear combination in a vector space : given vectors and scalars , a linear combination is . A key rule is emphasized: linear combinations are always finite sums. Even when the eventual vector space is infinite, the coefficients are chosen for only finitely many vectors at a time.
Next comes the span. For any subset , is the set of all vectors obtainable as finite linear combinations of elements of . The span is always a subspace of . When is empty, is defined as the smallest subspace, containing only the zero vector.
A subset is then called a generating set for a subspace if . This captures the idea that provides exactly enough information to build every vector in using linear combinations—no more and no less.
Linear independence is introduced as a uniqueness condition on coefficients. A set is linearly independent if the only way to represent the zero vector as a linear combination of elements of is the trivial one where all coefficients are zero. The set itself may be infinite, but the combinations tested are still finite.
Combining generating and independence yields the definition of a basis: a set is a basis of if it both generates and is linearly independent. The dimension is the number of vectors in any basis of ; this number is fixed even though many different bases may exist. For infinite-dimensional cases, the transcript notes that one could use cardinality to distinguish different infinities, but for this course it will mainly distinguish only finite versus infinite, writing when it is not finite.
Concrete examples anchor the definitions. For , the basis is the constant function , so . For , the monomials form a basis, giving . The transcript also highlights that the space of all functions on is infinite-dimensional. Finally, it points to a finite-dimensional matrix space: all complex-valued matrices have dimension , and the task is to find a basis (a linearly independent generating set) with six elements, foreshadowing coordinate methods in the next video.
Cornell Notes
The transcript extends basis, linear independence, span, and dimension from finite-dimensional spaces to general (including infinite-dimensional) vector spaces. A linear combination always uses only finitely many vectors, even when the underlying set of vectors is infinite. For any subset , is the set of all finite linear combinations of elements of , and it is always a subspace. A set generates a subspace if , and it is linearly independent if the zero vector has only the trivial representation using vectors from . A basis is both generating and linearly independent, and is the (fixed) number of vectors in any basis—finite or infinite.
Why does the definition of linear combination insist on finite sums, and what does that mean for infinite-dimensional spaces?
How do span, generating sets, and bases fit together logically?
What exactly does linear independence mean in this abstract setting?
How is dimension defined, and why is it not dependent on which basis is chosen?
How do the examples and illustrate the definitions?
Why is the matrix space said to have dimension 6?
Review Questions
- What is the relationship between and the smallest subspace containing , and how does the empty set case fit in?
- Give the abstract definition of a basis for a subspace , and state how it leads to the definition of .
- In what sense can a set be infinite while linear independence still checks only finite combinations?
Key Points
- 1
A linear combination in any vector space uses finitely many vectors, even when the vector space is infinite-dimensional.
- 2
For any subset , is the set of all finite linear combinations of elements of , and it is always a subspace.
- 3
A generating set for satisfies , meaning provides exactly the vectors needed to build .
- 4
Linear independence means the zero vector has only the trivial finite linear combination representation using vectors from .
- 5
A basis for is a set that both generates and is linearly independent.
- 6
Dimension is the fixed number of vectors in any basis of ; it is finite for finite-dimensional subspaces and written as when not finite.
- 7
Examples: with basis , and with basis ; complex matrices have dimension 6.