General Vector Spaces
The abstraction from to vector spaces is not a change of subject. It is a way to recognize the same algebraic behavior in polynomials, functions, matrices, sequences, and solution sets. Once the vector space axioms hold, ideas such as span, independence, basis, dimension, and linear transformation work in the same way.
This abstraction is useful because many objects that do not look like arrows still add and scale like arrows. Polynomials can be added and multiplied by constants. Continuous functions can be added and scaled. Matrices of the same size can be added and scaled. Linear algebra keeps the operations and lets the objects vary.
Definitions
A real vector space is a set with vector addition and scalar multiplication satisfying the standard closure, associativity, commutativity, identity, inverse, and distributive axioms. The elements of are called vectors even when they are functions or matrices.
A subspace of is a nonempty subset that is itself a vector space under the operations inherited from . It is enough to check:
- .
- If , then .
- If and , then .
The span of is
A set is linearly independent if the equation
forces every . If there is a nontrivial solution, the set is linearly dependent.
A basis is a linearly independent spanning set. In a finite-dimensional vector space, the dimension is the number of vectors in any basis.
Important examples include , the polynomial space of polynomials of degree at most , the matrix space , and spaces of functions such as .
Key results
The span of any nonempty set of vectors is a subspace. Closure follows because a sum of linear combinations is another linear combination, and a scalar multiple of a linear combination is another linear combination.
The solution set of a homogeneous linear system is a subspace of . If and , then
and
In contrast, the solution set of a nonhomogeneous system is usually not a subspace because it generally does not contain .
Every finite spanning set can be reduced to a basis by removing redundant vectors. Every finite independent set can be extended to a basis by adding vectors until it spans. These facts are the structural reason dimension is well-defined.
A subset described by a linear condition is often a subspace when the condition is homogeneous. For example, the set of polynomials with is a subspace because and . The set of polynomials with is not a subspace because it does not contain the zero polynomial and is not closed under scalar multiplication.
Visual
| Candidate set | Is it a vector space or subspace? | Reason |
|---|---|---|
| yes | standard addition and scalar multiplication | |
| yes | degree remains at most after addition and scaling | |
| Solutions of | yes | homogeneous equations are closed |
| Solutions of with | usually no | zero vector usually absent |
| Polynomials with | no | not closed under scalar multiplication |
Worked example 1: Test a subset of polynomials
Problem: let
Show that is a subspace of .
Step 1: check the zero vector. The zero polynomial satisfies
so .
Step 2: check closure under addition. Suppose . Then and . Therefore
So .
Step 3: check closure under scalar multiplication. If and , then
So .
Checked answer: is a subspace of .
We can also describe it. Since , the factor theorem gives
Thus is spanned by
Worked example 2: Show a set is not a subspace
Problem: decide whether
is a subspace of .
Step 1: test the zero vector.
has , not . Therefore .
Step 2: since every subspace must contain the zero vector, is not a subspace.
Step 3: confirm by scalar multiplication. The vector
belongs to because . But
does not belong to because .
Checked answer: is not a subspace. Geometrically, it is a line that does not pass through the origin.
Code
import sympy as sp
x = sp.symbols("x")
basis = [x - 1, x * (x - 1), x**2 * (x - 1)]
for p in basis:
print(sp.expand(p), "p(1) =", sp.expand(p).subs(x, 1))
candidate = 2*x**3 - 3*x**2 + x
print(sp.factor(candidate))
print(candidate.subs(x, 1))
The snippet uses symbolic polynomials to check the condition and to factor a candidate polynomial. It illustrates that vector-space ideas apply to algebraic objects beyond columns of numbers.
Common pitfalls
- Checking only that a subset is nonempty. A subspace must also be closed under addition and scalar multiplication.
- Forgetting that the zero vector depends on the space: zero polynomial, zero matrix, zero function, or zero column vector.
- Treating any line in as a subspace. Only lines through the origin are subspaces.
- Assuming a set described by an equation is a subspace. The equation must interact correctly with addition and scaling, and nonhomogeneous constants usually break closure.
- Confusing span with independence. A set can span and still be dependent if it contains redundancy.
- Believing vectors must be arrows. In a vector space, "vector" means an object that obeys the vector operations.
When testing a subspace, the zero-vector check is the fastest first filter. If the zero vector is missing, the set is not a subspace. If the zero vector is present, closure still has to be checked. Many students stop too early after finding zero, but zero is necessary rather than sufficient. Closure under linear combinations is the compact test:
for all scalars .
Homogeneous conditions tend to define subspaces because scaling preserves zero. Conditions such as , , or are compatible with addition and scalar multiplication when the underlying operation is linear. Nonhomogeneous conditions such as or with usually define shifted sets, not subspaces.
The abstraction also clarifies why proofs can be reused. Once a theorem is proved from the vector space axioms, it applies to polynomials, matrices, functions, and coordinate vectors without separate proofs. This is the payoff for learning the axioms: they identify the exact assumptions needed for linear arguments to work.
For finite-dimensional spaces, basis and dimension prevent ambiguity. A polynomial in can be represented by its coefficient vector relative to , but it is still a polynomial. Coordinates are representations of vectors, not the vectors themselves. This distinction becomes important when changing bases or choosing inner products.
Examples and nonexamples are equally important. The set of all polynomials is a vector space, and the set is a finite-dimensional subspace. The set of polynomials of exactly degree is not a subspace because adding two degree- polynomials can cancel leading terms and produce lower degree, and the zero polynomial does not have degree . This kind of closure failure is common.
Subspaces can be combined. The intersection of two subspaces is always a subspace because any vector satisfying both sets of closure-compatible conditions remains inside both after addition and scaling. The union of two subspaces is usually not a subspace unless one subspace contains the other. This distinction is a useful test of whether a proposed construction respects linear structure.
The language of vector spaces prepares for linear maps. Kernels, ranges, eigenspaces, and solution sets of homogeneous systems are all subspaces. Once you recognize a set as a subspace, the next natural questions are: what spans it, what basis is convenient, and what is its dimension?
A vector space is not defined by what its elements look like; it is defined by what operations are allowed and what laws they satisfy. This is why the same theorem can apply to arrows, polynomials, matrices, and functions. The abstraction removes irrelevant surface features and keeps the additive and scalable structure.
When learning a new vector space, identify its zero vector, its addition rule, and its scalar multiplication rule first. Then ask which familiar linear algebra concepts make sense there: span, independence, basis, dimension, linear maps, and inner products if one is supplied.
Those checks keep the abstraction tied to concrete operations.