Bases, Dimension, and Rank
A basis is a coordinate system for a vector space. It is small enough to avoid redundancy and large enough to reach every vector. Rank and nullity measure how a matrix distributes dimensions between outputs it can produce and inputs it sends to zero.
These ideas connect computation with structure. Row reduction finds pivots, pivots identify independent columns, independent columns form bases for column spaces, and free variables describe null spaces. Dimension turns these observations into exact counts rather than vague descriptions.
Definitions
A basis for a vector space is a set of vectors that spans and is linearly independent. If is a basis and
then the coordinate vector of relative to is
The dimension of a finite-dimensional vector space is the number of vectors in any basis.
For a matrix , the column space is the span of its columns, the row space is the span of its rows, and the null space is
The rank of is the dimension of its column space. The nullity of is the dimension of its null space. Pivot columns are the columns of the original matrix corresponding to leading columns in an echelon form.
Key results
All bases of a finite-dimensional vector space have the same number of vectors. This makes dimension well-defined. The proof idea is the exchange principle: an independent set cannot contain more vectors than a spanning set, because each independent vector consumes one necessary direction.
For an matrix ,
This is the rank-nullity theorem for matrices. Row reduction proves it: each of the variables is either a pivot variable or a free variable. Pivot variables count rank; free variables count nullity.
The pivot columns of the original matrix form a basis for the column space. The nonzero rows of an echelon form form a basis for the row space. Row operations preserve row space only up to row-equivalent span, but they do not preserve the original column vectors, so column-space bases must be selected from the original columns.
If is an matrix, then
The rank is also the dimension of the row space. This equality of row rank and column rank is one of the fundamental structural results of matrix theory.
Visual
| Object | How to find it | Dimension |
|---|---|---|
| Column space | pivot columns of original | rank |
| Row space | nonzero rows of echelon form | rank |
| Null space | solve | nullity |
| Basis for | independent spanning set |
Worked example 1: Find bases and rank
Problem: find bases for the column space, row space, and null space of
Step 1: row reduce. Subtract from and from :
Step 2: add to and scale by :
Step 3: eliminate above the second pivot:
Pivot columns are columns and . Therefore a basis for the column space is
A basis for the row space is given by the nonzero rows:
The rank is .
Worked example 2: Find a null-space basis and verify rank-nullity
Continue with the same matrix. The reduced equations are
Step 1: identify free variables. Columns and are non-pivot columns, so set
Step 2: solve for pivot variables.
Step 3: write the general null vector.
Thus a basis for the null space is
Step 4: verify rank-nullity. The matrix has columns, rank , and nullity , so
Checked answer: the dimension count is consistent.
Code
import sympy as sp
A = sp.Matrix([[1, 2, 1, 3],
[2, 4, 0, 4],
[1, 2, 3, 5]])
print(A.rref())
print("rank:", A.rank())
print("columnspace:", A.columnspace())
print("rowspace:", A.rowspace())
print("nullspace:", A.nullspace())
SymPy returns exact rational row reduction and exact bases. This is useful for studying rank and nullity because floating-point row reduction can obscure exact dependence.
Common pitfalls
- Taking pivot columns from the row-reduced matrix as a column-space basis. Use the pivot column positions, but take the columns from the original matrix.
- Confusing rank with the number of rows. Rank counts independent directions, not raw matrix size.
- Forgetting that nullity counts free variables in .
- Calling a spanning set a basis before checking independence.
- Assuming the coordinate vector is the same as . It is the list of coefficients relative to a chosen basis.
- Losing parameters when writing a null-space basis.
A useful basis test has two directions. If a list has exactly vectors, then it is enough to prove either spanning or independence; the other property follows. If the number of vectors is not known to match the dimension, both properties must be checked. In , for example, three independent vectors automatically form a basis, and three spanning vectors are automatically independent. Two vectors cannot span , and four vectors cannot be independent in .
Rank should be read as the number of independent output directions a matrix can produce. Nullity is the number of independent input directions that are lost. Rank-nullity says every input degree of freedom either affects the output or disappears into the null space. This interpretation is often more memorable than the formula alone.
When finding a column-space basis, remember that row operations alter the columns. The reduced matrix is excellent for identifying pivot positions, but the actual basis vectors should be selected from the original matrix. In contrast, row operations replace rows by linear combinations of rows, so the nonzero rows of an echelon form can be used as a row-space basis.
Coordinates relative to a basis are not universal. The same vector can have different coordinate columns in different bases. If is not the standard basis, the coordinate vector tells how to build from the basis vectors in . This is the foundation for change-of-basis matrices and diagonalization.
When a matrix is used as a linear transformation, rank is the dimension of the range. This gives a geometric interpretation to column space: it is the set of all outputs the transformation can produce. Nullity is the dimension of the kernel: it counts independent directions in the input that are sent to zero. The rank-nullity theorem says the domain dimension splits into visible output directions and lost directions.
Basis selection is not unique. A vector space has many possible bases, and a subspace often has many possible basis lists. What is invariant is the number of vectors in any basis. This is why dimension is meaningful even though the basis itself can be chosen for convenience. In computation, one basis may be sparse, another orthogonal, and another adapted to eigenvectors.
A final check on basis answers is reconstruction. If you claim a list spans a space, choose a typical vector in that space and show how it is expressed as a linear combination. If you claim independence, set a general linear combination equal to zero and show every coefficient must vanish. These two tests are the definition, and they remain reliable even when the space consists of polynomials, matrices, or functions.
In applications, rank often measures effective complexity. A data matrix with low rank has columns or rows that are strongly redundant. A transformation with low rank compresses many inputs into a smaller output set. A system whose coefficient matrix has deficient rank may have either no solution or infinitely many solutions, depending on the augmented vector.
Dimension arguments are also useful for impossibility claims. Four vectors in cannot be independent. Two vectors cannot span . A linear map from to cannot be one-to-one. These conclusions follow from dimension before any detailed arithmetic is done.
When stuck, count dimensions first. Dimension counts do not solve every problem, but they often tell you what kind of answer is possible.
Then use row reduction or a spanning argument to find the actual vectors that realize that count.