Orthogonality in Rn
Orthogonality turns vector algebra into geometry. Length, angle, distance, projection, and perpendicularity can all be expressed with the dot product. This is why systems, approximation, and decompositions become geometric: the best answer is often the one whose error is perpendicular to the space of allowable answers.
The dot product is the bridge between coordinates and measurement. Without it, vectors can be added and scaled; with it, one can ask how long a vector is, whether two directions are perpendicular, and how much of one vector lies in another direction. Projection is the key operation that later becomes least squares and QR factorization.
Definitions
The dot product of is
The Euclidean norm is
and the distance between points and is
Nonzero vectors form angle satisfying
Vectors are orthogonal if . The projection of onto a nonzero vector is
An orthogonal set is a set whose distinct vectors are mutually orthogonal. An orthonormal set is orthogonal and each vector has norm .
Key results
The Cauchy-Schwarz inequality states
This guarantees that the cosine formula is valid because the quotient lies between and .
The triangle inequality states
The Pythagorean theorem generalizes: if , then
Projection decomposes a vector into parallel and orthogonal parts:
and the second part is orthogonal to . To verify, compute
If nonzero vectors are mutually orthogonal, then they are linearly independent. A linear combination equal to zero can be dotted with each vector in turn, forcing each coefficient to be zero.
Orthogonality is also the cleanest language for decomposing information. If a vector is written as a sum of perpendicular pieces, then the squared length of the whole vector is the sum of the squared lengths of the pieces. This is why projection formulas are so useful: they separate "the part explained by a direction" from "the part left over." In a data setting, the explained part might be the component along a trend vector, while the orthogonal error is the part not captured by that trend. In geometry, the same decomposition gives the shortest distance from a point to a line or plane.
For a line through the origin, the projection is the closest point of to . To see this, write any vector on the line as . The squared distance from to is
Expanding with dot products gives
This quadratic in is minimized when
which is exactly the projection coefficient. The algebraic minimizer and the geometric perpendicular-error condition are the same fact in two languages.
The dot product also encodes orientation information. A positive dot product means the angle is acute and the vectors point partly in the same direction. A negative dot product means the angle is obtuse and the vectors point partly against each other. A zero dot product means neither vector has a component in the direction of the other. This interpretation is often faster than computing the actual angle.
Orthogonal matrices preserve this entire geometry. If , then
Therefore orthogonal transformations preserve lengths, distances, angles, and projections. Rotations and reflections are the standard examples. This fact becomes essential in QR factorization, the spectral theorem, and the singular value decomposition, where orthogonal changes of coordinates are preferred because they do not distort measurement.
Visual
ASCII projection picture:
u
*
/|
/ |
/ | orthogonal error
/ |
-----------*----*----------------> span{a}
0 proj_a(u)
| Quantity | Formula | Meaning |
|---|---|---|
| Dot product | signed alignment | |
| Norm | length | |
| Distance | length of displacement | |
| Projection onto | closest vector on the line through | |
| Orthogonality | perpendicular directions |
Worked example 1: Angle, norm, and orthogonality
Problem: for
compute the dot product, norms, and angle.
Step 1: compute the dot product.
Step 2: compute norms.
and
Step 3: compute the angle.
Thus . Checked answer: the vectors are orthogonal.
Worked example 2: Projection onto a vector
Problem: project
onto
Step 1: compute the dot products.
Step 2: apply the projection formula.
Step 3: compute the error vector.
Step 4: check orthogonality.
Checked answer: the projection is , and the error is perpendicular to .
Code
import numpy as np
u = np.array([3, 4], dtype=float)
a = np.array([1, 2], dtype=float)
proj = (u @ a) / (a @ a) * a
error = u - proj
print(proj)
print(error)
print(error @ a)
print(np.linalg.norm(u) ** 2, np.linalg.norm(proj) ** 2 + np.linalg.norm(error) ** 2)
The final line checks the Pythagorean decomposition .
Common pitfalls
- Forgetting that projection onto requires .
- Using instead of in the projection denominator.
- Assuming a zero dot product means one vector must be zero. Nonzero perpendicular vectors have zero dot product.
- Mixing degrees and radians when interpreting angles.
- Treating orthogonal and orthonormal as the same. Orthonormal also requires unit length.
- Believing projection preserves the original vector's length. Projection usually shortens it.
A dependable projection check is to compute the error and dot it with the target direction or subspace basis. If the projection is correct, the error should be orthogonal to every direction in the subspace. This check is often easier and more meaningful than recomputing the projection formula. It also prepares for least squares, where the residual must be orthogonal to every column of the design matrix.
When working with angles, avoid computing inverse cosine unless the angle itself is required. The dot product already tells much of the story. Positive means acute, negative means obtuse, and zero means right angle. The magnitude of the dot product relative to the product of norms tells how strongly aligned the vectors are. This interpretation is common in data analysis, where cosine similarity compares directions rather than lengths.
Orthogonality is not the same as independence, but nonzero orthogonal vectors are automatically independent. The converse is false: independent vectors need not be perpendicular. This distinction matters in basis work. Any basis can express vectors, but an orthonormal basis makes coordinates especially easy because coordinates are dot products.
In numerical work, exact zero dot products are rare when data are measured or computed with floating-point arithmetic. Instead of testing whether exactly, one usually checks whether its absolute value is small relative to the norms involved. The mathematical definition is exact; the computational test must account for rounding and noise.
Orthogonal complements give another organizing idea. If is a subspace of , then
is the set of all vectors perpendicular to . Projection decomposes a vector into a part in and a part in . This is the geometric form of many algebraic decompositions.
For a matrix , the null space of is orthogonal to the column space of . This is exactly what least squares uses: the residual lies in the orthogonal complement of the column space. A fact that begins as perpendicular arrows in becomes the core condition for regression and approximation.
When studying orthogonality, draw both the vector and its projection whenever possible. The picture reinforces which vector is being projected, which direction or subspace receives the projection, and which error is perpendicular. Most formula errors come from mixing up those roles.
The same dot product can represent physical work, statistical similarity, or geometric alignment depending on context. In physics, force dotted with displacement measures work. In data analysis, normalized dot products compare direction patterns. In geometry, dot products measure angles and projections. The formula is the same, but the interpretation comes from the modeled quantities.
Orthogonality also gives a way to simplify calculations. If vectors are orthogonal, many cross terms vanish when norms are squared. This is why orthogonal bases, QR factorization, and spectral decompositions are so valuable: they turn coupled calculations into sums of independent contributions.
This simplification is the recurring payoff of perpendicular coordinates.
Whenever a calculation becomes messy, look for an orthogonal basis or projection that separates the components.