Skip to main content

Big-O Notation

Big-O notation describes the asymptotic upper bound of a function's growth rate. It's how we compare algorithm efficiency without caring about constants or hardware.

Formal definition

We say f(n)=O(g(n))f(n) = O(g(n)) if there exist positive constants cc and n0n_0 such that:

0f(n)cg(n)for all nn00 \le f(n) \le c \cdot g(n) \quad \text{for all } n \ge n_0

In plain English: ff grows at most as fast as gg, up to a constant factor, for sufficiently large nn.

Common complexity classes

NotationNameExample
O(1)O(1)ConstantArray index access
O(logn)O(\log n)LogarithmicBinary search
O(n)O(n)LinearLinear search
O(nlogn)O(n \log n)LinearithmicMerge sort, heap sort
O(n2)O(n^2)QuadraticBubble sort, naive matrix mult.
O(n3)O(n^3)CubicStandard matrix multiplication
O(2n)O(2^n)ExponentialBrute-force subset enumeration
O(n!)O(n!)FactorialNaive traveling salesman
def linear_search(arr, target):
for i, x in enumerate(arr):
if x == target:
return i
return -1

The loop runs at most nn times, so this is O(n)O(n).

def binary_search(arr, target):
lo, hi = 0, len(arr) - 1
while lo <= hi:
mid = (lo + hi) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
lo = mid + 1
else:
hi = mid - 1
return -1

Each iteration halves the search space, so it takes O(logn)O(\log n) time.

warning

Big-O describes the worst case. Big-Θ (theta) describes a tight bound; Big-Ω (omega) describes a lower bound. Don't conflate them.

  • Vectors — linear algebra cost analysis often uses these bounds.