Digraphs Tournaments and Markov Chains
Directed graphs add orientation to adjacency. A road may be one-way, a web page may link to another without receiving a link back, and a state transition may have a preferred direction. Once directions matter, connectedness splits into several notions and matrices become especially useful.
Tournaments and Markov chains are two important directed examples. A tournament orients every edge of a complete graph, modelling pairwise contests. A Markov chain assigns transition probabilities to directed edges, modelling repeated movement between states.
Definitions
A directed graph or digraph has a vertex set and a set of ordered pairs called arcs. An arc from to is written .
The outdegree is the number of arcs leaving , and the indegree is the number of arcs entering . A directed walk follows arc directions. A digraph is strongly connected if for every ordered pair there is a directed path from to . It is weakly connected if the underlying undirected graph is connected.
A tournament is an orientation of a complete graph: for every distinct pair , exactly one of or is present.
A finite Markov chain has states and transition probabilities , where is the probability of moving from state to state in one step. Its transition matrix has nonnegative entries and each row sums to .
Key results
Directed handshaking. In every finite digraph,
where is the arc set. Each arc contributes once to an outdegree and once to an indegree.
Tournament Hamiltonian path theorem. Every tournament has a directed Hamiltonian path.
Proof sketch: insert vertices one at a time into an existing directed path. Given a path , place a new vertex before the first such that , or at the end if no such exists. The tournament property guarantees the needed orientations.
Eulerian digraph criterion. A weakly connected finite digraph has a directed Eulerian circuit if and only if
for every vertex .
Markov evolution. If is a row vector giving the state distribution at time , then
A stationary distribution satisfies .
Strong components. Strong connectedness partitions a digraph into maximal strongly connected subgraphs called strong components. If each strong component is shrunk to one vertex, the resulting condensation digraph is acyclic. This is a major structural difference from undirected components: directed reachability can have one-way movement between components, so the component graph carries order information.
Tournaments and scores. The score of a vertex in a tournament is its outdegree. Scores summarize how many pairwise contests each player wins, but they do not determine the tournament uniquely in general. A vertex with score beats every other vertex and is a source; a vertex with score loses to every other vertex and is a sink. Strong tournaments have directed paths both ways between every pair of vertices and, in fact, contain directed cycles.
Markov-chain interpretation of digraphs. A transition matrix defines a weighted digraph: draw when . Graph structure then controls probabilistic behavior. If the transition digraph is not strongly connected, probability may flow into a closed class and never return. If it is strongly connected and aperiodic, repeated multiplication often converges to a unique stationary distribution.
Adjacency matrices for digraphs. The adjacency matrix of a digraph has when . It is usually not symmetric. Powers still count directed walks: is the number of directed walks of length from to . This is the directed analogue of the walk-counting theorem for undirected graphs.
Acyclic digraphs. A directed acyclic graph, or DAG, has no directed cycle. DAGs admit a topological ordering of their vertices, meaning every arc points forward in the order. This is the graph-theoretic model behind prerequisites, task scheduling, and dependency resolution. Tournaments are DAGs exactly when they are transitive tournaments.
Stationary distributions as eigenvectors. With row-vector convention, a stationary distribution satisfies , so it is a left eigenvector of with eigenvalue . The entries must be nonnegative and sum to , which makes the eigenvector probabilistic rather than merely algebraic.
Visual
| Concept | Undirected version | Directed version |
|---|---|---|
| Degree | indegree and outdegree | |
| Connected | paths ignore direction | weak or strong connectedness |
| Euler condition | all degrees even | indegree equals outdegree |
| Matrix | symmetric adjacency for simple graph | generally nonsymmetric adjacency or transition matrix |
Worked example 1: Find a Hamiltonian path in a tournament
Problem. Consider the tournament on vertices with arcs
Find a directed Hamiltonian path.
Method.
- Start with a path containing : simply .
- Insert . Since , the path is .
- Insert . Check against : , so place before . The path becomes
- Insert . Compare with : the arc is , so cannot go before .
- Compare with : the arc is , so cannot go before .
- Compare with : the arc is , so place before .
The path is
Check. The arcs , , and are all in the tournament, and every vertex appears exactly once.
Worked example 2: Iterate a Markov chain
Problem. A weather model has states Sunny and Rainy with transition matrix
If today is certainly sunny, find the distribution after two days.
Method.
The initial distribution is
After one day:
After two days:
Checked answer. After two days the model gives probability of sunny and of rainy.
As a row-sum check, . Every probability distribution produced by multiplying by a row-stochastic matrix should still sum to ; if it does not, either the matrix rows were not normalized or the multiplication convention was mixed up.
Code
def step(dist, P):
n = len(dist)
return [sum(dist[i] * P[i][j] for i in range(n)) for j in range(n)]
P = [
[0.8, 0.2],
[0.4, 0.6],
]
dist = [1.0, 0.0]
for day in range(1, 6):
dist = step(dist, P)
print(day, dist)
def indegree_outdegree(vertices, arcs):
indeg = {v: 0 for v in vertices}
outdeg = {v: 0 for v in vertices}
for u, v in arcs:
outdeg[u] += 1
indeg[v] += 1
return indeg, outdeg
For Markov chains with many states, repeated multiplication is easy to implement but not always the fastest way to find a stationary distribution. Solving the linear system together with is often more direct. Graph structure still matters because reducible chains may have multiple stationary distributions.
For tournament problems, separate existence from construction. The insertion proof gives an explicit Hamiltonian path, but it does not usually give a Hamiltonian cycle. Stronger hypotheses, such as strong connectivity in tournaments, are needed for cycle conclusions.
For Markov-chain calculations, keep the state order fixed from the matrix to every vector. Swapping Sunny and Rainy halfway through a computation produces numbers that may still sum to one but describe the wrong states.
For directed examples, write arrows in every path you claim. A sequence of adjacent vertices in the underlying undirected graph is not necessarily a directed path. For Markov chains, check row sums or column sums according to the convention before doing any long calculation. A valid stochastic matrix should preserve total probability after one step.
Common pitfalls
- Treating a directed edge as if it also allows travel from to .
- Confusing weak connectedness with strong connectedness.
- Forgetting that tournaments have exactly one directed arc between every pair of distinct vertices.
- Using undirected Euler degree conditions on a digraph. Directed Euler circuits require indegree equal to outdegree.
- Multiplying Markov distributions on the wrong side. Row-vector notation uses ; column-vector notation uses or a column-stochastic convention.
- Accepting a transition matrix whose rows do not sum to .