Probability Theory 23 | Stochastic Processes [dark version]
Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
A stochastic process is a time-indexed family of random variables (X_t)_{t∈T} defined on a shared sample space Ω.
Briefing
Stochastic processes are framed as a clean way to model randomness that evolves over time: they are essentially random variables arranged across time points, all sharing the same underlying sample space. That simple structure matters because many real-world systems don’t just produce one random outcome—they change step by step, such as a growing bacteria population or a board game where a piece moves after each die roll. In this view, time can be discrete (indexed by natural numbers) or continuous (indexed by real numbers), depending on what the model needs.
A concrete coin-toss game makes the idea tangible. Consider repeatedly tossing a coin until two successive heads occur. The game can end early, later, or potentially never, so the random variable must be defined on the space of all possible infinite sequences of heads and tails. The sample space Ω is therefore the set of all sequences whose entries are either heads or tails, including sequences with infinitely many tosses.
The random variable Xₙ is then defined with three outcomes—0, 1, and 2—capturing the game’s progress after n tosses. Outcome 2 means the process has already achieved two consecutive heads within the first n tosses; once that happens, the game is effectively “absorbed,” and the value stays at 2 forever. Outcome 0 means no pair of successive heads has occurred yet and the nth toss is tails. Outcome 1 is the middle state: no consecutive heads have occurred yet, but the nth toss is heads—so the next toss could complete the target.
As n increases, the process “jumps” between these states with specific probabilities. From state 0, the next toss is heads with probability 1/2, moving the process to state 1; it is tails with probability 1/2, leaving the process in state 0. From state 1, a heads outcome with probability 1/2 advances to state 2 (two heads in a row), while a tails outcome with probability 1/2 resets back to state 0. From state 2, the probability of leaving is zero: the process remains at 2 with probability 1.
Before returning to the coin example, a formal definition is given. A stochastic process is built by choosing a time index set T (often T = ℕ or ℤ for discrete time, or T = ℝ for continuous time), fixing a sample space Ω, and then defining a random variable X_t for each time point t in T. Collecting these random variables into a single object—often written as (X_t)_{t∈T}—produces a “path” through state space as randomness unfolds over time. The coin game then becomes a simple illustration of how such time-indexed randomness can be represented as state transitions with clear probabilities, setting up the next step for deeper study in subsequent material.
Cornell Notes
A stochastic process is a family of random variables indexed by time: for each time point t in a set T, there is a random variable X_t defined on the same sample space Ω. This lets randomness evolve step-by-step (discrete time like T = ℕ or ℤ) or continuously (T = ℝ). The coin-toss “two heads in a row” game models this by defining X_n after n tosses, using a sample space of all infinite head/tail sequences. The process has three states: 0 (no consecutive heads yet and nth toss is tails), 1 (no consecutive heads yet but nth toss is heads), and 2 (two consecutive heads already occurred). Transition probabilities are 1/2 between 0↔1 and 1/2 from 1 to 2, with state 2 absorbing forever.
What makes a stochastic process different from a single random variable?
How is the sample space Ω constructed for the “two successive heads” coin game?
Why does the coin game’s random variable X_n have three possible values (0, 1, 2)?
What are the transition probabilities between states 0, 1, and 2?
How does the formal definition of a stochastic process relate to the coin example?
Review Questions
- In the coin game, what does X_n = 1 guarantee about the first n tosses?
- Why is state 2 an absorbing state, and what probability governs transitions out of it?
- How does choosing T = ℕ (or ℤ) versus T = ℝ change the interpretation of a stochastic process?
Key Points
- 1
A stochastic process is a time-indexed family of random variables (X_t)_{t∈T} defined on a shared sample space Ω.
- 2
Time can be modeled as discrete (e.g., T = ℕ or ℤ) or continuous (e.g., T = ℝ), depending on the system.
- 3
For the “two successive heads” game, Ω consists of all infinite head/tail sequences because the game can continue indefinitely.
- 4
The random variable X_n is defined using three states: 0 (no consecutive heads yet, nth toss tails), 1 (no consecutive heads yet, nth toss heads), and 2 (consecutive heads already occurred).
- 5
State 2 is absorbing: once two consecutive heads occur, X_n remains 2 for all larger n with probability 1.
- 6
From state 0, the process moves to state 1 with probability 1/2 and stays at 0 with probability 1/2.
- 7
From state 1, the process moves to state 2 with probability 1/2 and returns to state 0 with probability 1/2.