Get AI summaries of any video or article — Sign up free
Probability Theory 23 | Stochastic Processes [dark version] thumbnail

Probability Theory 23 | Stochastic Processes [dark version]

5 min read

Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

A stochastic process is a time-indexed family of random variables (X_t)_{t∈T} defined on a shared sample space Ω.

Briefing

Stochastic processes are framed as a clean way to model randomness that evolves over time: they are essentially random variables arranged across time points, all sharing the same underlying sample space. That simple structure matters because many real-world systems don’t just produce one random outcome—they change step by step, such as a growing bacteria population or a board game where a piece moves after each die roll. In this view, time can be discrete (indexed by natural numbers) or continuous (indexed by real numbers), depending on what the model needs.

A concrete coin-toss game makes the idea tangible. Consider repeatedly tossing a coin until two successive heads occur. The game can end early, later, or potentially never, so the random variable must be defined on the space of all possible infinite sequences of heads and tails. The sample space Ω is therefore the set of all sequences whose entries are either heads or tails, including sequences with infinitely many tosses.

The random variable Xₙ is then defined with three outcomes—0, 1, and 2—capturing the game’s progress after n tosses. Outcome 2 means the process has already achieved two consecutive heads within the first n tosses; once that happens, the game is effectively “absorbed,” and the value stays at 2 forever. Outcome 0 means no pair of successive heads has occurred yet and the nth toss is tails. Outcome 1 is the middle state: no consecutive heads have occurred yet, but the nth toss is heads—so the next toss could complete the target.

As n increases, the process “jumps” between these states with specific probabilities. From state 0, the next toss is heads with probability 1/2, moving the process to state 1; it is tails with probability 1/2, leaving the process in state 0. From state 1, a heads outcome with probability 1/2 advances to state 2 (two heads in a row), while a tails outcome with probability 1/2 resets back to state 0. From state 2, the probability of leaving is zero: the process remains at 2 with probability 1.

Before returning to the coin example, a formal definition is given. A stochastic process is built by choosing a time index set T (often T = ℕ or ℤ for discrete time, or T = ℝ for continuous time), fixing a sample space Ω, and then defining a random variable X_t for each time point t in T. Collecting these random variables into a single object—often written as (X_t)_{t∈T}—produces a “path” through state space as randomness unfolds over time. The coin game then becomes a simple illustration of how such time-indexed randomness can be represented as state transitions with clear probabilities, setting up the next step for deeper study in subsequent material.

Cornell Notes

A stochastic process is a family of random variables indexed by time: for each time point t in a set T, there is a random variable X_t defined on the same sample space Ω. This lets randomness evolve step-by-step (discrete time like T = ℕ or ℤ) or continuously (T = ℝ). The coin-toss “two heads in a row” game models this by defining X_n after n tosses, using a sample space of all infinite head/tail sequences. The process has three states: 0 (no consecutive heads yet and nth toss is tails), 1 (no consecutive heads yet but nth toss is heads), and 2 (two consecutive heads already occurred). Transition probabilities are 1/2 between 0↔1 and 1/2 from 1 to 2, with state 2 absorbing forever.

What makes a stochastic process different from a single random variable?

A stochastic process is a collection of random variables ordered by time. Instead of one outcome, it assigns a value X_t to every time point t in an index set T (discrete or continuous). All X_t share the same sample space Ω, so the randomness is consistent across time, forming a time-evolving “path” through values.

How is the sample space Ω constructed for the “two successive heads” coin game?

Because the game depends on an entire sequence of coin tosses and may continue indefinitely, Ω is the set of all infinite sequences of heads and tails. Each element of Ω is a possible full history of the coin tosses, not just the first few outcomes.

Why does the coin game’s random variable X_n have three possible values (0, 1, 2)?

Those values encode progress after n tosses. X_n = 2 means two consecutive heads have already occurred within the first n tosses. X_n = 0 means no such pair has occurred yet and the nth toss is tails. X_n = 1 is the middle situation: no consecutive heads yet, but the nth toss is heads—so the next toss could complete the pair.

What are the transition probabilities between states 0, 1, and 2?

From state 0: the next toss is heads with probability 1/2, moving to state 1; tails with probability 1/2, staying at 0. From state 1: heads with probability 1/2 moves to state 2; tails with probability 1/2 resets to state 0. From state 2: the process stays at 2 with probability 1 (absorbing state).

How does the formal definition of a stochastic process relate to the coin example?

The formal definition picks a time index set T and defines a random variable X_t for each t on the same Ω. In the coin example, T corresponds to discrete time steps n, Ω is the space of infinite head/tail sequences, and X_n is the state (0, 1, or 2) describing the game’s status after n tosses.

Review Questions

  1. In the coin game, what does X_n = 1 guarantee about the first n tosses?
  2. Why is state 2 an absorbing state, and what probability governs transitions out of it?
  3. How does choosing T = ℕ (or ℤ) versus T = ℝ change the interpretation of a stochastic process?

Key Points

  1. 1

    A stochastic process is a time-indexed family of random variables (X_t)_{t∈T} defined on a shared sample space Ω.

  2. 2

    Time can be modeled as discrete (e.g., T = ℕ or ℤ) or continuous (e.g., T = ℝ), depending on the system.

  3. 3

    For the “two successive heads” game, Ω consists of all infinite head/tail sequences because the game can continue indefinitely.

  4. 4

    The random variable X_n is defined using three states: 0 (no consecutive heads yet, nth toss tails), 1 (no consecutive heads yet, nth toss heads), and 2 (consecutive heads already occurred).

  5. 5

    State 2 is absorbing: once two consecutive heads occur, X_n remains 2 for all larger n with probability 1.

  6. 6

    From state 0, the process moves to state 1 with probability 1/2 and stays at 0 with probability 1/2.

  7. 7

    From state 1, the process moves to state 2 with probability 1/2 and returns to state 0 with probability 1/2.

Highlights

Stochastic processes turn randomness into a time-evolving object by assigning a random variable value to every time point t in T.
The coin game’s state space is only three values (0, 1, 2), yet it captures the full history needed to know whether two consecutive heads has occurred.
The transition structure is probabilistic and Markov-like: the next state depends only on the current state (0, 1, or 2).
Reaching state 2 ends the “progress” aspect of the game—after that, the process is locked in with probability 1.

Topics