Get AI summaries of any video or article — Sign up free
Linear Algebra 17 | Properties of the Matrix Product [dark version] thumbnail

Linear Algebra 17 | Properties of the Matrix Product [dark version]

4 min read

Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Each entry of AB is computed by summing products of a row of A with a column of B: (AB)_{ij}=Σ_{ℓ} a_{iℓ}b_{ℓj}.

Briefing

Matrix multiplication is built from a simple rule: each entry of the product comes from an inner product between a row of the first matrix and a column of the second. If A is an m×n matrix and B is an n×p matrix, then AB is m×p, and the (i,j) entry is computed by summing over the “middle” index: (AB)_{ij} = Σ_{ℓ=1..n} a_{iℓ} b_{ℓj}. This “sum over the middle” is the mechanism behind every algebraic property that follows, because it reduces matrix identities to familiar arithmetic with real numbers.

From that definition, several calculation rules carry over from ordinary arithmetic—especially distributive, scalar compatibility, and associativity. For distributivity, matrix multiplication distributes over addition when the addition happens on the same side: (A + B)C = AC + BC and A(B + C) = AB + AC, with the order of factors preserved. Scalar multiplication behaves consistently too. For any real scalar λ, multiplying a matrix product by λ from the left matches multiplying one factor by λ first: λ(AB) = (λA)B = A(λB). Associativity also holds: (AB)C = A(BC), provided the matrix sizes match so each product is defined.

The key reason these laws work is that the product’s definition turns each matrix entry into sums and products of real numbers. That lets standard real-number identities—like distributive rearrangements and regrouping of sums—transfer directly to the matrix setting. A component-wise proof of associativity tracks a fixed (i,j) entry on both sides, expands both products using the summation formula, and then rearranges sums and factors until the same expression appears for A(BC) and (AB)C.

One major arithmetic property does *not* carry over: commutativity. In general, AB ≠ BA because swapping the order changes which rows and columns get paired in the inner-product computation. A concrete 2×2 example makes the point: multiplying the matrix [[0,1],[-1,0]] by [[1,1],[1,1]] yields a different result than reversing the order. Associativity remains intact—parentheses can move without changing the outcome—but the order of the matrices themselves cannot be exchanged in general. That distinction—associative but not commutative—is the practical takeaway for doing matrix calculations correctly.

Cornell Notes

Matrix multiplication is defined entry-by-entry using a row–column inner product. If A is m×n and B is n×p, then AB is m×p and (AB)_{ij} = Σ_{ℓ=1..n} a_{iℓ} b_{ℓj}, summing over the shared “middle” dimension. Using this summation formula, distributive laws hold (e.g., (A+B)C = AC+BC and A(B+C)=AB+AC), scalar multiplication is compatible (λ(AB)=(λA)B=A(λB)), and multiplication is associative: (AB)C = A(BC) when sizes match. What fails in general is commutativity: AB usually does not equal BA, since swapping the order changes the row/column pairing that determines each entry.

How exactly is the (i,j) entry of a matrix product AB computed?

For A (m×n) and B (n×p), the product AB is m×p. The entry in row i and column j is (AB)_{ij} = Σ_{ℓ=1..n} a_{iℓ} b_{ℓj}. Here, ℓ indexes the shared dimension: you take row i of A and column j of B, multiply corresponding entries, and sum.

Why do distributive laws work for matrix multiplication, and what must stay in the same order?

Distributivity holds because each entry of a product is a sum of real-number products. For example, (A+B)C = AC + BC keeps C on the right in every term, so the same row/column pairing occurs. Similarly, A(B+C)=AB+AC keeps A on the left. The order of factors matters, so distributivity is applied without swapping A and B.

How does scalar multiplication interact with matrix products?

For any real scalar λ, λ(AB) equals (λA)B and also equals A(λB). In other words, scaling either factor by λ before multiplying gives the same result as scaling the entire product afterward. This follows directly from the summation definition: λ multiplies each term a_{iℓ} b_{ℓj} in the sum.

What does associativity mean for matrices, and when is it valid?

Associativity means (AB)C = A(BC), as long as the matrix sizes make both products defined. Parentheses can change, but the order of matrices stays A then B then C. A component-wise proof fixes an (i,j) entry, expands both sides using Σ_{ℓ}, and uses real-number algebra to show the same expression results.

Why is matrix multiplication not commutative?

Commutativity would require AB = BA for all matrices, but the definition pairs rows of the first matrix with columns of the second. Swapping the order changes which row/column combinations are used, so the (i,j) entries generally differ. A 2×2 example with [[0,1],[-1,0]] and a matrix of ones shows AB ≠ BA even though both products are defined.

Review Questions

  1. Given A is m×n and B is n×p, what are the dimensions of AB, and what is the summation index in (AB)_{ij}?
  2. State the distributive laws for matrix multiplication and indicate which side the addition occurs on.
  3. Provide a reason matrix multiplication is not commutative, even though it is associative.

Key Points

  1. 1

    Each entry of AB is computed by summing products of a row of A with a column of B: (AB)_{ij}=Σ_{ℓ} a_{iℓ}b_{ℓj}.

  2. 2

    Matrix multiplication is defined only when the number of columns of the left matrix matches the number of rows of the right matrix.

  3. 3

    Distributive laws hold, but factor order must be preserved: (A+B)C=AC+BC and A(B+C)=AB+AC.

  4. 4

    Scalar multiplication is compatible with matrix products: λ(AB)=(λA)B=A(λB).

  5. 5

    Matrix multiplication is associative when sizes match: (AB)C=A(BC).

  6. 6

    Matrix multiplication is not commutative in general: AB generally differs from BA.

Highlights

The (i,j) entry of AB comes from a “middle-index” sum: (AB)_{ij} = Σ_{ℓ=1..n} a_{iℓ} b_{ℓj}.
Distributivity and scalar compatibility work because each matrix entry reduces to real-number sums and products.
Associativity holds without changing the order of factors: (AB)C = A(BC).
Commutativity fails in general because swapping factors changes which rows and columns get paired.