Linear Algebra 17 | Properties of the Matrix Product [dark version]
Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Each entry of AB is computed by summing products of a row of A with a column of B: (AB)_{ij}=Σ_{ℓ} a_{iℓ}b_{ℓj}.
Briefing
Matrix multiplication is built from a simple rule: each entry of the product comes from an inner product between a row of the first matrix and a column of the second. If A is an m×n matrix and B is an n×p matrix, then AB is m×p, and the (i,j) entry is computed by summing over the “middle” index: (AB)_{ij} = Σ_{ℓ=1..n} a_{iℓ} b_{ℓj}. This “sum over the middle” is the mechanism behind every algebraic property that follows, because it reduces matrix identities to familiar arithmetic with real numbers.
From that definition, several calculation rules carry over from ordinary arithmetic—especially distributive, scalar compatibility, and associativity. For distributivity, matrix multiplication distributes over addition when the addition happens on the same side: (A + B)C = AC + BC and A(B + C) = AB + AC, with the order of factors preserved. Scalar multiplication behaves consistently too. For any real scalar λ, multiplying a matrix product by λ from the left matches multiplying one factor by λ first: λ(AB) = (λA)B = A(λB). Associativity also holds: (AB)C = A(BC), provided the matrix sizes match so each product is defined.
The key reason these laws work is that the product’s definition turns each matrix entry into sums and products of real numbers. That lets standard real-number identities—like distributive rearrangements and regrouping of sums—transfer directly to the matrix setting. A component-wise proof of associativity tracks a fixed (i,j) entry on both sides, expands both products using the summation formula, and then rearranges sums and factors until the same expression appears for A(BC) and (AB)C.
One major arithmetic property does *not* carry over: commutativity. In general, AB ≠ BA because swapping the order changes which rows and columns get paired in the inner-product computation. A concrete 2×2 example makes the point: multiplying the matrix [[0,1],[-1,0]] by [[1,1],[1,1]] yields a different result than reversing the order. Associativity remains intact—parentheses can move without changing the outcome—but the order of the matrices themselves cannot be exchanged in general. That distinction—associative but not commutative—is the practical takeaway for doing matrix calculations correctly.
Cornell Notes
Matrix multiplication is defined entry-by-entry using a row–column inner product. If A is m×n and B is n×p, then AB is m×p and (AB)_{ij} = Σ_{ℓ=1..n} a_{iℓ} b_{ℓj}, summing over the shared “middle” dimension. Using this summation formula, distributive laws hold (e.g., (A+B)C = AC+BC and A(B+C)=AB+AC), scalar multiplication is compatible (λ(AB)=(λA)B=A(λB)), and multiplication is associative: (AB)C = A(BC) when sizes match. What fails in general is commutativity: AB usually does not equal BA, since swapping the order changes the row/column pairing that determines each entry.
How exactly is the (i,j) entry of a matrix product AB computed?
Why do distributive laws work for matrix multiplication, and what must stay in the same order?
How does scalar multiplication interact with matrix products?
What does associativity mean for matrices, and when is it valid?
Why is matrix multiplication not commutative?
Review Questions
- Given A is m×n and B is n×p, what are the dimensions of AB, and what is the summation index in (AB)_{ij}?
- State the distributive laws for matrix multiplication and indicate which side the addition occurs on.
- Provide a reason matrix multiplication is not commutative, even though it is associative.
Key Points
- 1
Each entry of AB is computed by summing products of a row of A with a column of B: (AB)_{ij}=Σ_{ℓ} a_{iℓ}b_{ℓj}.
- 2
Matrix multiplication is defined only when the number of columns of the left matrix matches the number of rows of the right matrix.
- 3
Distributive laws hold, but factor order must be preserved: (A+B)C=AC+BC and A(B+C)=AB+AC.
- 4
Scalar multiplication is compatible with matrix products: λ(AB)=(λA)B=A(λB).
- 5
Matrix multiplication is associative when sizes match: (AB)C=A(BC).
- 6
Matrix multiplication is not commutative in general: AB generally differs from BA.