Linear Algebra 17 | Properties of the Matrix Product
Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Each entry of AB is computed by summing products of a fixed row of A with a fixed column of B: (AB)_{ij}=\sum_{\ell=1}^{n} a_{i\ell} b_{\ell j}.
Briefing
Matrix multiplication is built from a “row-by-column” inner product: the (i,j) entry of the product AB is obtained by fixing row i of A and column j of B and summing the products of matching entries. Concretely, if A is m n and B is n p, then AB is m p, and each entry is (AB)_{ij}= \sum_{\ell=1}^{n} a_{i\ell} b_{\ell j}. This “sum over the middle index” is the mechanism behind every algebraic rule that follows, because it reduces matrix identities to familiar real-number arithmetic.
From that definition, several calculation rules carry over from ordinary arithmetic. Matrix multiplication distributes over addition, but the side matters: multiplying on the right gives (A+B)C = AC + BC, while multiplying on the left gives D(A+B) = DA + DB. Scalar multiplication also behaves compatibly with matrix products. For any real number \lambda, \lambda(AB) equals (\lambda A)B and also equals A(\lambda B), so scalars can be moved to either factor without changing the result.
Associativity is another key property: whenever the shapes match, (AB)C = A(BC). The proof works component-by-component. Looking at a fixed entry (i,j) of (AB)C, the definition introduces a sum over the “middle” index, and applying the definition again expands the nested product into sums of real-number products. Using distributive laws and the ability to rearrange sums, the expression collapses into exactly the (i,j) entry of A(BC). Since this holds for every i and j, the matrices are equal.
One property does *not* carry over: commutativity. In general, AB \u2260 BA, because the order of factors changes which rows of the first matrix interact with which columns of the second. A simple 2 example makes the point: multiplying by the matrix \begin{pmatrix}0&1\\-1&0\end{pmatrix} in one order produces a different result than multiplying in the reverse order. Associativity still holds, so parentheses can be changed, but swapping the two matrices is not allowed in general.
These rules matter because they determine what manipulations are safe during computations. Distributivity and associativity let algebra be reorganized for efficiency and clarity, while the lack of commutativity forces careful attention to factor order—an issue that becomes even more important when matrix products represent transformations in later topics.
Cornell Notes
Matrix multiplication is defined so that each entry of AB comes from a row-by-column inner product: (AB)_{ij} = \sum_{\ell=1}^{n} a_{i\ell} b_{\ell j}. Using this “sum over the middle index,” several familiar algebra rules extend to matrices. Right-distributivity and left-distributivity both hold with addition, and scalar multiplication commutes with matrix multiplication in the sense that \lambda(AB) = (\lambda A)B = A(\lambda B). Matrix multiplication is associative: (AB)C = A(BC) whenever dimensions match. Commutativity fails in general, so AB usually cannot be swapped to BA.
How exactly is the (i,j) entry of AB computed?
What distributive laws work for matrix multiplication, and why does the side matter?
How does scalar multiplication interact with matrix products?
Why is matrix multiplication associative, and what does the proof rely on?
What fails that beginners often expect: commutativity?
Review Questions
- Given A is m n and B is n p, what are the dimensions of AB, and what index runs in the sum for (AB)_{ij}?
- State both distributive laws involving matrix multiplication and addition, including which side the common factor multiplies.
- Provide a reason commutativity fails for matrix multiplication, and describe what associativity still guarantees.
Key Points
- 1
Each entry of AB is computed by summing products of a fixed row of A with a fixed column of B: (AB)_{ij}=\sum_{\ell=1}^{n} a_{i\ell} b_{\ell j}.
- 2
Matrix multiplication is defined only when the number of columns of A matches the number of rows of B.
- 3
Right-distributivity holds: (A+B)C = AC + BC, and left-distributivity holds: D(A+B) = DA + DB.
- 4
Scalar multiplication is compatible with products: \lambda(AB) = (\lambda A)B = A(\lambda B).
- 5
Matrix multiplication is associative when dimensions match: (AB)C = A(BC).
- 6
Commutativity fails in general: AB is usually not equal to BA, so factor order cannot be swapped.