Get AI summaries of any video or article — Sign up free
Linear Algebra 17 | Properties of the Matrix Product thumbnail

Linear Algebra 17 | Properties of the Matrix Product

4 min read

Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Each entry of AB is computed by summing products of a fixed row of A with a fixed column of B: (AB)_{ij}=\sum_{\ell=1}^{n} a_{i\ell} b_{\ell j}.

Briefing

Matrix multiplication is built from a “row-by-column” inner product: the (i,j) entry of the product AB is obtained by fixing row i of A and column j of B and summing the products of matching entries. Concretely, if A is m n and B is n p, then AB is m p, and each entry is (AB)_{ij}= \sum_{\ell=1}^{n} a_{i\ell} b_{\ell j}. This “sum over the middle index” is the mechanism behind every algebraic rule that follows, because it reduces matrix identities to familiar real-number arithmetic.

From that definition, several calculation rules carry over from ordinary arithmetic. Matrix multiplication distributes over addition, but the side matters: multiplying on the right gives (A+B)C = AC + BC, while multiplying on the left gives D(A+B) = DA + DB. Scalar multiplication also behaves compatibly with matrix products. For any real number \lambda, \lambda(AB) equals (\lambda A)B and also equals A(\lambda B), so scalars can be moved to either factor without changing the result.

Associativity is another key property: whenever the shapes match, (AB)C = A(BC). The proof works component-by-component. Looking at a fixed entry (i,j) of (AB)C, the definition introduces a sum over the “middle” index, and applying the definition again expands the nested product into sums of real-number products. Using distributive laws and the ability to rearrange sums, the expression collapses into exactly the (i,j) entry of A(BC). Since this holds for every i and j, the matrices are equal.

One property does *not* carry over: commutativity. In general, AB \u2260 BA, because the order of factors changes which rows of the first matrix interact with which columns of the second. A simple 2 example makes the point: multiplying by the matrix \begin{pmatrix}0&1\\-1&0\end{pmatrix} in one order produces a different result than multiplying in the reverse order. Associativity still holds, so parentheses can be changed, but swapping the two matrices is not allowed in general.

These rules matter because they determine what manipulations are safe during computations. Distributivity and associativity let algebra be reorganized for efficiency and clarity, while the lack of commutativity forces careful attention to factor order—an issue that becomes even more important when matrix products represent transformations in later topics.

Cornell Notes

Matrix multiplication is defined so that each entry of AB comes from a row-by-column inner product: (AB)_{ij} = \sum_{\ell=1}^{n} a_{i\ell} b_{\ell j}. Using this “sum over the middle index,” several familiar algebra rules extend to matrices. Right-distributivity and left-distributivity both hold with addition, and scalar multiplication commutes with matrix multiplication in the sense that \lambda(AB) = (\lambda A)B = A(\lambda B). Matrix multiplication is associative: (AB)C = A(BC) whenever dimensions match. Commutativity fails in general, so AB usually cannot be swapped to BA.

How exactly is the (i,j) entry of AB computed?

Fix row i of A and column j of B. Multiply corresponding entries and sum over the shared dimension n: (AB)_{ij} = \sum_{\ell=1}^{n} a_{i\ell} b_{\ell j}. The “middle index” \ell runs from 1 to n, reflecting the requirement that the number of columns of A equals the number of rows of B.

What distributive laws work for matrix multiplication, and why does the side matter?

Distributivity holds, but multiplication must stay on the same side. For right multiplication: (A+B)C = AC + BC. For left multiplication: D(A+B) = DA + DB. The order matters because AB is not defined as BA unless dimensions and factor order both match.

How does scalar multiplication interact with matrix products?

For any real scalar \lambda, scalar multiplication can be applied before or after forming the product: \lambda(AB) = (\lambda A)B = A(\lambda B). This follows from the entrywise definition, where multiplying by \lambda scales each term in the sum.

Why is matrix multiplication associative, and what does the proof rely on?

Associativity says (AB)C = A(BC) when dimensions match. The proof checks a single entry (i,j). Expanding with the definition introduces sums over intermediate indices; then distributive laws for real numbers and rearranging sums show the two expanded expressions match entry-by-entry.

What fails that beginners often expect: commutativity?

Matrix multiplication is not commutative in general: AB \u2260 BA. Swapping factors changes which rows of the first matrix pair with which columns of the second, producing different entries. A 2 example with \begin{pmatrix}0&1\\-1&0\end{pmatrix} demonstrates that AB and BA yield unequal matrices.

Review Questions

  1. Given A is m n and B is n p, what are the dimensions of AB, and what index runs in the sum for (AB)_{ij}?
  2. State both distributive laws involving matrix multiplication and addition, including which side the common factor multiplies.
  3. Provide a reason commutativity fails for matrix multiplication, and describe what associativity still guarantees.

Key Points

  1. 1

    Each entry of AB is computed by summing products of a fixed row of A with a fixed column of B: (AB)_{ij}=\sum_{\ell=1}^{n} a_{i\ell} b_{\ell j}.

  2. 2

    Matrix multiplication is defined only when the number of columns of A matches the number of rows of B.

  3. 3

    Right-distributivity holds: (A+B)C = AC + BC, and left-distributivity holds: D(A+B) = DA + DB.

  4. 4

    Scalar multiplication is compatible with products: \lambda(AB) = (\lambda A)B = A(\lambda B).

  5. 5

    Matrix multiplication is associative when dimensions match: (AB)C = A(BC).

  6. 6

    Commutativity fails in general: AB is usually not equal to BA, so factor order cannot be swapped.

Highlights

The (i,j) entry of AB is a sum over the shared dimension: the “middle index” \ell is what gets summed out.
Distributivity works on both sides, but the common factor must stay on the same side of the addition.
Associativity can be proven entry-by-entry by expanding sums and using real-number distributive laws.
Matrix multiplication is not commutative; swapping factors changes the result.