Get AI summaries of any video or article — Sign up free
Real Analysis 38 | Examples of Derivatives and Power Series [dark version] thumbnail

Real Analysis 38 | Examples of Derivatives and Power Series [dark version]

5 min read

Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Linear, quadratic, and cubic derivatives are computed efficiently using the constant derivative rule and the product rule, not difference quotients.

Briefing

Derivatives of polynomials and power series can be computed term-by-term—provided the power series converges nicely—so long as uniform convergence is brought in to justify swapping limits and differentiation. The session starts with concrete examples: linear, quadratic, and cubic functions are differentiated using the standard rules (constant derivative, product rule, and the power rule). For instance, differentiating x² uses the product rule on x·x to get 2x, and differentiating x³ is handled by rewriting it as x²·x and applying the product rule again, yielding 3x² without any difference quotients.

From these examples comes a general pattern. For any integer n, the derivative of x^n is n·x^(n−1). The same logic extends to polynomials: if f(x)=a₀+a₁x+…+a_n x^n, then f′(x)=a₁+2a₂x+…+n a_n x^(n−1). Coefficients stay in place except that each term’s power drops by one and the coefficient is multiplied by the old exponent; the constant term disappears because its derivative is zero.

The key leap is extending this “term-by-term differentiation” to power series, which behave like infinite polynomials. The natural guess is that differentiating ∑_{k=0}^∞ a_k x^k should produce ∑_{k=1}^∞ k a_k x^(k−1). But the infinite nature creates a problem: sum rules for finite sums don’t automatically apply when limits are involved. The resolution depends on uniform convergence on compact subintervals inside the radius of convergence. On any interval [−C, C] with C<R, the partial sums of the power series converge uniformly to the function, and the differentiated partial sums also converge uniformly. That uniform convergence is what allows differentiation to commute with taking the limit of partial sums.

A proof sketch is given for uniform convergence of the partial sums to the function: the tail of the series is bounded by a geometric series using a majorant argument (a standard comparison technique). Once the function’s partial sums converge uniformly, the same style of estimate applies to the derivative series, with the radius of convergence unchanged. With both uniform convergence of the functions and their derivatives in hand, the differentiability of the limit function follows, and the derivative equals the term-by-term differentiated power series.

Finally, the method is demonstrated on classic examples. For e^x, the power series ∑_{k=0}^∞ x^k/k! differentiates to the same series again, confirming that (e^x)′=e^x. For the sign function (as defined earlier using only odd powers), differentiating its odd-power series produces a power series that matches cosine x; equivalently, the derivative of sin x is cos x. The takeaway is practical: once uniform convergence inside the radius of convergence is secured, derivatives of power series follow the same algebraic rules as polynomials, term by term.

Cornell Notes

The derivative rules for polynomials extend to power series, but only after uniform convergence justifies exchanging “differentiate” with “take the limit of partial sums.” Starting from examples (x, x², x³), the pattern (x^n)′ = n x^(n−1) leads to the polynomial derivative formula: each term a_k x^k becomes k a_k x^(k−1), and the constant term vanishes. For a power series f(x)=∑_{k=0}^∞ a_k x^k with radius of convergence R, the term-by-term derivative is ∑_{k=1}^∞ k a_k x^(k−1). The crucial condition is that on any compact interval [−C, C] with C<R, the partial sums converge uniformly, and the differentiated partial sums also converge uniformly, allowing the derivative to pass through the limit. This yields differentiability and the expected derivative series.

Why does differentiating x² and x³ avoid difference quotients, and what rules make it work?

For x², rewrite it as x·x and apply the product rule: (uv)′=u′v+uv′. With u=x and v=x, both derivatives are 1, giving 1·x + x·1 = 2x. For x³, rewrite as x²·x and apply the product rule again. Since (x²)′=2x and (x)′=1, the result is (x²)′·x + x²·(x)′ = (2x)·x + x²·1 = 3x².

What general formula emerges for (x^n)′, and how is it consistent with the product-rule examples?

The observed pattern is (x^n)′ = n x^(n−1). The reasoning can be formalized by induction using the product rule: x^n can be seen as x^(n−1)·x, so (x^n)′ = (x^(n−1))′·x + x^(n−1)·(x)′. If (x^(n−1))′ = (n−1)x^(n−2), then the product rule yields (n−1)x^(n−2)·x + x^(n−1)·1 = (n−1)x^(n−1) + x^(n−1) = n x^(n−1).

How does the derivative formula for polynomials follow from the power rule?

A polynomial is a finite sum of terms a_k x^k. Using the sum rule for finite sums, differentiate each term separately. The coefficient a_k stays, the exponent drops by one, and the exponent multiplies the coefficient: (a_k x^k)′ = a_k·k x^(k−1). The constant term a_0 corresponds to k=0 and disappears because its derivative is 0.

Why can’t the same term-by-term differentiation be assumed automatically for power series?

Power series are infinite sums, so differentiating term-by-term would require exchanging differentiation with a limit (the limit of partial sums). Finite-sum rules don’t automatically apply to infinite sums because the limit process can break algebraic manipulations unless convergence is strong enough. The transcript highlights that sum rules don’t directly apply due to the limit process.

What convergence condition makes term-by-term differentiation of power series valid?

Uniform convergence on compact intervals inside the radius of convergence. For f(x)=∑_{k=0}^∞ a_k x^k with radius R, restrict to [−C, C] where C<R. On such intervals, the partial sums converge uniformly to f, and the differentiated partial sums also converge uniformly. With uniform convergence of both the functions and their derivatives, differentiation commutes with the limit, so f is differentiable and f′ is given by the term-by-term differentiated series.

How do the examples confirm the general power-series differentiation rule?

For e^x, the series is ∑_{k=0}^∞ x^k/k!. Differentiating term-by-term gives ∑_{k=1}^∞ k x^(k−1)/k! = ∑_{k=0}^∞ x^k/k! after an index shift, so (e^x)′=e^x. For sin x (built from odd powers), differentiating its odd-power series yields a power series matching cos x (even powers), so (sin x)′=cos x.

Review Questions

  1. What is the exact term-by-term derivative formula for a power series ∑_{k=0}^∞ a_k x^k, and what happens to the k=0 term?
  2. Why does uniform convergence on [−C, C] with C<R matter for differentiating power series?
  3. How does an index shift help show that differentiating the power series for e^x reproduces the same series?

Key Points

  1. 1

    Linear, quadratic, and cubic derivatives are computed efficiently using the constant derivative rule and the product rule, not difference quotients.

  2. 2

    The power rule follows a consistent pattern: (x^n)′ = n·x^(n−1), which can be justified by induction using the product rule.

  3. 3

    Polynomial derivatives come from differentiating each term separately: a_k x^k becomes k a_k x^(k−1), and the constant term drops out.

  4. 4

    Term-by-term differentiation of power series requires justification because infinite sums involve limits of partial sums.

  5. 5

    Uniform convergence on every compact interval [−C, C] inside the radius of convergence (C<R) ensures the derivative can be taken term-by-term.

  6. 6

    The derivative series of a power series has the same radius of convergence as the original series.

  7. 7

    Applying the rule to e^x reproduces the same power series, confirming (e^x)′=e^x, and applying it to sin x yields cos x.

Highlights

Differentiating x³ becomes straightforward by rewriting it as x²·x and applying the product rule, giving 3x² immediately.
The general derivative of x^n is n·x^(n−1), and the polynomial rule is just that power rule applied term-by-term.
Term-by-term differentiation of power series is valid on [−C, C] only because uniform convergence (not just pointwise convergence) is established.
A geometric-series majorant argument bounds the tails of the power series to prove uniform convergence.
Index shifting turns the differentiated e^x series back into the original ∑ x^k/k! form, proving (e^x)′=e^x.

Topics