Real Analysis 38 | Examples of Derivatives and Power Series [dark version]
Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Linear, quadratic, and cubic derivatives are computed efficiently using the constant derivative rule and the product rule, not difference quotients.
Briefing
Derivatives of polynomials and power series can be computed term-by-term—provided the power series converges nicely—so long as uniform convergence is brought in to justify swapping limits and differentiation. The session starts with concrete examples: linear, quadratic, and cubic functions are differentiated using the standard rules (constant derivative, product rule, and the power rule). For instance, differentiating x² uses the product rule on x·x to get 2x, and differentiating x³ is handled by rewriting it as x²·x and applying the product rule again, yielding 3x² without any difference quotients.
From these examples comes a general pattern. For any integer n, the derivative of x^n is n·x^(n−1). The same logic extends to polynomials: if f(x)=a₀+a₁x+…+a_n x^n, then f′(x)=a₁+2a₂x+…+n a_n x^(n−1). Coefficients stay in place except that each term’s power drops by one and the coefficient is multiplied by the old exponent; the constant term disappears because its derivative is zero.
The key leap is extending this “term-by-term differentiation” to power series, which behave like infinite polynomials. The natural guess is that differentiating ∑_{k=0}^∞ a_k x^k should produce ∑_{k=1}^∞ k a_k x^(k−1). But the infinite nature creates a problem: sum rules for finite sums don’t automatically apply when limits are involved. The resolution depends on uniform convergence on compact subintervals inside the radius of convergence. On any interval [−C, C] with C<R, the partial sums of the power series converge uniformly to the function, and the differentiated partial sums also converge uniformly. That uniform convergence is what allows differentiation to commute with taking the limit of partial sums.
A proof sketch is given for uniform convergence of the partial sums to the function: the tail of the series is bounded by a geometric series using a majorant argument (a standard comparison technique). Once the function’s partial sums converge uniformly, the same style of estimate applies to the derivative series, with the radius of convergence unchanged. With both uniform convergence of the functions and their derivatives in hand, the differentiability of the limit function follows, and the derivative equals the term-by-term differentiated power series.
Finally, the method is demonstrated on classic examples. For e^x, the power series ∑_{k=0}^∞ x^k/k! differentiates to the same series again, confirming that (e^x)′=e^x. For the sign function (as defined earlier using only odd powers), differentiating its odd-power series produces a power series that matches cosine x; equivalently, the derivative of sin x is cos x. The takeaway is practical: once uniform convergence inside the radius of convergence is secured, derivatives of power series follow the same algebraic rules as polynomials, term by term.
Cornell Notes
The derivative rules for polynomials extend to power series, but only after uniform convergence justifies exchanging “differentiate” with “take the limit of partial sums.” Starting from examples (x, x², x³), the pattern (x^n)′ = n x^(n−1) leads to the polynomial derivative formula: each term a_k x^k becomes k a_k x^(k−1), and the constant term vanishes. For a power series f(x)=∑_{k=0}^∞ a_k x^k with radius of convergence R, the term-by-term derivative is ∑_{k=1}^∞ k a_k x^(k−1). The crucial condition is that on any compact interval [−C, C] with C<R, the partial sums converge uniformly, and the differentiated partial sums also converge uniformly, allowing the derivative to pass through the limit. This yields differentiability and the expected derivative series.
Why does differentiating x² and x³ avoid difference quotients, and what rules make it work?
What general formula emerges for (x^n)′, and how is it consistent with the product-rule examples?
How does the derivative formula for polynomials follow from the power rule?
Why can’t the same term-by-term differentiation be assumed automatically for power series?
What convergence condition makes term-by-term differentiation of power series valid?
How do the examples confirm the general power-series differentiation rule?
Review Questions
- What is the exact term-by-term derivative formula for a power series ∑_{k=0}^∞ a_k x^k, and what happens to the k=0 term?
- Why does uniform convergence on [−C, C] with C<R matter for differentiating power series?
- How does an index shift help show that differentiating the power series for e^x reproduces the same series?
Key Points
- 1
Linear, quadratic, and cubic derivatives are computed efficiently using the constant derivative rule and the product rule, not difference quotients.
- 2
The power rule follows a consistent pattern: (x^n)′ = n·x^(n−1), which can be justified by induction using the product rule.
- 3
Polynomial derivatives come from differentiating each term separately: a_k x^k becomes k a_k x^(k−1), and the constant term drops out.
- 4
Term-by-term differentiation of power series requires justification because infinite sums involve limits of partial sums.
- 5
Uniform convergence on every compact interval [−C, C] inside the radius of convergence (C<R) ensures the derivative can be taken term-by-term.
- 6
The derivative series of a power series has the same radius of convergence as the original series.
- 7
Applying the rule to e^x reproduces the same power series, confirming (e^x)′=e^x, and applying it to sin x yields cos x.