Multivariable Calculus 12 | Second Order Partial Derivatives
Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Higher order partial derivatives are needed to extend local extremum tests from one variable to multivariable functions.
Briefing
Higher order partial derivatives matter because they extend the familiar “first derivative test” for local extrema into multivariable settings—where the gradient replaces the single-variable derivative, and second derivatives become the missing ingredient for a sufficient criterion. In one dimension, a differentiable function has a necessary condition for a local extremum: the first derivative must vanish. But that condition alone isn’t enough; adding the second derivative yields a decision rule: a negative second derivative at the point signals a local maximum, while a positive second derivative signals a local minimum. Multivariable calculus follows the same logic in spirit, but the second-derivative piece is harder because there are many second derivatives, not just one.
For functions f: R^n → R, the first-order condition translates cleanly: the gradient must be zero at a candidate point for a local extremum. The second-order condition, however, requires defining and working with second partial derivatives—derivatives of partial derivatives. The transcript builds this definition directly from limits: for example, ∂^2f/∂x2∂x1 is obtained by first taking the partial derivative of f with respect to x2, producing a new function, and then differentiating that new function with respect to x1. In general, there are n^2 second order partial derivatives, including “pure” ones like ∂^2f/∂x1^2 and “mixed” ones like ∂^2f/∂x2∂x1.
A key complication is that the order of differentiation can matter. Swapping the order of mixed partial derivatives—going from ∂^2f/∂x2∂x1 to ∂^2f/∂x1∂x2—may produce different results in general. The transcript then narrows to a concrete example to show what happens when order does not change. Take f(x1, x2) = sin(x1·x2). The first partial derivatives are computed using the one-variable chain rule: differentiating with respect to x1 brings down a factor of x2 and a cosine term, while differentiating with respect to x2 brings down a factor of x1 and a cosine term.
From there, the four second order partial derivatives are formed. The pure second derivatives (with respect to x1 twice, or x2 twice) simplify to expressions involving −x2^2·sin(x1·x2) and −x1^2·sin(x1·x2). The mixed derivatives require the product rule because the first partial derivatives are products of factors (like x2 or x1) with trigonometric functions. After computing both mixed derivatives—∂^2f/∂x2∂x1 and ∂^2f/∂x1∂x2—the results match exactly for this example. That equality is presented as a consequence of a famous theorem (named later), which gives conditions under which mixed partial derivatives commute. The takeaway is practical: second-order information is essential for multivariable extremum tests, but whether mixed derivatives agree depends on the function’s regularity—something the next video is set to formalize.
Cornell Notes
Second order partial derivatives extend the one-variable “second derivative test” idea to multivariable functions by providing the missing second-derivative information needed for sufficient conditions on local extrema. For f: R^n → R, second partial derivatives are defined as derivatives of partial derivatives, such as ∂^2f/∂x2∂x1 obtained by differentiating ∂f/∂x2 again with respect to x1. In general, the order of mixed partial derivatives can matter, meaning ∂^2f/∂x2∂x1 may differ from ∂^2f/∂x1∂x2. A worked example with f(x1, x2)=sin(x1·x2 shows both mixed derivatives match, illustrating when order does not matter. A later theorem will specify the regularity conditions that guarantee this commutation.
How does the necessary condition for local extrema change when moving from one variable to many variables?
What exactly is a second order partial derivative like ∂^2f/∂x2∂x1?
Why can the order of mixed partial derivatives matter?
For f(x1,x2)=sin(x1·x2), what are the mixed second partial derivatives and do they match?
How many second order partial derivatives exist for a function of two variables?
Review Questions
- What is the multivariable analogue of the one-variable necessary condition f'(x̃)=0 for local extrema?
- Define ∂^2f/∂x2∂x1 in terms of iterated partial differentiation and explain what the order in the notation means.
- Using f(x1,x2)=sin(x1·x2), compute ∂f/∂x1 and ∂f/∂x2, then determine whether the mixed second partial derivatives agree.
Key Points
- 1
Higher order partial derivatives are needed to extend local extremum tests from one variable to multivariable functions.
- 2
For multivariable functions, the necessary condition for a local extremum is that the gradient vanishes at the candidate point.
- 3
Second order partial derivatives are defined as derivatives of partial derivatives, using limits just like first partial derivatives.
- 4
There are n^2 second order partial derivatives for f: R^n → R, including both pure and mixed derivatives.
- 5
In general, mixed partial derivatives may differ when the differentiation order is swapped.
- 6
A worked example with f(x1,x2)=sin(x1·x2) shows mixed second partial derivatives can match exactly.
- 7
A later theorem will provide the conditions under which mixed partial derivatives commute.