Get AI summaries of any video or article — Sign up free
Multivariable Calculus 12 | Second Order Partial Derivatives thumbnail

Multivariable Calculus 12 | Second Order Partial Derivatives

4 min read

Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Higher order partial derivatives are needed to extend local extremum tests from one variable to multivariable functions.

Briefing

Higher order partial derivatives matter because they extend the familiar “first derivative test” for local extrema into multivariable settings—where the gradient replaces the single-variable derivative, and second derivatives become the missing ingredient for a sufficient criterion. In one dimension, a differentiable function has a necessary condition for a local extremum: the first derivative must vanish. But that condition alone isn’t enough; adding the second derivative yields a decision rule: a negative second derivative at the point signals a local maximum, while a positive second derivative signals a local minimum. Multivariable calculus follows the same logic in spirit, but the second-derivative piece is harder because there are many second derivatives, not just one.

For functions f: R^n → R, the first-order condition translates cleanly: the gradient must be zero at a candidate point for a local extremum. The second-order condition, however, requires defining and working with second partial derivatives—derivatives of partial derivatives. The transcript builds this definition directly from limits: for example, ∂^2f/∂x2∂x1 is obtained by first taking the partial derivative of f with respect to x2, producing a new function, and then differentiating that new function with respect to x1. In general, there are n^2 second order partial derivatives, including “pure” ones like ∂^2f/∂x1^2 and “mixed” ones like ∂^2f/∂x2∂x1.

A key complication is that the order of differentiation can matter. Swapping the order of mixed partial derivatives—going from ∂^2f/∂x2∂x1 to ∂^2f/∂x1∂x2—may produce different results in general. The transcript then narrows to a concrete example to show what happens when order does not change. Take f(x1, x2) = sin(x1·x2). The first partial derivatives are computed using the one-variable chain rule: differentiating with respect to x1 brings down a factor of x2 and a cosine term, while differentiating with respect to x2 brings down a factor of x1 and a cosine term.

From there, the four second order partial derivatives are formed. The pure second derivatives (with respect to x1 twice, or x2 twice) simplify to expressions involving −x2^2·sin(x1·x2) and −x1^2·sin(x1·x2). The mixed derivatives require the product rule because the first partial derivatives are products of factors (like x2 or x1) with trigonometric functions. After computing both mixed derivatives—∂^2f/∂x2∂x1 and ∂^2f/∂x1∂x2—the results match exactly for this example. That equality is presented as a consequence of a famous theorem (named later), which gives conditions under which mixed partial derivatives commute. The takeaway is practical: second-order information is essential for multivariable extremum tests, but whether mixed derivatives agree depends on the function’s regularity—something the next video is set to formalize.

Cornell Notes

Second order partial derivatives extend the one-variable “second derivative test” idea to multivariable functions by providing the missing second-derivative information needed for sufficient conditions on local extrema. For f: R^n → R, second partial derivatives are defined as derivatives of partial derivatives, such as ∂^2f/∂x2∂x1 obtained by differentiating ∂f/∂x2 again with respect to x1. In general, the order of mixed partial derivatives can matter, meaning ∂^2f/∂x2∂x1 may differ from ∂^2f/∂x1∂x2. A worked example with f(x1, x2)=sin(x1·x2 shows both mixed derivatives match, illustrating when order does not matter. A later theorem will specify the regularity conditions that guarantee this commutation.

How does the necessary condition for local extrema change when moving from one variable to many variables?

In one dimension, a differentiable function must have f'(x̃)=0 at a local maximum or minimum. In multiple variables, the analogous necessary condition becomes ∇f(x̃)=0, meaning every first partial derivative vanishes at the candidate point x̃.

What exactly is a second order partial derivative like ∂^2f/∂x2∂x1?

It is defined by taking a partial derivative first, then differentiating again. Concretely, compute g(x1,x2)=∂f/∂x2, then form ∂g/∂x1. If the relevant limit exists, that value is ∂^2f/∂x2∂x1. The order in the notation records which variable is differentiated first.

Why can the order of mixed partial derivatives matter?

Mixed partial derivatives involve differentiating twice with respect to different variables. Without additional assumptions about smoothness, the two iterated limits that define ∂^2f/∂x2∂x1 and ∂^2f/∂x1∂x2 can fail to agree, so swapping the order may produce different results.

For f(x1,x2)=sin(x1·x2), what are the mixed second partial derivatives and do they match?

Compute first partial derivatives using the chain rule: ∂f/∂x1 = x2 cos(x1·x2) and ∂f/∂x2 = x1 cos(x1·x2). Differentiating ∂f/∂x1 with respect to x2 (product rule) and differentiating ∂f/∂x2 with respect to x1 (also product rule) both yield the same expression for the mixed derivatives. In this example, ∂^2f/∂x2∂x1 = ∂^2f/∂x1∂x2.

How many second order partial derivatives exist for a function of two variables?

For f(x1,x2), there are 2^2=4 second order partial derivatives: two pure ones (∂^2f/∂x1^2 and ∂^2f/∂x2^2) and two mixed ones (∂^2f/∂x2∂x1 and ∂^2f/∂x1∂x2).

Review Questions

  1. What is the multivariable analogue of the one-variable necessary condition f'(x̃)=0 for local extrema?
  2. Define ∂^2f/∂x2∂x1 in terms of iterated partial differentiation and explain what the order in the notation means.
  3. Using f(x1,x2)=sin(x1·x2), compute ∂f/∂x1 and ∂f/∂x2, then determine whether the mixed second partial derivatives agree.

Key Points

  1. 1

    Higher order partial derivatives are needed to extend local extremum tests from one variable to multivariable functions.

  2. 2

    For multivariable functions, the necessary condition for a local extremum is that the gradient vanishes at the candidate point.

  3. 3

    Second order partial derivatives are defined as derivatives of partial derivatives, using limits just like first partial derivatives.

  4. 4

    There are n^2 second order partial derivatives for f: R^n → R, including both pure and mixed derivatives.

  5. 5

    In general, mixed partial derivatives may differ when the differentiation order is swapped.

  6. 6

    A worked example with f(x1,x2)=sin(x1·x2) shows mixed second partial derivatives can match exactly.

  7. 7

    A later theorem will provide the conditions under which mixed partial derivatives commute.

Highlights

The gradient condition ∇f(x̃)=0 replaces the single-variable requirement f'(x̃)=0 for candidate extrema.
Second order partial derivatives like ∂^2f/∂x2∂x1 are built by differentiating ∂f/∂x2 again with respect to x1.
Mixed partial derivatives can disagree in general, but the example f(x1,x2)=sin(x1·x2) produces equality.
The equality of mixed partial derivatives is tied to a famous theorem introduced for the next step.

Topics