Multivariable Calculus 30 | Example for Lagrange Multipliers
Based on The Bright Side of Mathematics's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Combine multiple constraints into a vector function G(x) so the constraint set is G(x)=0.
Briefing
Lagrange multipliers pin down exactly where a linear function reaches its highest and lowest values on a curved constraint set: the cylinder–plane intersection. The setup uses a function on R^3, f(x1,x2,x3)=2x1+3x2+2x3, and a constraint defined by two equations: x1^2+x2^2=2 (a cylinder of radius √2 around the x3-axis) and x1+x3=1 (a plane). Their intersection is an infinite curve; the goal is to find candidate points for extrema of f along that curve.
The method requires two ingredients: regularity of the constraint and a “gradient balance” condition. The two constraints are packaged into a vector-valued map G:R^3→R^2 with components (x1^2+x2^2−2, x1+x3−1). Checking the Jacobian of G, the rank is 2 everywhere on the constraint set. The only potential rank drop would occur when x1=x2=0, but that cannot happen on G=0 because the first constraint would then force −2=0, a contradiction. With regularity secured, the necessary condition for constrained extrema becomes
∇f(x)=λ1∇G1(x)+λ2∇G2(x).
Because f is linear, ∇f is constant: (2,3,2). Meanwhile, ∇G1=(2x1,2x2,0) and ∇G2=(1,0,1). Solving the resulting system yields three gradient component equations plus the two constraint equations. The third gradient component immediately gives 2=λ2, letting λ2 be eliminated. Subtracting 2 from the first component equation forces 2λ1x1=0, so either x1=0 or λ1=0. The alternative λ1=0 fails because it would make the second component equation 3=0, impossible. Therefore x1 must equal 0.
With x1=0, the constraints simplify: x1^2+x2^2=2 becomes x2^2=2, giving x2=±√2, and x1+x3=1 becomes x3=1. That produces exactly two candidate points on the constraint curve: (0,√2,1) and (0,−√2,1). Evaluating f at these points shows which is which: f(0,√2,1)=3√2+2 is the maximum, while f(0,−√2,1)=−3√2+2 is the minimum.
Finally, the argument is completed using compactness: the intersection set G is treated as compact (so f achieves both a maximum and a minimum there). Since only two candidates satisfy the Lagrange condition, the lower value must be the minimum and the higher value the maximum. The result is a clean demonstration of how Lagrange multipliers convert a geometric optimization problem into an algebraic system whose solutions identify the extremal points.
Cornell Notes
A linear objective f(x1,x2,x3)=2x1+3x2+2x3 is optimized on the intersection of a cylinder and a plane: x1^2+x2^2=2 and x1+x3=1. The constraints are combined into G(x)=(x1^2+x2^2−2, x1+x3−1), and the Jacobian rank is checked to be 2 everywhere on the constraint set, so Lagrange multipliers apply. The gradient condition ∇f=λ1∇G1+λ2∇G2 leads to a solvable system; the equations force x1=0, then x2=±√2 and x3=1. Evaluating f at the two candidates shows (0,−√2,1) gives the minimum and (0,√2,1) gives the maximum. Compactness ensures these candidates are the true extrema.
How are the two constraints turned into a single Lagrange-multiplier setup?
Why does the Jacobian rank check matter, and what rank is required here?
What does the gradient equation look like for this specific problem?
How does the system force x1=0?
Once x1=0, how are the two candidate points determined and how are they classified?
Review Questions
- What regularity (Jacobian rank) condition is needed for Lagrange multipliers when there are two constraints in R^3, and how is it verified here?
- Write the Lagrange multiplier gradient equation for this problem and identify which component equations lead to λ2=2 and x1=0.
- After solving the constraints with x1=0, what are the two candidate points and which one gives the maximum value of f?
Key Points
- 1
Combine multiple constraints into a vector function G(x) so the constraint set is G(x)=0.
- 2
Check the Jacobian rank of G on the constraint set; here the required maximal rank is 2 and it holds everywhere on G=0.
- 3
Use the necessary condition ∇f=λ1∇G1+λ2∇G2 to turn constrained optimization into a system of equations.
- 4
For this linear f, ∇f is constant, simplifying the gradient-balance equations.
- 5
Component-wise matching of ∇f=λ1∇G1+λ2∇G2 can quickly eliminate multipliers (here λ2=2).
- 6
The algebra forces x1=0, then the constraints yield x2=±√2 and x3=1, producing exactly two candidates.
- 7
Evaluating f at the candidates identifies the minimum and maximum values, with compactness ensuring no other extrema exist.