Extrema and Lagrange Multipliers
Multivariable extrema extend optimization to surfaces and constraints. Without constraints, critical points occur where the gradient is zero or undefined. With constraints, the gradient of the objective must align with the gradient of the constraint at an interior constrained optimum.
The geometry is more important than the algebra. At an unconstrained local maximum or minimum, every first-order direction has zero change. On a constraint curve or surface, only directions tangent to the constraint are allowed, so the objective gradient must be perpendicular to those tangent directions.
Definitions
For , a critical point is a point where
or where one of the first partial derivatives fails to exist.
A local maximum occurs at if
for all near . A local minimum is defined with .
The second derivative discriminant for is
For constrained optimization with constraint
the Lagrange multiplier equations are
For more than one constraint, use one multiplier for each constraint:
Key results
The second derivative test for a critical point of says:
If and , then is a local minimum. If and , then it is a local maximum. If , then it is a saddle point. If , the test is inconclusive.
This test comes from the quadratic part of the Taylor expansion:
The sign of this quadratic form determines whether the surface bends up in all directions, down in all directions, or in mixed directions.
For constrained extrema, suppose has an extremum on the smooth constraint curve , and . Tangent vectors to the constraint satisfy
At a constrained extremum, the directional derivative of in every allowed tangent direction is zero:
Thus and are both normal to the same tangent line, so they are parallel:
Absolute extrema on closed and bounded regions require checking all candidates: interior critical points, boundary points, corners, and constraint candidates. Lagrange multipliers handle smooth boundary pieces but not automatically corners or nonsmooth boundaries.
The multiplier often has a sensitivity interpretation. In many applied problems, it measures how the optimal value changes as the constraint level changes, although that interpretation depends on regularity and the specific formulation.
The Hessian matrix organizes the second derivative test:
At a critical point, the quadratic approximation is
If this quadratic form is positive in every nonzero direction, the point is a local minimum. If it is negative in every nonzero direction, the point is a local maximum. If it takes both signs, the point is a saddle.
For absolute extrema on a closed disk, rectangle, triangle, or other bounded region, a complete search separates the region into interior and boundary. Interior points use . Boundary curves may be parametrized, handled with single-variable calculus, or handled with Lagrange multipliers. Corners must be checked directly.
For Lagrange multipliers, the condition is necessary under smoothness assumptions, not automatically sufficient. After solving the system, compare objective values at all feasible candidates. If the feasible set is compact and is continuous, absolute maximum and minimum values exist.
In three variables with one constraint, the same idea applies. Optimizing subject to gives
With two constraints, the feasible set is often a curve, and the objective gradient lies in the span of both constraint gradients.
Saddle points are common in multivariable calculus. The surface
has , but along the -axis the function is positive and along the -axis it is negative. Therefore every neighborhood of the origin contains both higher and lower values. The second derivative discriminant detects this because
Boundary analysis can reduce a multivariable problem to one-variable calculus. On the rectangle , , each edge fixes one variable. For example, on , the function becomes with . Critical points on that edge come from differentiating with respect to the remaining variable, and the corner values are checked separately.
Lagrange multipliers have a clear level-curve picture. At a constrained extremum of on , the level curve of just touches the constraint curve. If the two curves crossed transversely, moving along the constraint would pass to a higher or lower level of , contradicting extremality. Tangency means their normals, the gradients, are parallel.
In applied problems, constraints often encode limited resources. A maximum volume subject to fixed surface area, a minimum cost subject to production requirements, or a shortest distance subject to lying on a curve all fit this pattern. The multiplier equations are only useful after the objective and constraint correctly represent the situation.
Visual
| Case | Candidate condition | Classification step |
|---|---|---|
| Unconstrained interior | Hessian or second derivative test | |
| Closed rectangle | critical points and boundary | compare all values |
| Smooth constraint | compare feasible candidates | |
| Nonsmooth boundary | separate pieces and corners | direct comparison |
| Saddle point | neither max nor min |
Worked example 1: classify unconstrained critical points
Problem. Find and classify the critical point of
Method.
- Compute first partial derivatives:
- Set them equal to zero:
- Solve the system. From the first equation,
- Substitute into the second:
- Simplify:
- Then
- Compute second partials:
- Compute the discriminant:
Since and , the point is a local minimum.
Checked answer. The function has a local minimum at . The value is .
Worked example 2: constrained optimization with Lagrange multipliers
Problem. Maximize and minimize
subject to
Method.
- Let
The constraint is .
- Compute gradients:
- Set :
-
If , then the second equation gives . The constraint forces , and then the first equation gives , impossible. So . Similarly .
-
Multiply the first equation by and the second by :
Thus
-
So or .
-
If , then
The points are and , with .
- If , the points are and , with .
Checked answer. The maximum value is , and the minimum value is on the unit circle.
The results also follow from the identity
Since , this gives . Similarly, gives . These inequalities confirm the Lagrange multiplier values and show they are absolute extrema.
The feasible set is the unit circle, which is closed and bounded, and is continuous. Therefore absolute maximum and minimum values must exist. This justifies comparing the finite list of Lagrange candidates.
If the constraint were not closed or bounded, a finite candidate list might not tell the whole story. Always match the existence argument to the feasible set.
Code
def f(x, y):
return x*y
samples = []
for i in range(361):
import math
theta = math.radians(i)
x = math.cos(theta)
y = math.sin(theta)
samples.append((f(x, y), x, y))
print(max(samples))
print(min(samples))
Common pitfalls
- Forgetting points where partial derivatives do not exist.
- Using the second derivative test when as if it classified the point.
- Checking only interior critical points on a closed region and missing boundary extrema.
- Applying Lagrange multipliers when at a candidate without separate analysis.
- Solving but forgetting the constraint equation.
- Reporting candidates instead of objective values when asked for maxima and minima.
Connections
- Partial Derivatives and the Gradient: gradients and Hessian entries drive multivariable optimization.
- Applications of Derivatives: second derivative testing generalizes from one variable.
- Multiple Integrals: extrema help bound and interpret multivariable integrals.
- Vector Calculus: constrained geometry reappears in surfaces and flux orientation.