${}$
Chapter 4: Partial Differentiation
${}$
Section 4.8: Unconstrained Optimization
${}$

Essentials


${}$
•

Critical points for a function of several variables are points where the function or at least one of its firstpartial derivatives is undefined, or where the firstpartial derivatives vanish simultaneously.

•

Points where the firstpartial derivatives of $f$ vanish simultaneously are found as solutions of the equations contained in $\nabla f\=\mathbf{0}$.

${}$
•

Table 4.8.1 contains a statement of the SecondDerivative test for a critical point P that is a solution of $\nabla f\left(x\,y\right)\=\mathbf{0}$.

$T\left(\mathrm{P}\right)\=\genfrac{}{}{0ex}{}{\left({f}_{\mathrm{xx}}{f}_{\mathrm{yy}}{f}_{\mathrm{xy}}^{2}\right)}{\phantom{x\=a}}\genfrac{}{}{0ex}{}{\phantom{\mathrm{f(x)}}}{\mathrm{P}}$

$T\left(\mathrm{P}\right)\>0$ and $\{\begin{array}{cc}{f}_{\mathrm{xx}}\left(\mathrm{P}\right)\>0& \Rightarrow f\left(\mathrm{P}\right)local\mathrm{minimum}\\ {f}_{\mathrm{xx}}\left(\mathrm{P}\right)0& \Rightarrow f\left(\mathrm{P}\right)local\mathrm{maximum}\end{array}$

$T\left(\mathrm{P}\right)<0\Rightarrow$P is a saddle point

$T\left(\mathrm{P}\right)\=0\Rightarrow$test fails and no conclusion can be drawn

Table 4.8.1 SecondDerivative test for $f\left(x\,y\right)$



${}$
•

A point P is a saddle point if every neighborhood of P contains points at which $f\>f\left(\mathrm{P}\right)$ and points at which $f<f\left(\mathrm{P}\right)$. In other words, the tangent plane at P intersects the surface at P. Such a point is a stationary point, but not an extreme point.

${}$
•

The SecondDerivative test stated in Table 4.8.1 is a special case of a more general test that extends to functions of more than two variables. This generalization, stated in Table 4.8.2, is based on the RouthHurwitz criterion for quadratic forms. (Some authors, including this one, see the test as based on Sylvester's Law of Inertia.)

$H\left(P\right)$ is the Hessian for $f\left({x}_{1}\,\dots \,{x}_{n}\right)$ evaluated at P

$S$ is the sequence $1\,{Q}_{1}\,\dots \,{Q}_{n}$, where ${Q}_{k}\ne 0$ is the $k$th principal minor of $H\left(P\right)$

Signs of $S$ strictly alternate ⇒P is a local maximum

Signs of $S$ are all the same ⇒P is a local minimum

Signs of $S$ neither alternate nor are all the same ⇒P stationary, but not extreme

At least one ${Q}_{k}\=0$ ⇒test fails and no conclusion can be drawn

Table 4.8.2 Generalized SecondDerivative test



•

The principal minor ${Q}_{k}$ is the determinant of the $k\times k$ submatrix of $H\left(\mathrm{P}\right)$ whose main diagonal coincides with the main diagonal of $H\left(\mathrm{P}\right)$, which starts with the $\left(1\,1\right)$element of $H\left(\mathrm{P}\right)$, and whose last row and column are the $k$th row and column of $H\left(\mathrm{P}\right)$.

${}$
•

Table 4.8.3 provides some insight into the generalized SecondDerivative test stated in Table 4.8.2.

•

In any form of the SecondDerivative test, the tested function is expanded in a Taylor series about the point P. Since the firstderivative terms for a function of several variables are zero, the function near P is approximately $f\left(\mathrm{P}\right)\+\frac{1}{2}{\mathbf{X}}^{\mathrm{T}}H\left(\mathrm{P}\right)\mathbf{X}$, where $\mathbf{X}\=\mathbf{R}\mathbf{P}$, R is the position vector to the general point, and ${\mathbf{X}}^{\mathrm{T}}$ is a row vector, the transpose of X.

•

The values of $f$ near P then depend on whether the quadratic form that comprises the secondderivative terms is always positive, always negative, or sometimes positive and sometimes negative for R near P. If the symmetric matrix $H\left(\mathrm{P}\right)$ is positive definite, the quadratic form is always positive near P. If it is negative definite, the form is always negative near P. If it is both positive and negative near P, the form is indefinite. Thus, the RouthHurwitz and Sylvester criteria essentially determine if the quadratic form associated with the Hessian is positive definite, negative definite, or indefinite.

•

If the quadratic form is positive definite, then it adds to the value of $f\left(\mathrm{P}\right)$, so $f\left(\mathrm{P}\right)$ is a local minimum.

•

If the quadratic form is negative definite, then it subtracts from the value of $f\left(\mathrm{P}\right)$, so $f\left(\mathrm{P}\right)$ is a local maximum.

•

If the quadratic form is indefinite, then is both adds to and subtracts from the value of $f\left(\mathrm{P}\right)$, so $f\left(\mathrm{P}\right)$ is a stationary, but not an extreme, point.


Table 4.8.3 Remarks on the theoretical framework for the SecondDerivative test



${}$${}$


Examples


${}$
Example 4.8.1

Find and classify the critical (i.e., stationary) points for $f\left(x\,y\right)\=73{x}^{2}5{y}^{2}plus;6x9yplus;4$.

Example 4.8.2

Find and classify the critical (i.e., stationary) points for $f\left(x\,y\right)\=xy{x}^{2}{y}^{2}2x2yplus;4.$

Example 4.8.3

Find and classify the critical (i.e., stationary) points for $f\left(x\,y\right)\=xyplus;2x3yplus;1$.

Example 4.8.4

Find and classify the critical (i.e., stationary) points for $f\left(x\,y\right)\={x}^{3}{y}^{3}3xyplus;4$.

Example 4.8.5

Find and classify the critical (i.e., stationary) points for $f\left(x\,y\right)\=2xy3{x}^{4}5{y}^{4}plus;2yplus;7$.

Example 4.8.6

Find and classify the critical (i.e., stationary) points for ${x}^{2}2x\+8\+{y}^{2}4y\+{z}^{2}\+2z$.

Example 4.8.7

Line ${L}_{1}$ passes through the point $\left(1\,2\,3\right)$ and has direction ${\mathbf{V}}_{1}\=2\mathbf{i}5\mathbf{j}plus;4\mathbf{k}$. Line ${L}_{2}$ passes through the point $\left(3\,1\,2\right)$ and has direction ${\mathbf{V}}_{2}\=3\mathbf{i}plus;7\mathbf{j}6\mathbf{k}$. Show that the lines are skew, and find the minimum distance between them. Hint: Parametrize each line with a different parameter and minimize the square of the distance between an arbitrary point on each line.

Example 4.8.8

Find the minimum distance between the point $\left(1\,2\,3\right)$ and the plane $5xplus;3yplus;2zequals;7$.

Example 4.8.9

If both $\mathbf{U}\=3\mathbf{i}5\mathbf{j}plus;7\mathbf{k}$ and $\mathbf{V}\=6\mathbf{i}plus;\mathbf{j}13\mathbf{k}$ are bound to the origin, project U onto V. Hint: Find the minimum distance from the tip of U to the line along V.

Example 4.8.10

Choose $a$ and $b$ so that the line $y\=axplus;b$ minimizes $S\=\sum _{k\=1}^{n}{\left(a{x}_{k}plus;b{y}_{k}\right)}^{2}$, the sum of squares of the deviations from the points $\left(1\,5.3\right)\,\left(3\,8.8\right)\,\left(5\,12.5\right)\,\left(6\,15.4\right)$ to the line. Such a line is called the "leastsquares" line.

Example 4.8.11

Obtain formulas for $a$ and $b$ so that the line $y\=axplus;b$ minimizes $S\=\sum _{k\=1}^{n}{\left(a{x}_{k}plus;b{y}_{k}\right)}^{2}$, the sum of squares of the deviations from the points $\left({x}_{k}\,{y}_{k}\right)\,k\=1\,\dots \,n$, to the line.

Example 4.8.12

By minimizing the sum of squares of deviations $S\=\sum _{k\=1}^{5}{\left(f\left({x}_{k}\right){y}_{k}\right)}^{2}$ between $f\left(x\right)\=a{x}^{2}plus;bxplus;c$ and the points $\left(1\,3\right)\,\left(2\,8\right)\,\left(3\,15\right)\,\left(4\,30\right)\,\left(5\,38\right)$, obtain the best leastsquares quadratic fit to the data.



${}$

${}$
<< Previous Section Table of Contents Next Section >>
${}$${}$${}$
© Maplesoft, a division of Waterloo Maple Inc., 2024. All rights reserved. This product is protected by copyright and distributed under licenses restricting its use, copying, distribution, and decompilation.
${}$
For more information on Maplesoft products and services, visit www.maplesoft.com
${}$
${}$
${}$
${}$
${}$
${}$
${}$
