In many optimization problems, simple methods are not sufficient to find the best solution. They will only find a local optimum, usually the one closest to the search starting point, which is often given by the user. For example, consider the expression $\mathrm{ln}\left(x\right)\cdot \mathrm{sin}\left(x\right)\.$
>

$\mathrm{plot}\left(\mathrm{ln}\left(x\right)\cdot \mathrm{sin}\left(x\right)\,xequals;1..50\right)$

You could likely approximate the global minimum in the given domain, and find that minimum easily, by simply looking at the plot. But if you were unable to properly approximate it, or if you approximated incorrectly, you would not find the global minimum by using the usual optimization techniques.
>

$\mathrm{Optimization}\left[\mathrm{Minimize}\right]\left(\mathrm{ln}\left(x\right)\cdot \mathrm{sin}\left(x\right)\,xequals;1..50\right)$

$\left[{\mathrm{3.39618690740209}}{\,}\left[{x}{=}{29.8549920107437}\right]\right]$
 (1.3.1) 
According to the Minimize command, the minimum is at approximately x = 30. However, you can see in the plot above that this is not the global minimum.
By using the global optimization equivalent of this command, you can be assured that you have found the global minimum in the specified interval.
>

$\mathrm{with}\left(\mathrm{GlobalOptimization}\right)\:$

>

$\mathrm{GlobalSolve}\left(\mathrm{ln}\left(x\right)\cdot \mathrm{sin}\left(x\right)\,xequals;1..50\right)$

$\left[{\mathrm{3.88562416790700960}}{\,}\left[{x}{=}{48.6999273727307}\right]\right]$
 (1.3.2) 
To view the solving methods that were used to determine the global minimum, change the infolevel variable and reexecute the command. At infolevel = 3, the command displays the input model size and type, solver operational mode and parameter, and detailed runtime information.
>

$\mathrm{infolevel}\left[\mathrm{GlobalOptimization}\right]\u22543\:$

>

$\mathrm{GlobalSolve}\left(\mathrm{ln}\left(x\right)\cdot \mathrm{sin}\left(x\right)\,xequals;1..50\right)$

GlobalSolve: calling NLP solver
 
GlobalSolve: calling global optimization solver
 
GlobalSolve: number of problem variables 1
 
GlobalSolve: number of nonlinear inequality constraints 0
 
GlobalSolve: number of nonlinear equality constraints 0
 
GlobalSolve: method OptimusDEVOL
 
GlobalSolve: maximum iterations 80
 
GlobalSolve: population size 50
 
GlobalSolve: average stopping stepwidth 0.1e3
 
GlobalSolve: time limit 100
 
GlobalSolve: trying evalhf mode
 
GlobalSolve: performing local refinement
 
$\left[{\mathrm{3.88562416790700960}}{\,}\left[{x}{=}{48.6999273727307}\right]\right]$
 (1.3.3) 
Typically, the infolevel is set to 0, the default.
>

$\mathrm{infolevel}\left[\mathrm{GlobalOptimization}\right]\u22540\:$

The following is an example of a situation in which you cannot approximate the global minimum by using linear methods; however, the global solver finds the best solution.

Example


Consider a nonlinear system of equations.
>

$\mathrm{eq1}\u2254\sqrt{{x}^{2}\+{y}^{4}}\+{\ⅇ}^{x{y}^{2}}\+5\mathrm{sin}\left(2x4xy\right)12\mathrm{cos}\left(xy\right)colon;$

>

$\mathrm{eq2}\u22545\mathrm{ln}\left(1plus;{x}^{2}\right)plus;{ExponentialE;}^{yplus;x}plus;5\mathrm{sin}\left(6xy\right)colon;$

The induced leastsquares error function is highly multiextremal, making it difficult to minimize the error of the least squares approximation.
>

$\mathrm{plot3d}\left({\mathrm{eq1}}^{2}\+{\mathrm{eq2}}^{2}\,x\=2..2\,yequals;1..3comma;\mathrm{grid}equals;\left[30comma;30\right]comma;\mathrm{lightmodel}equals;\mathrm{light4}comma;\mathrm{axes}equals;\mathrm{boxed}\right)$

To determine the global minimum of the leastsquares error, define the objective function and constraints to be optimized.
>

$\mathrm{objf}\u2254{\mathrm{eq1}}^{2}\+{\mathrm{eq2}}^{2}\:$

>

$\mathrm{cons}\u2254\left\{\mathrm{eq1}\,\mathrm{eq2}\right\}colon;$

First, try to find a local solution by using Maple's builtin Optimization package. This system is sufficiently complex that linear optimization solvers cannot find a feasible solution.
>

$\mathrm{localsoln}\u2254\mathrm{Optimization}\left[\mathrm{Minimize}\right]\left(\mathrm{objf}\,\left\{\mathrm{eq1}equals;0comma;\mathrm{eq2}equals;0\right\}comma;xequals;1..2comma;yequals;2..1\right)$

However, global optimization techniques can be used to find a global minimum.
>

$\mathrm{sol}\u2254\mathrm{GlobalSolve}\left(\mathrm{objf}\,\left\{\mathrm{eq1}equals;0comma;\mathrm{eq2}equals;0\right\}comma;xequals;1..2comma;yequals;2..1\right)colon;\phantom{\rule[0.0ex]{0.0em}{0.0ex}}\mathrm{sol}\left[1\right]semi;\mathrm{sol}\left[2\right]$

${6.31088724176809444}{\times}{{10}}^{{\mathrm{30}}}$
 
$\left[{x}{=}{\mathrm{0.558972497939247}}{\,}{y}{=}{\mathrm{1.58234523718254}}\right]$
 (1.3.1.1) 
Substitution into the constraints shows that the least squares approximation can be a fairly precise solution. That is, the error of the least squares approximation is very small at the minimum point.
>

$\mathrm{eval}\left(\mathrm{cons}\,\mathrm{sol}\left[2\right]\right)$

$\left\{{1.77635683940025}{\times}{{10}}^{{\mathrm{15}}}\right\}$
 (1.3.1.2) 

With the examples demonstrated here, you are now ready to use the Global Optimization Toolbox to solve many complex mathematical problems. See the Maple help system for more information about the commands used in this guide, or more ways in which the Global Optimization Toolbox can help you.