In many optimization problems, simple methods are not sufficient to find the best solution. They will only find a local optimum, usually the one closest to the search starting point, which is often given by the user. For example, consider the expression $\mathrm{ln}\left(x\right)\cdot \mathrm{sin}\left(x\right)\.$
>

$\mathrm{plot}\left(\mathrm{ln}\left(x\right)\cdot \mathrm{sin}\left(x\right)\,xequals;1..50\right)$

You could likely approximate the global minimum in the given domain, and find that minimum easily, by simply looking at the plot. But if you were unable to properly approximate it, or if you approximated incorrectly, you would not find the global minimum by using the usual optimization techniques.
>

$\mathrm{Optimization}\left[\mathrm{Minimize}\right]\left(\mathrm{ln}\left(x\right)\cdot \mathrm{sin}\left(x\right)\,xequals;1..50\right)$

$\left[{}{3.39618690740209}{\,}\left[{x}{\=}{29.8549920107437}\right]\right]$
 (1.3.1) 
According to the Minimize command, the minimum is at approximately x = 30. However, you can see in the plot above that this is not the global minimum.
By using the global optimization equivalent of this command, you can be assured that you have found the global minimum in the specified interval.
>

$\mathrm{with}\left(\mathrm{GlobalOptimization}\right)\:$

>

$\mathrm{GlobalSolve}\left(\mathrm{ln}\left(x\right)\cdot \mathrm{sin}\left(x\right)\,xequals;1..50\right)$

$\left[{}{3.88562417153593120}{\,}\left[{x}{\=}{48.6999705692544}\right]\right]$
 (1.3.2) 
To view the solving methods that were used to determine the global minimum, change the infolevel variable and reexecute the command. At infolevel = 3, the command displays the input model size and type, solver operational mode and parameter, and detailed runtime information.
>

$\mathrm{infolevel}\left[\mathrm{GlobalOptimization}\right]\u22543\:$

>

$\mathrm{GlobalSolve}\left(\mathrm{ln}\left(x\right)\cdot \mathrm{sin}\left(x\right)\,xequals;1..50\right)$

GlobalSolve: calling NLP solver
SolveGeneral: calling global optimization solver
SolveGeneral: number of problem variables 1
SolveGeneral: number of nonlinear inequality constraints 0
SolveGeneral: number of nonlinear equality constraints 0
SolveGeneral: method multistart
SolveGeneral: merit function evaluation limit 1000
SolveGeneral: nonimproving merit function evaluation limit 200
SolveGeneral: constraint penalty multiplier 100.0
SolveGeneral: target merit function value 0.10e11
SolveGeneral: local search target objective function value 0.10e11
SolveGeneral: local search feasibility tolerance 0.10e5
SolveGeneral: local search optimality tolerance 0.10e5
SolveGeneral: time limit in seconds 100
SolveGeneral: trying evalhf mode
SolveGeneral: total number of function evaluations 1120
SolveGeneral: runtime in external solver 0.
SolveGeneral: maximum constraint infeasibility 0.
SolveGeneral: cycling or stall detected in solver
 
$\left[{}{3.88562417153593120}{\,}\left[{x}{\=}{48.6999705692544}\right]\right]$
 (1.3.3) 
Typically, the infolevel is set to 0, the default.
>

$\mathrm{infolevel}\left[\mathrm{GlobalOptimization}\right]\u22540\:$

The following is an example of a situation in which you cannot approximate the global minimum by using linear methods; however, the global solver finds the best solution.

Example


Consider a nonlinear system of equations.
>

$\mathrm{eq1}\u2254\sqrt{{x}^{2}\+{y}^{4}}\+{\ⅇ}^{x{y}^{2}}\+5\mathrm{sin}\left(2x4xy\right)12\mathrm{cos}\left(xy\right)colon;$

>

$\mathrm{eq2}\u22545\mathrm{ln}\left(1plus;{x}^{2}\right)plus;{ExponentialE;}^{yplus;x}plus;5\mathrm{sin}\left(6xy\right)colon;$

The induced leastsquares error function is highly multiextremal, making it difficult to minimize the error of the least squares approximation.
>

$\mathrm{plot3d}\left({\mathrm{eq1}}^{2}\+{\mathrm{eq2}}^{2}\,x\=2..2\,yequals;1..3comma;\mathrm{grid}equals;\left[30comma;30\right]comma;\mathrm{lightmodel}equals;\mathrm{light4}comma;\mathrm{axes}equals;\mathrm{boxed}\right)$

To determine the global minimum of the leastsquares error, define the objective function and constraints to be optimized.
>

$\mathrm{objf}\u2254{\mathrm{eq1}}^{2}\+{\mathrm{eq2}}^{2}\:$

>

$\mathrm{cons}\u2254\left\{\mathrm{eq1}\,\mathrm{eq2}\right\}colon;$

First, try to find a local solution by using Maple's builtin Optimization package. This system is sufficiently complex that linear optimization solvers cannot find a feasible solution.
>

$\mathrm{localsoln}\u2254\mathrm{Optimization}\left[\mathrm{Minimize}\right]\left(\mathrm{objf}\,\left\{\mathrm{eq1}equals;0comma;\mathrm{eq2}equals;0\right\}comma;xequals;1..2comma;yequals;2..1\right)$

However, global optimization techniques can be used to find a global minimum.
>

$\mathrm{sol}\u2254\mathrm{GlobalSolve}\left(\mathrm{objf}\,\left\{\mathrm{eq1}equals;0comma;\mathrm{eq2}equals;0\right\}comma;xequals;1..2comma;yequals;2..1\right)colon;\phantom{\rule[0.0ex]{0.0em}{0.0ex}}\mathrm{sol}\left[1\right]semi;\mathrm{sol}\left[2\right]$

${5.24626533431424086}{}{{10}}^{{23}}$
 
$\left[{x}{\=}{}{0.558972497937727}{\,}{y}{\=}{}{1.58234523718628}\right]$
 (1.3.1.1) 
Substitution into the constraints shows that the least squares approximation can be a fairly precise solution. That is, the error of the least squares approximation is very small at the minimum point.
>

$\mathrm{eval}\left(\mathrm{cons}\,\mathrm{sol}\left[2\right]\right)$

$\left\{{2.91588975187551}{}{{10}}^{{12}}{\,}{6.63025190306143}{}{{10}}^{{12}}\right\}$
 (1.3.1.2) 

With the examples demonstrated here, you are now ready to use the Global Optimization Toolbox to solve many complex mathematical problems. See the Maple help system for more information about the commands used in this guide, or more ways in which the Global Optimization Toolbox can help you.