Methods Used by the Optimization Package
•

This help page provides an overview of the methods used by the Optimization package. The exports in the Optimization package select the methods according to criteria described below. Some exports allow the specification of a particular method using the method option. When using an Optimization export, set infolevel[Optimization] to 1 or higher to display the name of the method being used.

•

In the following descriptions, bounded signifies having explicit bounds, while constrained signifies having general linear or nonlinear
constraints
. The Optimization commands treat constraints and simple bounds differently. For more efficient performance, specify bounds separately, not in the constraint set. The final
objective function
value should be the same in all cases, but constrained and simply bounded nonlinear programs are solved using different methods.

•

For more information about the various options mentioned below, see the Optimization/Options help page.


Linear and Quadratic Programs


•

Linear and quadratic programs are solved by the LPSolve and QPSolve commands respectively. The method option is available for continuous linear programs, but neither for quadratic nor integer linear programs.

•

Continuous programs are solved using one of two methods. The activeset method, featuring a feasibility phase followed by an optimality phase, is available for both linear and quadratic continuous programs.

•

For linear continuous problems in particular, a sparse interior point method is also available. The method may be specified using either the $\mathrm{method}\=\mathrm{activeset}$ or $\mathrm{method}\=\mathrm{interiorpoint}$ option. If no method is specified, a heuristic will be used to choose the method. In general, the interior point method will be more efficient for large, sparse problems. If the input is given in Matrix form and the constraint matrices have the storage=sparse option set, the interior point method will be used. Otherwise, the heuristic is based on the number of variables, constraints, and the density of the constraint coefficient matrices.

•

Integer programs
, accepted by the LPSolve command, are solved with a
branchandbound
algorithm. The continuous subproblems are solved with the activeset method. Integer programs are specified using the assume, binaryvariables or integervariables option.



Nonlinear Programs


•

Nonlinear programs are solved by the NLPSolve command. This command accepts the method option. If the optimization problem does not satisfy the conditions of the requested method, an error is issued.

•

The methods used by the NLPSolve command are described below. They are listed in order of increasing generality. The methods listed first apply to the most restricted problem classes.


Quadratic Interpolation  This method is available for univariate problems with no general constraints but with finite bounds. To select it, use the method=quadratic option. The method assumes the objective function has a continuous first derivative but does not require that derivatives be explicitly provided. Any initial point provided through the initialpoint option is ignored.


BranchandBound (Global Search)  This method is available for univariate problems with no general constraints but with finite bounds. To select it, use the $\mathrm{method}\=\mathrm{branchandbound}$ option. This method differs from others in the Optimization package in that a
global
search is performed. It initially uses a
branchandbound
algorithm, which assumes
Lipschitz continuity
. The solution is further refined with a
local
search.


Modified Newton  The modified Newton method is available for unconstrained or finitely bounded problems. To select it, use the $\mathrm{method}\=\mathrm{modifiednewton}$ option. The method uses the gradient of the objective function, which is computed automatically with algebraic or operator forms but must be provided explicitly with Matrix form.


Nonlinear Simplex  The nonlinear Simplex, or NelderMead, method is available for problems having no bounds or general constraints. To select it, use the $\mathrm{method}\=\mathrm{nonlinearsimplex}$ option. This method, which does not require derivatives, tends to be slow but robust so it is appropriate for objective functions that are prone to inaccuracies. For best performance, the optimization problem should be scaled so that the solution values are of order unity.


Preconditioned Conjugate Gradient (PCG)  The preconditioned limitedmemory quasiNewton conjugate gradient method is available for problems having no bounds or general constraints. To select it, use the $\mathrm{method}\=\mathrm{pcg}$ option. The method uses the gradient of the objective function, which is computed automatically with algebraic or operator forms but must be provided explicitly with Matrix form.


Sequential Quadratic Programming (SQP)  The SQP method is available for arbitrary unconstrained or constrained nonlinear programs. To select it, use the $\mathrm{method}\=\mathrm{sqp}$ option. The method uses derivatives automatically computed by Maple or explicitly provided through the options. If these are not available, numerical derivatives are computed.

•

If the method option is not provided, the method is chosen by NLPSolve according to the following rules. If the optimization problem is univariate and unconstrained except for finite bounds, quadratic interpolation is used. If the problem is unconstrained and the gradient of the objective function is available, the PCG method is used. Otherwise, the SQP method is used.



LeastSquares Problems


•

Leastsquares problems are solved by the LSSolve command. When the residuals in the objective function and the constraints are all linear, then an activeset method is used. When the problem is nonlinear, the method option can be used to specify a particular method. If the optimization problem does not satisfy the conditions of the requested method, then an error is issued.

•

The methods used by the LSSolve command for nonlinear problems are described below.


GaussNewton and Modified Newton  This method, based on a combined GaussNewton and modified Newton algorithm, is available for problems having no bounds or general constraints. It does not require derivatives. To select it, use the $\mathrm{method}\=\mathrm{modifiednewton}$ option.


Sequential Quadratic Programming (SQP)  The SQP method is available for arbitrary unconstrained or constrained nonlinear leastsquares problems. To select it, use the $\mathrm{method}\=\mathrm{sqp}$ option. The method uses derivatives automatically computed by Maple or explicitly provided through the options. If these are not available, numerical derivatives are computed.

•

If the method option is not provided, the method is chosen by LSSolve according to the following rules. If the optimization problem is unconstrained and unbounded, then the modified Newton method is used. Otherwise, the SQP method is used.


