Global And Non-Smooth Optimization (GANSO) Maple toolbox
Centre for Informatics and Applied Optimization Copyright University of Ballarat, 2006 Australia
j.ugon@ballarat.edu.au, gbeliako@gmail.com
Installation instructions
To use this worksheet you need to download GANSO libraries, from
http://www.ganso.com.au/libs/gansomapledlls.zip and extract the content of this file (using a utility like WinZip) into Maple's bin directory
(typically c:\Program Files\Maple 9\bin) or any directory on the PATH. This way Maple can locate GANSO libraries and properly use them.
Introduction
This worksheet illustrates how GANSO package can be used with Maple (under MS Windows).
GANSO is a programming library which implements several methods of global and nonsmooth, nonlinear optimization. GANSO library is specifically designed to solve optimization problems with a very complex, nondifferentiable objective function. It is not designed for solving linear, quadratic or integer programming problems, for which many specialised methods are available. However, when the objective function is non smooth, or has many local extrema, such specialised methods are inadequate. GANSO provides a computationally efficient way to tackle these complicated problems. Please see the user manual supplied with this worksheet.
This worksheet is provided as a template for user's own objective functions and constraints. It shows how to declare the objective function and call the appropriate optimization methods.
It should be used in conjunction with the supplied dlls (gansodll.dll, mwrap_ganso.dll), and the user manual (gansomaple.pdf).
The typical calling sequence is as follows:
1. execute define_external procedures in the next subsection
2. declare your objective function (and also set the boundaries and constraints if required)
3. call the relevant optimization routines with correct arguments
Please change the GANSOPATH line below to the location of this package on your computer!
Initialization: Declarations of GANSO optimization procedures found in the dll
Execute the commands below to declare the external subroutines from GANSO library.
Section 1: Declaration of the objective function and constraints: general but slow
We solve the problem minimize f(x), subject to a system of linear constraints Ax <= b and also box constraints lo_i <= x_i <= up_i.
Declaration of the objective function f(x)
Declare f as a procedure. The objective function f takes the arguments: n (dimension) and x (vector of size n). It returns a real value. (In this example f does not depend on x3)
Declaration of box constraints and auxiliary vectors
Constraints and boundaries, also dimension of x and declaration of x0 and other vectors.
Executing GANSO methods
Executing global optimization method ECAM with given box constraints.
Section 2: Example -- both general and fast methods
Declaration of another objective function. We use two declarations: for the general and for the fast methods.
Rastrigin's test function
Plot the graph of this function below to see that is has many local minimizers.
both declarations are here
Declaration of bounds
We now declare the upper and lower bounds, the desired number of iterations, Lipschitz constant (an estimate), and the vector x0 which will receive the solution.
Execution of GANSO methods
We first execute ECAM global optimization method
Setting iter to a negative number of iterations has the effect of not using local optimization method at the final stage
We call DFBM 20 times from random staring points
Another global optimization method DSO
Section 3: Example -- using fast method
Objective function, constraints and auxiliary vectors
This example illustrates the use of constrains. Refer to GANSO manual for the description of this optimization problem
Now define the matrices and vectors of constraints
We start with DFBM local search from an initial point x0 outside feasible domain.
Here we manually set up the desired basic variables, 10-3 = 7 variables, the ones that are linearly independent
Now a modified problem: add one inequality constraint x_3+x_7 <= 0.2
References
The methods implemented in GANSO are based on the theory of Abstract Convexity, outlined in
Rubinov, A. Abstract Convexity and Global Optimization, Kluwer, Dordrecht, 2000.
Specific references are:
Bagirov, A., (1999). Derivative-free methods for unconstrained nonsmooth optimization and its numerical analysis, Journal Investigacao Operacional, 19, 75. Bagirov, A., (2002). A method for minimization of quasidifferentiable functions, Optimization Methods and Software, 17, 31. Bagirov, A.; Rubinov, A., (2001). Modified versions of the cutting angle method, in Convex analysis and global optimization; N. Hadjisavvas and P. M. Pardalos, eds.; Kluwer: Dordrecht, Vol. 54; p 245. Bagirov, A.; Rubinov, A., (2000). Global minimization of increasing positively homogeneous function over the unit simplex, Annals of Operations Research, 98, 171. Bagirov, A.; Rubinov, A., (2003). Cutting angle method and a local search, Journal of Global Optimization, 27, 193. Bagirov, A., (2000). Numerical methods for minimizing quasidifferentiable functions: A survey and comparison, in Quasidifferentiability and Related Topics; V. F. Demyanov and A. M. Rubinov, eds.; p 33, KLUWER ACADEMIC PUBL. Beliakov, G., (2003). Geometry and combinatorics of the cutting angle method, Optimization, 52, 379. Beliakov, G., (2004). The Cutting Angle Method - a tool for constrained global optimization, Optimization Methods and Software, 19, 137. Beliakov, G., (2005). Extended Cutting Angle method of constrained global optimization, in Optimisation in Industry; L. Caccetta, ed.; Springer: New York; 2007, in press. Beliakov, G., (2005). A review of applications of the Cutting Angle methods, in Continuous Optimization; A. Rubinov and V. Jeyakumar, eds.; Springer: New York; p 209. Beliakov, G.; Bagirov, A.; Monsalve, J. E., (2003). Parallelization of the discrete gradient method of non-smooth optimization and its applications, in Proceedings of the 3rd International Conference on Computational ScienceSpringer-Verlag: Heidelberg; Vol. 3; p 592. Beliakov, G.; Ting, K. M.; Murshed, M.; Rubinov, A.; Bertoli, M., (2003). Efficient Serial and Parallel Implementations of the Cutting Angle Method, in High Performance Algorithms and Software for Nonlinear Optimization; G. Di Pillo, ed.; Kluwer Academic Publishers: p 57. Mammadov, M.A. (2004). A new global optimization algorithm based on dynamical systems approach. In: Alex Rubinov and Moshe Sniedovich (Editors), Proceedings of The Sixth International Conference on Optimization: Techniques and Applications (ICOTA6), Ballarat, December 2004, Article index number 198 (94th article), 11pp. University of Ballarat, also in: Research Report 04/04, University of Ballarat, 2004, http://www.ballarat.edu.au/ard/itms/publications/researchPapers.shtml Mammadov, M.A. and Orsi, R., (2005). H_infinity systhesis via a nonsmooth, nonconvex optimization approach, Pacific Journal of Optimization, Volume 1, Number 2, May 2005, pp. 405-420. Mammadov, M.A., Rubinov A. and Jearwood, J., (2005). Dynamical systems described by relational elasticities with applications to global optimization. In: Continuous Optimisation: Current Trends and Modern Applications, V.Jeyakumar and A. Rubinov, Eds. Springer, pp. 365-385.
Copyright by the Centre for Informatics and Applied Optimization, University of Ballarat, 2006.
Please consult http://www.ganso.com.au for updates.
We appreciate your comments about your experiences using GANSO and this Maple toolbox. Please send your feedback to
j.ugon@ballarat.edu.au
Legal Notice: The copyright for this application is owned by the author(s). Neither Maplesoft nor the author are responsible for any errors contained within and are not liable for any damages resulting from the use of this material. This application is intended for non-commercial, non-profit use only. Contact the author for permission if you wish to use this application in for-profit activities.