Optimization: New Applications
http://www.maplesoft.com/applications/category.aspx?cid=1600
en-us2014 Maplesoft, A Division of Waterloo Maple Inc.Maplesoft Document SystemThu, 17 Apr 2014 07:08:09 GMTThu, 17 Apr 2014 07:08:09 GMTNew applications in the Optimization categoryhttp://www.mapleprimes.com/images/mapleapps.gifOptimization: New Applications
http://www.maplesoft.com/applications/category.aspx?cid=1600
Classroom Tips and Techniques: Bivariate Limits - Then and Now
http://www.maplesoft.com/applications/view.aspx?SID=145979&ref=Feed
An introductory overview of the functionalities in Maple's GraphTheory package.<img src="/view.aspx?si=145979/thumb.jpg" alt="Classroom Tips and Techniques: Bivariate Limits - Then and Now" align="left"/>An introductory overview of the functionalities in Maple's GraphTheory package.145979Wed, 17 Apr 2013 04:00:00 ZDr. Robert LopezDr. Robert LopezClassroom Tips and Techniques: Introduction to Maple's GraphTheory Package
http://www.maplesoft.com/applications/view.aspx?SID=142357&ref=Feed
An introductory overview of the functionality in Maple's GraphTheory package.<img src="/view.aspx?si=142357/thumb.jpg" alt="Classroom Tips and Techniques: Introduction to Maple's GraphTheory Package" align="left"/>An introductory overview of the functionality in Maple's GraphTheory package.142357Thu, 17 Jan 2013 05:00:00 ZProf. Michael MonaganProf. Michael MonaganClassroom Tips and Techniques: Least-Squares Fits
http://www.maplesoft.com/applications/view.aspx?SID=140942&ref=Feed
<p><span id="ctl00_mainContent__documentViewer" ><span ><span class="body summary">The least-squares fitting of functions to data can be done in Maple with eleven different commands from four different packages. The <em>CurveFitting</em> and LinearAlgebra packages each have a LeastSquares command; the Optimization package has the LSSolve and NLPSolve commands; and the Statistics package has the seven commands Fit, LinearFit, PolynomialFit, ExponentialFit, LogarithmicFit, PowerFit, and NonlinearFit, which can return some measure of regression analysis.</span></span></span></p><img src="/view.aspx?si=140942/image.jpg" alt="Classroom Tips and Techniques: Least-Squares Fits" align="left"/><p><span id="ctl00_mainContent__documentViewer" ><span ><span class="body summary">The least-squares fitting of functions to data can be done in Maple with eleven different commands from four different packages. The <em>CurveFitting</em> and LinearAlgebra packages each have a LeastSquares command; the Optimization package has the LSSolve and NLPSolve commands; and the Statistics package has the seven commands Fit, LinearFit, PolynomialFit, ExponentialFit, LogarithmicFit, PowerFit, and NonlinearFit, which can return some measure of regression analysis.</span></span></span></p>140942Wed, 28 Nov 2012 05:00:00 ZDr. Robert LopezDr. Robert LopezClassroom Tips and Techniques: Best Taylor-Polynomial Approximations
http://www.maplesoft.com/applications/view.aspx?SID=136471&ref=Feed
In the early 90s, Joe Ecker (Rensselaer Polytechnic Institute) provided a Maple solution to the problem of determining for a given function, which expansion point in a specified interval yielded the best quadratic Taylor polynomial approximation, where "best" was measured by the L<sub>2</sub>-norm. This article applies Ecker's approach to the function <em>f(x)</em> = sinh<em>(x)</em> – <em>x e<sub>-3x</sub>,</em> -1 ≤ <em>x</em> ≤ 3, then goes on to find other approximating quadratic polynomials.<img src="/view.aspx?si=136471/image.jpg" alt="Classroom Tips and Techniques: Best Taylor-Polynomial Approximations" align="left"/>In the early 90s, Joe Ecker (Rensselaer Polytechnic Institute) provided a Maple solution to the problem of determining for a given function, which expansion point in a specified interval yielded the best quadratic Taylor polynomial approximation, where "best" was measured by the L<sub>2</sub>-norm. This article applies Ecker's approach to the function <em>f(x)</em> = sinh<em>(x)</em> – <em>x e<sub>-3x</sub>,</em> -1 ≤ <em>x</em> ≤ 3, then goes on to find other approximating quadratic polynomials.136471Tue, 14 Aug 2012 04:00:00 ZDr. Robert LopezDr. Robert LopezClassroom Tips and Techniques: An Inequality-Constrained Optimization Problem
http://www.maplesoft.com/applications/view.aspx?SID=135904&ref=Feed
<p>This article shows how to work both analytically and numerically to find the global maximum of</p>
<p><em>w</em> = ƒ(<em>x, y, z</em>) ≡ <em>x</em><sup>2</sup>(1 + <em>x</em>) + <em>y</em><sup>2</sup>(1 + <em>y</em>) + z<sup>2</sup>(1 + <em>z</em>)</p>
<p>in that part of the first octant on, or below, the plane <em>x</em> + <em>y</em> + <em>z</em> = 6.</p><img src="/view.aspx?si=135904/thumb.jpg" alt="Classroom Tips and Techniques: An Inequality-Constrained Optimization Problem" align="left"/><p>This article shows how to work both analytically and numerically to find the global maximum of</p>
<p><em>w</em> = ƒ(<em>x, y, z</em>) ≡ <em>x</em><sup>2</sup>(1 + <em>x</em>) + <em>y</em><sup>2</sup>(1 + <em>y</em>) + z<sup>2</sup>(1 + <em>z</em>)</p>
<p>in that part of the first octant on, or below, the plane <em>x</em> + <em>y</em> + <em>z</em> = 6.</p>135904Mon, 16 Jul 2012 04:00:00 ZDr. Robert LopezDr. Robert LopezStreet-fighting Math
http://www.maplesoft.com/applications/view.aspx?SID=129226&ref=Feed
This interactive Maple document contains a simple street-fighting game and performs a mathematical analysis of it, involving probability and game theory. The document is suitable for presentation in an undergraduate course on operations research, probability or linear programming. No knowledge of Maple is required.<img src="/view.aspx?si=129226/fighter_sm.jpg" alt="Street-fighting Math" align="left"/>This interactive Maple document contains a simple street-fighting game and performs a mathematical analysis of it, involving probability and game theory. The document is suitable for presentation in an undergraduate course on operations research, probability or linear programming. No knowledge of Maple is required.129226Thu, 29 Dec 2011 05:00:00 ZDr. Robert IsraelDr. Robert IsraelGreat Expectations
http://www.maplesoft.com/applications/view.aspx?SID=127116&ref=Feed
<p>An investor is offered what appears to be a great investment opportunity. Unfortunately it doesn't turn out to be so great in the long run. This interactive Maple document explores the situation using simulation and analysis, and suggests a new strategy that would produce better results.</p>
<p>This is an example suitable for presentation in an undergraduate course on probability. No knowledge of Maple is required.</p><img src="/view.aspx?si=127116/expectation_thum.png" alt="Great Expectations" align="left"/><p>An investor is offered what appears to be a great investment opportunity. Unfortunately it doesn't turn out to be so great in the long run. This interactive Maple document explores the situation using simulation and analysis, and suggests a new strategy that would produce better results.</p>
<p>This is an example suitable for presentation in an undergraduate course on probability. No knowledge of Maple is required.</p>127116Thu, 27 Oct 2011 04:00:00 ZClassroom Tips and Techniques: Steepest-Ascent Curves
http://www.maplesoft.com/applications/view.aspx?SID=123985&ref=Feed
Steepest-ascent curves are obtained for surfaces defined analytically and digitally.<img src="/view.aspx?si=123985/thumb.jpg" alt="Classroom Tips and Techniques: Steepest-Ascent Curves" align="left"/>Steepest-ascent curves are obtained for surfaces defined analytically and digitally.123985Tue, 19 Jul 2011 04:00:00 ZDr. Robert LopezDr. Robert LopezClassroom Tips and Techniques: Nonlinear Fit, Optimization, and the DirectSearch Package
http://www.maplesoft.com/applications/view.aspx?SID=122760&ref=Feed
In this month's article, I revisit a nonlinear curve-fitting problem that appears in my Advanced Engineering Mathematics ebook, examine the role of Maple's Optimization package in that problem, and then explore the DirectSearch package from Dr. Sergey N. Moiseev.<img src="/view.aspx?si=122760/thumb.jpg" alt="Classroom Tips and Techniques: Nonlinear Fit, Optimization, and the DirectSearch Package" align="left"/>In this month's article, I revisit a nonlinear curve-fitting problem that appears in my Advanced Engineering Mathematics ebook, examine the role of Maple's Optimization package in that problem, and then explore the DirectSearch package from Dr. Sergey N. Moiseev.122760Wed, 15 Jun 2011 04:00:00 ZDr. Robert LopezDr. Robert LopezDirectSearch optimization package, version 2
http://www.maplesoft.com/applications/view.aspx?SID=101333&ref=Feed
<p> The DirectSearch package is a collection of commands to numerically compute local and global minimums (maximums) of nonlinear multivariate function with (without) constraints. The package optimization methods are universal derivative-free direct searching methods, i.e. they do not require the objective function and constraints to be differentiable and continuous.<br /> The package optimization methods have quadratic convergence for quadratic functions.<br /><br /> The package also contains commands for multiobjective optimization, solving system of equations, fitting nonlinear function to data.<br /><br />The following is a summary of the major improvements in DirectSearch v.2.<br /><br />-- Three new derivative-free optimization methods are added.<br />-- The new global optimization command GlobalOptima is added.<br />-- The commands for multiobjective optimization, solving system of equations, fitting nonlinear function to data are added.<br />-- Mixed integer-discrete-continuous optimization is now supported.<br />-- You can now specify inequality constraints as any Boolean expressions.<br />-- You can now set bound inequality constraints x>=a, x<=b as: x=a..b.<br />-- Assume and assumption commands are supported for inequality constraints.<br />-- You can now specify problem variables as Vector.<br />-- High dimensional optimization problem are now solved a much faster.<br />-- Search in space curve direction is added to all algorithms.<br />-- Penalty function method is added for optimization with inequality constraints<br />-- Improved optimization algorithm for equality constraints is faster and more reliable.<br />-- The feasible initial point searching is improved.<br />-- Now the package is compatible with Maple 12 and above.<br />-- Detailed description of CDOS method in .pdf format is added.<br />-- Russian version of the package is now available.<br /><br /></p><img src="/view.aspx?si=101333/maple_icon.jpg" alt="DirectSearch optimization package, version 2" align="left"/><p> The DirectSearch package is a collection of commands to numerically compute local and global minimums (maximums) of nonlinear multivariate function with (without) constraints. The package optimization methods are universal derivative-free direct searching methods, i.e. they do not require the objective function and constraints to be differentiable and continuous.<br /> The package optimization methods have quadratic convergence for quadratic functions.<br /><br /> The package also contains commands for multiobjective optimization, solving system of equations, fitting nonlinear function to data.<br /><br />The following is a summary of the major improvements in DirectSearch v.2.<br /><br />-- Three new derivative-free optimization methods are added.<br />-- The new global optimization command GlobalOptima is added.<br />-- The commands for multiobjective optimization, solving system of equations, fitting nonlinear function to data are added.<br />-- Mixed integer-discrete-continuous optimization is now supported.<br />-- You can now specify inequality constraints as any Boolean expressions.<br />-- You can now set bound inequality constraints x>=a, x<=b as: x=a..b.<br />-- Assume and assumption commands are supported for inequality constraints.<br />-- You can now specify problem variables as Vector.<br />-- High dimensional optimization problem are now solved a much faster.<br />-- Search in space curve direction is added to all algorithms.<br />-- Penalty function method is added for optimization with inequality constraints<br />-- Improved optimization algorithm for equality constraints is faster and more reliable.<br />-- The feasible initial point searching is improved.<br />-- Now the package is compatible with Maple 12 and above.<br />-- Detailed description of CDOS method in .pdf format is added.<br />-- Russian version of the package is now available.<br /><br /></p>101333Tue, 01 Feb 2011 05:00:00 ZDr. Sergey MoiseevDr. Sergey MoiseevPortfolio Simulation and Quadratic Programming
http://www.maplesoft.com/applications/view.aspx?SID=100604&ref=Feed
<p>We will in this maple worksheet explore portfolio theory and quadratic optimization.<br />We will start by simulating some data for 50 stocks and then optimize the portfolio.<br />We will also use empirical data to backtest our portfolio strategy.</p><img src="/view.aspx?si=100604/maple_icon.jpg" alt="Portfolio Simulation and Quadratic Programming" align="left"/><p>We will in this maple worksheet explore portfolio theory and quadratic optimization.<br />We will start by simulating some data for 50 stocks and then optimize the portfolio.<br />We will also use empirical data to backtest our portfolio strategy.</p>100604Mon, 03 Jan 2011 05:00:00 ZMarcus DavidssonMarcus DavidssonClassroom Tips and Techniques: Partial Derivatives by Subscripting
http://www.maplesoft.com/applications/view.aspx?SID=100266&ref=Feed
As output, Maple can display the partial derivative ∂/∂<em>x f</em>(<em>x,y</em>) as <em>f</em><sub>x</sub>; that is, subscript notation can be used to display partial derivatives, and it can be done with two completely different mechanisms. This article describes these two techniques, and then investigates the extent to which partial derivatives can be calculated by subscript notation.<img src="/view.aspx?si=100266/thumb.jpg" alt="Classroom Tips and Techniques: Partial Derivatives by Subscripting" align="left"/>As output, Maple can display the partial derivative ∂/∂<em>x f</em>(<em>x,y</em>) as <em>f</em><sub>x</sub>; that is, subscript notation can be used to display partial derivatives, and it can be done with two completely different mechanisms. This article describes these two techniques, and then investigates the extent to which partial derivatives can be calculated by subscript notation.100266Wed, 15 Dec 2010 05:00:00 ZDr. Robert LopezDr. Robert LopezClassroom Tips and Techniques: Fitting Circles in Space to 3-D Data
http://www.maplesoft.com/applications/view.aspx?SID=1644&ref=Feed
<p>In "A Project on Circles in Space," Carl Cowen provided an algebraic solution for the problem of fitting a circle to a set of points in space. His technique used the singular value decomposition from linear algebra, and was recast as a project in the volume ATLAST: Computer Exercises for Linear Algebra. Both versions of the problem used MATLAB® for the calculations. In this worksheet, we implement the algebraic calculations in Maple, then add noise to the data to test the robustness of the algebraic method. Next, we solve the problem with an analytic approach that incorporates least squares, and appears to be more robust in the face of noisy data. Finally, the analytic approach leads to explicit formulas for the fitting circle, so we end with graphs of the data, fitting circle, and plane lying closest to the data in the least-squares sense.</p>
<p><em><sub>Simulink is a registered trademark of The MathWorks, Inc.</sub></em></p><img src="/view.aspx?si=1644/thumb3.jpg" alt="Classroom Tips and Techniques: Fitting Circles in Space to 3-D Data" align="left"/><p>In "A Project on Circles in Space," Carl Cowen provided an algebraic solution for the problem of fitting a circle to a set of points in space. His technique used the singular value decomposition from linear algebra, and was recast as a project in the volume ATLAST: Computer Exercises for Linear Algebra. Both versions of the problem used MATLAB® for the calculations. In this worksheet, we implement the algebraic calculations in Maple, then add noise to the data to test the robustness of the algebraic method. Next, we solve the problem with an analytic approach that incorporates least squares, and appears to be more robust in the face of noisy data. Finally, the analytic approach leads to explicit formulas for the fitting circle, so we end with graphs of the data, fitting circle, and plane lying closest to the data in the least-squares sense.</p>
<p><em><sub>Simulink is a registered trademark of The MathWorks, Inc.</sub></em></p>1644Mon, 17 May 2010 04:00:00 ZDr. Robert LopezDr. Robert LopezDirect search optimization package
http://www.maplesoft.com/applications/view.aspx?SID=87637&ref=Feed
<p>The DirectSearch package is a collection of commands to numerically computes local and global minimums (maximums) of nonlinear multivariate function with (without) constraints. The package optimization methods are direct searching methods, i.e. they do not require the objective function to be differentiable and continuous.</p><img src="/view.aspx?si=87637/Fig2.jpg" alt="Direct search optimization package" align="left"/><p>The DirectSearch package is a collection of commands to numerically computes local and global minimums (maximums) of nonlinear multivariate function with (without) constraints. The package optimization methods are direct searching methods, i.e. they do not require the objective function to be differentiable and continuous.</p>87637Wed, 12 May 2010 04:00:00 ZDr. Sergey MoiseevDr. Sergey MoiseevPhénomène de Runge - subdivision de Chebychev
http://www.maplesoft.com/applications/view.aspx?SID=35301&ref=Feed
<p>On observe d'abord la divergence du polynôme de Lagrange interpolant la fonction densité de probabilité de la loi de Cauchy lorsque la <strong>subdivision est équirépartie</strong> sur [-1;1]. C'est le <u>phénomène de Runge</u>.<br />
<br />
On observe ensuite qu'en choisissant une <strong>subdivision de Chebychev</strong> le phénomène de divergence au voisinage des bornes disparait.<br />
<br />
Cette activité a été réalisé dans le cadre de la préparation à l'agrégation interne de mathématiques de Rennes le 10 Mars 2010.<br />
Les nouveaux programmes du concours incitent à proposer des exercices utilisant les TICE. Il semble difficile de proposer une preuve convaincante du phénomène de Runge pour une épreuve orale. Ceci justifie de ne s'en tenir qu'à la seule observation.</p><img src="/view.aspx?si=35301/thumb.jpg" alt="Phénomène de Runge - subdivision de Chebychev" align="left"/><p>On observe d'abord la divergence du polynôme de Lagrange interpolant la fonction densité de probabilité de la loi de Cauchy lorsque la <strong>subdivision est équirépartie</strong> sur [-1;1]. C'est le <u>phénomène de Runge</u>.<br />
<br />
On observe ensuite qu'en choisissant une <strong>subdivision de Chebychev</strong> le phénomène de divergence au voisinage des bornes disparait.<br />
<br />
Cette activité a été réalisé dans le cadre de la préparation à l'agrégation interne de mathématiques de Rennes le 10 Mars 2010.<br />
Les nouveaux programmes du concours incitent à proposer des exercices utilisant les TICE. Il semble difficile de proposer une preuve convaincante du phénomène de Runge pour une épreuve orale. Ceci justifie de ne s'en tenir qu'à la seule observation.</p>35301Fri, 26 Mar 2010 04:00:00 ZKERNIVINEN SebastienKERNIVINEN SebastienDiet Optimization
http://www.maplesoft.com/applications/view.aspx?SID=35156&ref=Feed
<p>This application finds the least-cost diet that fulfills a specific set of nutritional requirements It has a default basket of foods (with an associated set of nutritional data), but foods can be added or removed, with changes remembered from prior saved sessions. The worksheet has a graphical user interface implemented with Maple’s embedded components.</p>
<p> </p>
<p>The linear programming techniques implemented in this application are now widely used to create practical diet plans from accepted nutritional guidelines.</p><img src="/view.aspx?si=35156/279802\test.jpg" alt="Diet Optimization" align="left"/><p>This application finds the least-cost diet that fulfills a specific set of nutritional requirements It has a default basket of foods (with an associated set of nutritional data), but foods can be added or removed, with changes remembered from prior saved sessions. The worksheet has a graphical user interface implemented with Maple’s embedded components.</p>
<p> </p>
<p>The linear programming techniques implemented in this application are now widely used to create practical diet plans from accepted nutritional guidelines.</p>35156Mon, 08 Feb 2010 05:00:00 ZMaplesoftMaplesoftClassroom Tips and Techniques: Geodesics on a Surface
http://www.maplesoft.com/applications/view.aspx?SID=34940&ref=Feed
<p>Several months ago we provided the article Tensor Calculus with the Differential Geometry Package in which we found geodesics in the plane when the plane was referred to polar coordinates. In this month's article we find geodesics on a surface embedded in R<sup>3</sup>. We illustrate three approaches: numeric approximation, the calculus of variations, and differential geometry.</p><img src="/view.aspx?si=34940/thumb.jpg" alt="Classroom Tips and Techniques: Geodesics on a Surface" align="left"/><p>Several months ago we provided the article Tensor Calculus with the Differential Geometry Package in which we found geodesics in the plane when the plane was referred to polar coordinates. In this month's article we find geodesics on a surface embedded in R<sup>3</sup>. We illustrate three approaches: numeric approximation, the calculus of variations, and differential geometry.</p>34940Tue, 08 Dec 2009 05:00:00 ZDr. Robert LopezDr. Robert LopezPortfolio Optimization under Nonconvex Transaction Costs with the Global Optimization Toolbox
http://www.maplesoft.com/applications/view.aspx?SID=1401&ref=Feed
<p>The two competing goals of investment are long-term growth of capital and low risk. The Markowitz model, first formulated in 1952, is a quadratic programming optimization model for balancing these two goals. The decision variables are the amounts invested in each asset. The objective is to minimize the variance of a portfolio's total return, subject to the constraints that (1) the expected growth of the portfolio is at least some target level, and (2) that we don't invest more capital than we have. In its original form, the Markowitz model assumes no transaction costs, and thus, it is easily solved with quadratic programming. Researchers have studied numerous convex generalizations of the Markowitz model, for example, by adding convex (usually linear) transaction costs, or other linear constraints. In this Maple application demonstration, we assume the investor pays nonconvex transaction costs on asset purchases; in particular, these costs are computed by a concave, piecewise-linear function of the amounts invested. This cost structure arises in practice whenever a broker offers volume discounts on commission rates. Under these conditions, the Markowitz model becomes a nonconvex optimization problem that is not easily solved using local-search optimization algorithms. However, we solve it easily using the Maple Global Optimization Toolbox.</p><img src="/view.aspx?si=1401/1407.jpg" alt="Portfolio Optimization under Nonconvex Transaction Costs with the Global Optimization Toolbox" align="left"/><p>The two competing goals of investment are long-term growth of capital and low risk. The Markowitz model, first formulated in 1952, is a quadratic programming optimization model for balancing these two goals. The decision variables are the amounts invested in each asset. The objective is to minimize the variance of a portfolio's total return, subject to the constraints that (1) the expected growth of the portfolio is at least some target level, and (2) that we don't invest more capital than we have. In its original form, the Markowitz model assumes no transaction costs, and thus, it is easily solved with quadratic programming. Researchers have studied numerous convex generalizations of the Markowitz model, for example, by adding convex (usually linear) transaction costs, or other linear constraints. In this Maple application demonstration, we assume the investor pays nonconvex transaction costs on asset purchases; in particular, these costs are computed by a concave, piecewise-linear function of the amounts invested. This cost structure arises in practice whenever a broker offers volume discounts on commission rates. Under these conditions, the Markowitz model becomes a nonconvex optimization problem that is not easily solved using local-search optimization algorithms. However, we solve it easily using the Maple Global Optimization Toolbox.</p>1401Mon, 14 Sep 2009 04:00:00 ZApplied Portfolio Optimization
http://www.maplesoft.com/applications/view.aspx?SID=7262&ref=Feed
Applied Portfolio Optimization<img src="/view.aspx?si=7262/1.jpg" alt="Applied Portfolio Optimization" align="left"/>Applied Portfolio Optimization7262Wed, 18 Feb 2009 00:00:00 ZMarcus DavidssonMarcus DavidssonA Computational Approach to Essential and Nonessential Objective Functions in Linear Multicriteria Optimization
http://www.maplesoft.com/applications/view.aspx?SID=7061&ref=Feed
<strong>Authors</strong>: Prof. Agnieszka B. Malinowska and Prof. Delfim F. M. Torres
The question of obtaining well-defined criteria for multiple criteria decision making problems is well-known. One of the approaches dealing with this question is the concept of nonessential objective function. A certain objective function is called nonessential if the set of efficient solutions is the same both with or without that objective function. In this work we put together two methods for determining nonessential objective functions. A computational implementation is done using the computer algebra system Maple.<img src="/view.aspx?si=7061/1.jpg" alt="A Computational Approach to Essential and Nonessential Objective Functions in Linear Multicriteria Optimization" align="left"/><strong>Authors</strong>: Prof. Agnieszka B. Malinowska and Prof. Delfim F. M. Torres
The question of obtaining well-defined criteria for multiple criteria decision making problems is well-known. One of the approaches dealing with this question is the concept of nonessential objective function. A certain objective function is called nonessential if the set of efficient solutions is the same both with or without that objective function. In this work we put together two methods for determining nonessential objective functions. A computational implementation is done using the computer algebra system Maple.7061Tue, 23 Dec 2008 00:00:00 ZProf. Delfim TorresProf. Delfim Torres