Operations Research : New Applications
http://www.maplesoft.com/applications/category.aspx?cid=148
en-us2015 Maplesoft, A Division of Waterloo Maple Inc.Maplesoft Document SystemTue, 31 Mar 2015 16:54:20 GMTTue, 31 Mar 2015 16:54:20 GMTNew applications in the Operations Research categoryhttp://www.mapleprimes.com/images/mapleapps.gifOperations Research : New Applications
http://www.maplesoft.com/applications/category.aspx?cid=148
Circle Packing in a Square
http://www.maplesoft.com/applications/view.aspx?SID=153599&ref=Feed
<p>This application optimizes the packing of circles (of varying radii) in a square, such that the side-length of the square is minimized. One solution for 20 circles (with integer radii of 1 to 20) is visualized here.</p>
<p>This is a difficult global optimization problem and demands strong solvers. This application uses Maple's <a href="/products/toolboxes/globaloptimization/">Global Optimization Toolbox</a>.</p>
<p>Circle packing (and packing optimization in general) is characterized by a large optimization space and many constraints; for this application, 20 circles generates 230 constraint equations.</p>
<p>The number of circles can be increased to create an increasingly complex problem; Maple automatically generates the symbolic constraint equations.</p>
<p>Applications like this are used to stress-test global optimizers.</p><img src="/view.aspx?si=153599/071f7b81258c5cad651a5030370d824f.gif" alt="Circle Packing in a Square" align="left"/><p>This application optimizes the packing of circles (of varying radii) in a square, such that the side-length of the square is minimized. One solution for 20 circles (with integer radii of 1 to 20) is visualized here.</p>
<p>This is a difficult global optimization problem and demands strong solvers. This application uses Maple's <a href="/products/toolboxes/globaloptimization/">Global Optimization Toolbox</a>.</p>
<p>Circle packing (and packing optimization in general) is characterized by a large optimization space and many constraints; for this application, 20 circles generates 230 constraint equations.</p>
<p>The number of circles can be increased to create an increasingly complex problem; Maple automatically generates the symbolic constraint equations.</p>
<p>Applications like this are used to stress-test global optimizers.</p>153599Wed, 04 Jun 2014 04:00:00 ZSamir KhanSamir KhanPacking Disks into a Circle
http://www.maplesoft.com/applications/view.aspx?SID=153600&ref=Feed
<p>This application finds the best packing of unequal non-overlapping disks in a circular container, such that the radius of the container is minimized. This is a tough global optimization problem that demands strong solvers; this application uses Maple's <a href="/products/toolboxes/globaloptimization/">Global Optimization Toolbox</a>. You must have the Global Optimization Toolbox installed to use this application.</p>
<p>One solution for the packing of 50 disks with the integer radii 1 to 50 (as found by this application) is visualized here.</p>
<p>Other solutions for similar packing problems are documented at <a href="http://www.packomania.com">http://www.packomania.com</a>.</p>
<p>Packing optimization is industrially important, with applications in pallet loading, the arrangement of fiber optic cables in a tube, or the placing of components on a circuit board.</p><img src="/view.aspx?si=153600/32183b61c1bca332d0c71924ae09f73a.gif" alt="Packing Disks into a Circle" align="left"/><p>This application finds the best packing of unequal non-overlapping disks in a circular container, such that the radius of the container is minimized. This is a tough global optimization problem that demands strong solvers; this application uses Maple's <a href="/products/toolboxes/globaloptimization/">Global Optimization Toolbox</a>. You must have the Global Optimization Toolbox installed to use this application.</p>
<p>One solution for the packing of 50 disks with the integer radii 1 to 50 (as found by this application) is visualized here.</p>
<p>Other solutions for similar packing problems are documented at <a href="http://www.packomania.com">http://www.packomania.com</a>.</p>
<p>Packing optimization is industrially important, with applications in pallet loading, the arrangement of fiber optic cables in a tube, or the placing of components on a circuit board.</p>153600Wed, 04 Jun 2014 04:00:00 ZSamir KhanSamir KhanClassroom Tips and Techniques: Bivariate Limits - Then and Now
http://www.maplesoft.com/applications/view.aspx?SID=145979&ref=Feed
An introductory overview of the functionalities in Maple's GraphTheory package.<img src="/view.aspx?si=145979/thumb.jpg" alt="Classroom Tips and Techniques: Bivariate Limits - Then and Now" align="left"/>An introductory overview of the functionalities in Maple's GraphTheory package.145979Wed, 17 Apr 2013 04:00:00 ZDr. Robert LopezDr. Robert LopezClassroom Tips and Techniques: Introduction to Maple's GraphTheory Package
http://www.maplesoft.com/applications/view.aspx?SID=142357&ref=Feed
An introductory overview of the functionality in Maple's GraphTheory package.<img src="/view.aspx?si=142357/thumb.jpg" alt="Classroom Tips and Techniques: Introduction to Maple's GraphTheory Package" align="left"/>An introductory overview of the functionality in Maple's GraphTheory package.142357Thu, 17 Jan 2013 05:00:00 ZProf. Michael MonaganProf. Michael MonaganClassroom Tips and Techniques: Best Taylor-Polynomial Approximations
http://www.maplesoft.com/applications/view.aspx?SID=136471&ref=Feed
In the early 90s, Joe Ecker (Rensselaer Polytechnic Institute) provided a Maple solution to the problem of determining for a given function, which expansion point in a specified interval yielded the best quadratic Taylor polynomial approximation, where "best" was measured by the L<sub>2</sub>-norm. This article applies Ecker's approach to the function <em>f(x)</em> = sinh<em>(x)</em> – <em>x e<sub>-3x</sub>,</em> -1 ≤ <em>x</em> ≤ 3, then goes on to find other approximating quadratic polynomials.<img src="/view.aspx?si=136471/image.jpg" alt="Classroom Tips and Techniques: Best Taylor-Polynomial Approximations" align="left"/>In the early 90s, Joe Ecker (Rensselaer Polytechnic Institute) provided a Maple solution to the problem of determining for a given function, which expansion point in a specified interval yielded the best quadratic Taylor polynomial approximation, where "best" was measured by the L<sub>2</sub>-norm. This article applies Ecker's approach to the function <em>f(x)</em> = sinh<em>(x)</em> – <em>x e<sub>-3x</sub>,</em> -1 ≤ <em>x</em> ≤ 3, then goes on to find other approximating quadratic polynomials.136471Tue, 14 Aug 2012 04:00:00 ZDr. Robert LopezDr. Robert LopezStreet-fighting Math
http://www.maplesoft.com/applications/view.aspx?SID=129226&ref=Feed
This interactive Maple document contains a simple street-fighting game and performs a mathematical analysis of it, involving probability and game theory. The document is suitable for presentation in an undergraduate course on operations research, probability or linear programming. No knowledge of Maple is required.<img src="/view.aspx?si=129226/fighter_sm.jpg" alt="Street-fighting Math" align="left"/>This interactive Maple document contains a simple street-fighting game and performs a mathematical analysis of it, involving probability and game theory. The document is suitable for presentation in an undergraduate course on operations research, probability or linear programming. No knowledge of Maple is required.129226Thu, 29 Dec 2011 05:00:00 ZDr. Robert IsraelDr. Robert IsraelGreat Expectations
http://www.maplesoft.com/applications/view.aspx?SID=127116&ref=Feed
<p>An investor is offered what appears to be a great investment opportunity. Unfortunately it doesn't turn out to be so great in the long run. This interactive Maple document explores the situation using simulation and analysis, and suggests a new strategy that would produce better results.</p>
<p>This is an example suitable for presentation in an undergraduate course on probability. No knowledge of Maple is required.</p><img src="/view.aspx?si=127116/expectation_thum.png" alt="Great Expectations" align="left"/><p>An investor is offered what appears to be a great investment opportunity. Unfortunately it doesn't turn out to be so great in the long run. This interactive Maple document explores the situation using simulation and analysis, and suggests a new strategy that would produce better results.</p>
<p>This is an example suitable for presentation in an undergraduate course on probability. No knowledge of Maple is required.</p>127116Thu, 27 Oct 2011 04:00:00 ZDirectSearch optimization package, version 2
http://www.maplesoft.com/applications/view.aspx?SID=101333&ref=Feed
<p> The DirectSearch package is a collection of commands to numerically compute local and global minimums (maximums) of nonlinear multivariate function with (without) constraints. The package optimization methods are universal derivative-free direct searching methods, i.e. they do not require the objective function and constraints to be differentiable and continuous.<br /> The package optimization methods have quadratic convergence for quadratic functions.<br /><br /> The package also contains commands for multiobjective optimization, solving system of equations, fitting nonlinear function to data.<br /><br />The following is a summary of the major improvements in DirectSearch v.2.<br /><br />-- Three new derivative-free optimization methods are added.<br />-- The new global optimization command GlobalOptima is added.<br />-- The commands for multiobjective optimization, solving system of equations, fitting nonlinear function to data are added.<br />-- Mixed integer-discrete-continuous optimization is now supported.<br />-- You can now specify inequality constraints as any Boolean expressions.<br />-- You can now set bound inequality constraints x>=a, x<=b as: x=a..b.<br />-- Assume and assumption commands are supported for inequality constraints.<br />-- You can now specify problem variables as Vector.<br />-- High dimensional optimization problem are now solved a much faster.<br />-- Search in space curve direction is added to all algorithms.<br />-- Penalty function method is added for optimization with inequality constraints<br />-- Improved optimization algorithm for equality constraints is faster and more reliable.<br />-- The feasible initial point searching is improved.<br />-- Now the package is compatible with Maple 12 and above.<br />-- Detailed description of CDOS method in .pdf format is added.<br />-- Russian version of the package is now available.<br /><br /></p><img src="/view.aspx?si=101333/maple_icon.jpg" alt="DirectSearch optimization package, version 2" align="left"/><p> The DirectSearch package is a collection of commands to numerically compute local and global minimums (maximums) of nonlinear multivariate function with (without) constraints. The package optimization methods are universal derivative-free direct searching methods, i.e. they do not require the objective function and constraints to be differentiable and continuous.<br /> The package optimization methods have quadratic convergence for quadratic functions.<br /><br /> The package also contains commands for multiobjective optimization, solving system of equations, fitting nonlinear function to data.<br /><br />The following is a summary of the major improvements in DirectSearch v.2.<br /><br />-- Three new derivative-free optimization methods are added.<br />-- The new global optimization command GlobalOptima is added.<br />-- The commands for multiobjective optimization, solving system of equations, fitting nonlinear function to data are added.<br />-- Mixed integer-discrete-continuous optimization is now supported.<br />-- You can now specify inequality constraints as any Boolean expressions.<br />-- You can now set bound inequality constraints x>=a, x<=b as: x=a..b.<br />-- Assume and assumption commands are supported for inequality constraints.<br />-- You can now specify problem variables as Vector.<br />-- High dimensional optimization problem are now solved a much faster.<br />-- Search in space curve direction is added to all algorithms.<br />-- Penalty function method is added for optimization with inequality constraints<br />-- Improved optimization algorithm for equality constraints is faster and more reliable.<br />-- The feasible initial point searching is improved.<br />-- Now the package is compatible with Maple 12 and above.<br />-- Detailed description of CDOS method in .pdf format is added.<br />-- Russian version of the package is now available.<br /><br /></p>101333Tue, 01 Feb 2011 05:00:00 ZDr. Sergey MoiseevDr. Sergey MoiseevPortfolio Optimization under Nonconvex Transaction Costs with the Global Optimization Toolbox
http://www.maplesoft.com/applications/view.aspx?SID=1401&ref=Feed
<p>The two competing goals of investment are long-term growth of capital and low risk. The Markowitz model, first formulated in 1952, is a quadratic programming optimization model for balancing these two goals. The decision variables are the amounts invested in each asset. The objective is to minimize the variance of a portfolio's total return, subject to the constraints that (1) the expected growth of the portfolio is at least some target level, and (2) that we don't invest more capital than we have. In its original form, the Markowitz model assumes no transaction costs, and thus, it is easily solved with quadratic programming. Researchers have studied numerous convex generalizations of the Markowitz model, for example, by adding convex (usually linear) transaction costs, or other linear constraints. In this Maple application demonstration, we assume the investor pays nonconvex transaction costs on asset purchases; in particular, these costs are computed by a concave, piecewise-linear function of the amounts invested. This cost structure arises in practice whenever a broker offers volume discounts on commission rates. Under these conditions, the Markowitz model becomes a nonconvex optimization problem that is not easily solved using local-search optimization algorithms. However, we solve it easily using the Maple Global Optimization Toolbox.</p><img src="/view.aspx?si=1401/1407.jpg" alt="Portfolio Optimization under Nonconvex Transaction Costs with the Global Optimization Toolbox" align="left"/><p>The two competing goals of investment are long-term growth of capital and low risk. The Markowitz model, first formulated in 1952, is a quadratic programming optimization model for balancing these two goals. The decision variables are the amounts invested in each asset. The objective is to minimize the variance of a portfolio's total return, subject to the constraints that (1) the expected growth of the portfolio is at least some target level, and (2) that we don't invest more capital than we have. In its original form, the Markowitz model assumes no transaction costs, and thus, it is easily solved with quadratic programming. Researchers have studied numerous convex generalizations of the Markowitz model, for example, by adding convex (usually linear) transaction costs, or other linear constraints. In this Maple application demonstration, we assume the investor pays nonconvex transaction costs on asset purchases; in particular, these costs are computed by a concave, piecewise-linear function of the amounts invested. This cost structure arises in practice whenever a broker offers volume discounts on commission rates. Under these conditions, the Markowitz model becomes a nonconvex optimization problem that is not easily solved using local-search optimization algorithms. However, we solve it easily using the Maple Global Optimization Toolbox.</p>1401Mon, 14 Sep 2009 04:00:00 ZOptimal Portfolio Allocation and Economic Utility
http://www.maplesoft.com/applications/view.aspx?SID=4860&ref=Feed
<p>Show how the risk and reward set of a portfolio of two or three securities whose returns are jointly normally distributed is calculated from standard theorems about the mean and variance of linear combinations of the random variables. Use ideas from multivariable calculus to show how the feasible set is derived. Visualize the efficient frontier of the risk and reward set. Visualize the economic utility of the portfolio. Show how the concept of economic utility selects a unique optimal portfolio on the efficient frontier.</p><img src="/view.aspx?si=4860/thumb2.jpg" alt="Optimal Portfolio Allocation and Economic Utility" align="left"/><p>Show how the risk and reward set of a portfolio of two or three securities whose returns are jointly normally distributed is calculated from standard theorems about the mean and variance of linear combinations of the random variables. Use ideas from multivariable calculus to show how the feasible set is derived. Visualize the efficient frontier of the risk and reward set. Visualize the economic utility of the portfolio. Show how the concept of economic utility selects a unique optimal portfolio on the efficient frontier.</p>4860Mon, 14 Sep 2009 04:00:00 ZShengjie GuoShengjie GuoA Computational Approach to Essential and Nonessential Objective Functions in Linear Multicriteria Optimization
http://www.maplesoft.com/applications/view.aspx?SID=7061&ref=Feed
<strong>Authors</strong>: Prof. Agnieszka B. Malinowska and Prof. Delfim F. M. Torres
The question of obtaining well-defined criteria for multiple criteria decision making problems is well-known. One of the approaches dealing with this question is the concept of nonessential objective function. A certain objective function is called nonessential if the set of efficient solutions is the same both with or without that objective function. In this work we put together two methods for determining nonessential objective functions. A computational implementation is done using the computer algebra system Maple.<img src="/view.aspx?si=7061/1.jpg" alt="A Computational Approach to Essential and Nonessential Objective Functions in Linear Multicriteria Optimization" align="left"/><strong>Authors</strong>: Prof. Agnieszka B. Malinowska and Prof. Delfim F. M. Torres
The question of obtaining well-defined criteria for multiple criteria decision making problems is well-known. One of the approaches dealing with this question is the concept of nonessential objective function. A certain objective function is called nonessential if the set of efficient solutions is the same both with or without that objective function. In this work we put together two methods for determining nonessential objective functions. A computational implementation is done using the computer algebra system Maple.7061Tue, 23 Dec 2008 00:00:00 ZProf. Delfim TorresProf. Delfim TorresTraveling Salesman Problem
http://www.maplesoft.com/applications/view.aspx?SID=6873&ref=Feed
The Traveling Salesman Problem (TSP) is a fascinating optimization problem in which a salesman wishes to visit each of N cities exactly once and return to the city of departure, attempting to minimize the overall distance traveled. For the symmetric problem where distance (cost) from city A to city B is the same as from B to A, the number of possible paths to consider is given by (N-1)!/2. The exhaustive search for the shortest tour becomes very quickly impossible to conduct. Why? Because, assuming that your computer can evaluate the length of a billion tours per second, calculations would last 40 years in the case of twenty cities and would jump to 800 years if you added one city to the tour [1]. These numbers give meaning to the expression "combinatorial explosion". Consequently, we must settle for an approximate solutions, provided we can compute them efficiently. In this worksheet, we will compare two approximation algorithms, a simple-minded one (nearest neighbor) and one of the best (Lin-Kernighan 2-opt).<img src="/view.aspx?si=6873/TSgif.gif" alt="Traveling Salesman Problem" align="left"/>The Traveling Salesman Problem (TSP) is a fascinating optimization problem in which a salesman wishes to visit each of N cities exactly once and return to the city of departure, attempting to minimize the overall distance traveled. For the symmetric problem where distance (cost) from city A to city B is the same as from B to A, the number of possible paths to consider is given by (N-1)!/2. The exhaustive search for the shortest tour becomes very quickly impossible to conduct. Why? Because, assuming that your computer can evaluate the length of a billion tours per second, calculations would last 40 years in the case of twenty cities and would jump to 800 years if you added one city to the tour [1]. These numbers give meaning to the expression "combinatorial explosion". Consequently, we must settle for an approximate solutions, provided we can compute them efficiently. In this worksheet, we will compare two approximation algorithms, a simple-minded one (nearest neighbor) and one of the best (Lin-Kernighan 2-opt).6873Mon, 10 Nov 2008 00:00:00 ZBruno GuerrieriBruno GuerrieriDijkstras Shortest Path Algorithm
http://www.maplesoft.com/applications/view.aspx?SID=4969&ref=Feed
An implementation of Dijkstra's Shortest Path algorithm as a Maple package.<img src="/view.aspx?si=4969/dijkstra_v2_maple11_4.jpg" alt="Dijkstras Shortest Path Algorithm" align="left"/>An implementation of Dijkstra's Shortest Path algorithm as a Maple package.4969Tue, 29 May 2007 00:00:00 ZJay PedersenJay PedersenLibLip - multivariate scattered data interpolation and smoothing
http://www.maplesoft.com/applications/view.aspx?SID=4854&ref=Feed
LibLip is a Maple toolbox, which provides many methods to interpolate scattered data (with or without preprocessing) by using only the data itself and one additional parameter - the Lipschitz constant (which is basically the upper bound on the slope of the function). The Lipschitz constant can be automatically estimated from the data.
LibLip also provides approximation methods using locally Lipschitz functions.
If the data contains noise, it can be smoothened using special techniques which rely on linear programming. Lipschitz constant can
also be estimated from noisy data by using sample splitting and cross-validation.
In addition LibLip also accommodates monotonicity and range constraints. It is useful for approximation of functions that are known to be monotone with respect to all or a subset of variables, as well as monotone only on parts of the domain. Range constraints accommodate non-constant bounds on the values of the data and the interpolant.<img src="/view.aspx?si=4854/image.jpg" alt="LibLip - multivariate scattered data interpolation and smoothing" align="left"/>LibLip is a Maple toolbox, which provides many methods to interpolate scattered data (with or without preprocessing) by using only the data itself and one additional parameter - the Lipschitz constant (which is basically the upper bound on the slope of the function). The Lipschitz constant can be automatically estimated from the data.
LibLip also provides approximation methods using locally Lipschitz functions.
If the data contains noise, it can be smoothened using special techniques which rely on linear programming. Lipschitz constant can
also be estimated from noisy data by using sample splitting and cross-validation.
In addition LibLip also accommodates monotonicity and range constraints. It is useful for approximation of functions that are known to be monotone with respect to all or a subset of variables, as well as monotone only on parts of the domain. Range constraints accommodate non-constant bounds on the values of the data and the interpolant.4854Fri, 29 Dec 2006 00:00:00 ZDr. Gleb BeliakovDr. Gleb BeliakovGlobal and nonsmooth optimization toolbox
http://www.maplesoft.com/applications/view.aspx?SID=4840&ref=Feed
GANSO is a programming library for multivariate global and non-smooth nonlinear optimization. It implements the Extended Cutting Angle method, the Derivative Free Bundle method, Dynamical systems based heuristic and multistart local search. Unlike most nonlinear optimization tools, GANSO algorithms are not trapped in shallow local minima. This Maple toolbox contains the user manual and a sample maple worksheet. The programming library should be downlodaded separately from
http://www.ganso.com.au/libs/gansomapledlls.zip
and its contents extracted into a directory on the path. GANSO homepage is www.ganso.com.au<img src="/view.aspx?si=4840/ganso.jpg" alt="Global and nonsmooth optimization toolbox" align="left"/>GANSO is a programming library for multivariate global and non-smooth nonlinear optimization. It implements the Extended Cutting Angle method, the Derivative Free Bundle method, Dynamical systems based heuristic and multistart local search. Unlike most nonlinear optimization tools, GANSO algorithms are not trapped in shallow local minima. This Maple toolbox contains the user manual and a sample maple worksheet. The programming library should be downlodaded separately from
http://www.ganso.com.au/libs/gansomapledlls.zip
and its contents extracted into a directory on the path. GANSO homepage is www.ganso.com.au4840Fri, 03 Nov 2006 00:00:00 ZCentre for Informatics and Applied Optimization CIAOCentre for Informatics and Applied Optimization CIAOProgrammation linéaire
http://www.maplesoft.com/applications/view.aspx?SID=4827&ref=Feed
Cette application permet d'optimiser une fonction soumise à des contraintes linéaires ayant deux variables ou plus. Lorsque le problème est à deux ou trois variables, l'application localise dans le plan cartésien ou dans un espace à trois dimensions, l'ensemble convexe des solutions réalisables du problème ainsi que tous ses sommets.<img src="/view.aspx?si=4827/prog_lin.jpg" alt="Programmation linéaire" align="left"/>Cette application permet d'optimiser une fonction soumise à des contraintes linéaires ayant deux variables ou plus. Lorsque le problème est à deux ou trois variables, l'application localise dans le plan cartésien ou dans un espace à trois dimensions, l'ensemble convexe des solutions réalisables du problème ainsi que tous ses sommets.4827Mon, 16 Oct 2006 00:00:00 ZAndre LevesqueAndre LevesqueClassroom Tips and Techniques: The Lagrange Multiplier Method
http://www.maplesoft.com/applications/view.aspx?SID=4811&ref=Feed
Maple has a number of graphical and analytical tools for studying and implementing the method of Lagrange multipliers. In this article, we demonstrate a number of these tools, indicating how they might be used pedagogically.<img src="/view.aspx?si=4811/lagrange.gif" alt="Classroom Tips and Techniques: The Lagrange Multiplier Method" align="left"/>Maple has a number of graphical and analytical tools for studying and implementing the method of Lagrange multipliers. In this article, we demonstrate a number of these tools, indicating how they might be used pedagogically.4811Mon, 28 Aug 2006 00:00:00 ZDr. Robert LopezDr. Robert LopezAspherical Lens Surface Identification - Non-Linear Fitting with the Global Optimization Toolbox
http://www.maplesoft.com/applications/view.aspx?SID=4806&ref=Feed
<p>In this Application Demonstration, we investigate Aspherical Lenses and apply non-linear fitting to obtain an accurate representation of the given data in the form of a function, using the GlobalOptimization Toolbox for Maple.</p><img src="/view.aspx?si=4806/asphlens.jpg" alt="Aspherical Lens Surface Identification - Non-Linear Fitting with the Global Optimization Toolbox" align="left"/><p>In this Application Demonstration, we investigate Aspherical Lenses and apply non-linear fitting to obtain an accurate representation of the given data in the form of a function, using the GlobalOptimization Toolbox for Maple.</p>4806Mon, 31 Jul 2006 04:00:00 ZMaplesoftMaplesoftCircuit Design Problem
http://www.maplesoft.com/applications/view.aspx?SID=1678&ref=Feed
Based on the classical study of Ebers and Moll (1954), a bipolar transistor is modeled by an electrical circuit (see also e.g., Granvilliers and Benhamou, 2001). The corresponding model leads to a square system if highly nonlinear equations in nine (9) variables that has been studied by numerous researchers, in attempts to solve it, and then prove the correctness of the suggested solution.<img src="/view.aspx?si=1678/EMOLL.JPG" alt="Circuit Design Problem" align="left"/>Based on the classical study of Ebers and Moll (1954), a bipolar transistor is modeled by an electrical circuit (see also e.g., Granvilliers and Benhamou, 2001). The corresponding model leads to a square system if highly nonlinear equations in nine (9) variables that has been studied by numerous researchers, in attempts to solve it, and then prove the correctness of the suggested solution.1678Wed, 26 Oct 2005 00:00:00 ZDr. Janos PinterDr. Janos PinterAlkylation Process Model
http://www.maplesoft.com/applications/view.aspx?SID=1675&ref=Feed
In this example, we describe a model for the optimization of a typical process operation in the petrochemical industry. Our objective is to determine the optimal set of operating conditions for an alkylation process that combines olefin with isobutane, in the presence of a catalyst, to form alkylate.
Many chemical processes are characterized by nonlinear equilibrium (material and energy balance) constraints. In addition, the processes are typically constrained by restrictions on the operating ranges of the decision variables such as amounts and rates of the components used, temperature, pressure, and so on.
This worksheet requires that the Global Optimization Toolbox has been added to Maple.<img src="/view.aspx?si=1675/alkyl.JPG" alt="Alkylation Process Model" align="left"/>In this example, we describe a model for the optimization of a typical process operation in the petrochemical industry. Our objective is to determine the optimal set of operating conditions for an alkylation process that combines olefin with isobutane, in the presence of a catalyst, to form alkylate.
Many chemical processes are characterized by nonlinear equilibrium (material and energy balance) constraints. In addition, the processes are typically constrained by restrictions on the operating ranges of the decision variables such as amounts and rates of the components used, temperature, pressure, and so on.
This worksheet requires that the Global Optimization Toolbox has been added to Maple.1675Mon, 10 Oct 2005 04:00:00 ZDr. Janos PinterDr. Janos Pinter