Regression: New Applications
http://www.maplesoft.com/applications/category.aspx?cid=237
en-us2015 Maplesoft, A Division of Waterloo Maple Inc.Maplesoft Document SystemSat, 10 Oct 2015 16:12:53 GMTSat, 10 Oct 2015 16:12:53 GMTNew applications in the Regression categoryhttp://www.mapleprimes.com/images/mapleapps.gifRegression: New Applications
http://www.maplesoft.com/applications/category.aspx?cid=237
Fitting 3D Data to a Bivariate Polynomial Surface
http://www.maplesoft.com/applications/view.aspx?SID=129500&ref=Feed
<p>This application:</p>
<ul>
<li>generates 3D data as a proxy for experimental data (i.e. a table of X, Y, Z points with random noise added to Z),</li>
<li>generates a bivariate polynomial (the order can be changed by the user),</li>
<li>fits the polynomial to the data with a least-squares fit,</li>
</ul>
<p>and plots the original data against the best-fit polynomial surface.</p><img src="/applications/images/app_image_blank_lg.jpg" alt="Fitting 3D Data to a Bivariate Polynomial Surface" align="left"/><p>This application:</p>
<ul>
<li>generates 3D data as a proxy for experimental data (i.e. a table of X, Y, Z points with random noise added to Z),</li>
<li>generates a bivariate polynomial (the order can be changed by the user),</li>
<li>fits the polynomial to the data with a least-squares fit,</li>
</ul>
<p>and plots the original data against the best-fit polynomial surface.</p>129500Mon, 09 Jan 2012 05:00:00 ZMaplesoftMaplesoftRegression in Maple
http://www.maplesoft.com/applications/view.aspx?SID=129021&ref=Feed
I have always thought that regressions has been too complicated in Maple. The Fit command is too fiddly ie you have to specify too many things and it is easy to get it wrong plus the statistical output you get is far from mainstream ie you dont get t-values, p-values, R, R^2, Adj R^2 etc etc.
I have therefore designed a new procedure called Reg() which only needs one input and that is a datamatrix.<img src="/view.aspx?si=129021/regression_sm.jpg" alt="Regression in Maple" align="left"/>I have always thought that regressions has been too complicated in Maple. The Fit command is too fiddly ie you have to specify too many things and it is easy to get it wrong plus the statistical output you get is far from mainstream ie you dont get t-values, p-values, R, R^2, Adj R^2 etc etc.
I have therefore designed a new procedure called Reg() which only needs one input and that is a datamatrix.129021Thu, 22 Dec 2011 05:00:00 ZMarcus DavidssonMarcus DavidssonDirectSearch optimization package, version 2
http://www.maplesoft.com/applications/view.aspx?SID=101333&ref=Feed
<p> The DirectSearch package is a collection of commands to numerically compute local and global minimums (maximums) of nonlinear multivariate function with (without) constraints. The package optimization methods are universal derivative-free direct searching methods, i.e. they do not require the objective function and constraints to be differentiable and continuous.<br /> The package optimization methods have quadratic convergence for quadratic functions.<br /><br /> The package also contains commands for multiobjective optimization, solving system of equations, fitting nonlinear function to data.<br /><br />The following is a summary of the major improvements in DirectSearch v.2.<br /><br />-- Three new derivative-free optimization methods are added.<br />-- The new global optimization command GlobalOptima is added.<br />-- The commands for multiobjective optimization, solving system of equations, fitting nonlinear function to data are added.<br />-- Mixed integer-discrete-continuous optimization is now supported.<br />-- You can now specify inequality constraints as any Boolean expressions.<br />-- You can now set bound inequality constraints x>=a, x<=b as: x=a..b.<br />-- Assume and assumption commands are supported for inequality constraints.<br />-- You can now specify problem variables as Vector.<br />-- High dimensional optimization problem are now solved a much faster.<br />-- Search in space curve direction is added to all algorithms.<br />-- Penalty function method is added for optimization with inequality constraints<br />-- Improved optimization algorithm for equality constraints is faster and more reliable.<br />-- The feasible initial point searching is improved.<br />-- Now the package is compatible with Maple 12 and above.<br />-- Detailed description of CDOS method in .pdf format is added.<br />-- Russian version of the package is now available.<br /><br /></p><img src="/view.aspx?si=101333/maple_icon.jpg" alt="DirectSearch optimization package, version 2" align="left"/><p> The DirectSearch package is a collection of commands to numerically compute local and global minimums (maximums) of nonlinear multivariate function with (without) constraints. The package optimization methods are universal derivative-free direct searching methods, i.e. they do not require the objective function and constraints to be differentiable and continuous.<br /> The package optimization methods have quadratic convergence for quadratic functions.<br /><br /> The package also contains commands for multiobjective optimization, solving system of equations, fitting nonlinear function to data.<br /><br />The following is a summary of the major improvements in DirectSearch v.2.<br /><br />-- Three new derivative-free optimization methods are added.<br />-- The new global optimization command GlobalOptima is added.<br />-- The commands for multiobjective optimization, solving system of equations, fitting nonlinear function to data are added.<br />-- Mixed integer-discrete-continuous optimization is now supported.<br />-- You can now specify inequality constraints as any Boolean expressions.<br />-- You can now set bound inequality constraints x>=a, x<=b as: x=a..b.<br />-- Assume and assumption commands are supported for inequality constraints.<br />-- You can now specify problem variables as Vector.<br />-- High dimensional optimization problem are now solved a much faster.<br />-- Search in space curve direction is added to all algorithms.<br />-- Penalty function method is added for optimization with inequality constraints<br />-- Improved optimization algorithm for equality constraints is faster and more reliable.<br />-- The feasible initial point searching is improved.<br />-- Now the package is compatible with Maple 12 and above.<br />-- Detailed description of CDOS method in .pdf format is added.<br />-- Russian version of the package is now available.<br /><br /></p>101333Tue, 01 Feb 2011 05:00:00 ZDr. Sergey MoiseevDr. Sergey MoiseevComparison of Multivariate Optimization Methods
http://www.maplesoft.com/applications/view.aspx?SID=1718&ref=Feed
<p>The worksheet demonstrates the use of Maple to compare methods of unconstrained nonlinear minimization of multivariable function. Seven methods of nonlinear minimization of the n-variables objective function f(x1,x2,.,xn) are analyzed:</p>
<p>1) minimum search by coordinate and conjugate directions descent; 2) Powell's method; 3) the modified Hooke-Jeeves method; 4) simplex Nelder-Meed method; 5) quasi-gradient method; 6) random directions search; 7) simulated annealing. All methods are direct searching methods, i.e. they do not require the objective function f(x1,x2,.,xn) to be differentiable and continuous. Maple's Optimization package efficiency is compared with these programs. Optimization methods have been compared on the set of 21 test functions.</p><img src="/view.aspx?si=1718/SearchPaths2.PNG" alt="Comparison of Multivariate Optimization Methods" align="left"/><p>The worksheet demonstrates the use of Maple to compare methods of unconstrained nonlinear minimization of multivariable function. Seven methods of nonlinear minimization of the n-variables objective function f(x1,x2,.,xn) are analyzed:</p>
<p>1) minimum search by coordinate and conjugate directions descent; 2) Powell's method; 3) the modified Hooke-Jeeves method; 4) simplex Nelder-Meed method; 5) quasi-gradient method; 6) random directions search; 7) simulated annealing. All methods are direct searching methods, i.e. they do not require the objective function f(x1,x2,.,xn) to be differentiable and continuous. Maple's Optimization package efficiency is compared with these programs. Optimization methods have been compared on the set of 21 test functions.</p>1718Tue, 15 Sep 2009 04:00:00 ZLibLip - multivariate scattered data interpolation and smoothing
http://www.maplesoft.com/applications/view.aspx?SID=4854&ref=Feed
LibLip is a Maple toolbox, which provides many methods to interpolate scattered data (with or without preprocessing) by using only the data itself and one additional parameter - the Lipschitz constant (which is basically the upper bound on the slope of the function). The Lipschitz constant can be automatically estimated from the data.
LibLip also provides approximation methods using locally Lipschitz functions.
If the data contains noise, it can be smoothened using special techniques which rely on linear programming. Lipschitz constant can
also be estimated from noisy data by using sample splitting and cross-validation.
In addition LibLip also accommodates monotonicity and range constraints. It is useful for approximation of functions that are known to be monotone with respect to all or a subset of variables, as well as monotone only on parts of the domain. Range constraints accommodate non-constant bounds on the values of the data and the interpolant.<img src="/view.aspx?si=4854/image.jpg" alt="LibLip - multivariate scattered data interpolation and smoothing" align="left"/>LibLip is a Maple toolbox, which provides many methods to interpolate scattered data (with or without preprocessing) by using only the data itself and one additional parameter - the Lipschitz constant (which is basically the upper bound on the slope of the function). The Lipschitz constant can be automatically estimated from the data.
LibLip also provides approximation methods using locally Lipschitz functions.
If the data contains noise, it can be smoothened using special techniques which rely on linear programming. Lipschitz constant can
also be estimated from noisy data by using sample splitting and cross-validation.
In addition LibLip also accommodates monotonicity and range constraints. It is useful for approximation of functions that are known to be monotone with respect to all or a subset of variables, as well as monotone only on parts of the domain. Range constraints accommodate non-constant bounds on the values of the data and the interpolant.4854Fri, 29 Dec 2006 00:00:00 ZDr. Gleb BeliakovDr. Gleb BeliakovThe Population of Mexican United States. Part I
http://www.maplesoft.com/applications/view.aspx?SID=4850&ref=Feed
The principal goal of this document is to find mathematical models for the Mexican population
growth since 1921 to 1995 using Maple 10 Statistics and DEtools Pakages and to predict the population for the
2000 and 2005 years.<img src="/view.aspx?si=4850/mexpop.jpg" alt="The Population of Mexican United States. Part I" align="left"/>The principal goal of this document is to find mathematical models for the Mexican population
growth since 1921 to 1995 using Maple 10 Statistics and DEtools Pakages and to predict the population for the
2000 and 2005 years.4850Thu, 14 Dec 2006 00:00:00 ZProf. David Macias FerrerProf. David Macias FerrerRanLip - black-box non-uniform random variate generator
http://www.maplesoft.com/applications/view.aspx?SID=4849&ref=Feed
RanLip is a toolbox for generation of nonuniform random variates from arbitrary Lipschitz-continuous distributions in Maple environment. It uses acceptance/ rejection approach, which is based on approximation of the probability density function from above with a "hat" function. RanLip provides very fast preprocessing and generation times, and yields small rejection constant. It exhibits good performance for up to five variables, and provides the user with a black box nonuniform random variate generator for a large class of distributions, in particular, multimodal distributions.<img src="/view.aspx?si=4849/ranlib.jpg" alt="RanLip - black-box non-uniform random variate generator" align="left"/>RanLip is a toolbox for generation of nonuniform random variates from arbitrary Lipschitz-continuous distributions in Maple environment. It uses acceptance/ rejection approach, which is based on approximation of the probability density function from above with a "hat" function. RanLip provides very fast preprocessing and generation times, and yields small rejection constant. It exhibits good performance for up to five variables, and provides the user with a black box nonuniform random variate generator for a large class of distributions, in particular, multimodal distributions.4849Wed, 13 Dec 2006 00:00:00 ZDr. Gleb BeliakovDr. Gleb BeliakovPolynomial Regression through Least Square Method
http://www.maplesoft.com/applications/view.aspx?SID=4845&ref=Feed
The goals of this document are to show the approximation of a Point Dispersion through Quadratic Regression Polynomials using the Least Square Method and Maple 10 tools.<img src="/view.aspx?si=4845/PolyReg.htm_67.gif" alt="Polynomial Regression through Least Square Method" align="left"/>The goals of this document are to show the approximation of a Point Dispersion through Quadratic Regression Polynomials using the Least Square Method and Maple 10 tools.4845Tue, 21 Nov 2006 00:00:00 ZProf. David Macias FerrerProf. David Macias FerrerAspherical Lens Surface Identification - Non-Linear Fitting with the Global Optimization Toolbox
http://www.maplesoft.com/applications/view.aspx?SID=4806&ref=Feed
<p>In this Application Demonstration, we investigate Aspherical Lenses and apply non-linear fitting to obtain an accurate representation of the given data in the form of a function, using the GlobalOptimization Toolbox for Maple.</p><img src="/view.aspx?si=4806/asphlens.jpg" alt="Aspherical Lens Surface Identification - Non-Linear Fitting with the Global Optimization Toolbox" align="left"/><p>In this Application Demonstration, we investigate Aspherical Lenses and apply non-linear fitting to obtain an accurate representation of the given data in the form of a function, using the GlobalOptimization Toolbox for Maple.</p>4806Mon, 31 Jul 2006 04:00:00 ZMaplesoftMaplesoftAn Algorithm For Fitting Nonlinear Piecewise Functions, with an Application to Animal Growth
http://www.maplesoft.com/applications/view.aspx?SID=4294&ref=Feed
This worksheet shows how to find parameters of a piecwise nonlinear function fitting a given data set. As an example, we fit a model for the growth of a bull to actual growth data. We show an improvement in the fit of our method over the classical Gompertz function.<img src="/view.aspx?si=4294//applications/images/app_image_blank_lg.jpg" alt="An Algorithm For Fitting Nonlinear Piecewise Functions, with an Application to Animal Growth" align="left"/>This worksheet shows how to find parameters of a piecwise nonlinear function fitting a given data set. As an example, we fit a model for the growth of a bull to actual growth data. We show an improvement in the fit of our method over the classical Gompertz function.4294Wed, 14 Aug 2002 11:53:55 ZStanislav BartonStanislav BartonFinding Least-Squares Fit Curve and Plotting in 3D
http://www.maplesoft.com/applications/view.aspx?SID=4201&ref=Feed
Given data in [x, y, z] find and plot the best linear least-squares fit to it.<img src="/view.aspx?si=4201//applications/images/app_image_blank_lg.jpg" alt="Finding Least-Squares Fit Curve and Plotting in 3D" align="left"/>Given data in [x, y, z] find and plot the best linear least-squares fit to it.4201Tue, 15 Jan 2002 14:35:44 ZYufang HaoYufang HaoFitting a Strength Function by the Levenberg-Marquardt Method
http://www.maplesoft.com/applications/view.aspx?SID=4160&ref=Feed
Procedure mnlfit , based on the Levenberg-Marquardt method, is used to fit a function relating adhesive bond strength to three variables. The maximum strength and values of the variables to produce it are estimated.<img src="/view.aspx?si=4160//applications/images/app_image_blank_lg.jpg" alt="Fitting a Strength Function by the Levenberg-Marquardt Method" align="left"/>Procedure mnlfit , based on the Levenberg-Marquardt method, is used to fit a function relating adhesive bond strength to three variables. The maximum strength and values of the variables to produce it are estimated.4160Mon, 05 Nov 2001 10:40:45 ZJ. M. RedwoodJ. M. RedwoodTaylor-Newton-Gauss method for nonlinear regression
http://www.maplesoft.com/applications/view.aspx?SID=3663&ref=Feed
A program to find the unknown parameters of a nonlinear or linear function from a series of observations {X,Y}.<img src="/view.aspx?si=3663//applications/images/app_image_blank_lg.jpg" alt="Taylor-Newton-Gauss method for nonlinear regression" align="left"/>A program to find the unknown parameters of a nonlinear or linear function from a series of observations {X,Y}.3663Tue, 19 Jun 2001 00:00:00 ZWayne AllenWayne AllenFitting a circle to data using a linear model
http://www.maplesoft.com/applications/view.aspx?SID=3776&ref=Feed
This worksheet demonstrates how Maple can be used to fit a circle to a set of datapoints. If a set of datapoints are known to lie on a circle, the average quadratic distance of the points from a circle should be minimized. This problem can be written so that it is linear in the parameters to be determined.<img src="/view.aspx?si=3776//applications/images/app_image_blank_lg.jpg" alt="Fitting a circle to data using a linear model" align="left"/>This worksheet demonstrates how Maple can be used to fit a circle to a set of datapoints. If a set of datapoints are known to lie on a circle, the average quadratic distance of the points from a circle should be minimized. This problem can be written so that it is linear in the parameters to be determined.3776Tue, 19 Jun 2001 00:00:00 ZThomas SchrammThomas Schramm