Steepest ascent method for multivariate optimization - Maple Application Center
Application Center Applications Steepest ascent method for multivariate optimization

Steepest ascent method for multivariate optimization

Authors
: William Richardson
Engineering software solutions from Maplesoft
This Application runs in Maple. Don't have Maple? No problem!
 Try Maple free for 15 days!
This worksheet solves nonlinear optimization problems by the method of steepest ascent. Given a function f(x,y) and a current point (x0,y0), the search direction is taken to be the gradient of f(x,y) at (x0,y0). The step length is computed by line search, i.e. as the step length that maximizes f(x,y) along the gradient direction.

Application Details

Publish Date: January 02, 2002
Created In: Maple 7
Language: English

More Like This

Classroom Tips and Techniques: Bivariate Limits - Then and Now
Classroom Tips and Techniques: Introduction to Maple's GraphTheory Package
Classroom Tips and Techniques: Steepest-Ascent Curves
Classroom Tips and Techniques: Partial Derivatives by Subscripting
DirectSearch optimization package, version 2
The Nelder-Mead Method for Optimization in Two Dimensions
Classroom Tips and Techniques: An Inequality-Constrained Optimization Problem
Fibonacci search method for unimodal optimization