Prof. William Fox: New Applications
https://www.maplesoft.com/applications/author.aspx?mid=84921
en-us2021 Maplesoft, A Division of Waterloo Maple Inc.Maplesoft Document SystemTue, 11 May 2021 03:12:22 GMTTue, 11 May 2021 03:12:22 GMTNew applications published by Prof. William Foxhttps://www.maplesoft.com/images/Application_center_hp.jpgProf. William Fox: New Applications
https://www.maplesoft.com/applications/author.aspx?mid=84921
Game Theory in Maple
https://www.maplesoft.com/applications/view.aspx?SID=154131&ref=Feed
This worksheet template allows one to enter the total conflict or partial conflict game and have it solved. The template also allows for the solution of the Prudential Strategies/Security levels and the Nash arbitration solution.<img src="https://www.maplesoft.com/applications/images/app_image_blank_lg.jpg" alt="Game Theory in Maple" style="max-width: 25%;" align="left"/>This worksheet template allows one to enter the total conflict or partial conflict game and have it solved. The template also allows for the solution of the Prudential Strategies/Security levels and the Nash arbitration solution.https://www.maplesoft.com/applications/view.aspx?SID=154131&ref=FeedThu, 07 Jul 2016 04:00:00 ZProf. William FoxProf. William FoxSteepest Ascent Method
https://www.maplesoft.com/applications/view.aspx?SID=154132&ref=Feed
This worksheet applies the gradient search method for multi-variable maximization problems. This is an update of our earlier version in the application center. This worksheet solves nonlinear optimization problems by the method of steepest ascent.
Given a function <EM>f(x,y)</EM> and a current point <EM>(`x__0`, `y__0`)</EM>, the search direction is taken to be the gradient of <EM>f(x,y)</EM> at <EM>(`x__0`, `y__0`)</EM>. The step length is computed by line search, i.e. as the step length that maximizes <EM>f(x,y)</EM> along the gradient direction.<img src="https://www.maplesoft.com/applications/images/app_image_blank_lg.jpg" alt="Steepest Ascent Method" style="max-width: 25%;" align="left"/>This worksheet applies the gradient search method for multi-variable maximization problems. This is an update of our earlier version in the application center. This worksheet solves nonlinear optimization problems by the method of steepest ascent.
Given a function <EM>f(x,y)</EM> and a current point <EM>(`x__0`, `y__0`)</EM>, the search direction is taken to be the gradient of <EM>f(x,y)</EM> at <EM>(`x__0`, `y__0`)</EM>. The step length is computed by line search, i.e. as the step length that maximizes <EM>f(x,y)</EM> along the gradient direction.https://www.maplesoft.com/applications/view.aspx?SID=154132&ref=FeedThu, 07 Jul 2016 04:00:00 ZDr. William Hank Richardson<BR>Dr. Henk KoppelaarDr. William Hank Richardson<BR>Dr. Henk KoppelaarNewton's Method for optimization in two variables
https://www.maplesoft.com/applications/view.aspx?SID=4199&ref=Feed
This application uses Newton's root finding procedure to find a critical point of a two-variable function from an initial point.<img src="https://www.maplesoft.com/applications/images/app_image_blank_lg.jpg" alt="Newton's Method for optimization in two variables" style="max-width: 25%;" align="left"/>This application uses Newton's root finding procedure to find a critical point of a two-variable function from an initial point.https://www.maplesoft.com/applications/view.aspx?SID=4199&ref=FeedMon, 14 Jan 2002 05:00:00 ZWilliam RichardsonWilliam RichardsonSteepest ascent method for multivariate optimization
https://www.maplesoft.com/applications/view.aspx?SID=4194&ref=Feed
This worksheet solves nonlinear optimization problems by the method of steepest ascent. Given a function f(x,y) and a current point (x0,y0), the search direction is taken to be the gradient of f(x,y) at (x0,y0). The step length is computed by line search, i.e. as the step length that maximizes f(x,y) along the gradient direction.<img src="https://www.maplesoft.com/applications/images/app_image_blank_lg.jpg" alt="Steepest ascent method for multivariate optimization" style="max-width: 25%;" align="left"/>This worksheet solves nonlinear optimization problems by the method of steepest ascent. Given a function f(x,y) and a current point (x0,y0), the search direction is taken to be the gradient of f(x,y) at (x0,y0). The step length is computed by line search, i.e. as the step length that maximizes f(x,y) along the gradient direction.https://www.maplesoft.com/applications/view.aspx?SID=4194&ref=FeedWed, 02 Jan 2002 05:00:00 ZWilliam RichardsonWilliam RichardsonFibonacci search method for unimodal optimization
https://www.maplesoft.com/applications/view.aspx?SID=4193&ref=Feed
This program performs the Fibonacci Line Search algorithm to find the maximum of a unimodal function, f(x) , over an interval, a <= x <= b . The program calculates the number of iterations required to insure the final interval is within the user-specified tolerance.<img src="https://www.maplesoft.com/applications/images/app_image_blank_lg.jpg" alt="Fibonacci search method for unimodal optimization" style="max-width: 25%;" align="left"/>This program performs the Fibonacci Line Search algorithm to find the maximum of a unimodal function, f(x) , over an interval, a <= x <= b . The program calculates the number of iterations required to insure the final interval is within the user-specified tolerance.https://www.maplesoft.com/applications/view.aspx?SID=4193&ref=FeedWed, 02 Jan 2002 05:00:00 ZProf. William FoxProf. William FoxGolden Section search technique for unimodal optimization
https://www.maplesoft.com/applications/view.aspx?SID=3669&ref=Feed
The Golden Section search technique for unimodal optimization<img src="https://www.maplesoft.com/applications/images/app_image_blank_lg.jpg" alt="Golden Section search technique for unimodal optimization" style="max-width: 25%;" align="left"/>The Golden Section search technique for unimodal optimizationhttps://www.maplesoft.com/applications/view.aspx?SID=3669&ref=FeedMon, 18 Jun 2001 04:00:00 ZMargie WitherspoonMargie Witherspoon