|
Chapter Contents |
Previous |
Next |
| Language Reference |
| Optimization Subroutines |
| Conjugate Gradient Optimization Method |
|
| Double Dogleg Optimization Method |
|
| Nelder-Mead Simplex Optimization Method |
|
| Newton-Raphson Optimization Method |
|
| Newton-Raphson Ridge Optimization Method |
|
| (Dual) Quasi-Newton Optimization Method |
|
| Quadratic Optimization Method |
|
| Trust-Region Optimization Method |
|
| Least-Squares Subroutines |
| Hybrid Quasi-Newton Least-Squares Methods |
|
| Levenberg-Marquardt Least-Squares Method |
|
| Supplementary Subroutines |
| Approximate Derivatives by Finite Differences |
|
| Feasible Point Subject to Constraints |
|
Note: The names of the optional arguments can be used as keywords. For example, the following statements are equivalent:
call nlpnrr(rc,xr,"fun",x0,,,ter,,,"grad"); call nlpnrr(rc,xr,"fun",x0) tc=ter grd="grad";
All the optimization subroutines require at least two input arguments.
Note that you can specify optional arguments with the keyword=argument syntax.
All the optimization subroutines return the following results:
|
Chapter Contents |
Previous |
Next |
Top |
Copyright © 1999 by SAS Institute Inc., Cary, NC, USA. All rights reserved.