Chapter Contents
Chapter Contents
Previous
Previous
Next
Next
Language Reference

NLPNRA Call

nonlinear optimization by Newton-Raphson method

CALL NLPNRA( rc, xr, "fun", x0 <,opt, blc, tc, par, "ptit", "grd", "hes">);

See "Nonlinear Optimization and Related Subroutines" for a listing of all NLP subroutines. See Chapter 11, "Nonlinear Optimization Examples," for a description of the inputs to and outputs of all NLP subroutines.

The NLPNRA algorithm uses a pure Newton step at each iteration when both the Hessian is positive definite and the Newton step successfully reduces the value of the objective function. Otherwise, it performs a combination of ridging and line-search to compute successful steps. If the Hessian is not positive definite, a multiple of the identity matrix is added to the Hessian matrix to make it positive definite (refer to Eskow & Schnabel 1991).

The subroutine uses the gradient g^{(k)} = \nabla f(x^{(k)})and the Hessian matrix G^{(k)} = \nabla^2 f(x^{(k)}), and it requires continuous first- and second-order derivatives of the objective function inside the feasible region. If second-order derivatives are computed efficiently and precisely, the NLPNRA method does not need many function, gradient, and Hessian calls, and it may perform well for medium to large problems.

Note that using only function calls to compute finite difference approximations for second-order derivatives can be computationally very expensive and may contain significant rounding errors. If you use the "grd" input argument to specify a module that computes first-order derivatives analytically, you can reduce drastically the computation time for numerical second-order derivatives. The computation of the finite difference approximation for the Hessian matrix generally uses only n calls of the module that specifies the gradient.

In each iteration, a line search is done along the search direction to find an approximate optimum of the objective function. The default line-search method uses quadratic interpolation and cubic extrapolation. You can specify other line-search algorithms with the fifth element of the opt argument. See "Options Vector" for details.

In unconstrained and boundary constrained cases, the NLPNRA algorithm can take advantage of diagonal or sparse Hessian matrices that are specified by the input argument "hes". To use sparse Hessian storage, the value of the ninth element of the opt argument must specify the number of nonzero Hessian elements returned by the Hessian module. See "Objective Function and Derivatives" for more details. In addition to the standard iteration history, the NLPNRA subroutine prints the following information:

The following statements invoke the NLPNRA subroutine to solve the constrained Betts optimization problem (see "Constrained Betts Function" ). The iteration history is shown in Figure 17.7.

proc iml;
   start F_BETTS(x);
      f = .01 * x[1] * x[1] + x[2] * x[2] - 100.;
      return(f);
   finish F_BETTS;

   con = {  2. -50.  .   .,
           50.  50.  .   .,
           10.  -1. 1. 10.};
   x = {-1. -1.};
   optn = {0 2};
   call nlpnra(rc,xres,"F_BETTS",x,optn,con);
   quit;

 
                         Optimization Start
                         Parameter Estimates

                            Gradient    Lower        Upper
                            Objective   Bound        Bound
  N Parameter    Estimate   Function    Constraint   Constraint
  1 X1           6.800000   0.136000    2.000000     50.000000
  2 X2          -1.000000  -2.000000   -50.000000    50.000000

                Value of Objective Function = -98.5376


                          Linear Constraints

  1   59.00000 :      10.0000  <=   +   10.0000 * X1        -    1.0000 * X2

           Newton-Raphson Optimization with Line Search

                    Without Parameter Scaling
              Gradient Computed by Finite Differences
         CRP Jacobian Computed by Finite Differences
                  Parameter Estimates               2
                  Lower Bounds                      2
                  Upper Bounds                      2
                  Linear Constraints                1

                   Optimization Start

   Active Constraints        0  Objective Function   -98.5376
   Max Abs Gradient Element  2

                                                             
                    Function        Active       Objective   
 Iter    Restarts      Calls   Constraints        Function   

    1           0          2             0       -98.81551     
    2*          0          3             0       -99.40840      
    3*          0          4             1       -99.87504    
    4           0          5             1       -99.96000     
    5           0          6             1       -99.96000  


        Objective    Max Abs              Slope of
         Function   Gradient      Step      Search
 Iter      Change    Element      Size   Direction

    1      0.2779     1.8000     0.100      -2.925
    2*     0.5929     1.2713     0.294      -2.365
    3*     0.4666     0.5829     0.542      -1.181
    4      0.0850   0.000039     1.000      -0.170
    5     3.9E-10   9.537E-7     1.000     -76E-11

                     Optimization Results

   Iterations                           5  Function Calls            7
   Hessian Calls                        6  Active Constraints        1
   Objective Function              -99.96  Max Abs Gradient Element  0
   Slope of Search Direction -7.64376E-10  Ridge                     0

   GCONV convergence criterion satisfied.


                      Optimization Results
                      Parameter Estimates

                                      Gradient    Active
                                      Objective    Bound
  N Parameter         Estimate        Function    Constraint

  1 X1                2.000000        0.040000     Lower BC
  2 X2            -0.000000196               0

              Value of Objective Function = -99.96



        Linear Constraints Evaluated at Solution

 1     10.00000  =  -10.0000 + 10.0000 * X1 - 1.0000 * X2

Figure 17.7: Iteration History for the NLPNRA Subroutine

Chapter Contents
Chapter Contents
Previous
Previous
Next
Next
Top
Top

Copyright © 1999 by SAS Institute Inc., Cary, NC, USA. All rights reserved.