Chapter Contents
Chapter Contents
Previous
Previous
Next
Next
Introduction to Optimization

PROC NLP

The NLP procedure (NonLinear Programming) offers a set of optimization techniques for minimizing or maximizing a continuous nonlinear function f(x) of n decision variables, x = (x1, ... ,xn)T with lower and upper bound, linear and nonlinear, equality and inequality constraints. This can be expressed as solving

& \min_{x \in {\cal R}^n} f(x) & \ {subject to} & c_i(x) = 0 & i = 1, ... ,m_e \ & c_i(x) \ge 0 & i = m_e+1, ... ,m \ & u_i \ge x_i \ge l_i & i = 1, ... ,n
where f is the objective function, the ci's are the constraint functions, and ui, li's are the upper and lower bounds. Problems of this type are found in many settings ranging from optimal control to maximum likelihood estimation.

The NLP procedure provides a number of algorithms for solving this problem that take advantage of special structure on the objective function or constraints, or both. One example is the quadratic programming problem:

f(x) = {1 \over 2} x^T G x + g^T x + b \{subject to} & c_i(x) = 0 & i = 1, ... ,m_e
where the ci(x)'s are linear functions; g = (g1, ... ,gn)T and b = (b1, ... ,bn)T are vectors; and G is an n ×n symmetric matrix.

Another example is the least-squares problem:

f(x) = {1 \over 2} \{f^2_1(x) +  ...  + f^2_l(x)\} \{subject to} & c_i(x) = 0 & i = 1, ... ,m_e
where the ci(x)'s are linear functions, and f1(x),...,fm(x) are nonlinear functions of x.

The following optimization techniques are supported in PROC NLP:

These optimization techniques require a continuous objective function f, and all but one (NMSIMP) require continuous first-order derivatives of the objective function f. Some of the techniques also require continuous second-order derivatives. There are three ways to compute derivatives in PROC NLP:

  1. analytically (using a special derivative compiler), the default method
  2. via finite difference approximations
  3. via user-supplied exact or approximate numerical functions

Nonlinear programs can be input into the procedure in various ways. The objective, constraint, and derivative fucntions are specified using the programming statements of PROC NLP. In addition, information in SAS data sets can be used to define the structure of objectives and constraints as well as specify constants used in objectives, constraints, and derivatives.

PROC NLP uses data sets to input various pieces of information.

PROC NLP uses data sets to output various results.


\begin{picture}
(360,160)

\thicklines 
 

\put(65,115){\makebox(0,0)[r]{Data}}
...
 ...70}}
\put(225, 65){\vector(1,0){35}}
\put(225,115){\line(0,-1){50}}\end{picture}

Figure 1.5: Input and Output Data Sets in PROC NLP

As an alternative to supplying data in SAS data sets, some or all data for the model can be specified using SAS programming statements. These are similar to those used in the SAS DATA step.

Consider the simple example of minimizing the Rosenbrock Function (Rosenbrock, 1960).

f(x) & = & {1 \over 2} \{ 100 (x_2 - x_1^2)^2 + (1 - x_1)^2 \} \ & = & {1 \over 2} \{ f_1^2(x) + f_2^2(x) \} ,  x = (x_1,x_2)
The minimum function value is f(x*) = 0 at x* = (1,1). This problem does not have any constraints.

The following PROC NLP run be used to solve this problem:

proc nlp;
   min f;
   decvar x1 x2;
   f1 = 10 * (x2 - x1 * x1);
   f2 = 1 - x1;
   f  = .5 * (f1 * f1 + f2 * f2);
run;

The MIN statement identifies the symbol f that characterizes the objective function in terms of f1 and f2, and the DECVAR statement names the decision variables X1 and X2. Because there is no explicit optimizing algorithm option specified (TECH=), PROC NLP would use the Newton-Raphson method with ridging, the default algorithm when there are no constraints.

A better way to solve this problem is to take advantage of the fact that f is a sum of squares of f1 and f2 and to treat it as a least-squares problem. Using the LSQ statement instead of the MIN statement tells the procedure that this is a least-squares problem, which results in the use of one of the specialized algorithms for solving least-squares problems (for example, Levenberg-Marquardt).

proc nlp;
   lsq f1 f2;
   decvar x1 x2;
   f1 = 10 * (x2 - x1 * x1);
   f2 = 1 - x1;
run;

The LSQ statement results in the minimization of a function that is the sum of squares of functions that appear in the LSQ statement.

The least-squares specification is preferred because it enables the procedure to exploit the structure in the problem for numeric stability and performance.

There are several other NLP statements that are used to supply additional data of the model, such as variable value bounds and linear and nonlinear constraints. The following is an example of a problem with bounds and with linear and nonlinear constraints:

proc nlp tech=QUANEW;
   min f;
   decvar x1 x2;
   bounds x1 - x2 <= .5;
   lincon x1 + x2 <= .6;
   nlincon c1 >= 0;

   c1 = x1 * x1 - 2 * x2;
 
   f1 = 10 * (x2 - x1 * x1);
   f2 = 1 - x1;

   f = .5 * (f1 * f1 + f2 * f2);
run;

Chapter Contents
Chapter Contents
Previous
Previous
Next
Next
Top
Top

Copyright © 1999 by SAS Institute Inc., Cary, NC, USA. All rights reserved.