Chapter Contents
Chapter Contents
Previous
Previous
Next
Next
Nonlinear Optimization Examples

Kuhn-Tucker Conditions

The nonlinear programming (NLP) problem with one objective function f and m constraint functions ci, which are continuously differentiable, is defined as follows:
{minimize} f(x), & & x \in {\cal R}^n, \; { subject to} \c_i(x) = 0 , & & i = 1, ... ,m_e \c_i(x) \ge 0 , & & i = m_e+1, ... ,m
In the preceding notation, n is the dimension of the function f(x), and me is the number of equality constraints. The linear combination of objective and constraint functions
L(x,\lambda) = f(x) - \sum_{i=1}^m \lambda_i c_i(x)
is the Lagrange function, and the coefficients \lambda_i are the Lagrange multipliers.

If the functions f and ci are twice differentiable, the point x* is an isolated local minimizer of the NLP problem, if there exists a vector \lambda^*=(\lambda_1^*,
 ... ,\lambda_m^*) that meets the following conditions:

In practice, you cannot expect that the constraint functions ci(x*) will vanish within machine precision, and determining the set of active constraints at the solution x* may not be simple.

Chapter Contents
Chapter Contents
Previous
Previous
Next
Next
Top
Top

Copyright © 1999 by SAS Institute Inc., Cary, NC, USA. All rights reserved.