Chapter Contents
Chapter Contents
Previous
Previous
Next
Next
The NLP Procedure

Testing the Gradient Specification

There are three main ways to check the correctness of derivative specifications.

The algorithm of Wolfe (1982) is used to check whether the gradient g(x) specified by a GRADIENT (or indirectly by a JACOBIAN) statement is appropriate for the objective function f(x) specified by the program statements.

Using function and gradient evaluations in the neighborhood of the starting point x(0), second derivatives are approximated by finite difference formulas. Forward differences of gradient values are used to approximate the Hessian element Gjk,

G_{jk} \approx H_{jk} = {g_j(x + \delta e_k) - g_j(x) \over \delta}
where \delta is a small step size and ek = (0, ... ,0,1,0, ... ,0)T is the unit vector along the kth coordinate axis. The test vector S, with
s_j = H_{jj} - {2 \over \delta} \{{f(x + \delta e_j) - f(x)
 \over \delta} - g_j(x)\}
contains the differences between two sets of finite difference approximations for the diagonal elements of the Hessian matrix
G_{jj} = \partial^2 f(x^{(0)}) / \partial x^2_j  ,  j=1, ... ,n
The test matrix \Delta H contains the absolute differences of symmetric elements in the approximate Hessian |Hjk - Hkj|, j,k = 1, ... ,n, generated by forward differences of the gradient elements.

If the specification of the first derivatives is correct, the elements of the test vector and test matrix should be relatively small. The location of large elements in the test matrix points to erroneous coordinates in the gradient specification. For very large optimization problems, this algorithm can be too expensive in terms of computer time and memory.

Chapter Contents
Chapter Contents
Previous
Previous
Next
Next
Top
Top

Copyright © 1999 by SAS Institute Inc., Cary, NC, USA. All rights reserved.