RE: Slow Gradient Method.
From: Erik Olofsen <E.Olofsen@lumc.nl>
Subject: RE: Slow Gradient Method.
Date: Thu, 31 May 2001 10:01:10 +0200 (CEST)
Dear Vladimir,
I came across the same phenomenon a while ago:
http://www.cognigencorp.com/nonmem/nm/99apr042001.html
Suddenly some components of the gradient vector get very large and one or two iterations later the same might happen to other components and even sometimes the problem disappears after a few iterations. The magnitude of the largest values depend on the number of significant digits and I have successful convergence and covariance step with eg SIG=3, and that's why I got the feeling that it has something to do with the precision of numerical derivatives of the prediction with respect to the thetas. In PRED first and second analytical derivatives with respect to the etas need to be computed, and second analytical derivatives only when the NUMERICAL option is not used. How is the SLOW option diffferent from the NUMERICAL option?
Best regards,
Erik Olofsen