Prediction Curve
In NONMEM IV as distributed, the number of data records in a single
individual record must not exceed 400. I wonder if the additional data
records sometimes cause this restriction to be exceeded. NONMEM IV
does not detect violations of this restriction, and might crash with an
I/O error of some sort.
Dr. Harnisch should examine his data set. If there are more than 400
data records in some individual records, he can remove the excess the
additional records. Alternately, NONMEM source code can be changed to
allow more records. This is described in Users Guide III, 3.2.9,
"Changing the Size of NONMEM Buffers". Buffers 1 and 2 must be
changed.
He says he adds records such as:
EVID=0, PCMT=CMT=2, CONC=DV=0, and MDV=1
or
EVID=2, PCMT=CMT=2, CONC=DV=0, and MDV=1
Ruedi Port was basically correct when he responded:
> (To me, EVID=2 seems to be the more correct version for the added
> lines with MDV=1.)
EVID=2 is more in the spirit of how things should be done, but in fact
the effect on the run itself would be insignificant. (There is a
difference if the $ERROR block specifies CALLFL=0, but few people
use this feature.)
Ruedi is also correct in advising Dr. Harnisch to omit the $ESTIM
record when the only purpose of the run is to obtain predictions
at additional data points. This will take care of the inability
of the search to proceed because it is already at the miniumum.
The error message from PREDPP is more puzzling:
NUMERICAL DIFFICULTIES WITH INTEGRATION ROUTINE.
MAXIMUM NO. OF EVALUATIONS OF DIFFERENTIAL EQUATIONS, 100000, EXCEEDED.
0PROGRAM TERMINATED BY PRRES
This message might legitmately occur with a very large data set. It is
possible to raise the limit, but first it would be best to make sure
that this is not some side-effect of the 400-record problem.
If it persists, Dr. Harnisch should contact me and I'll tell him what
to do.