From:"Steve Duffull"
Subject: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Wed, 2 Oct 2002 13:08:52 +1000
Hi all
I have come across an error which I can't seem to solve. I think perhaps that this
might have been discussed before - but can't find it.
Anyway. After having completed a NONMEM run we constructed a code file that
had the output from NONMEM as initial estimates (we want to look at the profile
likelihood). The following OMEGA statement (which is identical to the output
of NONMEM) caused the following error:
"INITIAL ESTIMATE OF OMEGA HAS A NONZERO BLOCK WHICH IS NUMERICALLY NOT POSITIVE DEFINITE"
This error occurs whether the OMEGA statement is used with $EST or $SIM.
$OMEGA BLOCK(4) ;BSV
0.329
0.205 0.272
0.161 -0.217 0.774
0.00818 0.0048 0.00462 0.000204
I think that this must be very unlikely since estimated VC matrices should always
positive definite. However, just in case I checked it and it is positive definite -
since it can be inverted and the determinant can be computed. The eigenvalues of the
VC matrix suggest that it may be 'ill-defined' (one of the values was 4.8E-7 and the others were
mostly 10E-1 and 10E-4).
Any ideas?
Thanks
Steve
*****************************************
Stephen Duffull
School of Pharmacy
University of Queensland
Brisbane 4072
Australia
Tel +61 7 3365 8808
Fax +61 7 3365 1688
http://www.uq.edu.au/pharmacy/duffull.htm
University Provider Number: 00025B
OMEGA HAS A NONZERO BLOCK
39 messages
9 people
Latest: Oct 08, 2002
From:Nick Holford
Subject:Re: [NMusers] OMEGA HAS A NONZERO BLOCK
Date: Wed, 02 Oct 2002 15:41:24 +1200
Steve,
This can be caused by the minor differences that occur when converting between the
text representation of a number (in the NM-TRAN control stream) and the internal
representation of the corresponding double precision value. If you "round
down" the last digit of the off diagonal elements of $OMEGA you may find that
NONMEM will then accept it e.g.
$OMEGA BLOCK(4) ;BSV
0.329
0.204 0.272
0.160 -0.216 0.774
0.00817 0.0047 0.00461 0.000204
I find this problem occurs quite often if I take the OMEGA estimates of a NONMEM run from
the NONMEM listing and put them into an NM-TRAN control stream and is usually fixed with
the "round down" kludge.
Nick
Nick Holford, Divn Pharmacology & Clinical Pharmacology
University of Auckland, 85 Park Rd, Private Bag 92019, Auckland, New Zealand
email:n.holford@auckland.ac.nz tel:+64(9)373-7599x6730 fax:373-7556
http://www.health.auckland.ac.nz/pharmacology/staff/nholford/
From:Pascal Girard
Subject:RE: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Wed, 2 Oct 2002 00:25:42 -0700
Hi Steve,
I often encountered the problem when I want to resample OMEGAs from MVN distribution using
a non-diagonal final OMEGA. I fixed it definitively by writing my own INFN subroutine which outputs
final OMEGA in an ASCII file, with more digits than the one provided by NONMEM output.
By the way this is really not rocket science. It would be much more efficient to have an
option in NONMEM that outputs all NONMEM estimates, with sufficient precisions, (Objective
functions, THETAs, OMEGAs and SIGMAs, SE, COV matrix ..., indexed by the PROB #) in an ASCII,
EXCEL compatible, file. Software like SAS statistical procedures or Splus functions offer this
functionality. This would be a kind of sophisticated Model Specification File, but in ASCII,
rather than binary coding. (Hope Stuart is listening -:)
All my best,
Pascal Girard
Tel/Fax +33 (0)437 37 80 29
From:"Kowalski, Ken"
Subject:RE: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Wed, 2 Oct 2002 08:32:00 -0400
Steve,
Your Omega matrix is extremely ill-conditioned (i.e., the ratio of the
largest eigenvalue to the smallest is >10^6). Inspecting the correlations
of the etas I find that the correlation between eta1 and eta4 is
corr(1,4) = omega(1,4)/{sqrt(omega(1,1)*omega(4,4)} =
0.00818/sqrt(0.329*0.000204) = 0.998
This correlation is very close to 1 and I suspect that if you calculated
this correlation with all the digits rather than the 3 significant digits
that NONMEM reports in the output this correlation would be 1.0 leading to a
singular matrix (and an eigenvalue of 0). A solution to this problem is to
reduce the dimensionality of Omega restricting the correlation to be 1.0.
This can be accomplished by sharing the eta between the two parameters
corresponding to THETA1 and THETA4. For example,
P1=THETA(1)*EXP(ETA(1))
P2=THETA(2)*EXP(ETA(2))
P3=THETA(3)*EXP(ETA(3))
P4=THETA(4)*EXP(THETA(5)*ETA(1))
will force the correlation between P1 and P4 to be 1.0 and
var(LOG(P4))=(THETA(5)^2)*var(LOG(P1)). Thus, THETA(5) is the ratio of the
standard deviations and from your BLOCK(4) Omega results the estimate would
be THETA(5) = sqrt(0.000204)/sqrt(0.329) = 0.0249 (i.e., the standard
deviation for eta4 is approx. 2.5% of the standard deviation of eta1). With
this parameterization one would specify a BLOCK(3) for Omega which has 6
element plus THETA(5) for a total of 7 elements related to BSV whereas your
BLOCK(4) Omega has 10 elements. Note that forcing the correlation(1,4)=1
induces fixed correlations between eta1 and eta4=theta5*eta1 and the other
etas. Indeed, in your BLOCK(4) results you will find that corr(1,2) ~=
corr(2,4) and corr(1,3)~=corr(3,4). If you fit this model (i.e., BLOCK(3)
with the addition of THETA(5)) you should find that you will get the exact
same fit (MOF and parameter estimates) as your BLOCK(4) results but the
model will be considerably more stable.
Regards,
Ken
From:Nick Holford [mailto:n.holford@auckland.ac.nz]
Subject:Re: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Wednesday, October 02, 2002 4:06 PM
Ken,
While I cannot disagree with the technical details of your proposal I would
wonder why one would deliberately impose the assumption that the correlation
between two parameters is 1. This seems very unlikely -- even more unlikely
than the other extreme that the correlation is 0. If the data and estimation
method do not allow one to estimate the correlation then I think it would be
preferable to fix the covariance so that a reasonable correlation is
assumed. I would assume that any stability benefits would also apply with
this more realistic implementation. An analogy would be the well accepted
procedure of fixing KA to a reasonable value when the data do not allow one
to define the absorption process well.
This topic was discussed on nmusers in May 2001
( http://www.cognigencorp.com/nonmem/nm/97may092001.html)
Mats Karlsson proposed this example of how to fix the covariance:
CL= THETA(1)*EXP(THETA(3) * (ETA(1) + THETA(5)*ETA(3))
V = THETA(2) *EXP(THETA(4) * (ETA(2) + SQRT(THETA(5)*THETA(5))*ETA(3))
Where omega for ETA(1), ETA(2) and ETA(3) are fixed to 1. IF THETA(5) is
negative, correlation is negative, whereas if it is positive correlation is
positive.
Nick
PS Users of Wings for NONMEM ( http://wfn.sourceforge.net/) will of course
readily diagnose the case when correlations approach 1 because these
correlations are automatically calculated and presented to the user instead
of just offering the less informative off-diagonal elements of OMEGA.
Nick Holford, Divn Pharmacology & Clinical Pharmacology
University of Auckland, 85 Park Rd, Private Bag 92019, Auckland, New Zealand
email:n.holford@auckland.ac.nz tel:+64(9)373-7599x6730 fax:373-7556
http://www.health.auckland.ac.nz/pharmacology/staff/nholford/
From:Kowalski, Ken
Subject:RE: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Thursday, October 03, 2002 11:38 AM
Hi Nick,
We've had this discussion before. I suspect the correlation is being driven
to 1 because of limitations of the design (i.e, insufficient information to
precisely estimate the correlation but sufficient information to suggest it
is non-zero--otherwise NONMEM would have estimated the covariance to be
zero). I draw the analogy to estimating a variance component for ka when
there is very little information in the absorption phase. With this
analogy, NONMEM might estimate the variance component for ka to be 0. We
typically do not interpret this to mean that there is no BSV in ka just that
the design cannot support the estimation of the BSV in ka. So, what do we
do?...We typically constrain the omega for ka to be 0 even though we know
that it is probably unrealistic.
With regards to the perfect correlation problem, if we fix the covariance in
such a way to restrict the correlation to a more reasonable value less than
1 we will take a hit in the MOF as the maximum likelihood estimates of the
parameters (including elements of Omega) wants to estimate this correlation
as 1...this is the discussion we had before. At that time you changed your
recommendation to a Bayesian solution where you specify a prior on this
correlation. I can't argue against that approach if one has such a prior.
However, I suspect the prior would have to be quite strong (to move the
correlation away from 1) as a flat or non-informative prior is going to run
into the same perfect correlation problem as maximum likelihood estimation.
What if Steve's ill-conditioned Omega just squeaked by NONMEM when he tried
to simulate...perhaps rounding down the off-diagonal elements of Omega as
you recommended in a previous message? He would have proceeded perhaps not
realizing that his model is ill-conditioned/over-parameterized and would
have been simulating with near perfect correlation for P1 and P4. NONMEM
(or any other nonlinear regression algorithm) can act quirky (e.g.,
extremely sensitive to starting values) when the model is ill-conditioned.
Steve's model may provide a good fit, I just contend that I can get that
same fit with 3 fewer elements in Omega. My solution is not altering the
fit that Steve obtained with his BLOCK(4) parameterization unless of course
he truly did not achieve a global minimum which is possible due to the
over-parameterized Omega. If so, my solution could possibly lead to an even
lower MOF. However, I have encountered the problem Steve raises on numerous
occasions and typically the solution I propose leads to the identical fit
without the instability. Steve didn't indicate whether the COV step failed
when he fit his BLOCK(4) model...often it will fail with an
over-parameterized Omega even though the estimation step converges. The
solution I propose removes the ill-conditioning of Omega and can allow the
COV step to run without altering the fit.
Mats' parameterization is not a solution to Steve's ill-conditioned Omega
problem. He merely re-parameterized Omega so that the variances and
covariance are estimated with 3 additional thetas (i.e., theta3, theta4 and
theta5 in his example) in lieu of the 3 elements in a BLOCK(2) Omega. With
Mat's parameterization the correlation between CL and V is
THETA(5)^2/(1+THETA(5)^2).
With this parameterization one can gain control over restricting the
correlation by fixing THETA(5). However, this parameterization expanded to
Steve's BLOCK(4) problem will still have the ill-conditioning problem as
it's fitting the same model with the same number of elements in Omega...just
reparameterized as Thetas. With Mats' parameterization, the perfect
correlation would result in THETA(5) going to infinity. In Steve's BLOCK(4)
results I calulated the correlation as 0.998 which would suggest that
THETA(5)=22.3. If you want to restrict the correlation to some arbitrary
value r, this can be obtained by fixing
THETA(5)=sqrt(r/(1-r)).
Thus, for r=0.8, THETA(5)=2.0. This is considerably smaller than the
THETA(5)=22.3 that I estimate for Steve's problem. However, as I indicated
in my previous message, if you use all the digits rather than the 3 signif
digits that NONMEM reports out I suspect that the correlation is even closer
to 1.
Bottom line: We need to get rid of the ill-conditioning by simplifying the
model. Simply fixing the correlation to some arbitrary value less than 1 so
that we don't have a singular Omega (which is what happens when we try to
estimate the correlation as 1) is undesirable because we take a hit on the
fit (higher MOF).
Ken
From:Nick Holford
Subject:Re: FW: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Fri, 04 Oct 2002 16:25:10 +1200
Ken,
Thanks for bringing together these comments.
1. I don't agree that the minimum objective function value (MOF) is the ultimate criterion
when it means the model parameters are intrinsically unreasonable. NONMEM can have problems
estimating covariances. I do not recall ever seeing NONMEM estimate a covariance of zero but
have frequently seen correlations close to 1 (or -1). I repeat my assertion that
a correlation of 1 (or -1) is never reasonable because it means that one parameter differs
only by a scale factor from another and thus only one parameter needs to be estimated. If
the deterministic structure of the model is compatible with one parameter then change this
part of the model rather than introducing it via a trick via the random effects part of
the model.
2. When NONMEM is unable to estimate a reasonable value then I suggest that a reasonable
heuristic is to fix the value e.g. a correlation of 0.5 between CL and V, rather than assume
correlation is zero. The method proposed by Mats can be used to do this. I was not suggesting
that Steve try this reparameterization and still try to estimate the THETA defining
the covariance. I agree that this is unlikely to solve the problem although it should be noted
that some reparameterizations can have unexpected benefits for estimation.
3. Bayesian estimation of the covariance can be approximated using NONMEM with the undocumented,
unsupported and widely discussed PRIOR method in NONMEM V 1.1. This method would work quite
nicely with the way Mats has suggested using a THETA to represent the covariance. I agree that
a strong prior would probably be needed to avoid the difficulties NONMEM has in
estimating covariances when the information in the data is low. The difference between a
fixed parameter and one estimated with a strong prior is not likely to be of any practical
importance so I would pragmatically tend to use the fixed parameter method. As Steve is such
a keen WinBugger I wonder if he can tell us how his problem behaves with a
truely Bayesian estimation method?
Nick
Nick Holford, Divn Pharmacology & Clinical Pharmacology
University of Auckland, 85 Park Rd, Private Bag 92019, Auckland, New Zealand
email:n.holford@auckland.ac.nz tel:+64(9)373-7599x6730 fax:373-7556
http://www.health.auckland.ac.nz/pharmacology/staff/nholford/
From:Leonid Gibiansky
Subject: Re: FW: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Fri, 04 Oct 2002 09:06:16 -0400
Ken, Nick,
Here are my 2c for this discussion:
As Ken pointed out, with the original parameterization there were 10
parameters responsible for the OMEGA matrix. NONMEM solution evidences that
those parameters are related (determinant of the matrix is zero).
Therefore, correct parameterization should contain 9 parameters.
Restricting this to 7 parameters, as Ken suggested, introduces two
additional restrictions that were not evident from the solution. Therefore
my guess is that re-parametrized solution will not be, in general,
equivalent to the original one. For practical purposes, the simplest way is
to try and see what happens.
If one wants to do a rigorous search for the correct correlation, I would
propose the following solution (for the 4 by 4 case)
CORR1=THETA(1)*ETA(1)
CORR2=THETA(2)*ETA(1)+THETA(3)*ETA(2)
CORR3=THETA(4)*ETA(1)+THETA(5)*ETA(2)+THETA(6)*ETA(3)
CORR4=THETA(7)*ETA(1)+THETA(8)*ETA(2)+THETA(9)*ETA(3)+THETA(10)*ETA(4)
CL= *EXP(CORR1)
Q = *EXP(CORR2)
V1= *EXP(CORR3)
V2= *EXP(CORR4)
Here I assume that OMEGA matrix for ETAs is fixed to unit matrix.
This has 10 parameters and should be equivalent to the original problem,
and least it gave the same solution in all the cases that I tested.
Coefficients of the OMEGA matrix are easily expressed via THETA1 - THETA10.
Then one can continue with the regular procedure to exclude parameters one
by one.
Leonid
From:"Kowalski, Ken"
Subject:RE: FW: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Fri, 4 Oct 2002 10:43:55 -0400
Nick,
With regards to your Item 1, I think we are going to have to agree to
disagree. Throwing away the objective function is not appealing to me...the
choice of values for fixing parameters (e.g., elements of Omega) that you
consider unrealistic is completely arbitrary. I suspect the reason NONMEM
never estimates a covariance to be zero is that covariances can be positive
or negative so zero is not on the boundary. But what about my analogy
regarding a variance component (diagonal element of Omega) going to zero
which is on the boundary? Surely you've seen NONMEM estimate a zero
variance component. Isn't a zero variance component estimated for say ka or
V unrealistic? Again, this can happen because of lack of information in the
design/data to estimate this variance component. Isn't it common practice
to then fix this variance component to zero rather than some arbitrary
non-zero value? Going back to Steve Duffull's problem, what if by chance
the Omega reported in the NONMEM output rounded to 3 significant digits
didn't have problems (i.e., just squeaked by and was positive semi-definite)
and let's say for this to happen the correlation was estimated to be 0.99.
Doesn't an estimate of 0.99 for the correlation concern you? If so, how low
does the correlation have to be for you to consider it realistic? Call it a
trick if you like, but my proposed solution is supported by the data and is
simply a more parsimonious form for Omega that will result in the identical
fit that Steve obtained.
Regarding Item 2, fixing the correlation to something less than 1 (say 0.5)
is going to result in a poorer fit since NONMEM is wanting to estimate the
correlation to be 1. As we discussed a year ago, I contend that my model
constraining the correlation to 1 will result in a more realistic simulation
of the data (i.e., a posterior predictive check) than fixing the correlation
to something considerably lower that is not supported by the current data.
Regarding Item 3, I think we are in agreement provided one has a strong
enough prior presumably supported by other data. This I think is a
reasonable alternative to my solution to the ill-conditioned Omega problem.
I make the distinction between a strong prior supported by an independent
set of data (perhaps data-rich healthy volunteer data) and fixing the
correlation arbitrarily.
Ken
From:"Kowalski, Ken"
Subject:RE: FW: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Fri, 4 Oct 2002 10:57:46 -0400
Leonid,
What happens when you use your proposed parameterization when Omega is
ill-conditioned as in Steve's example? Since NONMEM is wanting to estimate
the correlation between P1 and P4 (in your example CL and V2) to be 1
doesn't that mean that THETA(8)=THETA(9)=THETA(10)=0 (since ETA(1), ETA(2)
and ETA(3) are assumed to be independent)? So, with your parameterization,
I contend your model will reduce to 7 parameters in Omega as well.
Ken
From:Leonid Gibiansky
Subject:RE: FW: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Fri, 04 Oct 2002 11:14:38 -0400
Ken,
I was thinking about a little more general situation where only the
determinant of OMEGA is equal to zero and missed additional corr(1,4)=1
information that is responsible for the other two parameters. In Steve's
case the approach that you suggested should work perfectly (minor
difference of correlation from 1 should not change the result).
Leonid
Correlation matrix:
X1 X2 X3 X4
1 1.0000000 0.6852854 0.3190490 0.9984821
2 0.6852854 1.0000000 0.4729386 0.6443795
3 0.3190490 0.4729386 1.0000000 0.3676685
4 0.9984821 0.6443795 0.3676685 1.0000000
From: "Kowalski, Ken"
Subject:RE: FW: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Fri, 4 Oct 2002 11:21:26 -0400
Leonid,
If the determinant of Omega is zero doesn't this imply that one of the rows
can be expressed as a linear combination of the other rows? If so, then I
think for an NxN Omega, whenever DET(Omega)=0 I believe there will be N-1
restrictions to remove the ill-conditioning.
Ken
From:Leonid Gibiansky
Subject:RE: FW: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Fri, 04 Oct 2002 12:34:36 -0400
Ken,
Given the degenerate OMEGA matrix, we always can find degenerate direction
(the one that corresponds to zero eigenvalue, such that OMEGA*U=0) and find
the relationship that need to be introduced to remove ill-conditioning. In
this case
U ~ (sqrt(OMEGA44),0,0,-sqrt(OMEGA11))
leading to the conclusion that eta4 ~ eta1. In a more general case, this
will be a relationship of the type
eta4~ coeff1*eta1+coeff2*eta2+coeff3*eta3
The problem, as I see it, is that you cannot trust the matrix that you
received from the computations, if it is ill-conditioned. Therefore, you
cannot find this degenerate direction (or at least, can not be sure in this
relation). Parametrization that I use may provide a useful tool to look
for this direction. Moreover, if the estimation step with the full OMEGA
matrix fails (that may happen with ill-conditioned problem), this
parametrization allows to go not only from the full degenerate matrix to
the simpler one, but also from the diagonal matrix to the correlated one,
even if the computation with the full matrix fails. This is similar to
covariate model building step, where covariates can be correlated: instead
of starting from the full matrix (full covariate model), one can try to
increase complexity step by step.
Leonid
From: "Serge Guzy"
Subject:RE: FW: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Fri, 4 Oct 2002 11:02:14 -0700
I do agree with Ken that a large correlation between parameters should not be thrown out
just because it is large. Here are the two scenarios I already investigated using the new
MC_PEM (Monte Carlo Parametric Expectation Maximization)algorithm.
The first scenario considered a 1 compartment model with only one sample per patient
(50 patients).The true correlation was very high (~0.9).
-The algorithm retrieved the high correlation as well as the right population PK means
and variances
- A bootstrap algorithm was used to mimic other samples that could arise from the
unknown population
- Using the bootstrap 10 times, the correlation stayed high for all 10 new datsets
and with a very small standard error
and a mean~0.9
- My conclusion was that the correlation was trustable and not an artifact
due to the sample design
Second scenario
The second scenario considered a one compartment model with oral absorption
with 3 PK parameters where no correlation existed between PK parameters.
- The MC-PEM algorithm was used again for 10 similar datasets
- The result was that the correlation between Ka and V ranged from almost -1 to almost +1
- The original dataset had by accident a high positive correlation but
the bootstrap algorithm made us thinking that the correlation could go from -1 to 1 and
at the same time without changing significantly the other population parameter
estimates
My conclusion was that there is no much information about the correlation between these two parameters.
What to do about that, I don't know exactly but in that case the true correlation was zero.
I think we enter in the field of hypothesis testing and ask the following question
H0= No correlation
HA: correlation
The bootstrap algorithm therefore do not allow us to reject the null and we stay with no correlation.
It does not mean that there is no correlation but since there is no supportive evidence to
reject the null , we stay with the null.
I think it is the right thing to do and I would redo a fit forcing the correlation to be zero.
I would never fix the correlation to let say 0.5 and use the same other population
PK estimates for prediction purposes.
By the way , I would be happy to get the data that initiated this very interesting discussion
and would try to analyze it using our system
Serge Guzy, Ph.D.
Head of Pharmacometrics
Xoma
From:"Bonate, Peter"
Subject:[NMusers] OMEGA HAS A NONZERO BLOCK
Date:Fri, 4 Oct 2002 13:13:20 -0500
Dear All,
I have been following this discussion with great interest and thought I would share with the
group the results of some simple simulations that I have done. In the first simulation I
simulated data from 125 individuals with intense serial sampling after single dose administration.
Concentration-time data were simulated using a 1-compartment model with oral
administration. Clearance, volume of distribution, and Ka had typical values of 3.7 L/h,
227 L, and 0.7 per hour, respectively. All parameters were modeled as log-normal. Omega was
3 x 3 with values
CL V Ka
0.1
0.16 0.3
0.00 0.0 0.1
Residual variability was proportional having a variability of 0.003. Thus the correlation
between CL and V was 0.92. The model was then fit using FOCE having the true values as the
initial estimates. The model minimized with no errors and had an OFV of 13409. The final
parameter estimates (standard errors) was 3.65 (0.104) L/h, 220 (11.0) L, and 0.698
(0.0215) per hour for CL, V, and Ka, respectively. Omega was estimated at
CL V Ka
0.0893
0.142 0.264
0.00 0.000 0.101
with a residual variance of 0.00422. The model fitted correlation between CL and V was 0.92.
The largest and smallest eigenvalues of this model were 3.18 and 0.00417, respectively, with a
condition number of 763. Hence, the model was unstable, as expected.
I then refit the model using the trick we are all talking about. V was modeled as
Theta(2)*exp(eta(1) *theta(4)), where theta(4) is the ratio of the standard deviations. This model
was refit, had no errors, and had an OFV of 14005. Hence, the new model had an increase in OFV of 596!!
The new parameter estimates were 4.32 (0.186) L/h, 295 (22.8) L, and 0.636 (0.0223) per hour for CL, V,
and Ka, respectively. Omega was estimated at 0.109 for CL and 0.112 for Ka with a residual variance
of 0.0123. Theta(4) was 1.72 (0.146). The true ratio of the SDs was 1.73. So, reparameterization resulted in
a better estimate of the variance of both CL and V, with essentially no change in Ka. But, although
theta(4) was accurately estimated, CL, V, and Ka had greater bias and larger standard errors under the
new model. However, this model was more stable having a condition number of 3.33/0.0325 = 102.
The second simulation built on the first simulation where a PD marker was measured. The marker was
simulated having an Emax model with parameters Emax = 100% and EC50 = 25 ng/mL. Between-subject variability
was modeled as a normal distribution. Omega was 2 x 2 with values
Emax EC50
100
17 3
Hence, Emax and EC50 had correlation 0.98. Residual error was modeled as a normal distribution with
variance 10. The PD model was then fit using FOCE having the true values as the initial estimates.
The model minimized with no errors and had an OFV of 6494. The final parameter estimates (standard
errors) was 99.7 (1.06) for Emax and 25.2 (0.531) for EC50.
Omega was estimated at
Emax EC50
95.0
42.3 20.6
with residual variance estimated at 9.75. The condition number of the model was 2.89/0.00273 = 1059,
indicating the model was unstable.
The model was then parameterized as EC50 = THETA(2) + THETA(3)*ETA(1) and refit. Minimization was
successful with an OFV of 6516, an increase of 22. The final parameter estimates were 99.5 (1.10)
for Emax and 25.2 (0.538) for EC50. The variance of CL was estimated at 87.2 with a residual
variance of 10.2. Theta(3) was estimated at 0.366. The theoretical value was 0.173 (IS THIS
RIGHT, KEN?) This time the new model had no change in the estimates of Emax and EC50, but the
variance components had worse accuracy than the original model.
I then tried another reformulation of the model to EC50 = THETA(1)*THETA(2) + ETA(1) and refit.
This time the OFV was 7313.5, an increase of 820. The estimates of Emax were 98.9 (1.30), 0.252
(0.0435) for Theta(2), and hence 24.9 for EC50. The variances were totally off, however. The
variance of CL was estimated at 35.3 with a residual variance of 16.3.
As I interpret this, you may be better parameter estimates when two random effects are correlated
but the estimates are unstable and become data dependent. By using the shared covariance term
("the trick") there is no guarantee the OFV will be near the current, highly correlated model.
You also may get slightly more biased estimates with greater imprecision, but the model becomes
more stable and less data dependent. The trick seems to works for however the model is
parameterized, although if one random effect was log-normal and one was normal, I am not
sure what the new theta would represent.
Anyway just some more thoughts on the subject,
pete
Peter L. Bonate, PhD
Director, Pharmacokinetics
ILEX Oncology, Inc
4545 Horizon Hill Blvd
San Antonio, TX 78229
phone: 210-949-8662
fax: 210-949-8487
email: pbonate@ilexonc.com
From:Nick Holford
Subject:Re: FW: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Sat, 05 Oct 2002 08:21:32 +1200
Ken,
"Kowalski, Ken" wrote:
>
> Nick,
>
> With regards to your Item 1, I think we are going to have to agree to
> disagree. Throwing away the objective function is not appealing to
me...the
> choice of values for fixing parameters (e.g., elements of Omega) that
you
> consider unrealistic is completely arbitrary.
I do not understand why you think that my wish to fix the correlation to
a value such as 0.5 is "completely arbitrary". I tried to explain that
this choice was because I have a strong prior on the value of this
correlation and especially I *know* that the correlation between CL and
V is *not* 1 (your arbitrary choice) and is not likely to be zero.
The NONMEM objective function is not to be fully trusted when the
estimation process is unstable (see more comments on this below).
> I suspect the reason NONMEM
> never estimates a covariance to be zero is that covariances can be
positive
> or negative so zero is not on the boundary. But what about my analogy
> regarding a variance component (diagonal element of Omega) going to
zero
> which is on the boundary? Surely you've seen NONMEM estimate a zero
> variance component. Isn't a zero variance component estimated for say
ka or
> V unrealistic? Again, this can happen because of lack of information
in the
> design/data to estimate this variance component. Isn't it common
practice
> to then fix this variance component to zero rather than some arbitrary
> non-zero value?
I agree that it is common practice to fix the diagonal element of OMEGA
to zero. This is analogous to fixing the covariance between CL and V to
zero. It is possible from a mechanistic viewpoint and if the data does
not allow NONMEM to obtain a reliable estimate then it is a reasonable
pragmatic approach. Setting it to a positive value based on prior
experience would seem to be an even more reasonable approach (for a
Bayesian). On the other hand, bobody advocates setting the diagonal
element to INF which seems to be analogous to your suggestion of forcing
the correlation to be one i.e. choosing a completely unrealistic value.
> Going back to Steve Duffull's problem, what if by chance
> the Omega reported in the NONMEM output rounded to 3 significant
digits
> didn't have problems (i.e., just squeaked by and was positive
semi-definite)
> and let's say for this to happen the correlation was estimated to be
0.99.
> Doesn't an estimate of 0.99 for the correlation concern you?
It certainly does concern me and when I see this I usually try to change
the model in some way so that a more reasonable estimate (0.9 or less)
is obtained. It seems to happen most commonly for parameters that I have
little prior knowledge so I am happy to accept fixing the covariance to
zero if I cannot get a reasonable non-zero estimate.
>
> Regarding Item 2, fixing the correlation to something less than 1 (say
0.5)
> is going to result in a poorer fit since NONMEM is wanting to estimate
the
> correlation to be 1. As we discussed a year ago, I contend that my
model
> constraining the correlation to 1 will result in a more realistic
simulation
> of the data (i.e., a posterior predictive check) than fixing the
correlation
> to something considerably lower that is not supported by the current
data.
If NONMEM estimates were trustworthy at these extreme values for a
correlation then I would agree with you. But I don't think they are
reliable and so do not trust them. I agree with Leonid "The problem, as
I see it, is that you cannot trust the matrix that you
received from the computations, if it is ill-conditioned. Therefore, you
cannot find this degenerate direction (or at least, can not be sure in
this
relation)".
> Regarding Item 3, I think we are in agreement provided one has a
strong
> enough prior presumably supported by other data. This I think is a
> reasonable alternative to my solution to the ill-conditioned Omega
problem.
> I make the distinction between a strong prior supported by an
independent
> set of data (perhaps data-rich healthy volunteer data) and fixing the
> correlation arbitrarily.
I think we agree here. When one has prior knowledge then it is
reasonable to use it in constructing a model. This is of course the
Bayesian philosophy which I find very appealing and have struggled to
apply using NONMEM.
It seems we differ because you believe that NONMEM is finding a pointer
to the truth buried in the data when it estimates a correlation close to
1 whereas I think it is a pointer to numerical nonsense.
Finally, thank you Peter Bonate for doing some experiments on this issue
which seem to reveal that there is no consistently, reliable answer to
be obtained simply by re-parameterisation. However, further work on the
parameterisation of random effects may well be fruitful. Stuart Beal
recently suggested a reparameterization which was quite helpful in
working around one particular problem I was having so I encourage
everyone to experiment (via simulation) as Peter has done.
Nick
--
Nick Holford, Divn Pharmacology & Clinical Pharmacology
University of Auckland, 85 Park Rd, Private Bag 92019, Auckland, New Zealand
email:n.holford@auckland.ac.nz tel:+64(9)373-7599x6730 fax:373-7556
http://www.health.auckland.ac.nz/pharmacology/staff/nholford/
From: "Lewis B. Sheiner"
Subject:Re: [NMusers] OMEGA HAS A NONZERO BLOCK
Date: Fri, 04 Oct 2002 13:48:04 -0700
I, too have been following this discussion, and am puzzled by the fact that we are
covering very old ground. This is a solved problem, although we may not much like the
'solution'. The problem we are discussing is called the 'ill-posed inverse problem'.
Peter's and Serge's results confirm the nature of the problem: ill-posed inverse problems
are typified by instability of estimates. Indeed, not only is this well-known, but it is
also well known that if the problem can be solved at all (that is if the model is a priori
identifiable) one can ONLY achieve stability by adding more information;
that contained in the current data is insufficient.
Moreover, there is really no longer any serious argument about where this information should
come from, or even how to add it once we have it! Regarding the former, the principled answer
is 'from science' . Regarding the latter, the only principled way I know of is to adopt a
Bayesian perspective, and to use Bayesian methods.
So, returning to the current interchange, consider the 'how to' alternatives being discussed:
fix parameters to some value, or use full prior distributions on them (in either case, the first
principle says that the specific values chosen must be justified on the basis of science). First,
note that simplifying a model (that is, by adopting a simpler sub-model than the original one) is
formally equivalent to fixing certain parameters (of the full model) to prior values.
These two 'solutions', then, are the same solution, and I refer to them both when I say 'fixing
parameters'. From the bayesian perspective, if you 'add knowledge' by fixing parameters, this is
equivalent to asserting that you have perfect prior knowledge of certain parameter values. This is
a hyper-subjective position with which any scientist should feel distinctly uncomfortable (except
perhaps in the case that the parameter in question is Planck's constant, or the speed of light, or
other such universal constant). Not only is this practice unrealistic, it is dangerous: Although,
admittedly, you will fix only parameters to which predictions of the current
data are insensitive, making the 'danger' unobvious on those data, there is no
guarantee that predictions of futuredata, especially extrapolations to designs
considerably different from those used with the current data, will be
similarly insensitive.
In contrast, by putting a formal, informative, but appropriately diffuse (i.e. correctly representing
the current state of scientific uncertainty) prior on (all) the parameters, one is being MORE objective
than one is when fixing parameters:
uncertainty is recognized and correctly factored in.
This, in more words, is one of the points Ken was making. Where he and I part company is that
he thinks (to mangle Wittgenstein) that 'whereof the data do not speak, thereof we should be silent',
and I claim that that is not possible:
choosing 'not to speak' is always equivalent to a sharp prior restriction on some model parameters.
Moreover, I contend, when one is doing science, one is never completely ignorant -- there is ALWAYS
some relevant past data which do speak, however softly, and those data define an informative prior distribution.
In my view, then, Ken and Nick differ only on the sharp prior value to which to fix the pesky
correlation. Ken, the empiricist, says fix it to unity because the data do not contradict that
value, and Nick, the theoretician, complains that a correlation of unity is unscientific (whereas
one of .5 presumably is not). Their common meeting ground, in my view, would be an informative prior
distribution, bounded away from unity (to satisfy Nick), but with enough dispersion
so that if the data did say anything about the value, they would be heard (satisfying Ken's empiricism).
To return to the first point--where does the extra information come from?--the implication of the
appeal to science here is deep: There CANNOT BE an automatic (algorithmic) GENERAL procedure for
making ill-posed inverse problems well-posed (such as setting all near-boundary correlations to
boundary values), as the only principled source of extra information
is prior DOMAIN-SPECIFIC scientific knowledge.
Sermon over.
_/ _/ _/_/ _/_/_/ _/_/_/ Lewis B Sheiner, MD (lewis@c255.ucsf.edu)
_/ _/ _/ _/ _/ Professor: Lab. Med., Biophmct. Sci.
_/ _/ _/ _/_/_/ _/_/ Mail: Box 0626, UCSF, SF,CA,94143
_/ _/ _/ _/ _/ Courier: Rm C255, 521 Parnassus,SF,CA,94122
_/_/ _/_/ _/_/_/ _/ 415-476-1965 (v), 415-476-2796 (fax)
From:Nick Holford
Subject: Re: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Sat, 05 Oct 2002 09:47:46 +1200
Lewis,
Thanks for sermonizing on the Bayesian perspective. I had been trying to advocate this all
along in my contributions to this thread (based largely on my conclusion from a similar thread
last year). However, you do not seem to recognize this in your comments below and I wonder why
you seem to cast me in the role of being "hyper-subjective"? Perhaps it is because there is another
element of uncertainty here -- how to solve this problem in practice. NONMEM has some limited
ability to do a sort of Bayesian estimation (Stuart asserts the use of PRIOR is not really Bayesian).
To my knowledge nobody has ever applied NONMEM with priors on the covariance although it is clear how
this might be done. Given that the NONMEM prior method is unsupported, undocumented and has almost no
literature support it is not surprising that many people would be reluctant to use it for real problems.
Fixing a parameter to a reasonable "hyper-subjective" prior is a well known pragmatic method for
data description (and recommended by FDA guidance e.g. fixing KA). As a practical solution I think
that this is within the comfort zone of most who have to face this problem and want to use NONMEM to
solve it. I quite agree that one should be more uncomfortable when relying on this fixed value for simulation.
I am still hoping that Steve Duffull, who started this thread rolling, will come back on line
sometime and tell us his experiences of looking at this same problem using WinBugs.
Nick
Nick Holford, Divn Pharmacology & Clinical Pharmacology
University of Auckland, 85 Park Rd, Private Bag 92019, Auckland, New Zealand
email:n.holford@auckland.ac.nz tel:+64(9)373-7599x6730 fax:373-7556
http://www.health.auckland.ac.nz/pharmacology/staff/nholford/
From:"Steve Duffull"
Subject:RE: FW: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Mon, 7 Oct 2002 08:48:06 +1000
Hi all
Just a quick note about my ill-conditioned VC matrix.
As Ken pointed out there was high correlation between parameter 1 and 4.
However knowledge of this does not solve the problem.
Recall the OMEGA BLOCK(4) was
$OMEGA BLOCK(4) ;BSV
0.329
0.205 0.272
0.161 -0.217 0.774
0.00818 0.0048 0.00462 0.000204
When I tried all the different permutations on this matrix it turns out the bad egg was 0.161.
Using matrix banding:
This works fine:
$OMEGA BLOCK(4) ;BSV
0.329
0.205 0.272
0 -0.217 0.774
0 0 0.00462 0.000204
But this doesn't.
$OMEGA BLOCK(4) ;BSV
0.329
0.205 0.272
0.161 -0.217 0.774
X 0.0048 0.00462 0.000204
where X is either zero or fixed at any particular (legal) value you like.
This is where I'm having problems.
$OMEGA BLOCK(4) ;BSV
0.329
0.205 0.272
X -0.217 0.774
0.0048 0.0048 0.00462 0.000204
I could find no values of X that allowed NONMEM to work with this matrix.
Regards
Steve
===================================
Stephen Duffull
School of Pharmacy
University of Queensland
Brisbane 4072
Australia
Tel +61 7 3365 8808
Fax +61 7 3365 1688
http://www.uq.edu.au/pharmacy/duffull.htm
University Provider Number: 00025B
From:"Steve Duffull"
Subject:RE: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Mon, 7 Oct 2002 11:31:22 +1000
Hi All
I, also, have enjoyed all the comments. My last email was just to point to other difficulties in
the matrix - that do not stem from the "apparent" correlation of 1 {Cov(par1,par4)} that is quite
apart from the true matrix problem.
I should add a comment that I specifically did not include the parameter names (i.e. attributing
the omegas with CL's etc) since I was wondering why we could not take the output from NONMEM and
use it as an input when we can mathematically show that the matrix is invertible and the determinant
can be computed (both required by NONMEM's objective function).
The practical solution for us was to "pull back" and accept a more parsimonious model. The full model
was structurally identifiable but was probably not deterministically identifiable. What I mean by this
is that in theory all the parameters of the model were globally identifiable in theory - but the data
does not seem to support their estimation as accurately as we would like.
I think the 100% correlation (which was a red herring wrt the problem) - is artificial and was induced
by a poor data set. I would not be happy about setting the cov to 1 (but as Lewis indicates setting to
to zero is tantamount to the same assumption).
This data could very easily be modelled using a different method - such as MCMC. However, where the
data is not "speaking to us" I doubt that we would be able to get around the correlation problem. I
have never tried to add in an informative prior on the covariance terms using BUGS. In fact, given
the specification of the prior for the inverse of the VC matrix (typically a Wishart) I do not know
if it would even be possible to specify a different strength prior for different components of the VC
matrix while preserving the entire VC structure. To be more specific the only way I know
to add strength to the prior for a Wishart is to increase the degrees of freedom which would affect
all parameters presumably equally.
Perhaps someone has some ideas on this?
Kind regards
Steve
===================================
Stephen Duffull
School of Pharmacy
University of Queensland
Brisbane 4072
Australia
Tel +61 7 3365 8808
Fax +61 7 3365 1688
http://www.uq.edu.au/pharmacy/duffull.htm
University Provider Number: 00025B
From:"Kowalski, Ken"
Subject:RE: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Mon, 7 Oct 2002 09:59:09 -0400
All,
Wow, what a flurry of emails on a Friday afternoon! I agree that we are
hashing old ground. I disagree with Lewis that the ONLY way to achieve
stability is by adding more information. That is THE solution when such
additional information exists. So I have no problem with posing a Bayesian
solution. Again, I suspect that the correlation from an independent data
source must be fairly precisely estimated to provide strong enough prior
information to resolve the ill-conditioning problem with the current
data/design. But what if no such data exists or the estimate of the
correlation is imprecise (a weak prior) such that the ill-conditioned
problem can't be resolved with the additional data/information? Fixing the
correlation and assuming it is known perfectly or specifying a strong prior
arbitrarily (i.e., not based on existing independent data) does not sit well
with me (unless your God, bring me the data).
The alternative approach to achieve stability is to reduce the
dimensionality of the problem (i.e., the current model is
over-parameterized). That is, simplify the model that can adequately
describe the data in hand. In otherwords, in the absence of additional
data/information, you gotta live with what you've got! I still like my
analogy of the zero variance component estimate. Why is it that some of you
are willing to fix a variance component to zero for say Ka or V given the
limitations of the design/data but are not willing to fix a correlation to 1
given such limitations? Isn't it just as unreasonable to assume that Ka or
V is EXACTLY the same in ALL individuals in the population as it is to
assume that if I know an individual's CL then I know his V because of the
perfect correlation?
My proposed solution to Steve's ill-conditioned Omega was merely proposing a
simpler form of Steve's model to achieve the same fit he obtained. Steve
claims that my solution to his problem is a red-herring but I am not
convinced. I challenge Steve to fit the model I propose and report back on
the MOF for his ill-conditioned model and my proposed solution...I'll be
very much surprized if the MOF's differ by more than what can be explained
by rounding errors. However, I do acknowledge Leonid's point that we can't
necessarily trust the results from an ill-conditioned Omega to find the
direction that can remove the ill-conditioning. Thus, some form of testing
of the individual elements of Omega may have some benefit in finding a more
parsimonious Omega. If this can be obtained by banding and fixing an
element(s) to zero, so be it. Steve, can you report the MOF's for these
other Omega structures as well? If banding with only one element restricted
to zero (i.e., estimating 9 elements in Omega) gets rid of your
ill-conditioning then I suspect that the MOF will be lower than what you
obtained with your full BLOCK(4) ill-conditioned Omega because I claim I can
get rid of the ill-conditioning without loss in MOF with just 7 elements in
Omega.
My approach to building Omega is to fit the fullest Omega that can be
supported by the data. In Pete's simulations with a correlation of 0.92
where this was reliably estimated (supported by the data) I wouldn't propose
fixing it to 1 (of course the MOF will be higher as there is sufficient data
to estimate it different from 1). A condition number of 763 is not that
large and I wouldn't consider the Omega ill-conditioned (a condition number
greater than 10^3 is generally considered moderately ill-conditioned and a
condition number greater than 10^5 is considered severe...Steve's problem
had a condition number >10^6). I only propose fixing the correlation to
1 when NONMEM estimates it on the boundary such that the model is extremely
unstable. Usually when this occurs the COV step will fail.
Call me an empiricist if you'd like, but show me the science that say's the
correlation is exactly 0.5.
Ken
From:"Lewis B. Sheiner"
Subject:Re: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Mon, 07 Oct 2002 09:19:23 -0700
All,
My only quibble with Ken's reply, which I tried to make clear in my original
note, is that to 'simplify the model' (from a mechanistically reasonable more
complex one to an 'adequate' less complex one) is FORMALLY IDENTICAL to fixing
certain parameters (of the more complex model) to sharp values (usually on their
boundary), and hence, to adding additional information. You just can't escape
(as Ken says, " you gotta live with what you've got"): To make progress, you've
often got to assert more than your data can support all by themselves. If your
only goal is describing the data at hand, or testing hypotheses to which the
data at hand speak, then by all means, use the simplest model supported by the
data which affords reasonable power. But if your goal is a model useful for
extrapolation (prediction in new circumstances), control, or design, then such a
strict empiricist policy can lead to disaster. For predictive models, the
inadequacy of the data at hand means you must resort to science, and in that
case, the domain experts (or regulators) get to have the last word on what
constitutes valid 'extra information'. I repeat my outrageous claim: You never
know nothing.
LBS.
_/ _/ _/_/ _/_/_/ _/_/_/ Lewis B Sheiner, MD (lewis@c255.ucsf.edu)
_/ _/ _/ _/ _/ Professor: Lab. Med., Biophmct. Sci.
_/ _/ _/ _/_/_/ _/_/ Mail: Box 0626, UCSF, SF,CA,94143
_/ _/ _/ _/ _/ Courier: Rm C255, 521 Parnassus,SF,CA,94122
_/_/ _/_/ _/_/_/ _/ 415-476-1965 (v), 415-476-2796 (fax)
From:"Serge Guzy"
Subject:RE: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Mon, 7 Oct 2002 09:30:00 -0700
I do agree with Ken that there is no science in the decision of fixing the correlation to
let say 0.5. I will add that fixing a correlation to zero or 1 is more scientific as it is
related to the notion of independence or linear dependence(often in the log domain)between
parameters. Based on our individual background, we make the decision of what seems to be more
or less scientific and Ken has a good point about fixing Ka to a unique value when we know that Ka
should be different for each individual. If you think in terms of simulation, fixing Ka or
forcing 100% correlation between 2 parameters when the true population dispersion is very very small(Ka)
or the true correlation between the two parameters is very high(~100%correlation) is the same. Of course
I am interested to know to what extent my assumptions will affect my prediction power.
Did somebody try to bootstrap the observed data as well as the intraindividual noise to see if
the "illness" was still present with correlation ~1. In my simulations, when I would get a correlation
near to 1 when in fact the true correlation was zero, the Bootstrap approach would give me correlation
ranging from zero to 1 while a true correlation near to 1 would stay near to 1 for all bootstrap samples.
From:"Serge Guzy"
Subject:RE: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Mon, 7 Oct 2002 09:30:42 -0700
I would be glad to get the raw data with all the information about the structural model, Dosage information,
mixed-effect type, intraindividual variance structure and of course observed data. I would be interested
to try another technique(we called the MC-PEM for Montecarlo Parametric Expectation Maximization) which so
far did not have problems with ill-conditioned matrices(we indeed got sometimes high correlation that were
really real or an artifact but resampling techniques were able to differentiable between reliable correlation
or not). I do not know to what extent what is ill is related to "poor data factor" vs. "numerical instability
or difficulty". To try the exact EM algorithm we developed would may be add some useful information.
Steve, let me know if there is a way to send me the whole information about the dataset you dealt with.
Serge Guzy, Ph.D.
Head of Pharmacometrics
Xoma
From:"Kowalski, Ken"
Subject:RE: [NMusers] OMEGA HAS A NONZERO BLOCK
Date: Mon, 7 Oct 2002 13:05:17 -0400
All,
And my only quibble with Lewis' reply is what does it mean to "resort to
science"? Is there some scientific theory that says the correlation is
0.5?...I doubt it. If previous scientific exploration results in data that
is readily extrapolated to the current problem that supports a strong prior
for the correlation then I'm all for it (just show me the data...I sound
like I'm from Missouri but I'm not). I agree we "never know nothing"...but
what do we know and how well do we know it? Just to say that I know
theoretically that the correlation can't be 1 doesn't give me Carte Blanc to
fix it or specify a strong informative prior in the absence of any
additional information. I also agree that we are on tenuous grounds in
using models for prediction when we estimate elements of Omega on the
boundary. But we are also on tenous grounds for prediction if we fix it to
0.5 in the absence of any truly independent information to support this
prior. I have no problem with domain experts providing the 'extra
information' I just want to see the theory or prior data that supports this
information, otherwise, it just sounds like hyper-subjectivity to me.
Ken
From:"Lewis B. Sheiner"
Subject:Re: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Mon, 07 Oct 2002 11:16:33 -0700
So, we converge: 'Science' is to be used to supply what the current data do not,
but science is to be based on empirical evidence. Unlike Ken, I'm not generally
inclined to demand the data on which the domain expert's opinion rests (I'd not
likely understand it's implications fully, not being a domain expert myself),
but I'm glad someone is checking ...
I, too, can't leave this without 2 last quibbles:
1. I was not defending fixing the corr to .5, although I do agree with Nick that
if you have to fix it to something (e.g., because it's too technically demanding
to use a full Bayesian framework), for CL and V that is, I'd bet .5 is closer to
truth than either 0 or 1 (that's me being a domain expert).
2. More importantly, though, it is, in my view, MORE 'hyper-subjective' to fix a
parameter to a sharp value (ANY sharp value, 0, 1, .5, ...other) lacking
empirical support than it is to specify a non-degenerate distribution for it
(also lacking such support).
I'll repeat my main message another way: THERE IS NO WAY OUT OF THE ILL-POSED
INVERSE PROBLEM-- REPEAT, NO WAY OUT -- THAT DOESN'T INVOLVE MAKING ASSUMPTIONS
THAT ARE UNTESTABLE ON THE CURRENT DATA. Which, to reassure Ken, is not to say
that this releases us from an obligation to be careful about those assumptions,
question the data on which they are based, etc., etc.
LBS.
From:"Kowalski, Ken"
Subject:RE: [NMusers] OMEGA HAS A NONZERO BLOCK
Date: Mon, 7 Oct 2002 14:57:44 -0400
Come on Lew, you would never question a domain expert's prior? I find that
hard to believe. Moreover, as your message suggests we often wear different
hats and sometimes assume the role of a domain expert anyway. If I was
assuming the domain expert role and was going to specify a prior I would
certainly want to provide the information to support my prior to give
credence to its use...am I the only one that feels this way? In the absence
of any 'credible' prior information, fixing the correlation to 1 is the best
we can do on the basis of the MOF and the current data. I'm assuming that
fixing the correlation to 0.5 is inconsistent with the current data in hand
(i.e., a considerably higher MOF). Nevertheless, if there is credible extra
information that the correlation is better represented by being fixed at
0.5, then that's fine. Serge's simulation results though seem to suggest
that the correlation being estimated on the boundary value of 1 is more
likely to occur when the true correlation is indeed high...presumably higher
than 0.5 (Serge correct me if I'm overstating your results).
Ken
From:Mats Karlsson
Subject:Re: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Mon, 07 Oct 2002 21:16:57 +0200
Dear all,
Just two things that are on the boundary of this discussion.
First, the use of OFV/MOF to discriminate between models incorporating a
covariance terms or not has been suggested. We know that the likelihood ratio
test doesn't work well in many situations. In the most recent issue of JPKPD, we
describe simulations that indicate that covariance terms are even worse than
variance terms (which are worse than fixed effects), when it comes to providing
appropriate tail areas for hypothesis testing using the LR test. The inclusion
of covariance terms often give large drops in OFV, even if the covariance truly
is zero.
Second, the solution everyone appears to gravitate towards is that additional
knowledge to be gained from the "domain experts" is useful. As I see it there
are two types of domain experts. Those of the particular drug in question and
those who are experienced in assessing interindividual variability components in
PK models in general (i.e. us). If the former can't say much about a particular
parameter, I think that we could still provide a better assessment regarding a
parameter than just fix it to a boundary (I agree with Ken that assuming a
variance of zero in a parameter like Ka or V is as unrealistic as assuming a
correlation of one). Providing some knowledge based on previously analysed drugs
is considerably easier for variance components that for covariances, but why not
turn to the literature or experience for guidance. This could be done very
simplistically (e.g. find an estimate for a similar drug and fix it to this) or
quite sophisticated (taking into account protein binding, route of elimination,
include as a prior, etc), depending on the importance of the particular
component for the purpose of the modelling.
Mats
From:Nick Holford
Subject:Re: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Tue, 08 Oct 2002 08:28:50 +1300
Ken,
I have several times asserted in this thread that it is never reasonable, as a domain expert in PK,
to asssme that CL and V are perfectly correlated.
To paraphrase your own words:
> I just want to see the theory or prior data that supports this
> information, otherwise, it just sounds like hyper-subjectivity to me.
> Is there some scientific theory that says the correlation is
> 1.0?...I doubt it.
Put up or shut up :-) {That include you too, Serge)
To support my own hyper-subjective viewpoint that CL and V are more likely to be positively
correlated (e.g. 0.5 ) I would point out that CL reflects function which increases with size
while V reflects structure which also increases with size. Given the common covariate of size then
I would expect CL and V to be positively correlated and if that was all that was involved then
indeed the correlation would be exactly 1. However, biology is not that simple. For example, in
the neonate CL typically increases rapidly (as liver and kidney function mature) while V often
decreases (as water is lost). In the elderly clearance may decrease somewhat, due to renal and
hepatic function decline, while V may increase (diazepam) or decrease (digoxin) as body composition
changes to more fat and less muscle. Add to these examples, the very obvious random between subject
variability in CL and V and it is necessarily the case that the correlation between CL and V CANNOT be
exactly 1. Size, age!
, organ
function (and other factors) will determine the magnitude and sign of the correlation.
I do not assert that 0.5 is THE ANSWER. It is a suggestion that is the least of 3 evils if we
choose not to take a fully Bayesian approach (as I have advocated and LBS has supported). The 3 evils
are to fix the correlation to 1, 0 or some other number. I contend that a priori the choice of 1 cannot
reflect the real world. The choice of 0 is possible but unlikely given the biology. For a typical case
(not at the extremes of life) then I expect the correlation to be positive. Based on simulation of a drug
resembling an aminoglycoside (in collaboration with Joga Gobburu and Diane Mould) and including the fixed
and random effects mentioned above the correlation between CL and V happened to be 0.3. Analysis
of a an actual adult aminoglycoside data set (with Ivan Mathews and Carl Kirkpatrick) estimated the correlation
of CL and V1 to be 0.56.
As a group (nmusers) I think we have frequently failed to document good estimates of the correlation
of PK parameters so I have to say it is hard to point to a review from the literature that would provide
data to test the theory. If someone can enlighten me with such a review I would be delighted to hear from
you. If you have estimates of the correlation between CL and V and would like to send them to me then I will
compile them and report back.
Nick
From:"Kowalski, Ken"
Subject:RE: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Mon, 7 Oct 2002 15:44:14 -0400
Mats,
I agree with you. Of course, it may be difficult to find an estimate of a
correlation (so many pop PK/PD results don't report this estimate and many
assume a diagonal Omega anyway). Moreover, if one has an estimate I would
also like to look at the precision of this estimate before I fixed it. If
its not very precisely estimated then I would want to take this imprecision
into account as it may be such a weak prior that it is of little help
anyway. In any event, this is the data I'm looking for to support the
prior.
Ken
From:"Kowalski, Ken"
Subject: RE: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Mon, 7 Oct 2002 16:06:43 -0400
Nick,
We are at an impasse. I can say the same to you...put up or shut up :-).
Put up the evidence to support fixing the correlation to some value less
than 1. In the absence of this evidence, the existing data is what I'm
relying on to fix it to 1 (it has the lowest MOF of any choice for the
correlation). I know you don't like that which is why we are at an impasse.
I'm willing to accept the limitations of the data I have in hand similar to
the zero variance component problem.
Ken
From:Leonid Gibiansky
Subject:RE: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Mon, 07 Oct 2002 16:40:42 -0400
Shouldn't we agree that there are no absolute rules that fit all the cases?
Suppose that you have an ill posed problem, with correlation close to one;
when you try to fix correlation to 0.5, OF increases, fit become worse.
Profiling of the OF evidence that, given the data, correlation=1
corresponds to the minimum. There are no strong prior information that
correlation should not be equal to one. Bootstrap show distribution of
correlation coefficients centered somewhere near 0.9.
I would put correlation to 1. Is there any reasons not to do it ? Prior
information should be very strong and reliable in order to shift the
balance from the data in hand to the prior data: if we would be so
confident in the prior information, why would we do a new analysis ?
On the other hand, if in the similar situation:
OF profile is very shallow with no difference between correlation = 1 and
correlation = 0,
bootstrap gives distribution close to the uniform, and
no prior information is available,
I would put correlation to zero.
Instead of bootstrap and profiling, which require a lot of efforts, one can
look on the standard errors of the correlation coefficient. If SE is small
- then the true parameter value is likely to be equal to the NONMEM
estimate. If SE is very high one can fix the parameter to any value (I
would not use 0.5, I prefer 0 or 1, but this can be a matter of taste). For
OMEGA-block size >3, it is hard to get standard errors for the correlation
coefficients from NONMEM. Then parameterization that I propose earlier may
help: it is straightforward to obtain standard errors for THETAs, and those
thetas characterize correlation of the parameters.
Prior knowledge may shift balance in either direction for the situation 2.
An intermediate situations can be decided on the case by case basis
weighing the balance of prior knowledge, new data, time to the final
report, etc.
Leonid
From:Nick Holford
Subject:Re: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Tue, 08 Oct 2002 10:20:22 +1300
Leonid,
Leonid Gibiansky wrote:
>
> Shouldn't we agree that there are no absolute rules that fit all the cases?
>
That viewpoint might be called the Bayesian perspective.
> I would put correlation to 1. Is there any reasons not to do it ?
Please read my earlier "put up or shut up" response to Ken and Serge for my reasons
why you should not do this. The priors against a correlation of 1 between CL and V are
so strong that any data that pushed onto that boundary should be regarded with extreme
suspicion. If you or anyone else can provide any theory to support the idea that CL and V could be
perfectly correlated in reality then please tell me.
Nick
--
Nick Holford, Divn Pharmacology & Clinical Pharmacology
University of Auckland, 85 Park Rd, Private Bag 92019, Auckland, New Zealand
email:n.holford@auckland.ac.nz tel:+64(9)373-7599x6730 fax:373-7556
http://www.health.auckland.ac.nz/pharmacology/staff/nholford/
From:"Steve Duffull"
Subject:RE: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Tue, 8 Oct 2002 10:31:41 +1000
Hi
I am not aware of any way to weight different components in the VC matrix differently. Therefore if you made one
component precise or imprecise then all components would be treated the same. This brings us back to the current
discussion of fixing an off diagonal component to X (X:0<=X<=1).
Steve
===================================
Stephen Duffull
School of Pharmacy
University of Queensland
Brisbane 4072
Australia
Tel +61 7 3365 8808
Fax +61 7 3365 1688
http://www.uq.edu.au/pharmacy/duffull.htm
University Provider Number: 00025B
From:Leonid Gibiansky
Subject:Re: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Tue, 08 Oct 2002 08:47:02 -0400
Nick,
I can suggest a case where the correlation between CL and V is 1. Suppose
that you have high variability in bioavailability. Then all the parameters
are scaled by the bioavailability. You have a choice of putting ETA to
bioavailability or using correlated CV, V with correlation equal to 1. I've
seen the data where these problems were equivalent (in terms of diagnostic
plots, OF, etc.). In fact, I recovered the solution looking on the NONMEM
results that gave me consistent CV-V correlation that was close to 1. I
ended up using F1=1*EXP(ETA), but this does not disqualify the example. You
may argue that more data would be needed to create a valid model, but
there were no more data available. I hope you would not argue that
correlation 0.5 would not be inappropriate in this case?
Leonid
From:"Serge Guzy"
Subject:RE: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Tue, 8 Oct 2002 09:07:46 -0700
I think Leonid summarizes very weel the potential scenarios (I talked about also) we could
encounter. I personally simulated the two scenarios he discussed which include high correlation,
small standard error and shallow OF profile with correlation characterized by high standard error.
I still suggest to use the bootstrap rather than the Hessian approach. It is may be time consuming
but can prevent wrongly standard error estimation.
Serge Guzy
Head of Pharmacometrics
Xoma
From:"Serge Guzy"
Subject:RE: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Tue, 8 Oct 2002 09:29:53 -0700
Ken, you stated correctly what I said. The problem is that I create an artifact by simulating an
hypothetical situation and try to retrieve the characteristics of the population I simulated. The
artifact is that I am using only Mathematics and Statistics and just a little bit of physiology
(PK parameters must be positive). The result of my simulations/fitting show that high simulated
correlation occur for all the bootstrap samples I generated. If I would not know there is correlation
I would stand by what I got using the scientific tool I used and that is it. Here come the
experts in the biologic field (I am not an expert) and tell me "Serge, this high correlation is not
realistic". I do not have an answer to that but I am a little concerned about what to do. What should
I do indeed? Should fitting results match always with what we know from Physiology? To what extent
Physiology is relevant when in fact we use a mathematical/statistical method to find a maximum of a
specific objective function? Should it be enough just to make sure the fitting parameter are positive?
If selecting a correlation to a specific value which correspond to an objective function that is not as
good as the one I got, then why not changing the value of all the parameters I believe are not realistic?
Serge
From:"Lewis B. Sheiner"
Subject:Re: [NMusers] OMEGA HAS A NONZERO BLOCK
Date:Tue, 08 Oct 2002 12:11:23 -0700
Ken points out that my numbering was messed up ... Here is a corrected message.
LBS.
==================================
... Which is why you have to be Bayesian and incorporate an honest estimate not
only of what the 'science' says is the best guess, but of how sure that guess
is. And yes, if the science contradicts the data, you may well prefer to act on
the science, not the data, as presumably the science is also based on data, and
apparently enough of it to make the current data appear questionable. This is
why you cannot in principle fix parameters (or models): the strength of the
scientific knowledge (on which such choices are based) is thereby asserted to be
infinite: no amount of data can change your mind about a parameter that is
fixed. To Serge, I ask the following: What is so sacrosanct about the current
data/analysis that you are inclined to accept its point estimates even when they
are very uncertain, and fly in the face of valid past experience?
Now, all this is a bit too theoretical. When data appear to contradict accepted
science, we look for explanations. We do not usually decide we will accept one
or the other without a good reason to do so (the fact that these data are mine,
and those contradictory data are yours is NOT a good reason). Similarly, if we
fix a parameter in our analysis, and then, by examining residuals, etc.,
conclude that the data contradict that choice, we will change it. So wer are
not all so far apart as it may see,
However, let's get back to the real problem we have been discussing. It is the
case that the data are NOT definitive; indeed, not even suggestive about
parameters we consider 'important'. We have had the following suggestions for
what to do (until we can do another experiment that does address those
parameters):
1. Hope the problem is not really there. This is exemplified by Leonid's
remarks, which point out that with further careful investigation, we may find
that the data do indeed have something to say about the problem parameters--that
the fault was not that the problem was ill-posed--but that we were using
inadequate methods of analysis. Unfortunately, many ill-posed problems really
are fundamentally ill-posed and no analysis method will reveal what is not
there.
2. Use the estimate from the data regardless (exemplified by the choice of corr
= 0 or corr = 1 if the estimate is close to that value). This method is clearly
the easiest: It solves the problem without any additional work (such as
consulting experts, or doing additinal analyses). However, it has two terrible
problems, as I have discussed: (i) It is well known that for certain simple
cases of extremely ill-posed problems (and presumably what I am about to say
generalizes to more complex cases) the actual value of the parameters estimated
depends exclusively on the realization of the noise in the particular data at
hand and NOT AT ALL on the 'true' parameter values (you can convince yourself of
this by considering the regression y = a.x1 + b.x2 + error, where, unbeknownst
to the analyst x1 = x2 -- And please don't answer me that of course the analyst
would notice that x1=x2; I sacrificed realism to make the concept clear). Not
only may the estimates be nonsense (which is harmful, we recall, not to the
current analysis, which is insensitive to the values of these parameters--which
insensitivity is causing the problem in the first place--but for extrapolation
to new conditions), but by fixing on these meaningless estimates, (ii) we are
asserting that not only are they sensible, they are also known perfectly! It
seems to me these two problems effectively rule out this choice, despite its
attractive simplicity and seeming objectivity. Again, and I stress this, the
method is ruled out only when prediction under new circumstances is the goal; it
is perfectly reasonable if only the current data are to be interpreted (but in
that case, any approach to the under-determined parameters is rational since
they should have no influence on any inferences). I have seen nothing in this
long thread of correspondence that suggests to me that either my analysis of
this choice is wrong, or that there is some advantage to it that I have
overlooked.
3. Eliminate the ill-conditioning by fixing the under-determined parameters to
reasonable values based on external evidence (science). This necessarily
involves consulting the experts, and, indeed, trusting them. This approach
dominates #1, since it eliminates problem (i). Problem (ii) persists, however,
but at least the estimates have some justification, even if we are asserting
them too strongly.
4. Proceed as in #3, but elicit from the experts an estimate of spread
(uncertainty) as well as location, and correctly incorporate this into the
analysis. The end result is the best possible description of the current state
of knowledge: past experience (from the experts) is properly balanced against
current data to yield a rational synthesis. Other than technical difficulties
(which can be formidable) this method is the most satisfying. The major
drawback with it has not yet been mentioned: it requires a statement of the most
complete possible model from the outset. However, science is all about
modifying our view of the model structure--not only of its parameters--as we
learn more. Without incredible contortions, the Bayesian approach cannot do
this, and even when it can be twisted into doing so, the technical difficulties
quickly become insurmountable, except for the simplest of problems.
So, we know what we should do if only it were possible (#4), and we know what we
should never do (#2). The art, as I see it, is a judicious blend of #3 and #4
-- that is, limit oneself to tractable and moderately sized models (equivalent
to fixing the parameters of larger models to boundary values yielding
scientifically reasonable approximate models, suitable to the scale of the
available data and the uses to which the model is to be put), and use
informative but non-degenerate priors for all remaining free parameters of those
models. This can be seen as using #3 'globally' and Bayesian methods (#4)
'locally'.
LBS.
--
_/ _/ _/_/ _/_/_/ _/_/_/ Lewis B Sheiner, MD (lewis@c255.ucsf.edu)
_/ _/ _/ _/ _/ Professor: Lab. Med., Biophmct. Sci.
_/ _/ _/ _/_/_/ _/_/ Mail: Box 0626, UCSF, SF,CA,94143
_/ _/ _/ _/ _/ Courier: Rm C255, 521 Parnassus,SF,CA,94122
_/_/ _/_/ _/_/_/ _/ 415-476-1965 (v), 415-476-2796 (fax)
From:Nick Holford
Subject:Re: [NMusers] OMEGA HAS A NONZERO BLOCK
Date: Wed, 09 Oct 2002 08:39:14 +1300
Leonid,
If you believe that bioavailability is the ONLY source of variability for both CL and V then
you will of course get a correlation of 1. But as I tried to explain previously we know from
biological science that there are several identified fixed effects with different relationships
(e.g. weight, age) for CL and V. Plus the unidentified fixed effects, which we
treat as random effects, guarantee that the correlation cannot be one.
So while it is true that the correlation of that component of ETA attributable to F can be
expected to be 1 for CL and V this is not the only source of variability in these parameters
and thus the overall correlaton must be less than 1.
Nick
___________________________________