Re: Negative objective functions
From: James <J.G.Wright@ncl.ac.uk>
Subject: Re: Negative objective functions
Date: Sat, 31 Oct 1998 16:23:47 +0000
Dear Professor Sheiner,
My definition of likelihood as a probability is taken from Kendall's Advanced Theory of Statistics. However I appreciate that this distinction may be blurred or even contradicted in other texts. I agree that it makes no difference to your maximum likelihood estimation to include an additive constant in your objective function. However, if this constant is data-dependent (ie not constant as the number of observations changes) how does this affect the asymptotic properties of NONMEM estimates? Indeed is it even sensible to talk about asymptotics when your estimator may not be consistent? Is this constant necessary for any reason?
My interest stems from my attempts to produce leverage diagnostics for individuals in populations analysed using NONMEM.
James
At 10:17 AM 10/30/98 -0800, you wrote:
>PROPORTIONAL to prob, not = prob.
>
>James wrote:
>>
>> Dear Professor Sheiner,
>>
>> This agrees with my understanding of the situation. However, my
>> understanding of likelihood is that is it a probability, and therefore must
>> be between 0 and 1.
>>
>> James
>>
>> At 08:43 PM 10/29/98 -0800, you wrote:
>> >The objective function is not a sum of squares, it is
>> >-2 times the log of the likelihood. The likelihood, in
>> >simple npormal problesm is a sum of squares. If that sum
>> >is >1 then -2 log likelihood will be negative.
>> >
>> >The likelihood in NONMEM is usually more complicated
>> >than a simple sum of squares, but it may still be
>> >>1 and hence the obj fn be negative. The absolute
>> >value of the obj fn is meaningless, as a likelihood
>> >is only defined up to an arbitrary proportiojnality
>> >constant. Only differences
>> >between obj functions of nested models are meaningful.
>> >
>> >LBS.
>> >
>> >
>Attachment Converted: "d:\prog\attach\vcard.vcf"