Minimum value of objective function..

4 messages 3 people Latest: Oct 05, 2001

Minimum value of objective function..

From: Beautyemy Date: October 05, 2001 technical
From: "=?big5?B?YmVhdXR5ZW15?=" <beautyemy@sinamail.com> Subject: Minimum value of objective function.. Date: Fri, 5 Oct 2001 14:47:05 +0800 Dear NONMEM users: As a beginer of NONMEM, I have a stupid question...... Although minimazation successfully, I got minimum value of objective function around 1420...Is it normal? or there is still something wrong with my model?? Thanks for any suggestion.....

Re: Minimum value of objective function..

From: Scott VanWart Date: October 05, 2001 technical
From: Scott VanWart <scott.vanwart@cognigencorp.com> Subject: Re: Minimum value of objective function.. Date: Fri, 05 Oct 2001 10:03:50 -0400 Dear Beautyemy, The minimum value of the objective function can be thought of as the overall error in the prediction of drug concentrations based upon observed data. It is therefore logical that the more data you have, the more likely it is that the objective function value will be larger. Comparing the objective function value usually comes into play when when you are trying to evaluate hierarchal models to see which model produced the smallest sum of weighted squared residuals. However, you should pay more attention to other criteria which would let you know how representative your model is to the system you are trying to characterize. Some possible suggestions would be to look at the following goodness of fit criteria: 1. Diagnostic plots - predicted vs. Observed Cp will give you an overall sense of how well the model is doing. Weighted residuals vs. predicted Cp, residuals vs. time, and weighted residuals vs. time will also help diagnose potential errors related to the model selected and the variance structure. 2. Check the values estimated for your parameters as well as their precision (standard errors). Do the results agree with historical data? If precisions are bad, your model may be overparameterized. 3. Check the correlation matrix to make sure parameters are not highly correlated, which would result in some identifiability issues with your model. Hope this helps! Scott Van Wart ------------------------------- Scott Van Wart Population PK/PD Scientist Cognigen Corporation 395 Youngs Road Williamsville, NY 14221 Tel. (716) 633-3463 Ext 241 Fax. (716) 633-7404 email: scott.vanwart@cognigencorp.com Web: www.cognigencorp.com

RE: Minimum value of objective function..

From: William Bachman Date: October 05, 2001 technical
From: "Bachman, William" <bachmanw@globomax.com> Subject: RE: Minimum value of objective function.. Date: Fri, 5 Oct 2001 12:09:37 -0400 In answer to your question, the numerical value of the objective function has no meaning by itself. The value will depend, among other things, on the amount of data you have. The number can be large, small or even negative. By itself, it says nothing about your model. It is related to the sum of squared errors (more complicated than this in actuality). It is important only in a relative sense. Relative to the objective function value of a competing model. One typically compares two model objective functions using some selected criterion. For non-nested models, one could use the Aikaike Information Criterion (AIC) (the objective function plus two times the number of parameter) to compare the models. The lower AIC is "better". For nested models, you can use the likelihood ratio test. With the likelihood ratio test, you assume the difference in objective functions between the two models is chi squared distributed. You then make a decision as to which model is "better" based on a preselected significance level, the degrees of freedom (difference in total number of parameters in the models) and the critical chi square value (for the chosen level of significance and degrees of freedom). If a nested model with fewer parameters has an objective function value lower by an amount larger than the critical chi square value than the "larger" model, than it is the "better" model. Do not use the difference in objective function value as the sole criterion of goodness-of-fit. It is possible to have a lower objective function value and a worse fit. Also consider, the change in magnitude of the variance parameters and the diagnostic plots. William J. Bachman, Ph.D. GloboMax LLC 7250 Parkway Dr., Suite 430 Hanover, MD 21076 Voice (410) 782-2212 FAX (410) 712-0737 bachmanw@globomax.com
From: "Bachman, William" <bachmanw@globomax.com> Subject: RE: Minimum value of objective function.. Clarification Date: Fri, 5 Oct 2001 12:18:01 -0400 Although this is true: If a nested model with fewer parameters has an objective function value lower by an amount larger than the critical chi square value than the "larger" model, than it is the "better" model. the more typical case is: If within a set of nested models, a model with MORE parameters has an objective function value lower by an amount larger than the critical chi square value than the "larger" model, than it is the "better" model. William J. Bachman, Ph.D. GloboMax LLC 7250 Parkway Dr., Suite 430 Hanover, MD 21076 Voice (410) 782-2212 FAX (410) 712-0737 bachmanw@globomax.com