lagged absorption and empirical input function
Hi,
I'm trying to implement an absorption lag-time into a model described by
differential equations and using an empirical input function.
My initial idea was to define a MTIME(1) corresponding to the lag-time and to
use MPAST(1) to set the start of the input as below.
DADT(1)=(DOSE/V)*INPUT*MPAST(1)-K*A(1)
However, I only end up with no input at all in the system resulting in
simulated DV of 0 ± residual error. The issue seems related to the use MPAST
with the input function but I don't know how.
Would anyone be kind enough to point me in the right direction?
Thanks,
Pierre-Olivier