heuristically at least you have the partial variance formula. letting I be the observed fisher information for the parameter of interest (\beta), and letting \eta the ancillary parameter:
var(I)=E[var(I)||\hat\eta]+var(E[I||\hat\eta])
or E[var(I)||\hat\eta]=var(I)-var(E[I||\hat\eta])
that is to say, the greater the variance of the conditional fisher information (the subtractor on the rhs), the lower the expected variance of the fisher information, which means better second-order properties for your estimator of \beta.
in re your last paragf: conditioning on the sufficient statistic will, as we've discussed, make inference impossible.
Comments 1
var(I)=E[var(I)||\hat\eta]+var(E[I||\hat\eta])
or E[var(I)||\hat\eta]=var(I)-var(E[I||\hat\eta])
that is to say, the greater the variance of the conditional fisher information (the subtractor on the rhs), the lower the expected variance of the fisher information, which means better second-order properties for your estimator of \beta.
in re your last paragf: conditioning on the sufficient statistic will, as we've discussed, make inference impossible.
Reply
Leave a comment