conditional inference; why completeness matters

Nov 16, 2012 01:00

Earlier this week, another piece of statistical theory fell into place for me, this time inspired by reading Cox&Hinkley ( Read more... )

Leave a comment

Comments 1

random_walker November 16 2012, 22:35:35 UTC
heuristically at least you have the partial variance formula. letting I be the observed fisher information for the parameter of interest (\beta), and letting \eta the ancillary parameter:

var(I)=E[var(I)||\hat\eta]+var(E[I||\hat\eta])

or E[var(I)||\hat\eta]=var(I)-var(E[I||\hat\eta])

that is to say, the greater the variance of the conditional fisher information (the subtractor on the rhs), the lower the expected variance of the fisher information, which means better second-order properties for your estimator of \beta.

in re your last paragf: conditioning on the sufficient statistic will, as we've discussed, make inference impossible.

Reply


Leave a comment

Up