On improved predictive density estimation with parametric constraints

Dominique Fourdrinier, Éric Marchand, Ali Righi, William E. Strawderman
2011 Electronic Journal of Statistics  
We consider the problem of predictive density estimation for normal models under Kullback-Leibler loss (KL loss) when the parameter space is constrained to a convex set. More particularly, we assume that X ∼ N p (µ, v x I) is observed and that we wish to estimate the density of Y ∼ N p (µ, v y I) under KL loss when µ is restricted to the convex set C ⊂ R p . We show that the best unrestricted invariant predictive density estimatorp U is dominated by the Bayes estimatorp π C associated to the
more » ... form prior π C on C. We also study so called plug-in estimators, giving conditions under which domination of one estimator of the mean vector µ over another under the usual quadratic loss, translates into a domination result for certain corresponding plug-in density estimators under KL loss. Risk comparisons and domination results are also made for comparisons of plug-in estimators and Bayes predictive density estimators. Additionally, minimaxity and domination results are given for the cases where: (i) C is a cone, and (ii) C is a ball.
doi:10.1214/11-ejs603 fatcat:sejqi6t4b5ajtl6vg3fsmqthwm