Abstract
Abstract
Let X be an N p (o, ∑) random vector. Suppose besides n observations on X, m observations on the first q(q < p) coordinates are available. Eaton (1970), for this set up, has given a minimax estimator of ∑, which is better than the MLE. We, in this paper, obtain a class of constant risk minimax estimators (Eaton's estimator is its member), and hence estimators better than any member of this class. Similar results are derived also for the estimation of ∑-1. The loss functions considered are those of Selliah (1964) and James and Stein (1961) for the estimation of ∑ and an analogue of Stein's loss function for the estimation of ∑-1.
Get full access to this article
View all access options for this article.
