Abstract
The Akaike information criterion, AIC, and Mallows’ C p statistic have been proposed for selecting a smaller number of regressors in the multivariate regression models with fully unknown covariance matrix. All of these criteria are, however, based on the implicit assumption that the sample size is substantially larger than the dimension of the covariance matrix. To obtain a stable estimator of the covariance matrix, it is required that the dimension of the covariance matrix is much smaller than the sample size. When the dimension is close to the sample size, it is necessary to use ridge-type estimators for the covariance matrix. In this article, we use a ridge-type estimators for the covariance matrix and obtain the modified AIC and modified C p statistic under the asymptotic theory that both the sample size and the dimension go to infinity. It is numerically shown that these modified procedures perform very well in the sense of selecting the true model in large dimensional cases.
2000 Mathematics Subject Classification:
Acknowledgments
We would like to thank the Associate Editor and three reviewers for many valuable comments and helpful suggestions which led to an improved version of this article. Research of the first author was supported in part by Grant-in-Aid for Scientific Research (19200020 and 21540114), Japan. The research of the second author was supported by NSERC.