Abstract
In many regression analysis, the authors are interested in regression mean of response variate given predictors, not its the conditional distribution. This paper is concerned with dimension reduction of predictors in sense of mean function of response conditioning on predictors. The authors introduce the notion of partial dynamic central mean dimension reduction subspace, different from central mean dimension reduction subspace, it has varying subspace in the domain of predictors, and its structural dimensionality may not be the same point by point. The authors study the property of partial dynamic central mean dimension reduction subspace, and develop estimated methods called dynamic ordinary least squares and dynamic principal Hessian directions, which are extension of ordinary least squares and principal Hessian directions based on central mean dimension reduction subspace. The kernel estimate methods for dynamic ordinary least squares and dynamic Principal Hessian Directions are employed, and large sample properties of estimators are given under the regular conditions. Simulations and real data analysis demonstrate that they are effective.
Similar content being viewed by others
References
Li K C, Sliced inverse regression for dimension reduction, Jounal of American Statistical Association, 1991, 86(414): 316–327.
Cook R D and Weisberg, Comment on the “Sliced inverse regression for dimension reduction”, Jounal of American Statistical Association, 1991, 86(414): 328–332.
Li B and Wang S, On directional regression for dimension reduction, Jounal of American Statistical Association, 2007, 102(479): 997–1008.
Cook R D and Li B, Dimension reduction for conditional mean in regression, The Annals of Statistics, 2002, 30(2): 455–474.
Li K C, On principal Hessian directions for data visualization and dimension reduction: Another application of Stein’s lemma, Jounal of American Statistical Association, 1992, 87(420): 1025–1039.
Cook R D and Li B, Determing the dimension of iterative hessian transformation, The Annals of Statistics, 2004, 32(6): 2501–2531.
Chiaromonte F, Cook R D, and Li B, Sufficient dimensions reduction in regressions with categorical predictors, The Annals of Statistics, 2002, 30(2): 475–497.
Li B, Cook R D, and Chiaromonte F, Dimension reduction for the conditional mean in regressions with categorical predictors, The Annals of Statistics, 2003, 31(5): 1636–1668.
Wen X R and Cook R D, Optimal sufficient dimension reduction in regressions with categorical predictors, Journal of Statistical Planning and Inference, 2007, 137(6): 1961–1978.
Wang Q and Yin X R, Sufficient dimension reduction and variable selection for regression mean function with two types of predictors, Statistics and Probability Letters, 2008, 78: 2798–2803.
Yoo J K, Partial moment-based sufficient dimension reduction, Statistics and Probability Letters, 2009, 79(4): 450–456.
Shao Y W, Cook R D, and Weisberg S, Partial central subspace and sliced average variance estimation, Journal of Statistical Planning and Inference, 2009, 139: 952–961.
Hilafu H and Yin X R, Sufficient dimension reduction in multivariate regressions with categorical predictors, Computational Statistics and Data Analysis, 2013, 63: 139–147.
Feng Z H, Wen X M, Yu Z, et al., On partial sufficient dimension reduction with applications to partially linear multi-index models, Journal of the American Statistical Association, 2013, 108(51): 237–246.
Li L X, Li B, and Zhu L X, Groupwise Dimension Reduction, Journal of the American Statistical Association, 2010, 105(491): 1188–1201.
Yin J X, Geng Z, Li R Z, et al., Nonparametric convariance model, Statistica Sinica, 2010, 20: 469–479.
Fan J Q and Yao Q W, Efficient estimation of conditional variance functions in stochastic regression, Biometrika, 1998, 85(3): 645–660.
Zhu L X, Asymptotics for kernel estimate of sliced inverse regression, Annals of Statistics, 1996, 24(3): 1053–1068.
Ye Z S and Weiss R E, Using the bootstrap to select one of a new class of dimension reduction methods, Journal of the American Statistical Association, 2003, 98(464): 968–979.
Zhu L X, Miao B Q, and Peng H, On sliced inverse regression with high-dimensional covariates, Journal of the American Statistical Association, 2006, 101(474): 630–643.
Luo W and Li B, Combining eigenvalues and variation of eigenvectors for order determination, Biometrika, 2016, 103(4): 875–887.
Fan J Q and Huang T, Profile likelihood inferences on semiparametric varying-coefficient partially linear models, Bernoulli, 2005, 11(6): 1031–1057.
Author information
Authors and Affiliations
Corresponding author
Additional information
This research was supported by the Natural Science Foundation of Fujian Province of China under Grant No. 2018J01662, and High-Level Cultivation Project of Fuqing Branch of Fujian Normal University under Grant No. KY2018S02.
This paper was recommended for publication by Editor SUN Liuquan.
Rights and permissions
About this article
Cite this article
Gan, S., Yu, Z. Partial Dynamic Dimension Reduction for Conditional Mean in Regression. J Syst Sci Complex 33, 1585–1601 (2020). https://doi.org/10.1007/s11424-020-8329-3
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11424-020-8329-3