Skip to main content
Log in

Partial Dynamic Dimension Reduction for Conditional Mean in Regression

  • Published:
Journal of Systems Science and Complexity Aims and scope Submit manuscript

Abstract

In many regression analysis, the authors are interested in regression mean of response variate given predictors, not its the conditional distribution. This paper is concerned with dimension reduction of predictors in sense of mean function of response conditioning on predictors. The authors introduce the notion of partial dynamic central mean dimension reduction subspace, different from central mean dimension reduction subspace, it has varying subspace in the domain of predictors, and its structural dimensionality may not be the same point by point. The authors study the property of partial dynamic central mean dimension reduction subspace, and develop estimated methods called dynamic ordinary least squares and dynamic principal Hessian directions, which are extension of ordinary least squares and principal Hessian directions based on central mean dimension reduction subspace. The kernel estimate methods for dynamic ordinary least squares and dynamic Principal Hessian Directions are employed, and large sample properties of estimators are given under the regular conditions. Simulations and real data analysis demonstrate that they are effective.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Li K C, Sliced inverse regression for dimension reduction, Jounal of American Statistical Association, 1991, 86(414): 316–327.

    Article  MathSciNet  Google Scholar 

  2. Cook R D and Weisberg, Comment on the “Sliced inverse regression for dimension reduction”, Jounal of American Statistical Association, 1991, 86(414): 328–332.

    MATH  Google Scholar 

  3. Li B and Wang S, On directional regression for dimension reduction, Jounal of American Statistical Association, 2007, 102(479): 997–1008.

    Article  MathSciNet  Google Scholar 

  4. Cook R D and Li B, Dimension reduction for conditional mean in regression, The Annals of Statistics, 2002, 30(2): 455–474.

    Article  MathSciNet  Google Scholar 

  5. Li K C, On principal Hessian directions for data visualization and dimension reduction: Another application of Stein’s lemma, Jounal of American Statistical Association, 1992, 87(420): 1025–1039.

    Article  MathSciNet  Google Scholar 

  6. Cook R D and Li B, Determing the dimension of iterative hessian transformation, The Annals of Statistics, 2004, 32(6): 2501–2531.

    Article  MathSciNet  Google Scholar 

  7. Chiaromonte F, Cook R D, and Li B, Sufficient dimensions reduction in regressions with categorical predictors, The Annals of Statistics, 2002, 30(2): 475–497.

    Article  MathSciNet  Google Scholar 

  8. Li B, Cook R D, and Chiaromonte F, Dimension reduction for the conditional mean in regressions with categorical predictors, The Annals of Statistics, 2003, 31(5): 1636–1668.

    Article  MathSciNet  Google Scholar 

  9. Wen X R and Cook R D, Optimal sufficient dimension reduction in regressions with categorical predictors, Journal of Statistical Planning and Inference, 2007, 137(6): 1961–1978.

    Article  MathSciNet  Google Scholar 

  10. Wang Q and Yin X R, Sufficient dimension reduction and variable selection for regression mean function with two types of predictors, Statistics and Probability Letters, 2008, 78: 2798–2803.

    Article  MathSciNet  Google Scholar 

  11. Yoo J K, Partial moment-based sufficient dimension reduction, Statistics and Probability Letters, 2009, 79(4): 450–456.

    Article  MathSciNet  Google Scholar 

  12. Shao Y W, Cook R D, and Weisberg S, Partial central subspace and sliced average variance estimation, Journal of Statistical Planning and Inference, 2009, 139: 952–961.

    Article  MathSciNet  Google Scholar 

  13. Hilafu H and Yin X R, Sufficient dimension reduction in multivariate regressions with categorical predictors, Computational Statistics and Data Analysis, 2013, 63: 139–147.

    Article  MathSciNet  Google Scholar 

  14. Feng Z H, Wen X M, Yu Z, et al., On partial sufficient dimension reduction with applications to partially linear multi-index models, Journal of the American Statistical Association, 2013, 108(51): 237–246.

    Article  MathSciNet  Google Scholar 

  15. Li L X, Li B, and Zhu L X, Groupwise Dimension Reduction, Journal of the American Statistical Association, 2010, 105(491): 1188–1201.

    Article  MathSciNet  Google Scholar 

  16. Yin J X, Geng Z, Li R Z, et al., Nonparametric convariance model, Statistica Sinica, 2010, 20: 469–479.

    MathSciNet  Google Scholar 

  17. Fan J Q and Yao Q W, Efficient estimation of conditional variance functions in stochastic regression, Biometrika, 1998, 85(3): 645–660.

    Article  MathSciNet  Google Scholar 

  18. Zhu L X, Asymptotics for kernel estimate of sliced inverse regression, Annals of Statistics, 1996, 24(3): 1053–1068.

    Article  MathSciNet  Google Scholar 

  19. Ye Z S and Weiss R E, Using the bootstrap to select one of a new class of dimension reduction methods, Journal of the American Statistical Association, 2003, 98(464): 968–979.

    Article  MathSciNet  Google Scholar 

  20. Zhu L X, Miao B Q, and Peng H, On sliced inverse regression with high-dimensional covariates, Journal of the American Statistical Association, 2006, 101(474): 630–643.

    Article  MathSciNet  Google Scholar 

  21. Luo W and Li B, Combining eigenvalues and variation of eigenvectors for order determination, Biometrika, 2016, 103(4): 875–887.

    Article  MathSciNet  Google Scholar 

  22. Fan J Q and Huang T, Profile likelihood inferences on semiparametric varying-coefficient partially linear models, Bernoulli, 2005, 11(6): 1031–1057.

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhou Yu.

Additional information

This research was supported by the Natural Science Foundation of Fujian Province of China under Grant No. 2018J01662, and High-Level Cultivation Project of Fuqing Branch of Fujian Normal University under Grant No. KY2018S02.

This paper was recommended for publication by Editor SUN Liuquan.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gan, S., Yu, Z. Partial Dynamic Dimension Reduction for Conditional Mean in Regression. J Syst Sci Complex 33, 1585–1601 (2020). https://doi.org/10.1007/s11424-020-8329-3

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11424-020-8329-3

Keywords

Navigation