Skip to main content
Log in

On a new hybrid estimator for the central mean space

  • Published:
Journal of Systems Science and Complexity Aims and scope Submit manuscript

Abstract

Existing estimators of the central mean space are known to have uneven performances across different types of link functions. By combining the strength of the ordinary least squares and the principal Hessian directions, the authors propose a new hybrid estimator that successfully recovers the central mean space for a wide range of link functions. Based on the new hybrid estimator, the authors further study the order determination procedure and the marginal coordinate test. The superior performance of the hybrid estimator over existing methods is demonstrated in extensive simulation studies.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Li K C, Sliced inverse regression for dimension reduction. Journal of American Statistical Association, 1991, 86: 316–327.

    Article  MathSciNet  MATH  Google Scholar 

  2. Cook R D, Regression Graphics: Ideas for Studying Regressions through Graphics, Wiley, New York, 1998, 102–106.

    Book  MATH  Google Scholar 

  3. Cook R D, On the interpretation of regression plots. Journal of American Statistical Association, 1994, 89: 177–189.

    Article  MathSciNet  MATH  Google Scholar 

  4. Cook R D and Weisberg S, Discussion of sliced inverse regression for dimension reduction. Journal of American Statistical Association, 1991, 86: 28–33.

    MathSciNet  MATH  Google Scholar 

  5. Cook R D and Li B, Dimension reduction for conditional mean in regression. Annals of Statistics, 2002, 40: 455–474.

    Article  MathSciNet  MATH  Google Scholar 

  6. Li K C and Duan N, Regression analysis under link violation. Annals of Statistics, 1989, 17: 1009–1052.

    Article  MathSciNet  MATH  Google Scholar 

  7. Li K C, On principal Hessian directions for data visualization and dimension reduction: Another application of Steins lemma. Journal of American Statistical Association, 1992, 87: 1025–1039.

    Article  MathSciNet  MATH  Google Scholar 

  8. Dong Y and Li B, Dimension reduction for non-elliptically distributed predictors: Second-order methods. Biometrika, 2010, 97: 279–294.

    Article  MathSciNet  MATH  Google Scholar 

  9. Dong Y, A note on moment-based sufficient dimension reduction estimators, Statistics and Its Interface, 2016, 9: 141–145.

    Article  MathSciNet  Google Scholar 

  10. Gannoun A and Saracco J, An asymptotic theory for SIRa method. Statistica Sinica, 2003, 13: 97–310.

    MathSciNet  MATH  Google Scholar 

  11. Ye Z and Weiss R E, Using the bootstrap to select one of a new class of dimension reduction methods. Journal of American Statistical Association, 2003, 98: 968–979.

    Article  MathSciNet  MATH  Google Scholar 

  12. Zhu L, Ohtaki M, and Li Y, On hybrid methods of inverse regression-based algorithms. Computational Statistics and Data Analysis, 2007, 51: 2621–2635.

    Article  MathSciNet  MATH  Google Scholar 

  13. Li B and Wang S, On directional regression for dimension reduction. Journal of American Statistical Association, 2007, 479: 997–1008.

    Article  MathSciNet  MATH  Google Scholar 

  14. Shanker A J and Prendergast L A, Iterative application of dimension reduction methods. Electronic Journal of Statistics, 2011, 5: 1471–1494.

    Article  MathSciNet  MATH  Google Scholar 

  15. Yu Z, Dong Y, and Huang M, General directional regression. Journal of Multivariate Analysis, 2014, 124: 94–104.

    Article  MathSciNet  MATH  Google Scholar 

  16. Cook R D, Testing predictor contributions in sufficient dimension reduction. Annals of Statistics, 2004, 32: 1062–1092.

    Article  MathSciNet  MATH  Google Scholar 

  17. Cook R D and Li B, Determining the dimension of iterative hessian transformation. Annals of Statistics, 2004, 32: 2501–2531.

    Article  MathSciNet  MATH  Google Scholar 

  18. Shao Y, Cook R D, and Weisberg S, Marginal tests with sliced average variance estimation. Biometrika, 2007, 94: 285–296.

    Article  MathSciNet  MATH  Google Scholar 

  19. Yu Z and Dong Y, Model-free coordinate test and variable selection via directional regression, Statistica Sinica, 2016, 26: 1159–1174.

    MathSciNet  MATH  Google Scholar 

  20. Cook R D and Ni L, Sufficient dimension reduction via inverse regression. Journal of American Statistical Association, 2005, 100: 410–428.

    Article  MathSciNet  MATH  Google Scholar 

  21. Yoo J K and Cook R D, Optimal sufficient dimension reduction for the conditional mean in multivariate regression. Biometrika, 2007, 94: 231–242.

    Article  MathSciNet  MATH  Google Scholar 

  22. Li B, Cook R D, and Chiaromonte F, Dimension reduction for the conditional mean in regressions with categorical predictors. Annals of Statistics, 2003, 31: 1636–1668.

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qi Xia.

Additional information

This paper was recommended for publication by Editor SHAO Jun.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xia, Q., Dong, Y. On a new hybrid estimator for the central mean space. J Syst Sci Complex 30, 111–121 (2017). https://doi.org/10.1007/s11424-017-6227-0

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11424-017-6227-0

Keywords

Navigation