Abstract
Existing estimators of the central mean space are known to have uneven performances across different types of link functions. By combining the strength of the ordinary least squares and the principal Hessian directions, the authors propose a new hybrid estimator that successfully recovers the central mean space for a wide range of link functions. Based on the new hybrid estimator, the authors further study the order determination procedure and the marginal coordinate test. The superior performance of the hybrid estimator over existing methods is demonstrated in extensive simulation studies.
Similar content being viewed by others
References
Li K C, Sliced inverse regression for dimension reduction. Journal of American Statistical Association, 1991, 86: 316–327.
Cook R D, Regression Graphics: Ideas for Studying Regressions through Graphics, Wiley, New York, 1998, 102–106.
Cook R D, On the interpretation of regression plots. Journal of American Statistical Association, 1994, 89: 177–189.
Cook R D and Weisberg S, Discussion of sliced inverse regression for dimension reduction. Journal of American Statistical Association, 1991, 86: 28–33.
Cook R D and Li B, Dimension reduction for conditional mean in regression. Annals of Statistics, 2002, 40: 455–474.
Li K C and Duan N, Regression analysis under link violation. Annals of Statistics, 1989, 17: 1009–1052.
Li K C, On principal Hessian directions for data visualization and dimension reduction: Another application of Steins lemma. Journal of American Statistical Association, 1992, 87: 1025–1039.
Dong Y and Li B, Dimension reduction for non-elliptically distributed predictors: Second-order methods. Biometrika, 2010, 97: 279–294.
Dong Y, A note on moment-based sufficient dimension reduction estimators, Statistics and Its Interface, 2016, 9: 141–145.
Gannoun A and Saracco J, An asymptotic theory for SIRa method. Statistica Sinica, 2003, 13: 97–310.
Ye Z and Weiss R E, Using the bootstrap to select one of a new class of dimension reduction methods. Journal of American Statistical Association, 2003, 98: 968–979.
Zhu L, Ohtaki M, and Li Y, On hybrid methods of inverse regression-based algorithms. Computational Statistics and Data Analysis, 2007, 51: 2621–2635.
Li B and Wang S, On directional regression for dimension reduction. Journal of American Statistical Association, 2007, 479: 997–1008.
Shanker A J and Prendergast L A, Iterative application of dimension reduction methods. Electronic Journal of Statistics, 2011, 5: 1471–1494.
Yu Z, Dong Y, and Huang M, General directional regression. Journal of Multivariate Analysis, 2014, 124: 94–104.
Cook R D, Testing predictor contributions in sufficient dimension reduction. Annals of Statistics, 2004, 32: 1062–1092.
Cook R D and Li B, Determining the dimension of iterative hessian transformation. Annals of Statistics, 2004, 32: 2501–2531.
Shao Y, Cook R D, and Weisberg S, Marginal tests with sliced average variance estimation. Biometrika, 2007, 94: 285–296.
Yu Z and Dong Y, Model-free coordinate test and variable selection via directional regression, Statistica Sinica, 2016, 26: 1159–1174.
Cook R D and Ni L, Sufficient dimension reduction via inverse regression. Journal of American Statistical Association, 2005, 100: 410–428.
Yoo J K and Cook R D, Optimal sufficient dimension reduction for the conditional mean in multivariate regression. Biometrika, 2007, 94: 231–242.
Li B, Cook R D, and Chiaromonte F, Dimension reduction for the conditional mean in regressions with categorical predictors. Annals of Statistics, 2003, 31: 1636–1668.
Author information
Authors and Affiliations
Corresponding author
Additional information
This paper was recommended for publication by Editor SHAO Jun.
Rights and permissions
About this article
Cite this article
Xia, Q., Dong, Y. On a new hybrid estimator for the central mean space. J Syst Sci Complex 30, 111–121 (2017). https://doi.org/10.1007/s11424-017-6227-0
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11424-017-6227-0