Abstract
We consider the treatment comparison problem in a general high-dimensional regression setting. In this article, we propose a nonparametric estimation approach based on partial sliced inverse regression (SIR) (Chiaromonte et al. in Ann Stat 30:475–497, 2002) and an extension of partial inverse mean matching (Carroll and Li in Stat Sin 5:667–688, 1995) without requiring a prespecified parametric model. A sparse estimation strategy is incorporated in our approach to enhance the interpretation of variable selection. Several simulation examples are used to compare our method with SIR and principal components analysis. Illustrative applications to two real datasets are also presented.
Similar content being viewed by others
References
Bura E, Cook RD (2001) Estimating the structural dimension of regressions via parametric inverse regression. J R Stat Soc Ser B 63:393–410
Carroll RJ, Li KC (1995) Binary regressors in dimension reduction model: A new look at treatment comparisons. Stat Sin 5:667–688
Chen CH, Li KC (1998) Can SIR be as popular as multiple linear regression? Stat Sin 8:289–316
Chiaromonte F, Cook RD, Li B (2002) Sufficient dimension reduction in regressions with categorical predictors. Ann Stat 30:475–497
Cook RD (1998) Regression graphics. Wiley, New York
Cook RD, Li B (2002) Dimension reduction for conditional mean in regression. Ann Stat 30:455–474
Cook RD, Ni L (2005) Sufficient dimension reduction via inverse regression: a minimum discrepancy approach. J Am Stat Assoc 100:410–428
Cook RD, Weisberg S (1991) Comment on “Sliced inverse regression for dimension”, by KC Li. J Am Stat Assoc 86:328–332
Eaton ML (1986) A characterization of spherical distributions. J Multivar Anal 20:272–276
Hall P, Li KC (1993) On almost linearity of low dimensional projections from high dimensional data. Ann Stat 21:867–889
Li B, Zha H, Chiaromonte F (2005) Contour regression: a general approach to dimension reduction. Ann Stat 33:1580–1616
Li KC (1991) Sliced inverse regression for dimension reduction (with discussion). J Am Stat Assoc 86: 316–342
Li KC (1992) On principal Hessian directions for data visualization and dimension reduction: another application of Stein’s lemma. J Am Stat Assoc 87:1025–1039
Li KC (1997) Nonlinear confounding in high-dimensional regression. Ann Stat 25:577–612
Li L (2007) Sparse sufficient dimension reduction. Biometrika 94:603–613
Li L, Cook RD, Nachtsheim CJ (2005) Model-free variable selection. J R Stat Soc Ser B 67:285–299
Li L, Nachtsheim CJ (2006) Sparse sliced inverse regression. Technometrics 48:503–510
Ni L, Cook RD, Tsai CL (2005) A note on shrinkage sliced inverse regression. Biometrika 92:242–247
Schott J (1994) Determining the dimensionality in sliced inverse regression. J Am Stat Assoc 89:141–148
Velilla S (1998) Assessing the number of linear components in a general regression problem. J Am Stat Assoc 93:1088–1098
Weisberg S (2005) Applied linear regression. Wiley, New York
Xia Y, Tong H, Li WK, Zhu LX (2002) An adaptive estimation of dimension reduction space. J R Stat Soc Ser B 64:363–410
Zhu Y, Zeng P (2006) Fourier methods for estimating the central subspace and the central mean subspace in regression. J Am Stat Assoc 101:1638–1651
Acknowledgments
We are grateful to the reviewers for their helpful comments that have greatly improved the presentation of this paper. Lue’s research was supported in part by a grant from the National Science Council of Taiwan (Grant No. NSC95-2118-M-029-004).
Author information
Authors and Affiliations
Corresponding author
Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Lue, HH., You, BR. High-dimensional regression analysis with treatment comparisons. Comput Stat 28, 1299–1317 (2013). https://doi.org/10.1007/s00180-012-0357-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00180-012-0357-6