Abstract
The Bayesian horseshoe estimator is known for its robustness when handling noisy and sparse big data problems. This paper presents two extensions of the regular Bayesian horseshoe: (i) the grouped Bayesian horseshoe and (ii) the hierarchical Bayesian grouped horseshoe. The advantages of the proposed methods are their flexibility in handling grouped variables through extra shrinkage parameters at the group and within-group levels. We apply the proposed methods to the important class of additive models where group structures naturally exist, and we demonstrate that the grouped hierarchical Bayesian horseshoe has promising performance on both simulated and real data.
References
Alcalá, J., Fernández, A., Luengo, J., Derrac, J., GarcÃa, S., Sánchez, L., Herrera, F.: Keel data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J. Multiple-Valued Logic Soft Comput. 17(2–3), 255–287 (2010)
Breheny, P., Huang, J.: Penalized methods for bi-level variable selection. Stat. Interface 2(3), 369–380 (2009)
Breiman, L.: Better subset regression using the nonnegative garrote. Technometrics 37(4), 373–384 (1995)
Bühlmann, P., Geer, S.V.D.: Statistics for High-Dimensional Data: Methods, Theory and Applications. Springer Science & Business Media, New York (2011)
Carvalho, C.M., Polson, N.G., Scott, J.G.: Handling sparsity via the horseshoe. JMLR 5, 73–80 (2009)
Carvalho, C.M., Polson, N.G., Scott, J.G.: The horseshoe estimator for sparse signals. Biometrika 97(2), 465–480 (2010)
Fan, J., Li, R.: Variable selection via nonconcave penalized likelihood and its oracle properties. J. Am. Stat. Assoc. 96(456), 1348–1360 (2001)
Hoerl, A.E., Kennard, R.W.: Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970)
Huang, J., Ma, S., Xie, H., Zhang, C.H.: A group bridge approach for variable selection. Biometrika 96(2), 339–355 (2009)
Makalic, E., Schmidt, D.F.: A simple sampler for the horseshoe estimator. IEEE Signal Process. Lett. 23(1), 179–182 (2016)
Park, T., Casella, G.: The Bayesian lasso. J. Am. Stat. Assoc. 103(482), 681–686 (2008)
Yuan, M., Lin, Y.: Model selection and estimation in regression with grouped variables. J. R. Stat. Soc. Ser. B (Stat. Methodol.) 68(1), 49–67 (2006)
Zhao, P., Rocha, G., Yu, B.: The composite absolute penalties family for grouped and hierarchical variable selection. Ann. Stat. 37(6A), 3468–3497 (2009)
Zou, H.: The adaptive lasso and its oracle properties. J. Am. Stat. Assoc. 101(476), 1418–1429 (2006)
Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. J. R. Stat. Soc. Ser. B (Stat. Methodol.) 67(2), 301–320 (2005)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix: Full Conditional Distributions
Appendix: Full Conditional Distributions
The hierarchical specification of the complete model of the HBGHS is given in (7). By using the decomposition [10], the hierarchical representation becomes:
The full conditional distributions of \(\varvec{\beta }\), \(\sigma ^2\), \(\lambda _1^2,\cdots ,\lambda _G^2\), \(\delta _1^2,\cdots ,\delta _p^2\), \(\tau \) are:
Rights and permissions
Copyright information
© 2016 Springer International Publishing AG
About this paper
Cite this paper
Xu, Z., Schmidt, D.F., Makalic, E., Qian, G., Hopper, J.L. (2016). Bayesian Grouped Horseshoe Regression with Application to Additive Models. In: Kang, B.H., Bai, Q. (eds) AI 2016: Advances in Artificial Intelligence. AI 2016. Lecture Notes in Computer Science(), vol 9992. Springer, Cham. https://doi.org/10.1007/978-3-319-50127-7_19
Download citation
DOI: https://doi.org/10.1007/978-3-319-50127-7_19
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-50126-0
Online ISBN: 978-3-319-50127-7
eBook Packages: Computer ScienceComputer Science (R0)