Abstract
In this paper, we present a new robust estimation procedure for semi-functional linear regression models by using exponential squared loss. The outstanding advantage of the proposed method is the resulting estimators are more efficient than the least squares estimators in the presence of outliers or heavy-tail error distributions. The slope function and functional predictor variable are approximated by functional principal component basis functions. Under some regularity conditions, we obtain the optimal convergence rate of slope function, and the asymptotic normality of parameter vector and variance estimator. Finally, we investigate the finite sample performance of the proposed method through a simulation study and real data analysis.
Similar content being viewed by others
References
Aneiros-Pérez G, Ling N, Vieu P (2015) Error variance estimation in semi-functional partially linear regression models. J Nonparametr Stat 27(3):316–330
Aneiros-Pérez G, Raña P, Vieu P, Vilar P (2017) Bootstrap in semi-functional partial linear regression under dependence. TEST 2017:1–21
Aneiros-Pérez G, Vieu P (2006) Semi-functional partial linear regression. Stat Probab Lett 76(11):1102–1110
Aneiros-Pérez G, Vieu P (2013) Testing linearity in semi-parametric functional data analysis. Comput Stat 28:413–434
Aneiros-Pérez G, Vieu P (2015) Partial linear modelling with multi-functional covariates. Comput Stat 30(3):647–671
Brunel É, Mas A, Roche A (2016) Non-asymptotic adaptive prediction in functional linear models. J Multivar Anal 143:208–232
Cai T, Hall P (2006) Prediction in functional linear regression. Ann Stat 34(5):2159–2179
Cai T, Yuan M (2012) Minimax and adaptive prediction for functional linear regression. J Am Stat Assoc 107(499):1201–1216
Cardot H, Ferraty F, Sarda P (1999) Functional linear model. Stat Probab Lett 45(1):11–22
Crambes C, Kneip A, Sarda P (2009) Smoothing splines estimators for functional linear regression. Ann Stat 37(1):35–72
Ferraty F, Goia A, Salinelli E, Vieu P (2013) Functional projection pursuit regression. TEST 22(2):293–320
Ferraty F, Vieu P (2006) Nonparametric functional data analysis: theory and practice. Springer, New York
Goia A, Vieu P (2014) Some advances on semi-parametric functional data modelling. In: Contributions in infinite-dimensional statistics and related topics, Esculapio, Bologna
Goia A, Vieu P (2015) A partitioned single functional index model. Comput Stat 30(3):673–692
Hall P, Hooker G (2016) Truncated linear models for functional data. J R Stati Soc Ser B (Stat Methodol) 78(3):637–653
Hall P, Horowitz JL (2007) Methodology and convergence rates for functional linear regression. Ann Stat 35(1):70–91
Horváth L, Kokoszka P (2012) Inference for functional data with applications. Springer, New York
Hsing T, Eubank R (2015) Theoretical foundations of functional data analysis, with an introduction to linear operators. Wiley, New York
Huber P (1981) Robust estimation. Wiley, New York
Imaizumi M, Kato K (2018) PCA-based estimation for functional linear regression with functional responses. J Multivar Anal 163:15–36
Jiang Y, Ji Q, Xie B (2017) Robust estimation for the varying coefficient partially nonlinear models. J Comput Appl Math 326:31–43
Kai B, Li R, Zou H (2011) New efficient estimation and variable selection methods for semiparametric varying-coefficient partially linear models. Ann Stat 39(1):305–332
Koenker R, Bassett G (1978) Regression quantiles. Econometrica 46:33–50
Kokoszka P, Reimherr M (2017) Introduction to functional data analysis. CRC Press, Boca Raton
Kong D, Xue K, Yao F, Zhang H (2016) Partially functional linear regression in high dimensions. Biometrika 103(1):147–159
Lin Z, Cao J, Wang L, Wang H (2017) Locally sparse estimator for functional linear regression models. J Comput Graph Stat 26(2):306–318
Ling N, Aneiros G, Vieu P (2017) kNN estimation in functional partial linear modeling. Stat Pap 1–22
Lovric M (2011) International encyclopedia of statistical science. Springer, New York
Lu Y, Du J, Sun Z (2014) Functional partially linear quantile regression model. Metrika 77(2):317–332
Lv J, Yang H, Guo C (2015) Robust smooth-threshold estimating equations for generalized varying-coefficient partially linear models based on exponential score function. J Comput Appl Math 280:125–140
Müller HG, Stadtmüller U (2005) Generalized functional linear models. Ann Stat 32(2):774–805
Peng QY, Zhou JJ, Tang NS (2016) Varying coefficient partially functional linear regression models. Stat Pap 57(3):827–841
Ramsay JO, Dalzell CJ (1991) Some tools for functional data analysis. J R Stat Soc Ser B (Methodol) 53(3):539–572
Ramsay JO, Silverman BW (2002) Applied functional data analysis: methods and case studies. Springer, New York
Ramsay JO, Silverman BW (2005) Functional data analysis, 2nd edn. Springer, New York
Shin H (2009) Partial functional linear regression. J Stat Plan Inference 139(10):3405–3418
Song Y, Jian L, Lin L (2016) Robust exponential squared loss-based variable selection for high-dimensional single-index varying-coefficient model. J Comput Appl Math 308:330–345
Wang K, Lin L (2016) Robust structure identification and variable selection in partial linear varying coefficient models. J Stat Plan Inference 174:153–168
Wang X, Jiang Y, Huang M, Zhang H (2013) Robust variable selection with exponential squared loss. J Am Stat Assoc 108(502):632–643
Yao F, Müller HG, Wang JL (2005) Functional linear regression analysis for longitudinal data. Ann Stat 33(6):2873–2903
Yu P, Zhang Z, Du J (2016) A test of linearity in partial functional linear regression. Metrika 79(8):953–969
Zhou J, Chen Z, Peng Q (2016) Polynomial spline estimation for partial functional linear regression models. Comput Stat 31(3):1107–1129
Zou H, Yuan M (2008) Composite quantile regression and the oracle model selection theory. Ann Stat 36(3):1108–1126
Author information
Authors and Affiliations
Corresponding author
Additional information
Zhu’s work is supported by the National Natural Science Foundation of China (Nos. 11671096, 11690013), Zhang’s work is supported by the National Natural Science Foundation of China (No. 11271039).
Appendix A: Technical proofs
Appendix A: Technical proofs
Proof of Theorem 1
Let \(\delta _n=n^{-\frac{2b-1}{2(a+2b)}}\), \(\varvec{\alpha }=\varvec{\alpha }_0+\delta _n\varvec{u}_1\), \(\varvec{\gamma }=\varvec{\gamma }_0+\delta _n\varvec{u}_2\), \(\varvec{u}=\big (\varvec{u}_1^T,\varvec{u}_2^T\big )^T\), \(R_i=\int _{0}^{1}\beta _0(t)X_i(t)dt-\varvec{U}_i^T\varvec{\gamma }_0\). We next show that, for any given \(\eta >0\), there exists a sufficient large constant L such that
This implies with the probability at least \(1-\eta \) that there exists a local maximizer \(\varvec{\hat{\alpha }}\) and \(\varvec{\hat{\gamma }}\) in the ball \(\Big \{(\varvec{\alpha }_0^T,\varvec{\gamma }_0^T)^T+\delta _n\varvec{u}:\big \Vert \varvec{u}\big \Vert \le L\Big \}\) such that \(\Vert \varvec{\hat{\alpha }}-\varvec{\alpha }_0\Vert =O_p(\delta _n)\) and \(\Vert \varvec{\hat{\gamma }}-\varvec{\gamma }_0\Vert =O_p(\delta _n)\), which is exactly what we want to show.
Note that \( \Vert v_j-\hat{v}_j\Vert ^2=O_p(n^{-1}j^2)\) (see, e.g., Shin 2009; Yu et al. 2016), one has
For \(\text {A}_1\), by condition C1 and the Hölder inequality, it is obtained
As for \(\text {A}_2\), due to
one has
Taking these together, we have
Invoking Taylor expansion and a simple calculation, we have
where \(\epsilon '_i\) is between \(\epsilon _i+R_i\) and \(\epsilon _i+R_i-\delta _n(\varvec{z}_i^T\varvec{u}_1+\varvec{U}_i^T\varvec{u}_2)\). Furthermore, we have
where \(\tilde{\epsilon }_i\) is between \(\epsilon _i\) and \(\epsilon _i+R_i\). Then, by condition C1, Eq. (A.4) and a simple calculation, we can prove that
Similarly, we can obtain
Therefore, by choosing a sufficiently large L, both \(B_1\) and \(B_3\) are dominated by \(B_2\) uniformly in \(\Vert \varvec{u}\Vert =L\). Hence, Eq. (A.1) holds with probability tending to one, and there exists local maximizer \(\hat{\varvec{\gamma }}\) such that
Observe that
Invoking Eq. (A.7), condition C2, the orthogonality of \(\{\hat{v}_j\}\) and \( \Vert v_j-\hat{v}_j\Vert ^2=O_p(n^{-1}j^2)\), one has
Then, combining Eqs. (A.8)–(A.10) yields Theorem 1. \(\square \)
Proof of Theorem 2
According to Theorem 1, we know that, as \(n\rightarrow \infty \), with probability tending to 1, \(Q_n(\varvec{\alpha }, \varvec{\gamma })\) attains the maximal value at \((\hat{\varvec{\alpha }}^T,\hat{\varvec{\gamma }}^T)^T\). Then, we have
and
Invoking Taylor expansion and Eq. (A.11), some simple calculation yields
where \(\epsilon _i^*\) is between \(\epsilon _i\) and \(Y_i-\varvec{z}_i^T \varvec{\hat{\alpha }}-\varvec{U}_i^T\varvec{\hat{\gamma }}\). Similarly, by Eq. (A.12) we can get
where \(\epsilon _i^{**}\) is between \(\epsilon _i\) and \(Y_i-\varvec{z}_i^T \varvec{\hat{\alpha }}-\varvec{U}_i^T\varvec{\hat{\gamma }}\).
Let \(\Phi _n=\frac{1}{n}\sum _{i=1}^{n}\varphi ''_h(\epsilon _i)\varvec{U}_i\varvec{U}^T_i\), \(\Psi _n=\frac{1}{n}\sum _{i=1}^{n}\varphi ''_h(\epsilon _i)\varvec{U}_i\varvec{z}_i^T\), \(\Lambda _n=\frac{1}{n}\sum _{i=1}^{n}\varvec{U}_i(\varphi '_h(\epsilon _i)+\varphi ''_h(\epsilon _i)R_i)\). Then, by Eq. (A.14), we have
Substituting Eq. (A.15) into Eq. (A.13), we can obtain that
Note that
Then, by Eqs. (A.16)–(A.18), it is easy to show that
where \(\widetilde{\varvec{z}}_i={\varvec{z}}_i-{\Psi }_n^T\Phi _n^{-1}\varvec{U}_i\). Furthermore, invoking Lemma 1 in Yu et al. (2016) and condition C6, a simple calculation yields
Using the central limits Theorem, we have
Moreover, by the law of large numbers, we can obtain
Then, invoking Eqs. (A.19)–(A.22), and the Slutsky’s theorem, we can complete the proof of Theorem 2. \(\square \)
Proof of Theorem 3
The proof can be obtained by used an argument analogous to Theorem 3 in Zhou et al. (2016), we omit the detail for saving space. \(\square \)
Proof of Corollary 2
By Theorem 2, we have
Note that
By Theorem 1,Theorem 2 and Eq. (A.2), we can get
Moreover, according to Lemma 1 in Yu et al. (2016), we have
Thus, invoking the Slutsky’s theorem and the definition of \(\varvec{\Xi }\) we complete the proof of Corollary 2. \(\square \)
Rights and permissions
About this article
Cite this article
Yu, P., Zhu, Z. & Zhang, Z. Robust exponential squared loss-based estimation in semi-functional linear regression models. Comput Stat 34, 503–525 (2019). https://doi.org/10.1007/s00180-018-0810-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00180-018-0810-2