Abstract
In this article, we consider Hilbertian spatial periodically correlated autoregressive models. Such a spatial model assumes periodicity in its autocorrelation function. Plausibly, it explains spatial functional data resulted from phenomena with periodic structures, as geological, atmospheric, meteorological and oceanographic data. Our studies on these models include model building, existence, time domain moving average representation, least square parameter estimation and prediction based on the autoregressive structured past data. We also fit a model of this type to a real data of invisible infrared satellite images.




Similar content being viewed by others
References
Bosq D (2000) Linear processes in function spaces, theory and applications. Lecture notes in statistics, Springer, Berlin
Dunford N, Schwartz JT (1958) Linear operators, Part I: general theory. Wiley-Interscience, Hoboken
Ferraty F, Vieu P (2006) Nonparametric functional data analysis. Springer, New York
Helson H, Lowdenslager D (1958) Prediction theory and Fourier series in several variables. Acta Math Hung 99:165–202
Hurd HL, Miamee A (2007) Periodically correlated random sequences., Spectral theory and practice. Wiley, Hoboken
Hurd HL, Kallianpur G, Farshidi J (2004) Correlation and spectral theory for periodically correlated random fields indexed on \({{\mathbb{Z}}}^{2}\). J Multivar Anal 90:359–383
Horváth L, Kokoszka P (2012) Inference for functional data with applications. Springer series in statistics, Springer, New York
Ramsay JO, Silverman BW (2005) Functional data analysis. Springer, New York
Ruiz-Medina MD (2011a) Spatial autoregressive and moving average Hilbertian processes. J Multivar Anal 102:292–305
Ruiz-Medina MD (2011b) Spatial functional prediction from spatial autoregressive Hilbertian processes. Environmetrics 23:119–128
Serpedin E, Panduru F, Sari I, Giannakis GB (2005) Bibliography on cyclostationarity. Signal Process 85:233–2303
Shishebor Z, Soltani AR, Zamani A (2011) Asymptotic distribution for periodograms of infinite dimensional discrete time periodically correlated processes. J Multivar Anal 101:368–373
Soltani AR (1984) Extrapolation and moving average representation for stationary random fields and Burling’s Theorem. Ann Probab 12(1):120–132
Soltani AR, Hashemi M (2011) Periodically correlated autoregressive Hilbertian. Stat Inference Stoch Process 14(2):177–188
Soltani AR, Shishebor Z (1998) A spectral representation for weakly periodic sequences of bounded linear transformations. Acta Math Hung 80:265–270
Soltani AR, Shishebor Z (1999) Weakly periodic sequences of bounded linear transformations: a spectral characterization. J Georgian Math 6:91–98
Soltani AR, Shishebor Z, Sajjadnia Z (2012) Hilbertian GARCH models. Ann ISUP 56:61–80
Soltani AR, Shishebor Z, Zamani A (2010) Inference on periodograms of infinite dimensional discrete time periodically correlated processes. J Multivar Anal 101:368–373
Acknowledgments
The authors express their sincere thanks to the editor and referees for providing valuable comments and suggestions.
Author information
Authors and Affiliations
Corresponding author
Appendix: Proofs
Appendix: Proofs
Proof of Theorem 3.1
First, we show that the random field \({{\mathbf {X}}}=\{{{\mathbf {X}}}_{\mathbf {t}},\ {\mathbf {t}}\in {{\mathbb {Z}}}^{2}\}\) defined by
is a \({\mathbf {T}}_1\)-HSPC process. Let \({\mathbf {t}},{{\mathbf {s}}}\) and \({\mathbf {n}}\in {{\mathbb {Z}}}^{2}\). Then the cross-covariance matrix of \(\mathbf X\) is \(C_{{{{ \mathbf {X}}}}_{{\mathbf {t}}},{{{\mathbf {X}}}}_{\mathbf {s}}}=\left[ a_{i,j}(\mathbf{t,s})\right] _{ i,j=0}^{ T_{2}-1} \) where \(a_{i,j}(\mathbf{t,s})=\mathbb {E}\left( X_{{\mathbf {t}}\odot {{\mathbf {T}}}_{2}+(0,i)}\otimes X_{{{\mathbf {s}}}\odot {{\mathbf {T}}}_{2}+(0,j)}\right) \). Since \({\mathbf {T}}={ \mathbf {T}}_{1}\odot {\mathbf {T}}_{2}\),
So \(C_{{{{\mathbf {X}}}}_{{\mathbf {t}}},{{{\mathbf {X}}}}_{ \mathbf {s}}}=C_{{{{\mathbf {X}}}}_{{\mathbf {t}}{\mathbf {+}}{\mathbf { n}}\odot {{\mathbf {T}}}_{1}},{{{\mathbf {X}}}}_{{{\mathbf {s}} \mathbf {+}}{\mathbf {n}}\odot {{\mathbf {T}}}_{1}}}\). It means that \( {{\mathbf {X}}}\) is a \({{\mathbf {T}}}_{1}\)-HSPC process. By definition
Since the structure of definition \({\varvec{\epsilon }}=\left\{ \varvec{\mathbf {\epsilon }}_{\mathbf {t}},\ {\mathbf {t}}\in {{\mathbb {Z}}}^{2}\right\} \) is same as \(\mathbf X\), one can conclude \({\varvec{\epsilon }}\) is also a \({{\mathbf {T}}}_{1}\)-HSPC process. Moreover, the cross-covariance matrix of \({\varvec{\epsilon }}\) is \(C_{{\varvec{\epsilon }}_{{\mathbf {t}} },{\varvec{\epsilon }}_{\mathbf {s}}}=\left[ b_{i,j}(\mathbf{s,t})\right] _{i,j=0}^{T_2-1}\) where \(b_{i,j}(\mathbf{s,t})=\mathbb {E}\left( {\epsilon }_{{\mathbf {t}}\odot {{\mathbf {T}}}_{2}+(0,i)}\otimes {\epsilon }_{{\mathbf {s}}\odot {{\mathbf {T}}}_{2}+(0,j)}\right) \). For \({\mathbf {s}}\ne {{\mathbf { t}}}\) we have \({\mathbf {t }}\odot {{\mathbf {T}}}_{2}+(0,i)\ne \mathbf {s}\odot {{\mathbf {T}}} _{2}+(0,j)\) for all \(i,j=0,1, \cdots , T_2-1\) and hence \(b_{i,j}(\mathbf{s,t})=0\). Therefore, \({\varvec{\epsilon }}\) is a \({{\mathbf {T}}}_{1}\)-HSPC-WN process. This also results that \({\varvec{\varepsilon }}_{i,j}={{{\mathcal {D}}}}_{i}{{\varvec{\epsilon }}}_{i,j}\), is a \({{\mathbf {T}}}_{1}\)-HSPC-WN process with covariance operator \({C_{{{{ \varvec{\varepsilon }}}}_{i,j}}={{{\mathcal {D}}}} _{i}C_{{{\varvec{\epsilon }}}_{i,j}}{{{ \mathcal {D}}}}_{i}^{{{*}}}}\). To see \(\mathbf X\) satisfies (3.1), by using (2.1) recursively, we have
and for each \(k=0, 1, \dots , T_2-1,\) we can get
Writing this equations in matrix form, using definitions (2.6) and (2.7) gives
where \({{\mathcal {A}}}_{i}=\left[ {{a}}_{m,n}^{i}\right] \), \({{{ \mathcal {B}}}}_{i}=\left[ {{b}}_{m,n}^{i}\right] \), \({{ \mathcal {C}}}_{i}=\left[ {{c}}_{m,n}^{i}\right] \) and \({{{ \mathcal {D}}}}_{i}=\left[ {{d}}_{m,n}^{i}\right] \). \(\square \)
Proof of Theorem 3.2
By definition
Therefore, for \({\mathbf {s,t}\in { \mathbb {Z}}}^{2}\), the cross-covariance matrix \(C_{{{\widetilde{\mathbf {X}}}}_{{\mathbf {t }}},{{\widetilde{\mathbf {X}}}}_{{{\mathbf {s}}}}}\) is a \(T_{1}\times T_{1}\) block matrix with \((i,j)\)-th block is \(C_{{{{\mathbf {X}}}}_{{\mathbf {t}}\odot {{\mathbf {T}}} _{1}+(i,0)},{{{\mathbf {X}}}}_{{{\mathbf {s}}}\odot {{\mathbf {T}}} _{1}+(j,0)}}\) for \(i, j=0, 1, \dots , T_{1}-1\). Since by Theorem 3.1 \({{\mathbf { X}}}\) is a \(\mathbf {T_1}\)-HSPC process,
So the entries of \(C_{{\widetilde{{\mathbf {X}}}}_{{\mathbf {t}}},{\widetilde{{\mathbf {X}}}}_{{{ \mathbf {s}}}}}\) depend on \({\mathbf {t}},{{\mathbf {s}}}\) through \({{\mathbf {s} }}-{\mathbf {t}}\) and this means \({\widetilde{\mathbf {X}}}=\{{\widetilde{\mathbf {X}}}_{\mathbf {t}}, {\mathbf {t}}\in {{\mathbb {Z}}}^{2}\}\) is an HSS process. This result is also true for \(\widetilde{\varvec{\epsilon }}=\{\widetilde{\varvec{\epsilon }}_{\mathbf {t}}, {\mathbf {t}}\in {{\mathbb {Z}}}^{2}\}\) which is defined as
Moreover, by the same technique as used in the proof of Theorem 3.1, it can be shown that \(\widetilde{\varvec{\epsilon }}\) is an HSS-WN process. Therefore, \(\widetilde{\varvec{\varepsilon }}_{{\mathbf {t}}}={\varvec{\widetilde{\mathcal {D}}}}\varvec{\mathcal {E}}\varvec{\widetilde{\epsilon } }_{{\mathbf {t}}}\) is an HSS-WN process with covariance operator \(C_{{\widetilde{\varvec{\varepsilon }}}}=\varvec{\widetilde{\mathcal {D }}}\varvec{{\mathcal {E}}}C_{\widetilde{\varvec{\epsilon }}}\varvec{\mathcal {E} }^{*}\varvec{{\widetilde{\mathcal {D}}}}^{*}\). On the other hand, by (2.6) and (2.7),\(\ {{ {\mathcal {D}}}}_{iT_{1}+k}={{{\mathcal {D}}}}_{k}\), therefore, \({ {{\varvec{\varepsilon }}}}_{iT_{1}+k,j}={{{\mathcal {D}} }}_{iT_{1}+k}{{\varvec{\epsilon }}}_{iT_{1}+k,j}={{{\mathcal {D}} }}_{k}{{\varvec{\epsilon }}}_{iT_{1}+k,j}\). Now using Eq. (3.1) recursively, gives
and for each \(k=0, 1, \dots , T_1-1,\) we can get
Writing the above equations in matrix form, using definitions (2.8) and (2.9) gives
where \(\varvec{\widetilde{\mathcal {A}}}=\left[ \widetilde{\mathbf {a}}_{m,n}\right] , \varvec{ \widetilde{\mathcal {B}}}=\left[ \widetilde{\mathbf {b}}_{m,n}\right] ,\ \varvec{\widetilde{\mathcal {C}}}=\left[ \widetilde{\mathbf {c }}_{m,n}\right] \) and \(\varvec{\widetilde{\mathcal {D}}}=\left[ \widetilde{\mathbf {d}}_{m,n}\right] \). \(\square \)
Proof of Theorem 3.4
By Corollary 3.5, \({\mathbf {Y}}\) is an HSS-AR(1) process on \( H^{T_{1}T_{2}}\) satisfies
Now, by the results of Ruiz-Medina (2011a), under the Assumptions A, \(\mathbf Y\) has a unique stationary solution given by
Since \({{\widetilde{\mathbf {X}}}}_{{i,j}}={\widetilde{\varvec{\mathcal {D}}}} {\varvec{\mathcal {E}}}{{\mathbf {Y}}}_{{i,j}}\),
By definition (2.4)
For \((i,j)\in {{\mathbb {Z}}}^{2}\), let \(i=i^{*}+\left[ \frac{i}{T_{1}}\right] T_{1}\) and \(j=j^{*}+\left[ \frac{j}{T_{2}}\right] T_{2}\) it is resulted that \(X_{i,j}\) is \(i^{*}T_{2}+j^{*}\)-th member of the vector \({{\widetilde{\mathbf {X}}}}_{\left[ \frac{i}{T_{1}}\right] , \left[ \frac{j}{T_{2}}\right] }\). i.e., \({\pi }_{i^{*}T_{2}+j^{*}}{{\widetilde{\mathbf {X}}}}_{\left[ \frac{i}{T_{1}}\right] ,\left[ \frac{j}{T_{2}}\right] }=X_{i,j},\) and by \(i^{*}T_{2}+j^{*}=\left( i-T_{1}\left[ \frac{i}{T_{1}}\right] -\left[ \frac{j}{T_{2}} \right] \right) T_{2}+j\), we have
Using Eqs. (8.1) and (8.2) gives
\(\square \)
Proof of Lemma 4.1
For \(h\in H\) define \({\left\langle \varvec{\theta } ,h\right\rangle }_{H} =\left[ \left\langle {\theta }_1,h\right\rangle _{H} ,\dots ,\left\langle {\theta } _m,h\right\rangle _{H} \right] ^{\prime }\). If \(\left\{ e_1,e_2,\dots \right\} \) be an orthogonal basis for \({H}\). Using Parseval’s identity for (4.1) gives
But the minimization of \(\sum ^n_{i=1}{{\left| \left\langle Y_i- {{\ \mathbf {x}}}_i\varvec{\theta } ,e_k\right\rangle _{H} \right| }^2}\) is by \(\widehat{\varvec{\theta }} \mathrm {=}{\left( {\mathbf {\mathcal {X}}}^{\prime }{\mathbf {\mathcal {X}}}\right) }^{{-1}} {\mathbf {\mathcal {X}}}^{\prime }\mathbf {Y}\). The reasoning is that the corresponding regression equation is
The least square solution of this system is
\(\square \)
Rights and permissions
About this article
Cite this article
Haghbin, H., Shishebor, Z. & Soltani, A.R. Hilbertian spatial periodically correlated first order autoregressive models. Adv Data Anal Classif 8, 303–319 (2014). https://doi.org/10.1007/s11634-014-0172-8
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11634-014-0172-8