skip to main content
10.1145/3468791.3468839acmotherconferencesArticle/Chapter ViewAbstractPublication PagesssdbmConference Proceedingsconference-collections
short-paper

Local Gaussian Process Model Inference Classification for Time Series Data

Published: 11 August 2021 Publication History

Abstract

One of the prominent types of time series analytics is classification, which entails identifying expressive class-wise features for determining class labels of time series data. In this paper, we propose a novel approach for time series classification called Local Gaussian Process Model Inference Classification (LOGIC). Our idea consists in (i) approximating the latent, class-wise characteristics of given time series data by means of Gaussian processes and (ii) aggregating these characteristics into a feature representation to (iii) provide a model-agnostic interface for state-of-the-art feature classification mechanisms. By making use of a fully-connected neural network as classification model, we show that the LOGIC model is able to compete with state-of-the-art approaches.

References

[1]
Anthony J. Bagnall, Michael Flynn, James Large, Jason Lines, and Matthew Middlehurst. 2020. On the Usage and Performance of the Hierarchical Vote Collective of Transformation-Based Ensembles Version 1.0. In AALTD@PKDD/ECML(Lecture Notes in Computer Science), Vol. 12588. Springer, 3–18.
[2]
Anthony J. Bagnall, Jason Lines, Aaron Bostrom, James Large, and Eamonn J. Keogh. 2017. The great time series classification bake off: a review and experimental evaluation of recent algorithmic advances. Data Min. Knowl. Discov. 31, 3 (2017), 606–660.
[3]
Fabian Berns and Christian Beecks. 2021. Complexity-Adaptive Gaussian Process Model Inference for Large-Scale Data. In SDM. SIAM.
[4]
Fabian Berns and Christian Beecks. 2021. Stochastic Time Series Representation for Interval Pattern Mining via Gaussian Processes. In SDM. SIAM.
[5]
Fabian Berns, Kjeld Schmidt, Ingolf Bracht, and Christian Beecks. 2020. 3CS Algorithm for Efficient Gaussian Process Model Retrieval. In ICPR. IEEE, 1773–1780.
[6]
Christopher M. Bishop. 2007. Pattern recognition and machine learning. Springer.
[7]
Ane Blázquez-García, Angel Conde, Usue Mori, and José Antonio Lozano. 2020. A review on outlier/anomaly detection in time series data. CoRR abs/2002.04236(2020).
[8]
Hoang Anh Dau, Anthony J. Bagnall, Kaveh Kamgar, Chin-Chia Michael Yeh, Yan Zhu, Shaghayegh Gharghabi, Chotirat Ann Ratanamahatana, and Eamonn J. Keogh. 2019. The UCR time series archive. IEEE CAA J. Autom. Sinica 6, 6 (2019), 1293–1305.
[9]
Angus Dempster, François Petitjean, and Geoffrey I. Webb. 2020. ROCKET: exceptionally fast and accurate time series classification using random convolutional kernels. Data Min. Knowl. Discov. 34, 5 (2020), 1454–1495.
[10]
Houtao Deng, George C. Runger, Eugene Tuv, and Vladimir Martyanov. 2013. A time series forest for classification and feature extraction. Inf. Sci. 239(2013), 142–153.
[11]
Christos Faloutsos, Jan Gasthaus, Tim Januschowski, and Yuyang Wang. 2018. Forecasting Big Time Series: Old and New. Proc. VLDB Endow. 11, 12 (2018), 2102–2105.
[12]
Hassan Ismail Fawaz, Benjamin Lucas, Germain Forestier, Charlotte Pelletier, Daniel F. Schmidt, Jonathan Weber, Geoffrey I. Webb, Lhassane Idoumghar, Pierre-Alain Muller, and François Petitjean. 2020. InceptionTime: Finding AlexNet for time series classification. Data Min. Knowl. Discov. 34, 6 (2020), 1936–1962.
[13]
Michael Flynn, James Large, and Tony Bagnall. 2019. The Contract Random Interval Spectral Ensemble (c-RISE): The Effect of Contracting a Classifier on Accuracy. In HAIS(Lecture Notes in Computer Science), Vol. 11734. Springer, 381–392.
[14]
Josif Grabocka, Nicolas Schilling, Martin Wistuba, and Lars Schmidt-Thieme. 2014. Learning time-series shapelets. In KDD. ACM, 392–401.
[15]
Eamonn J. Keogh, Kaushik Chakrabarti, Sharad Mehrotra, and Michael J. Pazzani. 2001. Locally Adaptive Dimensionality Reduction for Indexing Large Time Series Databases. In SIGMOD Conference. ACM, 151–162.
[16]
James Large, Anthony J. Bagnall, Simon Malinowski, and Romain Tavenard. 2019. On time series classification with dictionary-based classifiers. Intell. Data Anal. 23, 5 (2019), 1073–1089.
[17]
T Warren Liao. 2005. Clustering of time series data—a survey. Pattern recognition 38, 11 (2005), 1857–1874.
[18]
Carl Henning Lubba, Sarab S. Sethi, Philip Knaute, Simon R. Schultz, Ben D. Fulcher, and Nick S. Jones. 2019. catch22: CAnonical Time-series CHaracteristics - Selected through highly comparative time-series analysis. Data Min. Knowl. Discov. 33, 6 (2019), 1821–1852.
[19]
Benjamin Lucas, Ahmed Shifaz, Charlotte Pelletier, Lachlan O’Neill, Nayyar A. Zaidi, Bart Goethals, François Petitjean, and Geoffrey I. Webb. 2019. Proximity Forest: an effective and scalable distance-based classifier for time series. Data Min. Knowl. Discov. 33, 3 (2019), 607–635.
[20]
Matthew Middlehurst, William Vickers, and Anthony J. Bagnall. 2019. Scalable Dictionary Classifiers for Time Series Classification. In IDEAL (1)(Lecture Notes in Computer Science), Vol. 11871. Springer, 11–19.
[21]
Carl Edward Rasmussen and Christopher K. I. Williams. 2006. Gaussian processes for machine learning. MIT Press.
[22]
Patrick Schäfer. 2015. The BOSS is concerned with time series classification in the presence of noise. Data Min. Knowl. Discov. 29, 6 (2015), 1505–1530.
[23]
Patrick Schäfer and Ulf Leser. 2017. Fast and Accurate Time Series Classification with WEASEL. In CIKM. ACM, 637–646.
[24]
Ahmed Shifaz, Charlotte Pelletier, François Petitjean, and Geoffrey I. Webb. 2020. TS-CHIEF: a scalable and accurate forest algorithm for time series classification. Data Min. Knowl. Discov. 34, 3 (2020), 742–775.
[25]
Edward Snelson and Zoubin Ghahramani. 2007. Local and global sparse Gaussian process approximations. In AISTATS(JMLR Proceedings), Vol. 2. JMLR.org, 524–531.
[26]
Zhiguang Wang, Weizhong Yan, and Tim Oates. 2017. Time series classification from scratch with deep neural networks: A strong baseline. In IJCNN. IEEE, 1578–1585.

Cited By

View all
  • (2021)LOGIC: Probabilistic Machine Learning for Time Series Classification2021 IEEE International Conference on Data Mining (ICDM)10.1109/ICDM51629.2021.00113(1000-1005)Online publication date: Dec-2021

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
SSDBM '21: Proceedings of the 33rd International Conference on Scientific and Statistical Database Management
July 2021
275 pages
ISBN:9781450384131
DOI:10.1145/3468791
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 11 August 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Gaussian Processes
  2. Neural Networks
  3. Time Series Classification

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Conference

SSDBM 2021

Acceptance Rates

Overall Acceptance Rate 56 of 146 submissions, 38%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)15
  • Downloads (Last 6 weeks)4
Reflects downloads up to 20 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2021)LOGIC: Probabilistic Machine Learning for Time Series Classification2021 IEEE International Conference on Data Mining (ICDM)10.1109/ICDM51629.2021.00113(1000-1005)Online publication date: Dec-2021

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media