Skip to main content

Applying Boosting to Similarity Literals for Time Series Classification

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1857))

Abstract

A supervised classification method for temporal series, even multivariate, is presented. It is based on boosting very simple classifiers, which consists only of one literal. The proposed predicates are based in similarity functions (i.e., euclidean and dynamic time warping) between time series.

The experimental validation of the method has been done using different datasets, some of them obtained from the UCI repositories. The results are very competitive with the reported in previous works. Moreover, their comprehensibility is better than in other approaches with similar results, since the classifiers are formed by a weighted sequence of literals.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Robert J. Alcock and Yannis Manolopoulos. Time-series similarity queries employing a feature-based approach. In 7 th Hellenic Conference on Informatics, Ioannina, Greece, 1999.

    Google Scholar 

  2. Carlos J. Alonso González and Juan J. Rodrýguez Diez. A graphical rule language for continuous dynamic systems. In Masoud Mohammadian, editor, Computational Intelligence for Modelling, Control and Automation., volume 55 of Concurrent Systems Engineering Series, pages 482–487, Amsterdam, Netherlands, 1999. IOS Press.

    Google Scholar 

  3. Eric Bauer and Ron Kohavi. An empirical comparison of voting classification algorithms: Bagging, boosting and variants. Machine Learning, 36(1/2):105–139, 1999.

    Article  Google Scholar 

  4. Stephen D. Bay. The UCI KDD archive, 1999. http://kdd.ics.uci.edu/.

  5. D.J. Berndt and J. Clifford. Finding patterns in time series: a dynamic programming approach. In U.M. Fayyad, G. Piatetsky-Shapiro, P. Smyth, and R. Uthurusamy, editors, Advances in Knowledge Discovery and Data Mining, pages 229–248. AAAI Press /MIT Press, 1996.

    Google Scholar 

  6. C.L. Blake and C.J. Merz. UCI repository of machine learning databases, 1998. http://www.ics.uci.edu/im$mlearn/MLRepository.html.

  7. L. Breiman, J.H. Friedman, A. Olshen, and C.J. Stone. Classification and Regression Trees. Chapman & Hall, New York, 1993. Previously published by Wadsworth & Brooks/Cole in 1984.

    Google Scholar 

  8. Thomas G. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization. Machine Learning, 1999.

    Google Scholar 

  9. Y. Freund and R. Schapire. Experiments with a new boosting algorithm. In 13 th International Conference om Machine Learning (ICML-96), pages 148–156, Bari, Italy, 1996.

    Google Scholar 

  10. Michael Harries. Boosting a strong learner: Evidence against the minimum margin. In Ivan Bratko and Saso Dzeroski, editors, 16 th International Conference of Machine Learning (ICML-99). Morgan Kaufmann, 1999.

    Google Scholar 

  11. Mohammed Waleed Kadous. Learning comprehensible descriptions of multivariate time series. In Ivan Bratko and Saso Dzeroski, editors, Proceedings of the 16 th International Conference of Machine Learning (ICML-99). Morgan Kaufmann, 1999.

    Google Scholar 

  12. M. Kubat, I. Koprinska, and G. Pfurtscheller. Learning to classify biomedical signals. In R.S. Michalski, I. Bratko, and M. Kubat, editors, Machine Learning and Data Mining, pages 409–428. John Wiley & Sons, 1998.

    Google Scholar 

  13. J.Ross Quinlan. C4.5: programs for machine learning. Machine Learning. Morgan Kaufmann, San Mateo,California, 1993.

    Google Scholar 

  14. J.Ross Quinlan. Boosting, bagging, and C4.5. In 13 th National Conference on Artificial Inteligence (AAAI’96), pages 725–730. AAAI Press, 1996.

    Google Scholar 

  15. Juan J. Rodrýguez Diez and Carlos J. Alonso González. Time series classification through clauses restricted to literals on intervals. In Ana M. Garcýa Serrano, Ramón Rizo, Serafýn Moral, and Francisco Toledo, editors, 8 th Conference of the Spanish Association for Artificial Intelligence (CAEPIA’99), pages 125–132, Murcia, Spain, November 1999. In spanish.

    Google Scholar 

  16. Naoki Saito. Local Feature Extraction and Its Applications Using a Library of Bases. PhD thesis, Department of Mathematics, Yale University, 1994.

    Google Scholar 

  17. Naoki Saito and Ronald R. Coifman. Local discriminant bases and their applications. J.Mathematical Imaging and Vision, 5(4):337–358, 1995.

    Article  MATH  MathSciNet  Google Scholar 

  18. Robert E. Schapire. A brief introduction to boosting. In Thomas Dean, editor, 16 th International Joint Conference on Artificial Intelligence (IJCAI-99), pages 1401–1406. Morgan Kaufmann, 1999.

    Google Scholar 

  19. Robert E. Schapire and Yoram Singer. Improved boosting algorithms using confidence-rated predictions. In 11 th Annual Conference on Computational Learning Theory (COLT-98), pages 80–91. ACM, 1998.

    Google Scholar 

  20. Geoffrey I. Webb. MultiBoosting: A technique for combining boosting and wagging. Machine learning, 1999. In press.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2000 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Rodríguez Diez, J.J., Alonso González, C.J. (2000). Applying Boosting to Similarity Literals for Time Series Classification. In: Multiple Classifier Systems. MCS 2000. Lecture Notes in Computer Science, vol 1857. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45014-9_20

Download citation

  • DOI: https://doi.org/10.1007/3-540-45014-9_20

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-67704-8

  • Online ISBN: 978-3-540-45014-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics