The liquid state machine (LSM) is a relatively new recurrent neural network (RNN) architecture for dealing with time-series classification problems. The LSMhas some attractive properties such as a fast training speed compared with more traditional RNNs, its biological plausibility, and its ability to deal with highly nonlinear dynamics. This paper presents the democratic LSM, an extension of the basic LSM that uses majority voting by combining two dimensions. First, instead of only giving the classification at the end of the time-series, multiple classifications after different time-periods are combined. Second, instead of using a single LSM, multiple ensembles are combined. The results show that the democratic LSM significantly outperforms the basic LSM and other methods on two music composer classification tasks where the goal is to separate Haydn/Mozart and Beethoven/Bach, and a music instrument classification problem where the goal is to distinguish between a flute and a bass guitar.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Computation 14 (2002) 2531-2560
Jaeger, H.: The ‘echo state’ approach to analyzing and training recurrent neural networks. GMD report 148 (2001)
Hochreiter, S.: Recurrent neural net learning and vanishing gradient. International Journal Of Uncertainity, Fuzziness and Knowledge-Based Systems 6 (1998) 107-116
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Computation 9 (1997) 1735-1780
Bakker, B., Zhumatiy, V., Gruener, G., Schmidhuber, J.: A robot that reinforcement-learns to identify and memorize important previous observations. In: Proceedings of the 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS2003) (2003) 430-435
Breiman, L.: Bagging predictors. Machine Learning 24 (1996) 123-140
Ruta, D., Gabrys, B.: A theoretical analysis of the limits of majority voting errors for multiple classifier systems. Technical Report 11, ISSN 1461-6122, Department of Computing and Information Systems, University of Paisley (2000)
Williams, R., Zipser, D.: A learning algorithm for continually running fully recurrent neural networks. Neural Computation 1 (1989) 270-280
Rumelhart, D., Hinton, G., Williams, R.: Learning internal representations by error propagation. Parallel Distributed Processing 1 (1986) 318-362
Bengio, Y., Simard, P., Frasconi, P.: Learning long-term dependencies with gradient descent is difficult. IEEE Transactions on Neural Networks 5 (1994) 157-166
Pearlmutter, B.: Gradient calculations for dynamic recurrent neural networks: a survey. IEEE Transactions on Neural Networks 6 (1995) 1212-1228
Williams, R., Zipser, D.: Gradient-based learning algorithms for recurrent net-works and their computational complexity. In Chauvin, Y., Rumelhart, D., eds.: Back-propagation: theory, architectures and applications. Lawrence Erlbaum, Hillsdale, NJ (1995) 433-486
Lin, T., Horne, B., Tino, P., Giles, C.: Learning long-term dependencies is not as difficult with NARX networks. In: Touretzky, D., Mozer, M., Hasselmo, M., eds.: Advances in Neural Information Processing Systems, Volume 8. MIT, Cambridge, MA (1996) 577-583
Natschläger, T., Bertschinger, N., Legenstein, R.: At the edge of chaos: Real-time computations and self-organized criticality in recurrent neural networks. In: Proceedings of Neural Information Processing Systems, Volume 17 (2004) 145-152
Gupta, A., Wang, Y., Markram, H.: Organizing principles for a diversity of GABAergic interneurons and synapses in the neocortex. Science 287 (2000) 273-278
Vreeken, J.: On real-world temporal pattern recognition using liquid state machines. Master’s thesis, Utrecht University, University of Zürich (2004)
Häusler, S., Markram, H., Maass, W.: Perspectives of the high dimensional dynamics of neural microcircuits from the point of view of low dimensional readouts. Complexity (Special Issue on Complex Adaptive Systems) 8 (2003) 39-50
Narasimhamurthy, A.: Theoretical bounds of majority voting performance for a binary classification problem. IEEE Transactions on Pattern Analysis and Machine Intelligence 27 (2005) 1988-1995
Lam, L., Suen, C.Y.: Application of majority voting to pattern recognition: an analysis of its behaviour and performance. IEEE Transactions on Systems, Man, and Cybernetics 27 (1997) 553-568
Feng, J., Brown, D.: Integrate-and-fire models with nonlinear leakage. Bulletin of Mathematical Biology 62 (2000) 467-481
Vapnik, V.: Statistical learning theory. Wiley, New York (1998)
Sapp, C., Liu, Y.: The Haydn/Mozart string quartet quiz. Center for Computer Assisted Research in the Humanities at Stanford University. Available on http://qq.themefinder.org/ (2006)
Pantev, C., Hoke, M., Lutkenhoner, B., Lehnertz, K.: Tonotopic organization of the auditory cortex: pitch versus frequency representation. Science 246 (1989) 486-488
Pantev, C., Hoke, M., Lutkenhoner, B., Lehnertz, K.: Neuromagnetic evidence of functional organization of the auditory cortex in humans. Acta Otolaryngol Supply 491 (1991) 106-115
Fernando, C., Sojakka, S.: Pattern recognition in a bucket. In: Proceedings of the 7th European Conference on Artificial Life (2003) 588-597
Essid, S., Richard, G., David, B.: Musical instrument recognition based on class pairwise feature selection. In: ISMIR Proceedings 2004 (2004) 560-568
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Pape, L., de Gruijl, J., Wiering, M. (2008). Democratic Liquid State Machines for Music Recognition. In: Prasad, B., Prasanna, S.R.M. (eds) Speech, Audio, Image and Biomedical Signal Processing using Neural Networks. Studies in Computational Intelligence, vol 83. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-75398-8_9
Download citation
DOI: https://doi.org/10.1007/978-3-540-75398-8_9
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-75397-1
Online ISBN: 978-3-540-75398-8
eBook Packages: EngineeringEngineering (R0)