Abstract
The wavelet transform is a powerful tool for performing multiscale analysis and it is a key subroutine in countless applications, from image processing to astronomy. Recently, it has extended its range of users to include the ever growing machine learning community. For a wavelet library to be efficiently adopted in this context, it needs to provide transformations which can be integrated seamlessly in already existing machine learning workflows and neural networks, being able to leverage the same libraries and run on the same hardware (e.g., CPU vs GPU) as the rest of the machine learning pipeline, without impacting training and evaluation performance. In this paper we present WaveTF, a wavelet library available as a Keras layer, which leverages TensorFlow to exploit GPU parallelism and can be used to enrich already existing machine learning workflows. To demonstrate its efficiency we compare its raw performance against other alternative libraries and finally measure the overhead it causes to the learning process when it is integrated in an already existing Convolutional Neural Network.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Abadi, M., et al.: Tensorflow: a system for large-scale machine learning. In: 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI 16), pp. 265–283 (2016)
Addison, P.S.: The Illustrated Wavelet Transform Handbook: Introductory Theory and Applications in Science, Engineering, Medicine and Finance. CRC Press, Boca Raton (2017)
Amin, H.U., et al.: Feature extraction and classification for eeg signals using wavelet transform and machine learning techniques. Australas. Phys. Eng. Sci. Med. 38(1), 139–149 (2015)
Bruna, J., Mallat, S.: Invariant scattering convolution networks. IEEE Trans. Pattern Anal. Mach. Intell. 35(8), 1872–1886 (2013)
Burgess, J.: Rtx on-the nvidia turing gpu. IEEE Micro 40(2), 36–44 (2020)
Chollet, F., et al.: Keras: the python deep learning library. Astrophysics Source Code Library (2018)
Daubechies, I.: Orthonormal bases of compactly supported wavelets. Commun. Pure Appl. Math. 41(7), 909–996 (1988)
Daubechies, I.: Ten Lectures on Wavelets, vol. 61. Siam, Thailand (1992)
Fujieda, S., Takayama, K., Hachisuka, T.: Wavelet convolutional neural networks for texture classification (2017)
Haug, K.M.: Stability of Adaptive Neural Networks for Image Reconstruction. Master’s thesis (2019)
Howard, J.: Fastai’s imagenette and imagewoof datasets (2020). https://github.com/fastai/imagenette
Huang, H., He, R., Sun, Z., Tan, T.: Wavelet-srnet: a wavelet-based cnn for multi-scale face super resolution. In: The IEEE International Conference on Computer Vision (ICCV) (2017)
Jouppi, N., Young, C., Patil, N., Patterson, D.: Motivation for and evaluation of the first tensor processing unit. IEEE Micro 38(3), 10–19 (2018)
Lee, G., Gommers, R., Waselewski, F., Wohlfahrt, K., O’Leary, A.: Pywavelets: a python package for wavelet analysis. J. Open Source Softw. 4(36), 1237 (2019)
Liu, P., Zhang, H., Lian, W., Zuo, W.: Multi-level wavelet convolutional neural networks. IEEE Access 7, 74973–74985 (2019)
Livani, H., Evrenosoglu, C.Y.: A machine learning and wavelet-based fault location method for hybrid transmission lines. IEEE Trans. Smart Grid 5(1), 51–59 (2013)
Lohne, M.: Parseval Reconstruction Networks. Master’s thesis (2019)
Mallat, S.G.: A theory for multiresolution signal decomposition: the wavelet representation. IEEE Trans. Pattern Anal. Mach. Intell. 11(7), 674–693 (1989)
Oliveira, B.: pytest Quick Start Guide: Write Better Python Code with Simple and Maintainable Tests. Packt Publishing Ltd., Birmingham (2018)
Paleo, P.: pypwt, parallel discrete wavelet transform (2020). https://github.com/pierrepaleo/pypwt
Rodriguez, M.X.B., et al.: Deep adaptive wavelet network. In: The IEEE Winter Conference on Applications of Computer Vision, pp. 3111–3119 (2020)
Russakovsky, O., et al.: ImageNet large scale visual recognition challenge. Int. J. Comput. Vis. (IJCV) 115(3), 211–252 (2015)
Walt, S.V.D., Colbert, S.C., Varoquaux, G.: The NumPy array: a structure for efficient numerical computation. Comput. Sci. Eng. 13(2), 22–30 (2011)
Acknowledgments
I’d like to thank G. Busonera and L. Pireddu for reviewing the draft and S. Leo for his suggestions on structuring the Python code. This work has been funded by the European Commission under the H2020 program grant DeepHealth (n. 825111).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Versaci, F. (2021). WaveTF: A Fast 2D Wavelet Transform for Machine Learning in Keras. In: Del Bimbo, A., et al. Pattern Recognition. ICPR International Workshops and Challenges. ICPR 2021. Lecture Notes in Computer Science(), vol 12661. Springer, Cham. https://doi.org/10.1007/978-3-030-68763-2_46
Download citation
DOI: https://doi.org/10.1007/978-3-030-68763-2_46
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-68762-5
Online ISBN: 978-3-030-68763-2
eBook Packages: Computer ScienceComputer Science (R0)