Abstract
Permutation is a fundamental way of data augmentation. However, it is not commonly used in image based systems with hardware acceleration due to distortion of spatial correlation and generation complexity. This paper proposes Restricted Permutation Network (RPN), a scalable architecture to automatically generate a restricted subset of local permutation, preserving the features of the dataset while simplifying the generation to improve scalability. RPN reduces the spatial complexity from \(\textit{O}(Nlog(N))\) to \(\textit{O}(N)\), making it easily scalable to 64 inputs and beyond, with 21 times speed up in generation and significantly reducing data storage and transfer, while maintaining the same level of accuracy as the original dataset for deep learning training. Experiments show Convolutional Neural Networks (CNNs) trained by the augmented dataset can be as accurate as the original one. Combining three to five networks in general improves the network accuracy by 5%. Network training can be accelerated by training multiple sub-networks in parallel with a reduced training data set and epochs, resulting in up to 5 times speed up with a negligible loss in accuracy. This opens up the opportunity to easily split long iterative training process into independent parallelizable processes, facilitating the trade off between resources and run time.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Wu, Y., Yang, Y., Nishiura, H., Saitoh, M.: Deep learning for epidemiological predictions. In: SIGIR (2018)
Ivan, C.: Convolutional neural networks on randomized data. CoRR (2019)
Um, T.T., et al.: Data augmentation of wearable sensor data for Parkinson’s disease monitoring using convolutional neural networks. In: ICMI (2017)
Iwana, B.K., Uchida, S.: An empirical survey of data augmentation for time series classification with neural networks. In: ICPR (2020)
Akbiyik, M.E.: Data augmentation in training CNNs: injecting noise to images. In: ICLR (2020)
Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: NIPS (2012)
Simard, P., Steinkraus, D., Platt, J.: Best practices for convolutional neural networks applied to visual document analysis. In: ICDAR (2003)
Goodfellow, I.J., et al.: Generative adversarial networks. In: NIPS (2014)
Yoon, J., et al.: Time-series generative adversarial networks. In: NIPS (2019)
Tran, T., Pham, T., Carneiro, G., Palmer, L.J., Reid, I.D.: A bayesian data augmentation approach for learning deep models. CoRR (2017)
Beneš, V.E.: Mathematical Theory of Connecting Network and Telephone Traffic. Academic, New York (1965)
Cooley, J.M., et al.: An algorithm for the machine calculation of complex fourier series. Math. Comp. 19, 297–301 (1965)
Jumandi, Z., Samsudin, A., Budiarto, R.: Optimized arbitrary size networks. In: ICTTA (2004)
Butler, J.T., Sasao, T.: Hardware index to permutation converter. In: IPDPS (PhD Forum) (2012)
Poplin, R., et al.: A universal SNP and small-indel variant caller using deep neural networks. Nat. Biotechnol. 36, 983–987 (2018)
Acknowledgement
The support of the Croucher Foundation, the UK EPSRC (grant number EP/V028251/1, EP/L016796/1, EP/S030069/1 and EP/N031768/1) and Xilinx is gratefully acknowledged.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Kwan, B.P.Y., Guo, C., Luk, W., Jiang, P. (2022). Light-Weight Permutation Generator for Efficient Convolutional Neural Network Data Augmentation. In: Gan, L., Wang, Y., Xue, W., Chau, T. (eds) Applied Reconfigurable Computing. Architectures, Tools, and Applications. ARC 2022. Lecture Notes in Computer Science, vol 13569. Springer, Cham. https://doi.org/10.1007/978-3-031-19983-7_11
Download citation
DOI: https://doi.org/10.1007/978-3-031-19983-7_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-19982-0
Online ISBN: 978-3-031-19983-7
eBook Packages: Computer ScienceComputer Science (R0)