Abstract
It is well known that deep learning model has huge parameters and is computationally expensive, especially for embedded and mobile devices. Polyphone pronunciations selection is a basic function for Chinese Text-to-Speech (TTS) application. Recurrent neural network (RNN) is a good sequence labeling solution for polyphone pronunciation selection. However, huge parameters and computation make compression needed to alleviate its disadvantages. Meanwhile, Large-scale-labels classification leads to more complicated network and heavy computation cost. In contrast to existing quantization with low precision data format and projection layer, we propose a novel method based on shared labels, which focuses on compressing the fully-connected layer before Softmax for models with a huge number of labels in TTS polyphone selection. The basic idea is to compress large number of target labels into a few label clusters, which will share the parameters of fully-connected layer. Furthermore, we combine it with other methods to further compress the polyphone pronunciation selection model. The experimental result shows that for Bi-LSTM (Bidirectional Long Short Term Memory) based polyphone selection, shared labels model decreases about 52% of original model size and accelerates prediction by 44% almost without performance loss. It is worth mentioning that the proposed method can be applied for other tasks to compress model and accelerate calculation.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Grachev, A.M., Ignatov, D.I., Savchenko, A.V.: Neural networks compression for language modeling. In: Pattern Recognition and Machine Intelligence, vol. 10597 (2017)
Hinton, G., Dean, J., Vinyals, O.: Distilling the knowledge in a neural network. In: NIPS Deep Learning and Representation Learning Workshop (2015)
Lample, G., Ballesteros, M., Subramanian, S., Kawakami, K., Dyer, C.: Neural architectures for named entity recognition. In: Proceedings of NAACL-HLT, pp. 260–270 (2016)
John, L., Andrew, M., Pereira, F.C.N.: Conditional random fields: probabilistic models for segmenting and labeling sequence data. In Proceedings ICML (2001)
Courbariaux, M., Bengio, Y.: Binarynet: training deep neural networks with weights and activations constrained to +1 or -1. CoRR, vol. abs/1602.02830 (2016)
Abadi, M., et al.: TensorFlow: large-scale machine learning on heterogeneous systems. Software available from tensorflow.org (2015)
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)
Gupta, S., Agrawal, A., Gopalakrishnan, K., Narayanan, P.: Deep learning with limited numerical precision. In: Proceedings of the 32Nd International Conference on International Conference on Machine Learning - vol. 37, ser. ICML 2015, pp. 1737–1746 (2015)
Han, S., Pool, J., Tran, J., Dally, W.J.: Learning both weights and connections for efficient neural networks. In: Proceedings of the 28th International Conference on Neural Information Processing Systems, ser. NIPS 2015 (2015)
Chen, W., Wilson, J., Tyree, S., Weinberger, K.Q., Chen, Y.: Compressing neural networks with the hashing trick. In: JMLR Workshop and Conference Proceedings (2015)
Yu, C., Wang, D., Zhou, P., Zhang, T.: A survey of model compression and acceleration for deep neural networks. In: IEEE Signal Processing Magazine, Special Issue on Deep Learning for Image Understanding (2019)
Cai, Z., Yang, Y., Zhang, C., Qin, X., Li, M.: Polyphone disambiguation for mandarin Chinese using conditional neural network with multi-level embedding features. INTERSPEECH (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Chen, P., Wang, L., Di, H., Ouchi, K., Wang, L. (2020). Compress Polyphone Pronunciation Prediction Model with Shared Labels. In: Sun, M., Li, S., Zhang, Y., Liu, Y., He, S., Rao, G. (eds) Chinese Computational Linguistics. CCL 2020. Lecture Notes in Computer Science(), vol 12522. Springer, Cham. https://doi.org/10.1007/978-3-030-63031-7_29
Download citation
DOI: https://doi.org/10.1007/978-3-030-63031-7_29
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-63030-0
Online ISBN: 978-3-030-63031-7
eBook Packages: Computer ScienceComputer Science (R0)