Abstract
In this paper, we applied Recurrent Neural Networks (RNNs) Language Model on Arabic Language by training and testing it on “Arab World Books” and “Hindawi” free Arabic text datasets. While the standard architecture of RNNs does not match ideally with Arabic, we adapted a RNN model to deal with Arabic features. Our proposition in this paper is a gated Long-Short Term Memory (LSTM) model responding to some Arabic language criteria. As originality of the paper, we demonstrate the power of our LSTM model in generating Arabic text comparing to the standard LSTM model. Our results, comparing to English and Chinese text generation, have been promising and gave sufficient accuracy.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Arab World Books is a cultural club and Arabic bookstore that aims to promote Arab thought, provide a public service for writers and intellectuals, and exploit the vast potential of the Internet to open a window in which the world looks at Arab thought, to identify its creators and thinkers, and to achieve intellectual communication between the people of this homeland and abroad.
- 2.
Hindawi Foundation is a non-profit organization that seeks to make a significant impact on the world of knowledge. The Foundation is also working to create the largest Arabic library containing the most important books of modern Arab heritage after reproduction, to keep them from extinction.
References
Alansary, S., et al.: Building an International Corpus of Arabic (ICA): Progress of Compilation Stage. Bibliotheca Alexandrina (2008)
Souri, A., et al.: A study towards a building an Arabic corpus (ArbCo). In: The 2nd National Symposium on Arabic Language Engineering (JDILA 2015). National School Applied Sciences, University Sidi Mohammed Ben Abdellah Fez, Morocco (2015)
Souri, A., et al.: A proposed approach for Arabic language segmentation. In: 1st International Conference Arabic Computational Linguistics, Cairo, Egypt, 17–20 April 2015. IEEE Computer Society (2015). https://doi.org/10.1109/acling.2015.13
Elarnaoty, M., et al.: A machine learning approach for opinion holder extraction in Arabic language. Int. J. Artif. Intel. Appl. 3, 45–63 (2012). https://doi.org/10.5121/ijaia.2012.3205
Chang, Y., Lee, K.: Bayesian feature selection for sparse topic model. In: IEEE International Workshop Machine Learning for Signal Processing, Beijing, China, pp. 1–6. IEEE (2011)
Faria, L., et al.: Automatic preservation watch using information extraction on the web: a case study on semantic extraction of natural language for digital preservation. In: 10th International Conference Preservation of Digital Objects, Lisbon, Portugal (2013)
Alghamdi, H.M., et al.: Arabic web pages clustering and annotation using semantic class features. J. King Saud Uni. Comput. Inf. Sci. 26, 388–397 (2014). https://doi.org/10.1016/j.jksuci.2014.06.002
Józefowicz, R., et al.: Exploring the limits of language modeling. CoRR abs/1602.02410 (2016)
Zoph, B., et al.: Simple, fast noise-contrastive estimation for large RNN vocabularies. In: NAACL (2016). https://doi.org/10.18653/v1/n16-1145
Arab world Books dataset. http://www.arabworldbooks.com/index.html. Accessed 22 Feb 2018
Ling, W., et al.: Finding function in form: compositional character models for open vocabulary word representation. In: EMNLP (2015). https://doi.org/10.18653/v1/d15-1176
Hindawi Database. https://www.hindawi.org. Accessed 22 Feb 2018
Sutskever, I., et al.: Generating text with recurrent neural networks. In: International Conference on Machine Learning, ICML 2011 (2011)
Graves, A.: Generating sequences with recurrent neural networks. arXiv preprint arXiv:1308.0850 (2013)
Taymiya, I.: Book of Al Iman, 5 edn (1996)
Keras. http://www.keras.io. Accesses 19 Jan 2018
TensorFlow. http://www.tensorflow.org. Accessed 19 Jan 2018
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997). https://doi.org/10.1162/neco.1997.9.8.1735
Kang, M., et al.: Mandarin word-character hybridinput neural network language model. In: 12th Annual Conference International Speech Communication Association, INTERSPEECH 2011, Florence, Italy, pp. 625–628 (2011)
dos Santos, C.N., Zadrozny, B.: Learning character-level representations for part-of-speech tagging. In: Proceeding of the 31st International Conference on Machine Learning, ICML 2014, Beijing, China, pp. 1818–1826 (2014)
Bojanowski, P., et al.: Alternative structures for character-level RNNs. CoRR abs/1511.06303 (2015)
Luong, M.T., Manning, C.D.: Achieving open vocabulary neural machine translation with hybrid word-character models. CoRR abs/1604.00788 (2016). https://doi.org/10.18653/v1/p16-1100
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Souri, A., El Maazouzi, Z., Al Achhab, M., El Mohajir, B.E. (2018). Arabic Text Generation Using Recurrent Neural Networks. In: Tabii, Y., Lazaar, M., Al Achhab, M., Enneya, N. (eds) Big Data, Cloud and Applications. BDCA 2018. Communications in Computer and Information Science, vol 872. Springer, Cham. https://doi.org/10.1007/978-3-319-96292-4_41
Download citation
DOI: https://doi.org/10.1007/978-3-319-96292-4_41
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-96291-7
Online ISBN: 978-3-319-96292-4
eBook Packages: Computer ScienceComputer Science (R0)