Skip to main content
Log in

Pseudo-random number generation using LSTMs

  • Published:
The Journal of Supercomputing Aims and scope Submit manuscript

Abstract

Previous studies have developed pseudo-random number generators, where a pseudo-random number is not perfectly random but is practically useful. In this paper, we propose a new system for pseudo-random number generation. Recurrent neural networks with long short-term memory units are used to mimic the appearance of a given sequence of irrational number (e.g., pi), and these are intended to generate pseudo-random numbers in an iterative manner. We design algorithms to ensure that the output sequence contains no repetition or pattern. Through experimental results, we can observe the potential of the proposed system in terms of its randomness and stability. As this system can be used for parameter approximation in machine learning techniques, we believe that it will contribute to various industrial fields such as traffic management and frameworks for sensor networks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Chung J, Gulcehre C, Cho K, Bengio Y (2014) Empirical evaluation of gated recurrent neural networks on sequence modeling. In: Proceedings of NIPS 2014 Deep Learning and Representation Learning Workshop, Montreal, Quebec, Canada, pp 1–9

  2. Desai V, Patil R, Rao D (2012) Using layer recurrent neural network to generate pseudo random number sequences. Int J Comput Sci Issues 9(2):324–334

    Google Scholar 

  3. Fischer A, Igel C (2012) An introduction to restricted Boltzmann machines. In: Proceedings of 17th Iberoamerican Congress: Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications, Buenos Aires, Argentina, pp 14–36

  4. Gholipour A, Mirzakuchaki S (2011) A pseudorandom number generator with Keccak hash function. Int J Comput Electr Eng 3(6):896–899

    Article  Google Scholar 

  5. Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, Courville A, Bengio Y (2014) Generative adversarial nets. In: Proceedings of 28th Annual Conference on Neural Information Processing Systems, Montreal, Quebec, Canada, pp 2672–2680

  6. Graves A (2012) Supervised sequence labelling with recurrent neural networks, vol 385. Springer, Berlin

    Book  Google Scholar 

  7. Hinton GE, Osindero S, Teh YW (2006) A fast learning algorithm for deep belief nets. Neural Comput 18(7):1527–1554

    Article  MathSciNet  Google Scholar 

  8. Hochreiter S (1998) The vanishing gradient problem during learning recurrent neural nets and problem solutions. Int J Uncertain Fuzziness Knowl Based Syst 6(2):107–116

    Article  MathSciNet  Google Scholar 

  9. Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780

    Article  Google Scholar 

  10. Jeong YS, Oh K, Cho CK, Choi, HJ (2018) Pseudo random number generation using LSTMS and irrational numbers. In: Proceedings of the 26th IEEE International Conference on Big Data and Smart Computing, Shanghai, China, pp 541–544

  11. Kim Y (2014) Convolutional neural networks for sentence classification. In: Proceedings of the Conference on Empirical Methods in Natural Language Processing, Doha, Qatar, pp 1746–1751

  12. Kim ZM, Jeong YS, Oh HR, Oh KJ, Lim CG, Iraqi Y, Choi HJ (2016) Investigating the impact of possession-way of a smartphone on action recognition. Sensors 16(6):1–15

    Article  Google Scholar 

  13. Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. In: Proceedings of 26th Annual Conference on Neural Information Processing Systems, Nevada, USA, pp 1106–1114

  14. Lee HG, Kim JS, Shin JH, Lee J, Quan YX, Jeong YS (2016) papago: a machine translation service withword sense disambiguation and currency conversion. In: Proceedings of the 26th International Conference on Computational Linguistics: System Demonstrations, Osaka, Japan, pp 185–188

  15. Lee KS, Lee SR, Kim Y, Lee CG (2017) Deep learning-based real-time query processing for wireless sensor network. Int J Distrib Sens Netw 13(5):1–10

    Article  Google Scholar 

  16. Mikolov T, Sutskever I, Chen K, Corrado G, Dean J (2013) Distributed representations of words and phrases and their compositionality. In: Proceedings of the 27st Annual Conference on Neural Information Processing Systems, Nevada, United States, pp 3111–3119

  17. Milinkovic L, Antic M, Cica Z (2011) Pseudo-random number generator based on irrational numbers. In: Proceedings of 10th International Conference on Telecommunication in Modern Satellite Cable and Broadcasting Services, Serbia, pp 719–722

  18. Mitchell TM (1997) Machine learning, 1st edn. McGraw-Hill, New York

    MATH  Google Scholar 

  19. National Institute of Standard and Technology: Data encryption standard. Federal Information Processing Standard (FIPS) publication 46(3) (1999)

  20. National Institute of Standard and Technology: Advanced encryption standard. Federal Information Processing Standard (FIPS) publication 197 (2001)

  21. National Institute of Standard and Technology: A statistical test suite for random and pseudorandom number generators for cryptographic applications (2010)

  22. Neumann JV (1951) Various techniques used in connection with random digits. In: Monte Carlo method, vol 12. National Bureau of Standards Applied Mathematics Series, pp 36–38

  23. Olah C (2015) Understanding LSTM networks . http://colah.github.io/posts/2015-08-Understanding-LSTMs/

  24. Rumelhart D, Hinton GE, Williams RJ (1986) Learning internal representations by error propagation. In: Parallel distributed processing: explorations in the microstructure of cognition, vol 1. MIT Press, Cambridge, pp 318–362

  25. Schneier B (1996) Applied cryptography, 2nd edn. Wiley, New York

    MATH  Google Scholar 

  26. Stallings W (2010) Cryptography and network security, principles and practices, 5th edn. Pearson, London

    Google Scholar 

  27. Sutskever I, Martens J, Hinton G (2011) Generating text with recurrent neural networks. In: Proceedings of the 28th International Conference on Machine Learning, Bellevue, WA, USA, pp 1017–1024

  28. Sutskever I, Vinyals O, Le QV (2014) Sequence to sequence learning with neural networks. In: Proceedings of the 27th International Conference on Neural Information Processing Systems, Montreal, Canada, pp 3104–3112

  29. Sys M, Riha Z, Matyas V, Marton K, Suciu A (2015) On the interpretation of results from the nist statistical test suite. Rom J Inf Sci Technol 18(1):18–32

    Google Scholar 

  30. Szegedy C, Liu W, Jia Y, Sermanet P, Reed S, Anguelov D, Erhan D, Vanhoucke V, Rabinovich A (2015) Going deeper with convolutions. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, pp 1–9

  31. Tirad K (2010) Developing pseudo random number generator based on neural networks and neurofuzzy systems. Master’s thesis, Ryerson University

  32. Tirdad K (2010) Developing pseudo random number generator based on neural networks and neurofuzzy systems. Master degree dissertation, Ryerson University

  33. Wang XY, Qin X (2012) A new pseudo-random number generator based on CML and chaotic iteration. Nonlinear Dyn 70(2):1589–1592

    Article  MathSciNet  Google Scholar 

  34. Wang Y, Wang G, Zhang H (2010) Random number generator based on Hopfield neural network and SHA-2 (512). In: Lecture notes in electrical engineering, vol 56. Springer, Berlin, pp 198–205

  35. Yao S, Hu S, Zhao Y, Zhang A, Abdelzaher T (2017) Deepsense: a unified deep learning framework for time-series mobile sensing data processing. In: Proceedings of the 26th International World Wide Web Conference, Perth, Australia, pp 351–360

Download references

Acknowledgements

This work was supported by the Soonchunhyang University Research Fund. This research was also supported by Korea Electric Power Corporation (Grant number: R18XA05).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ho-Jin Choi.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jeong, YS., Oh, KJ., Cho, CK. et al. Pseudo-random number generation using LSTMs. J Supercomput 76, 8324–8342 (2020). https://doi.org/10.1007/s11227-020-03229-7

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11227-020-03229-7

Keywords

Navigation