skip to main content
10.1145/3007818.3007832acmotherconferencesArticle/Chapter ViewAbstractPublication PagesedbConference Proceedingsconference-collections
research-article

Medical examination data prediction using simple recurrent network and long short-term memory

Authors Info & Claims
Published:17 October 2016Publication History

ABSTRACT

In this work, we use two different types of recurrent neural networks (RNNs) to predict medical examination results of a subject given the previous measurements. The first one is a simple recurrent network (SRN) which models temporal trajectories of a data sequence to infer the unknown future observation, and the second one is a long short-term memory (LSTM) that enables modeling the longer trajectories by exploiting forgetting switches. The non-linear, temporal evolution of medical status of a human subjects are approximated by the RNNs, and the prediction of the future measurement becomes more accurate than those of the linear approximation method. The performance evaluation experiments are carried out on the real medical examination data, and the proposed methods show superior performances over the linear regression method. For the subjects who have abnormal behaviors in their medical examination results, the performance improvements are much more significant, so the proposed methods are expected to be used in detecting potential patients to provide earlier diagnosis and proper treatments for their illnesses.

References

  1. M. Abadi, A. Agarwal, P. Barham, E. Brevdo, Z. Chen, C. Citro, G. S. Corrado, A. Davis, J. Dean, M. Devin, et al. Tensorflow: Large-scale machine learning on heterogeneous distributed systems. arXiv preprint arXiv:1603.04467, 2016.Google ScholarGoogle Scholar
  2. L. Ahmad, A. Eshlaghy, A. Poorebrahimi, M. Ebrahimi, and A. Razavi. Using three machine learning techniques for predicting breast cancer recurrence. Journal of Health and Medical Informatics, 4(124), 4 2013.Google ScholarGoogle Scholar
  3. C. Bishop. Pattern Recognition and Machine Learning. Springer, January 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. G. E. Dahl, D. Yu, L. Deng, and A. Acero. Context-dependent pre-trained deep neural networks for large-vocabulary speech recognition. IEEE Transactions on Audio, Speech, and Language Processing, 20(1):30--42, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. J. L. Elman. Distributed representations, simple recurrent networks, and grammatical structure. Machine learning, 7(2--3):195--225, 1991. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. A. Graves, A.-R. Mohamed, and G. Hinton. Speech recognition with deep recurrent neural networks. In 2013 IEEE international conference on acoustics, speech and signal processing, pages 6645--6649. IEEE, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  7. K. Greff, R. K. Srivastava, J. Koutník, B. R. Steunebrink, and J. Schmidhuber. LSTM: A search space odyssey. arXiv preprint arXiv:1503.04069, 2015.Google ScholarGoogle Scholar
  8. P. Hachesu, M. Ahmadi, S. Alizadeh, and F. Sadoughi. Use of data mining techniques to determine and predict length of stay of cardiac patients. Healthcare Informatics Research, 19(2):121--129, 6 2013.Google ScholarGoogle ScholarCross RefCross Ref
  9. G. Hinton, L. Deng, D. Yu, G. E. Dahl, A.-r. Mohamed, N. Jaitly, A. Senior, V. Vanhoucke, P. Nguyen, T. N. Sainath, et al. Deep neural networks for acoustic modeling in speech recognition: The shared views of four research groups. IEEE Signal Processing Magazine, 29(6):82--97, 2012.Google ScholarGoogle ScholarCross RefCross Ref
  10. S. Hochreiter and J. Schmidhuber. Long short-term memory. Neural computation, 9(8):1735--1780, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. S. Ioffe and C. Szegedy. Batch normalization: Accelerating deep network training by reducing internal covariate shift. arXiv preprint arXiv:1502.03167, 2015.Google ScholarGoogle Scholar
  12. A. Krizhevsky, I. Sutskever, and G. E. Hinton. Imagenet classification with deep convolutional neural networks. In Advances in neural information processing systems, pages 1097--1105, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. J. Lee, J. S. Lee, S.-H. Park, S. A. Shin, and K. Kim. Cohort profile: The national health insurance service-national sample cohort (NHIS-NSC), south korea. International Journal of Epidemiology, pages 1--8, 1 2016.Google ScholarGoogle ScholarCross RefCross Ref
  14. J. Oh and B. Kim. Prediction model for demands of the health meteorological information using a decision tree method. Asian Nursing Research, 4(3):151--162, 9 2010.Google ScholarGoogle ScholarCross RefCross Ref
  15. T.-H. Wen, M. Gasic, N. Mrksic, P.-H. Su, D. Vandyke, and S. Young. Semantically conditioned LSTM-based natural language generation for spoken dialogue systems. arXiv preprint arXiv:1508.01745, 2015.Google ScholarGoogle Scholar
  16. F. Wilcoxon. Individual comparisons by ranking methods. Biometrics bulletin, 1(6):80--83, 1945.Google ScholarGoogle ScholarCross RefCross Ref
  17. X. Yan. Linear regression analysis: theory and computing. World Scientific, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. K. Yao, B. Peng, Y. Zhang, D. Yu, G. Zweig, and Y. Shi. Spoken language understanding using long short-term memory neural networks. In Spoken Language Technology Workshop (SLT), 2014 IEEE, pages 189--194. IEEE, 2014.Google ScholarGoogle ScholarCross RefCross Ref
  19. W. Zaremba, I. Sutskever, and O. Vinyals. Recurrent neural network regularization. arXiv preprint arXiv:1409.2329, 2014.Google ScholarGoogle Scholar

Index Terms

  1. Medical examination data prediction using simple recurrent network and long short-term memory

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      EDB '16: Proceedings of the Sixth International Conference on Emerging Databases: Technologies, Applications, and Theory
      October 2016
      183 pages
      ISBN:9781450347549
      DOI:10.1145/3007818

      Copyright © 2016 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 17 October 2016

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader