Skip to main content
Log in

Patient’s actions recognition in hospital’s recovery department based on RGB-D dataset

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

A recovery room is a necessary unit of hospital nursing that should be continued in the operating room. The goal of recovery is to provide high-quality care for patients undergoing post-surgical recovery, in contrast to the effects of anesthetic drugs, such as sudden movements of the hands and legs, standing, or falling from the bed that may occur instantaneously. Due to the shortage of nurses in the recovery room that these units face, the need to use remote monitoring systems for patients is increasing to somehow compensate for the lack of service personnel and that assist staff in better monitoring of patients. In this study, using a combination of geometric features and depth data, a patient’s actions are recognized. Then, the actions that are at risk for patients in the recovery room will be identified and notified to the nursing unit before its occurrence to take necessary measures. For this purpose, RGB-D data is collected and analyzed. The proposed methodology steps in this study generally include recording video images using Kinect sensors (457 videos with 640 × 480 resolution), extracting features from video frames (color separation-based approach), training the Hidden Markov Model to classify the indicator vectors, and finally evaluation and validation of the model. Experimental results indicate that the proposed identification method can accurately detect moments that the patient is exposed to danger due to their changes in the hospital bed. The recognition rate for this approach is 91.36%.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Algorithm 1
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Notes

  1. Post Anesthesia Care Unit

  2. Time of Flight

  3. Histogram of Oriented Gradients (HOG)

  4. Edge Orientation. Histograms (EOH)

  5. Hidden Markov Model (HMM)

References

  1. Adamina M, Gié O, Demartines N, Ris F (2013) Contemporary perioperative care strategies. Br J Surg 100(1):38–54. https://doi.org/10.1002/bjs.8990

    Article  Google Scholar 

  2. Ahad MAR, Antar AD, Ahmed M (n.d.) "Human Activity Recognition: Data Collection and Design Issues," in IoT Sensor-Based Activity Recognition: Springer, pp. 63–75

  3. Alazrai R, Momani M, Daoud MI (2017) Fall detection for elderly from partially observed depth-map video sequences based on view-invariant human activity representation. Appl Sci 7(4):316. https://doi.org/10.3390/app7040316

    Article  Google Scholar 

  4. Arivazhagan S, Shebiah RN, Harini R, Swetha S (2019) Human action recognition from RGB-D data using complete local binary pattern. Cognitive Syst Res 58:94–104. https://doi.org/10.1016/j.cogsys.2019.05.002

    Article  Google Scholar 

  5. Barone CP, Pablo CS, Barone GW (2003) A history of the PACU. J Peri Anesth Nurs 18(4):237–241. https://doi.org/10.1016/S1089-9472(03)00130-8

    Article  Google Scholar 

  6. Bellini V, Guzzon M, Bigliardi B, Mordonini M, Filippelli S, Bignami E (2020) Artificial intelligence: a new tool in operating room management. Role of machine learning models in operating room optimization. J Med Syst 44(1):1–10. https://doi.org/10.1007/s10916-019-1512-1

    Article  Google Scholar 

  7. Berchtold M, Budde M, Schmidtke HR, Beigl M (2010) "An extensible modular recognition concept that makes activity recognition practical," in Annual Conference on Artificial Intelligence: Springer, pp. 400–409, https://doi.org/10.1007/978-3-642-16111-7_46.

  8. Childers CP, Maggard-Gibbons M (2018) Understanding costs of care in the operating room. JAMA Surg 153(4):e176233–e176233. https://doi.org/10.1001/jamasurg.2017.6233

    Article  Google Scholar 

  9. Davidson M, Litchfield K (2018) Patient recovery and the post-anaesthesia care unit (PACU). Anaesth Intensive Care Med 19(9):457–460. https://doi.org/10.1016/j.mpaic.2018.06.002

    Article  Google Scholar 

  10. Ding I Jr, Chang C-W (2016) An adaptive hidden Markov model-based gesture recognition approach using Kinect to simplify large-scale video data processing for humanoid robot imitation. Multimed Tools Appl 75(23):15537–15551. https://doi.org/10.1007/s11042-015-2505-9

    Article  Google Scholar 

  11. Diraco G, Leone A, Siciliano P (2010) "An active vision system for fall detection and posture recognition in elderly healthcare," in 2010 Design, Automation & Test in Europe Conference & Exhibition (DATE 2010): IEEE, pp. 1536–1541, https://doi.org/10.1109/DATE.2010.5457055.

  12. Du Nguyen H, Tran KP, Zeng X, Koehl L, Tartare G (2020) "An improved ensemble machine learning algorithm for wearable sensor data based human activity recognition," Reliab Stat Comput, pp. 207–228

  13. Dutta T (2012) Evaluation of the Kinect™ sensor for 3-D kinematic measurement in the workplace. Appl Ergon 43(4):645–649. https://doi.org/10.1016/j.apergo.2011.09.011

    Article  Google Scholar 

  14. Fairley M, Scheinker D, Brandeau ML (2019) Improving the efficiency of the operating room environment with an optimization and machine learning model. Health Care Manag Sci 22(4):756–767. https://doi.org/10.1007/s10729-018-9457-3

    Article  Google Scholar 

  15. Garcia-Agundez A, Folkerts AK, Konrad R, Caserman P, Tregel T, Goosses M, Göbel S, Kalbe E (2019) Recent advances in rehabilitation for Parkinson’s Disease with Exergames: A Systematic Review. J Neuroeng Rehab 16(1):17. https://doi.org/10.1186/s12984-019-0492-1

    Article  Google Scholar 

  16. Han J, Kamber M (2006) "data mining concepts and techniques, published by Morgan Kauffman," ed

  17. Han J, Shao L, Xu D, Shotton J (2013) Enhanced computer vision with microsoft kinect sensor: a review. IEEE Trans Cybern 43(5):1318–1334. https://doi.org/10.1109/TCYB.2013.2265378

    Article  Google Scholar 

  18. Harris CG, Stephens M (1988) A combined corner and edge detector. In: Proceedings of the Alvey Vision Conference, pp 10–5244. https://doi.org/10.5244/C.2.23 

  19. Hochhausen N, Barbosa Pereira C, Leonhardt S, Rossaint R, Czaplik M (2018) Estimating Respiratory Rate in Post-Anesthesia Care Unit Patients Using Infrared Thermography: An Observational Study. Sensors 18(5):1618. https://doi.org/10.3390/s18051618

    Article  Google Scholar 

  20. Hong P, Turk M, Huang TS (2000) "Gesture modeling and recognition using finite state machines," in Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580): IEEE, pp. 410–415, https://doi.org/10.1109/AFGR.2000.840667.

  21. Imran J, Raman B (2020) Evaluating fusion of RGB-D and inertial sensors for multimodal human action recognition. J Ambient Intell Humaniz Comput 11(1):189–208. https://doi.org/10.1007/s12652-019-01239-9

    Article  Google Scholar 

  22. Inoue M, Taguchi R (2020) Bed exit action detection based on patient posture with long short-term memory. In: 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), pp. 4390–4393. https://doi.org/10.1109/EMBC44109.2020.9175619

  23. Ji X, Cheng J, Feng W, Tao D (2018) Skeleton embedded motion body partition for human action recognition using depth sequences. Signal Process 143:56–68. https://doi.org/10.1016/j.sigpro.2017.08.016

    Article  Google Scholar 

  24. Karabulut N, Aktaş YY (2016) Nursing management of delirium in the postanesthesia care unit and intensive care unit. J Peri Anesth Nurs 31(5):397–405. https://doi.org/10.1016/j.jopan.2014.10.006

    Article  Google Scholar 

  25. Ke Q, Bennamoun M, An S, Sohel F, Boussaid F (2017) "A new representation of skeleton sequences for 3d action recognition," in Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 3288–3297

  26. Khoshelham K, Elberink SO (2012) Accuracy and resolution of kinect depth data for indoor mapping applications. Sensors 12(2):1437–1454. https://doi.org/10.3390/s120201437

    Article  Google Scholar 

  27. Li G, Li C (2020) Learning skeleton information for human action analysis using Kinect. Signal Process Image Commun 84:115814. https://doi.org/10.1016/j.image.2020.115814

    Article  Google Scholar 

  28. Li Y, Berkowitz L, Noskin G, Mehrotra S (2014) "Detection of patient's bed statuses in 3D using a Microsoft Kinect," in 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society: IEEE, pp. 5900–5903, https://doi.org/10.1109/EMBC.2014.6944971.

  29. Li W-J, Hsieh C-Y, Lin L-F, Chu W-C (2017) "Hand gesture recognition for post-stroke rehabilitation using leap motion," in 2017 International Conference on Applied System Innovation (ICASI): IEEE, pp. 386–388, https://doi.org/10.1109/ICASI.2017.7988433.

  30. Li S, Li W, Cook C, Zhu C, Gao Y (2018) "Independently recurrent neural network (indrnn): Building a longer and deeper rnn," in Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 5457–5466

  31. Liu X, Yin J, Liu Y, Zhang S, Guo S, Wang K (2019) Vital signs monitoring with RFID: opportunities and challenges. IEEE Netw 33(4):126–132

    Article  Google Scholar 

  32. Liu B, Cai H, Ju Z, Liu H (2019) RGB-D sensing based human action and interaction analysis: a survey. Pattern Recogn 94:1–12. https://doi.org/10.5573/IEIESPC.2015.4.4.281

  33. Luckowski A (2019) Safety priorities in the PACU. Nursing 2020 49(4):62–65. https://doi.org/10.1097/01.NURSE.0000554246.74635.e0

    Article  Google Scholar 

  34. Ludbrook G, Lloyd C, Story D, Maddern G, Riedel B, Richardson I, Scott D, Louise J, Edwards S (2021) The effect of advanced recovery room care on postoperative outcomes in moderate-risk surgical patients: a multicentre feasibility study. (in eng), Anaesth 76(4):480–488. https://doi.org/10.1111/anae.15260

    Article  Google Scholar 

  35. Lun R, Zhao W (2015) A survey of applications and human motion recognition with microsoft kinect. Int J Pattern Recog Artif Intell 29(05):1555008. https://doi.org/10.1142/S0218001415550083

    Article  Google Scholar 

  36. Malasinghe LP, Ramzan N, Dahal K (2019) Remote patient monitoring: a comprehensive study. J Ambient Intell Humaniz Comput 10(1):57–76. https://doi.org/10.1007/s12652-017-0598-x

    Article  Google Scholar 

  37. Ong APR et al (2017) Application of KinectTM and wireless technology for patient data recording and viewing system in the course of surgery. AIP Conf Proceed 1817(1) AIP Publishing LLC:040004. https://doi.org/10.1063/1.4976789

    Article  Google Scholar 

  38. Oreifej O, Liu Z (2013) "Hon4d: Histogram of oriented 4d normals for activity recognition from depth sequences," in Proceedings of the IEEE conference on computer vision and pattern recognition, , pp. 716–723, https://doi.org/10.1109/CVPR.2013.98.

  39. Overhage JM, McCallie D Jr (2020) Physician time spent using the electronic health record during outpatient encounters: a descriptive study. Ann Intern Med 172(3):169–174. https://doi.org/10.7326/M18-3684

    Article  Google Scholar 

  40. Pachoulakis I, Papadopoulos N, Analyti A (2018) Kinect-based exergames tailored to Parkinson patients. Int J Comput Games Technol 2018:1–14. https://doi.org/10.1155/2018/2618271

    Article  Google Scholar 

  41. Patsadu O, Nukoolkit C, Watanapa B (2012) "Human gesture recognition using Kinect camera," in 2012 ninth international conference on computer science and software engineering (JCSSE): IEEE, pp. 28–32, https://doi.org/10.1109/JCSSE.2012.6261920.

  42. Qiao R, Liu L, Shen C, van den Hengel A (2017) Learning discriminative trajectorylet detector sets for accurate skeleton-based action recognition. Pattern Recogn 66:202–212. https://doi.org/10.1016/j.patcog.2017.01.015

    Article  Google Scholar 

  43. Rabiner L, Juang B (1986) An introduction to hidden Markov models. IEEE ASSP Mag 3(1):4–16. https://doi.org/10.1109/MASSP.1986.1165342

    Article  Google Scholar 

  44. Rahmani H, Mahmood A, Huynh DQ, Mian A (2014) "HOPC: Histogram of oriented principal components of 3D pointclouds for action recognition," in European conference on computer vision: Springer, pp. 742–757, https://doi.org/10.1007/978-3-319-10605-2_48.

  45. Rougier C, Meunier J, St-Arnaud A, Rousseau J (2011) Robust video surveillance for fall detection based on human shape deformation. IEEE Trans Circ Syst Vid Technol 21(5):611–622. https://doi.org/10.1109/TCSVT.2011.2129370

    Article  Google Scholar 

  46. Rougier C, Auvinet E, Rousseau J, Mignotte M, Meunier J (2011) "Fall detection from depth map video sequences," in International conference on smart homes and health telematics: Springer, pp. 121–128, https://doi.org/10.1007/978-3-642-21535-3_16.

  47. Sepehri MM, Mollaei H, Khatibi T (2014) A framework for monitoring patients in the recovery room using Kinect. In: 11th International Industrial Engineering Conference, pp. 141–152

  48. Silverstein E, Snyder M (2017) Implementation of facial recognition with Microsoft Kinect v2 sensor for patient verification. Med Phys 44(6):2391–2399. https://doi.org/10.1002/mp.12241

    Article  Google Scholar 

  49. Stoyanov T, Louloudi A, Andreasson H, Lilienthal AJ (2011) "Comparative evaluation of range sensor accuracy in indoor environments," in 5th European Conference on Mobile Robots, ECMR 2011, September 7–9, 2011, Örebro, Sweden, pp. 19–24. [Online]. Available: urn:nbn:se:oru:diva-24096. [Online]. Available: urn:nbn:se:oru:diva-24096

  50. Surasak T, Takahiro I, Cheng C-H, Wang C-E, Sheng P-Y (2018) Histogram of oriented gradients for human detection in video. In: 2018 5th International conference on business and industrial research (ICBIR), pp. 172–176, https://doi.org/10.1109/icbir.2018.8391187

  51. Trăscău M, Nan M, Florea AM (2019) Spatio-temporal features in action recognition using 3d skeletal joints. Sensors 19(2):423. https://doi.org/10.3390/s19020423

  52. Van den Bergh M, Van Gool L (2011) "Combining RGB and ToF cameras for real-time 3D hand gesture interaction," in 2011 IEEE workshop on applications of computer vision (WACV): IEEE, pp. 66–72, https://doi.org/10.1109/WACV.2011.5711485.

  53. Wang J, Liu Z, Wu Y, Yuan J (2012) "Mining actionlet ensemble for action recognition with depth cameras," in 2012 IEEE Conference on Computer Vision and Pattern Recognition: IEEE, pp. 1290–1297, https://doi.org/10.1109/CVPR.2012.6247813.

  54. Wang L, Huynh DQ, Koniusz P (2019) A comparative review of recent kinect-based action recognition algorithms. IEEE Trans Image Process 29:15–28. https://doi.org/10.1109/TIP.2019.2925285

    Article  MathSciNet  MATH  Google Scholar 

  55. Yamato J, Ohya J, Ishii K (1992) "Recognizing human action in time-sequential images using hidden Markov model," in CVPR, vol. 92, pp. 379–385, https://doi.org/10.1109/CVPR.1992.223161.

  56. Yan S, Xiong Y, Lin D (2018) Spatial temporal graph convolutional networks for skeleton-based action recognition. In: Proceedings of the AAAI Conference on Artificial Intelligence, 32(1). https://doi.org/10.1609/aaai.v32i1.12328

  57. Yang X, Tian Y (2014) "Super normal vector for activity recognition using depth sequences," in Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 804–811, https://doi.org/10.1109/CVPR.2014.108.

  58. Zhang Z (2012) Microsoft kinect sensor and its effect. IEEE Multimed 19(2):4–10. https://doi.org/10.1109/MMUL.2012.24

  59. Zhang H-B et al (2019) A Comprehensive Survey of Vision-Based Human Action Recognition Methods. Sensors 19(5):1005. https://doi.org/10.3390/s19051005

  60. Zhao W, Feng H, Lun R, Espy DD, Reinthal MA (2014) A Kinect-based rehabilitation exercise monitoring and guidance system. In: 2014 IEEE 5th International Conference on Software Engineering and Service Science, pp 762–765, https://doi.org/10.1109/ICSESS.2014.6933678

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mohammad Mehdi Sepehri.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix 1

Appendix 1

figure b

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mollaei, H., Sepehri, M.M. & Khatibi, T. Patient’s actions recognition in hospital’s recovery department based on RGB-D dataset. Multimed Tools Appl 82, 24127–24154 (2023). https://doi.org/10.1007/s11042-022-14200-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-022-14200-4

Keywords

Navigation