Skip to main content
Log in

Efficient health-related abnormal behavior detection with visual and inertial sensor integration

  • Theoretical Advances
  • Published:
Pattern Analysis and Applications Aims and scope Submit manuscript

Abstract

An increasing number of healthcare issues arise from unsafe abnormal behaviors such as falling and staggering of a rapidly aging population. These abnormal behaviors, often coming with abrupt movements, could potentially be life-threatening if unnoticed; real-time, accurate detection of this sort of behavior is essential for timely response. However, it is challenging to achieve generic, while accurate, abnormal behavior detection in real time with moderate sensing devices and processing power. This paper presents an innovative system as a solution. It utilizes primarily visual data for detecting various types of abnormal behaviors due to accuracy and generality of computer vision technologies. Unfortunately, the volume of the recorded video data is huge, which is preventive to process all in real time. We propose to use elder-carried mobile devices either by a dedicated design or by a smartphone, equipped with inertial sensor to trigger the selection of relevant video data. In this way, the system operates in a trigger verify fashion, which leads to selective utilization of video data to guarantee both accuracy and efficiency in detection. The system is designed and implemented using inexpensive commercial off-the-shelf sensors and smartphones. Experimental evaluations in real-world settings illustrate our system’s promise for real-time accurate detection of abnormal behaviors.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Notes

  1. United Nations World Population Ageing: 1950–2050. http://www.un.org/esa/population/publications/worldageing19502050/.

  2. US Department of Health and Human Services Aging Statistics. http://www.aoa.gov/AgingStatistics/.

References

  1. Abbate S, Avvenuti M, Bonatesta F, Cola G, Corsini P, Vecchio A (2012) A smartphone-based fall detection system. Pervasive and Mobile Computing 8(6):883–899

    Article  Google Scholar 

  2. Adib F, Kabelac Z, Katabi D, Miller RC (2013) 3D tracking via body radio reflections. Technical report MIT-CSAIL-TR-2013-030, MIT

  3. Alvarez L, Weickert J, Sánchez J (2000) Reliable estimation of dense optical flow fields with large displacements. Int J Comput Vis 39(1):41–56

    Article  MATH  Google Scholar 

  4. Alwan M, Rajendran P, Kell S, Mack D, Dalal S, Wolfe M, Felder R (2006) A smart and passive floor-vibration based fall detector for elderly. Proc. IEEE ICTTA 1:1003–1007

    Google Scholar 

  5. Arulampalam MS, Maskell S, Gordon N, Clapp T (2002) A tutorial on particle filters for online nonlinear/non-gaussian bayesian tracking. IEEE Transactions on Signal Processing 50(2):174–188

    Article  Google Scholar 

  6. Bourke AK, O’Brien JV, Lyons GM (2007) Evaluation of a threshold-based tri-axial accelerometer fall detection algorithm. Gait Posture 26(2):194–199

    Article  Google Scholar 

  7. Brezmes T, Gorricho JL, Cotrina J (2009) Activity recognition from accelerometer data on a mobile phone. In: Proceedings of the IWANN, Part II, no. 5518 in LNCS, Springer, pp 796–799

  8. Chéron G, Laptev I, Schmid C (2015) P-CNN: Pose-based cnn features for action recognition. In: Proceedings of the IEEE international conference on computer vision, pp 3218–3226

  9. Choudhury T, Consolvo S, Harrison B, Hightower J, Lamarca A, Legrand L, Rahimi A, Rea A, Bordello G, Hemingway B, Klasnja P, Koscher K, Landay J, Lester J, Wyatt D, Haehnel D (2008) The mobile sensing platform: an embedded activity recognition system. IEEE Pervasive Comput 7(2):32–41

    Article  Google Scholar 

  10. Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297

    MATH  Google Scholar 

  11. Cuppens K, Chen CW, Wong KBY, Van de Vel A, Lagae L, Ceulemans B, Tuytelaars T, Van Huffel S, Vanrumste B, Aghajan H (2012) Integrating video and accelerometer signals for nocturnal epileptic seizure detection. In: Proceedings of the 14th ACM international conference on Multimodal interaction, ACM, pp 161–164

  12. Dai J, Bai X, Yang Z, Shen Z, Xuan D (2010) PerFallD: a pervasive fall detection system using mobile phones. In: Proceedings of the IEEE PERCOM Workshop, pp 292–297

  13. Ding S, Li Y, Zhu J, Zheng YF, Xuan D (2015) Sequential sample consensus: a robust algorithm for video-based face recognition. IEEE Trans Circuits Syst Video Technol 25(10):1586–1598

    Article  Google Scholar 

  14. Ding S, Zhai Q, Li Y, Zhu J, Zheng YF, Xuan D (2016) Simultaneous body part and motion identification for human-following robots. Pattern Recognit 50:118–130

    Article  Google Scholar 

  15. Ding S, Li G, Li Y, Li X, Zhai Q, Champion AC, Zhu J, Xuan D, Zheng YF (2017) Survsurf: human retrieval on large surveillance video data. Multimed Tools Appl 76(5):6521–6549

    Article  Google Scholar 

  16. Fasola J, Mataric M (2013) A socially assistive robot exercise coach for the elderly. J Hum-Robot Interact 2(2):3–32

    Article  Google Scholar 

  17. Felzenszwalb PF, Girshick RB, McAllester D, Ramanan D (2010) Object detection with discriminatively trained part-based models. IEEE Trans Pattern Anal Mach Intell 32(9):1627–1645

    Article  Google Scholar 

  18. Kwapisz JR, Weiss GM, Moore SA (2011) Activity recognition using cell phone accelerometers. SIGKDD Explor Newsl 12(2):74–82

    Article  Google Scholar 

  19. Lara OD, Labrador MA (2013) A survey on human activity recognition using wearable sensors. IEEE Commun Surv Tuts 15(3):1192–1209

    Article  Google Scholar 

  20. Li G, Li X, Yang F, Teng J, Ding S, Zheng YF, Xuan D, Chen B, Zhao W (2017) Traffic at-a-glance: time-bounded analytics on large visual traffic data. IEEE Trans Parallel Distrib Syst PP(99):1–1. doi:10.1109/TPDS.2017.2684158

    Google Scholar 

  21. Li Q, Stankovic JA, Hanson MA, Barth AT, Lach J, Zhou G (2009) Accurate, fast fall detection using gyroscopes and accelerometer-derived posture information. In: Proceedings of the IEEE BSN

  22. Li Y, Ding S, Zhai Q, Zheng YF, Xuan D (2015) Human feet tracking guided by locomotion model. In: Proceedings of 2015 IEEE international conference on robotics and automation (ICRA), IEEE, pp 2424–2429

  23. Nait-Charif H, McKenna SJ (2004) Activity summarisation and fall detection in a supportive home environment. Proc IEEE ICPR 4:323–326

    Google Scholar 

  24. Rougier C, Meunier J, St-Arnaud A, Rousseau J (2011) Robust video surveillance for fall detection based on human shape deformation. IEEE Trans Circuits Syst Video Technol 21(5):611–622

    Article  Google Scholar 

  25. Sadanand S, Corso JJ (2012) Action bank: a high-level representation of activity in video. In: Proceedings of the IEEE CVPR, IEEE, pp 1234–1241

  26. Singh G, Saha S, Cuzzolin F (2016) Online real time multiple spatiotemporal action localisation and prediction on a single platform. arXiv preprint arXiv:161108563

  27. Suarez I, Jahn A, Anderson C, David K (2015) Improved activity recognition by using enriched acceleration data. In: Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing, ACM, pp 1011–1015

  28. Tabar AM, Keshavarz A, Aghajan H (2006) Smart home care network using sensor fusion and distributed vision-based reasoning. In: Proceedings of the 4th ACM international workshop on Video surveillance and sensor networks, pp 145–154

  29. Tolkiehn M, Atallah L, Lo B, Yang GZ (2011) Direction sensitive fall detection using a triaxial accelerometer and a barometric pressure sensor. In: Proceedings of the IEEE EMBC, pp 369–372

  30. Uijlings JR, Duta I, Rostamzadeh N, Sebe N (2014) Realtime video classification using dense hof/hog. In: Proceedings of international conference on multimedia retrieval, ACM, pp 145–152

  31. Wang H, Schmid C (2013) Action recognition with improved trajectories. In: Proceedings of the IEEE international conference on computer vision, pp 3551–3558

  32. Wang H, Oneata D, Verbeek J, Schmid C (2016) A robust and efficient video representation for action recognition. Int J Comput Vis 119(3):219–238

    Article  MathSciNet  Google Scholar 

  33. Wang L, Qiao Y, Tang X (2015) Action recognition with trajectory-pooled deep-convolutional descriptors. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 4305–4314

  34. Williams A, Ganesan D, Hanson A (2007) Aging in Place: Fall Detection and Localization in a Distributed smart camera network. In: Proceedings of the ACM MM

  35. Wilson J, Patwari N (2010) Radio tomographic imaging with wireless networks. IEEE Trans Mobile Comput 9(5):621–632

    Article  Google Scholar 

  36. Wilson J, Patwari N (2011) See-through walls: motion tracking using variance-based radio tomography networks. IEEE Trans Mobile Comput 10(5):612–621

    Article  Google Scholar 

  37. Xie H, Tao X, Ye H, Lu J (2013) WeCare: an intelligent badge for elderly danger detection and alert. In: Proceedings of the IEEE UIC/ATC, pp 224–231

  38. Yan Y, Ricci E, Subramanian R, Liu G, Sebe N (2014) Multitask linear discriminant analysis for view invariant action recognition. IEEE Trans Image Proces 23(12):5599–5611. doi:10.1109/TIP.2014.2365699

    Article  MathSciNet  MATH  Google Scholar 

  39. Zhao Y, Patwari N, Phillips JM, Venkatasubramanian S (2013) Radio tomographic imaging and tracking of stationary and moving people via kernel distance. In: Proceedings of the ACM IPSN, pp 229–240

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Ying Li or Qiang Zhai.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, Y., Zhai, Q., Ding, S. et al. Efficient health-related abnormal behavior detection with visual and inertial sensor integration. Pattern Anal Applic 22, 601–614 (2019). https://doi.org/10.1007/s10044-017-0660-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10044-017-0660-5

Keywords

Navigation