Skip to main content

Advertisement

A hybrid vision transformer and residual neural network model for fall detection using UWB radars

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Detecting falls presents a significant challenge for researchers, given the risk of serious injuries like femoral neck fractures, brain hemorrhages, or burns, which result in significant pain and, in some cases, worsen over time, leading to end-of-life complications or even fatalities. One approach to addressing this challenge involves promptly alerting caregivers, such as nurses, upon detecting a fall. In our work, we present a technique to detect falls within a 40-square-meter apartment by collecting data from three ultra-wideband radars. The presented technique combines a vision transformer and a residual neural network for fall identification, a binary classification task distinguishing between fall and non-fall events. To train and test the presented technique, we use data reflecting various fall types simulated by 10 participants across three locations in the apartment. We evaluated the performance of the presented technique in comparison with some base models by using the leave-one-subject-out strategy to demonstrate the generalization of experiment results in practical scenarios with new subjects. We also report our results by applying cross-validation to select a validation set, which highlights the effectiveness of the presented technique during the training phase and demonstrates the confidence of the obtained results in the testing phase. Consistently, the results illustrate the superior performance of the presented technique compared to the based models. Encouragingly, our results indicate nearly 99% accuracy in fall detection, demonstrating promising potential for real-world application.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Availability of supporting data

This study introduces a fall detection system that solely depends on the three UWB radars positioned within an apartment located in the Laboratoire d’Intelligence Ambiante pour la Reconnaissance d’Activites (LIARA), University of Quebec at Chicoutimi. All the data can be accessed publicly at http://www.kevin-bouchard.ca, Accessed on: 11.03.2024. No datasets were generated during the current study.

Notes

  1. https://github.com/lucidrains/vit-pytorch/tree/main

  2. https://kevinbouchard122764662.wordpress.com/projets-de-recherche

References

  1. Shen M, Tsui KL, Nussbaum MA, Kim S, Lure F (2023) An indoor fall monitoring system: Robust, multistatic radar sensing and explainable, feature-resonated deep neural network. IEEE J Biomed Health Inf 27(4):1891–1902

    Article  Google Scholar 

  2. Hung WP, Chang CH (2024) Dual-mode embedded impulse-radio ultra-wideband radar system for biomedical applications. Sens 24(17):5555

    Article  Google Scholar 

  3. Matta L, Sharma B, Sharma M (2024) A review on bandwidth enhancement techniques and band-notched characteristics of mimo-ultra wide band antennas. Wirel Netw 30(3):1339–1382

    Article  Google Scholar 

  4. He J, Zhu W, Qiu L, Zhang Q, Wang C (2024) An indoor fall detection system based on wifi signals and genetic algorithm optimized random forest. Wirel Netw 30(3):1753–1771

    Article  Google Scholar 

  5. Ullmann I, Guendel RG, Kruse NC, Fioranelli F, Yarovoy A (2023) A survey on radar-based continuous human activity recognition. IEEE J Microwaves

  6. Dosovitskiy A, Beyer L, Kolesnikov A, Weissenborn D, Zhai X, Unterthiner T, Dehghani M, Minderer M, Heigold G, Gelly S et al (2020) An image is worth 16x16 words: Transformers for image recognition at scale. arXiv preprint arXiv:2010.11929

  7. Wu H, Xiao B, Codella N, Liu M, Dai X, Yuan L, Zhang L (2021) Cvt: Introducing convolutions to vision transformers. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp 22–31

  8. Zhou D, Kang B, Jin X, Yang L, Lian X, Jiang Z, Hou Q, Feng J (2021) Deepvit: Towards deeper vision transformer. arXiv preprint arXiv:2103.11886 (2021)

  9. Ali A, Touvron H, Caron M, Bojanowski P, Douze M, Joulin A, Laptev I, Neverova N, Synnaeve G, Verbeek J et al (2021) Xcit: Cross-covariance image transformers. Adv Neural Inf Process Syst 34:20014–20027

    Google Scholar 

  10. Renggli C, Pinto AS, Houlsby N, Mustafa B, Puigcerver J, Riquelme C (2022) Learning to merge tokens in vision transformers. arXiv preprint arXiv:2202.12015

  11. Liu Y, Matsoukas C, Strand F, Azizpour H, Smith K (2023) Patchdropout: Economizing vision transformers using patch dropout. In: Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, pp 3953–3962

  12. Lee SH, Lee S, Song BC (2021) Vision transformer for small-size datasets. arXiv preprint arXiv:2112.13492

  13. Touvron H, Cord M, El-Nouby A, Verbeek J, Jégou H (2022) Three things everyone should know about vision transformers. In: European Conference on Computer Vision, pp 497–515. Springer

  14. Sandler M, Zhmoginov A, Vladymyrov M, Jackson A (2022) Fine-tuning image transformers using learnable memory. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp 12155–12164

  15. Majib MS, Rahman MM, Sazzad TS, Khan NI, Dey SK (2021) Vgg-scnet: A vgg net-based deep learning framework for brain tumor detection on mri images. IEEE Access 9:116942–116952

    Article  Google Scholar 

  16. Kiliç Ş, Askerzade I, Kaya Y (2020) Using resnet transfer deep learning methods in person identification according to physical actions. IEEE Access 8:220364–220373

    Article  Google Scholar 

  17. Azhagiri M, Rajesh P (2024) Ean: enhanced alexnet deep learning model to detect brain tumor using magnetic resonance images. Multimed Tools Appl 1–17

  18. Bird JJ, Lotfi A (2024) Cifake: Image classification and explainable identification of ai-generated synthetic images. IEEE Access

  19. Gharghan SK, Hashim HA (2024) A comprehensive review of elderly fall detection using wireless communication and artificial intelligence techniques. Meas 114186

  20. Hu S, Cao S, Toosizadeh N, Barton J, Hector M.G, Fain MJ (2024) Radar-based fall detection: A survey. IEEE Robot Autom Mag

  21. Clemente J, Li F, Valero M, Song W (2019) Smart seismic sensing for indoor fall detection, location, and notification. IEEE J Biomed Health Inf 24(2):524–532

    Article  Google Scholar 

  22. Chen D, Wong AB, Wu K (2023) Fall detection based on fusion of passive and active acoustic sensing. IEEE Int Things J

  23. He C, Liu S, Zhong G, Wu H, Cheng L, Lin J, Huang Q (2023) A non-contact fall detection method for bathroom application based on mems infrared sensors. Micromach 14(1):130

    Article  Google Scholar 

  24. Le Kernec J, Fioranelli F, Ding C, Zhao H, Sun L, Hong H, Lorandel J, Romain O (2019) Radar signal processing for sensing in assisted living: The challenges associated with real-time implementation of emerging algorithms. IEEE Signal Process Mag 36(4):29–41

    Article  Google Scholar 

  25. Hong H, Zhang L, Gu C, Li Y, Zhou G, Zhu X (2018) Noncontact sleep stage estimation using a cw doppler radar. IEEE J Emerg Select Top Circ Syst 8(2):260–270

    Article  Google Scholar 

  26. Ma L, Li X, Liu G, Cai Y (2023) Fall direction detection in motion state based on the fmcw radar. Sens 23(11):5031

    Article  Google Scholar 

  27. Maitre J, Bouchard K, Gaboury S (2020) Fall detection with uwb radars and cnn-lstm architecture. IEEE J Biomed Health Inf 25(4):1273–1283

    Article  Google Scholar 

  28. Imbeault-Nepton T, Maître J, Bouchard K, Gaboury S (2023) Fall detection from uwb radars: A comparative analysis of deep learning and classical machine learning techniques. In: Proceedings of the 2023 ACM Conference on Information Technology for Social Good, pp 197–204

  29. Erol B, Amin MG, Boashash B (2017) Range-doppler radar sensor fusion for fall detection. In: 2017 IEEE Radar Conference (RadarConf), pp 0819–0824. IEEE

  30. Seyfioğlu MS, Özbayoğlu AM, Gürbüz SZ (2018) Deep convolutional autoencoder for radar-based classification of similar aided and unaided human activities. IEEE Trans Aerosp Electr Syst 54(4):1709–1723

    Article  Google Scholar 

  31. Shrestha A, Le Kernec J, Fioranelli F, Cippitelli E, Gambi E, Spinsante S (2017) Feature diversity for fall detection and human indoor activities classification using radar systems. In: International Conference on Radar Systems (Radar 2017), IET

  32. Liang T, Liu R, Yang L, Lin Y, Shi CJR, Xu H (2024) Fall detection system based on point cloud enhancement model for 24 ghz fmcw radar. Sens 24(2):648

    Article  Google Scholar 

  33. Sadreazami H, Bolic M, Rajan S (2019) Fall detection using standoff radar-based sensing and deep convolutional neural network. IEEE Trans Circ Syst II Express Briefs 67(1):197–201

    Google Scholar 

  34. Wang P, Li Q, Yin P, Wang Z, Ling Y, Gravina R, Li Y (2023) A convolution neural network approach for fall detection based on adaptive channel selection of uwb radar signals. Neural Comput Appl 35(22):15967–15980

    Article  Google Scholar 

  35. Wang Y, Zhou J, Tong J, Wu X (2019) Uwb-radar-based synchronous motion recognition using time-varying range-doppler images. IET Radar Sonar Navig 13(12):2131–2139

    Article  Google Scholar 

  36. Li H, Shrestha A, Heidari H, Le Kernec J, Fioranelli F (2020) Bi-lstm network for multimodal continuous human activity recognition and fall detection. IEEE Sens J 20(3):1191–1201

    Article  Google Scholar 

  37. Baik JY, Shin HC (2024) Fall detection using fmcw radar to reduce detection errors for the elderly. Journal of Electromagnetic Eng Sci 24(1):78–88

    Article  Google Scholar 

  38. Arnaoutoglou DG, Dedemadis D, Kyriakou AA, Katsimentes S, Grekidis A, Menychtas D, Aggelousis N, Sirakoulis GC, Kyriacou GA (2024) Acceleration-based low-cost cw radar system for real-time elderly fall detection. IEEE J Electromagn RF Microwaves Med Biol

  39. Sadreazami H, Bolic M, Rajan S (2021) Contactless fall detection using time-frequency analysis and convolutional neural networks. IEEE Trans Ind Inf 17(10):6842–6851

    Article  Google Scholar 

  40. Lu J, Ye WB (2022) Design of a multistage radar-based human fall detection system. IEEE Sens J 22(13):13177–13187

  41. Yang L, Ye W (2024) Design of a two-stage continuous fall detection system using multiframe radar range-doppler maps. IEEE Sens J

  42. Pardhu T, Kumar V, Kanavos A, Gerogiannis VC, Acharya B (2024) Enhanced classification of human fall and sit motions using ultra-wideband radar and hidden markov models. Math 12(15):2314

    Article  Google Scholar 

  43. Stankovic L, Dakovic M, Thayaparan T (2014) Time-frequency Signal Analysis with Applications. Artech house, ???

  44. Erol B, Francisco M, Ravisankar A, Amin M (2018) Realization of radar-based fall detection using spectrograms. In: Compressive Sensing VII: From Diverse Modalities to Big Data Analytics, vol 10658, pp 77–88. SPIE

  45. Liu Z, Lin Y, Cao Y, Hu H, Wei Y, Zhang Z, Lin S, Guo B (2021) Swin transformer: Hierarchical vision transformer using shifted windows. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp 10012–10022

Download references

Acknowledgements

The authors wish to acknowledge King Fahd University of Petroleum & Minerals (KFUPM) and SDAIA-KFUPM Joint Research Center for Artificial Intelligence (JRC-AI) for providing the facilities to carry out this research.

Funding

This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shadi Abudalfa.

Ethics declarations

Competing interest

The authors declare that they have no competing interests.

Ethical approval

Since an open access dataset was used, ethics committee approval was not required.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Abudalfa, S., Bouchard, K. A hybrid vision transformer and residual neural network model for fall detection using UWB radars. Appl Intell 55, 222 (2025). https://doi.org/10.1007/s10489-024-06156-9

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10489-024-06156-9

Keywords