Abstract
As health-care budgets are continuously under increasing demands, Artificial Intelligence resources such as digital heart twins could save millions of dollars by predicting results and preventing unnecessary surgery. Can we start to make digital human body twins to plant and predict health outcomes for a patient? By using a way to design competent simulation models from real objects, digital twins were created through IoT. But the digital twin is a complicated system and a very long-drawn step away from its possibilities. Researchers must design all components of entities or structures. There is a need to collect and merge various types of data. Many engineering researchers and participants aren’t sure about which technologies and resources to use. The 3D digital twin model offers a reference guide for digital twin comprehension and implementation. This paper aims to investigate and outline the recent technologies and tools used for digital twin applications from a 3-D digital model perspective, such as references to technologies and tools for future digital twin applications.
Similar content being viewed by others
References
Afzal H, Aouada D, Foni D, Mirbach B, Ottersten B (2014) RGB-D Multi-view System Calibration for Full 3D Scene Reconstruction. In ICPR (pp. 2459–2464)
Ahmed N, Theobalt C, Rossl C, Thrun S, Seidel HP (2008) Dense Correspondence Finding for Parametrization-free Animation Reconstruction from Video, Proc. IEEE conference on computer vision and pattern recognition (CVPR 08), IEEE Press, pp. 1–8, doi:https://doi.org/10.1109/CVPR.2008.4587758.
Anguelov D, Srinivasan P, Koller D, Thrun S, Rodgers J, Davis J (2005) SCAPE: shape completion and animation of people. ACM Trans Graph 2:3
Besl PJ, McKay N (1992) A method for registration of 3-d shapes. IEEE Trans on Pattern Anal Mach Intell 14:239–256
Bogo F, Romero J, Loper M, Black MJ (2014) FAUST: dataset and evaluation for 3D mesh registration. In CVPR
Boje C, Guerriero A, Kubicki S, Rezgui Y (2020) Towards a semantic construction digital twin: directions for future research”, Automation in construction, Elsevier, Volume 114, 103179.
De Aguiar E, Stoll C, Theobalt C, Ahmed N, Seidel H-P, Thrun S (2008) Performance capture from sparse multi-view video. In ACM SIGGRAPH 2008 papers (pp. 1–10)
Fan Z, Day C, Barlow C Digital Twin: Enabling Technologies, Challenges, and Open Research. IEEE Access 8:108952–108971
General Electric (2017) Predix technology brief - digital twin. General Electric, Boston, MA
Grieves M (2014) Digital twin: manufacturing excellence through virtual factory replication. White paper 1:1–7
Infosys Insights (2016) The future for industrial services: the digital twin. Infosys Ltd., Bangalore
Kim Y, Baek S, Bae BC (2017) Motion capture of the human body using multiple depth sensors. ETRI J 39(2):181–190. https://doi.org/10.4218/etrij.17.2816.0045
Kim YM, Theobalt C, Diebel J, Kosecka J, Micusik B, Thrun S (2009) Multi-view image and tof sensor fusion for dense 3d reconstruction. In 2009 IEEE 12th international conference on computer vision workshops, ICCV workshops (pp. 1542–1549). IEEE
Lee J, Cameron I, Hassall M (2019) Improving process safety: what roles for digitalization and industry 4.0? Process Saf. Environ. Prot. 132:325–339
Lefloch D, Nair R, Lenzen F, Schäfer H, Streeter L, Cree MJ, Kolb A (2013) Technical foundation and calibration methods for time-of-flight cameras. In Time-of-Flight and Depth Imaging. Sensors, Algorithms, and Applications (pp. 3–24). Springer, Berlin, Heidelberg
Li C, Mahadevan S, Ling Y, Wang L, Choze S (2017) A dynamic Bayesian network approach for digital twin. In: 19th AIAA Non-Deterministic Approaches Conference, p. 1566
Liao Y, Sun Y, Li G, Kong J, Jiang G, Jiang D, Cai H, Ju Z, Yu H, Liu H (2017) Simultaneous calibration: a joint optimization approach for multiple kinect and external cameras. Sensors 17(7):1491. https://doi.org/10.3390/s17071491
Ma X; Tao F; Zhang M; Wang T; Zuo Y (2019) Digital twin enhanced human-machine interaction in product lifecycle. Procedia CIRP 2019, 83, 789–793. [CrossRef]
Maimone A, Fuchs H (2012) Real-time volumetric 3D capture of room-sized scenes for telepresence. In 2012 3DTV-conference: the true vision-capture, transmission and display of 3D video (3DTV-CON) (pp. 1–4). IEEE
Microsoft Kinect (2010) http://www.xbox.com/kinect
Min Y, Kim CT, Diebel J, Kosecka J, Miscusik B, Thrun S (2009) Multi-view image and to sensor fusion for dense 3d reconstruction. In IEEE 12th International Conference on Computer Vision Workshops (ICCV Workshops), pp:1542–1549
Multi-camera self-calibration (2003). http://cmp.felk.cvut.cz/svoboda/SelfCal/
Patriarca R, Di Gravio G, Cioponea R, Licu A (2019) Safety intelligence: incremental proactive risk management for holistic aviation safety performance. Saf Sci 118:551–567
Project Baseline (2017). Project Baseline. Available at: https://www.projectbaseline.com/
Qi Q, Tao F, Zuo Y, Zhao D (2018) Digital twin service towards smart manufacturing. Procedia CIRP 72(1):237–242
Richards-Rissetto H; Remondino F; Agugiaro G; Robertsson J; von Schwerin J; Girardi G (2012) Kinect and 3D GIS in archaeology. In Proceedings of 18th International Conference on Virtual Systems and Multimedia, Milan, Italy, 2–5 ; pp. 331–33
Ruchay A, Kober V (2018) Impulsive noise removal from color images with morphological filtering. In International Conference on Analysis of Images, Social Networks and Texts (pp. 280–291). Springer, Cham
Rusu RB, Cousins S (2011) 3D is here: point cloud library (PCL). In International Conference on Robotics and Automation
Scoles S. (2016) A Digital Twin of Your Body Could Become a Critical Part of Your Health Care. Available at: http://www.slate.com
Shotton J, Sharp T, Kipman A, Fitzgibbon A, Finocchio M, Blake A, Cook M, Moore R (2013) Real-time Human Pose Recognition in Parts from Single Depth Images, Communications of the ACM, vol. 56, no. 1
Shotton J, Sharp T, Kipman A, Fitzgibbon A, Finocchio M, Blake A, Moore R (2013) Real-time human pose recognition in parts from single depth images. Commun ACM 56:116–124
Smelkina NA, Kosarev RN, Nikonorov AV, Bairikov IM, Ryabov KN, Avdeev AV, Kazanskiy NL (2017) Reconstruction of anatomical structures using statistical shape modeling [In Russian]. Comput Opt 41(6):897–904. https://doi.org/10.18287/2412-6179-2017-41-6-897-904
Telenti A, Pierce LC, Biggs WH, Di Iulio J, Wong EH, Fabani MM, Kirkness EF, Moustafa A, Shah N, Xie C, Brewerton SC (2016) Deep sequencing of 10,000 human genomes. Proceedings of the National Academy of Sciences 113(42):11901–11906
Vu H-H, Labatut P, Pons J-P, Keriven R (2012) High accuracy and visibility-consistent dense multiview stereo. Pattern Anal Mach Intell IEEE Trans 34(5):889–901
Weiss A, Hirshberg D, Black MJ (2011) Home 3D Body Scans from Noisy Image and Range Data, Proc. IEEE Int Conf Comput Vis (ICCV), pp. 1951–1958, doi:https://doi.org/10.1109/ICCV.2011.6126465.
Zhou QY, Koltun V(2014) Colormap optimization for3Dreconstruction with consumer depth cameras. ACM Trans Graph, Vol. 33, No. 4, pp:155:1–155:10, https://doi.org/10.1145/2601097.2601134, http://vladlen.info/papers/color-mapping.pdf
Acknowledgements
The authors gratefully acknowledge the Science and Engineering Research Board (SERB), Department of Science & Technology, India, for the financial support through the Mathematical Research Impact Centric Support (MATRICS) scheme (MTR/2019/000542). The authors also acknowledge SASTRA Deemed University, Thanjavur, for extending infrastructural support to carry out this research work.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Sengan, S., Kumar, K., Subramaniyaswamy, V. et al. Cost-effective and efficient 3D human model creation and re-identification application for human digital twins. Multimed Tools Appl 81, 26839–26856 (2022). https://doi.org/10.1007/s11042-021-10842-y
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-021-10842-y