Skip to main content
Log in

Design and Implementation of the Bio-inspired Facial Expressions for Medical Mannequin

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

A medical mannequin is an intelligent device used in hospitals for providing medical students hands-on learning experience without the risk of injury to a human subject. The closer these mannequins become to mimicking the functionality of a human being, the better the quality of training. In this paper, we report the development of a biomimetic mechatronic face with eyes, skin and neck having human-like performance. The face was created with 13 degrees of freedom to incorporate all the action units necessary to reconstruct 7 basic expressions. The eyes were able to perform blinking, movement, iris dilation, and object tracking. The neck was a Stewart Platform with 6 degrees of freedom and was capable of translation and rotation that mimic the human vertebrae. A durable artificial skin was developed using silicone elastomer to replicate the texture and appearance of real human skin.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

References

  1. Shaw-Garlock G (2009) Looking forward to sociable robots. Int J Soc Robot 1:249–260

    Article  Google Scholar 

  2. Silvera-Tawil D, Rye D, Velonaki M (2014) Interpretation of social touch on an artificial arm covered with an EIT-based sensitive skin. Int J Soc Robot. doi:10.1007/s12369-013-0223-x

  3. Cooper J, Toqueti V (2004) A brief history of the development of mannequin simulators for clinical education and training. Qual Saf Health Care 13:i11–i18

    Article  Google Scholar 

  4. Gaumard Simulators for Health Care Education (2013) The New NOELLE\({\textregistered }\) S575.100 with Newborn HAL\({\textregistered }\). www.gaumard.com/noelle. Accessed 14 Nov 2013

  5. CAE Healthcare Patient Simulators (2013) iStan. http://www.caehealthcare.com/eng/patient-simulators/istan. Accessed 14 Nov 2013

  6. Laerdal Medical (2013) SimMan\({\textregistered }\) 3G. www.laerdal.com/us/doc/85/SimMan-3G. Accessed 14 Nov 2013

  7. Duffy B (2003) Anthropomorphism and the social robot. Robot Auton Syst 42:177–190

    Article  MATH  Google Scholar 

  8. Spexard T, Hanheide M, Sagerer G (2007) Human-oriented interaction with an anthropomorphic robot. IEEE Trans Robot 23:852–862

    Article  Google Scholar 

  9. McColl D, Nejat G (2014) Recognizing emotional body language displayed by a human-like social robot. Int J Soc Robot 6:261– 280

  10. Powers A, Kiesler S (2006) The advisor robot: tracing people’s mental model from a robot’s physical attributes. In: Proceedings of the ACM conference on human–robot interaction, pp 218–225

  11. Lee JK, Breazeal C (2010) Human social response toward humanoid robot’s head and facial features. In: Proceedings of the ACM conference on human factors in computing systems (CHI EA), pp 4237–4242

  12. DiSalvo C, Gemperle F, Forlizzi J, Kiesler S (2002) All robots are not created equal: the design and perception of humanoid robot heads. In: Proceedings of the conference on designing interactive systems, processes, practices, methods, and techniques, pp 25– 28

  13. Kim H, York G, Murphy-Chutorian E, Trieesch J (2004) Design of an anthropomorphic robot head for studying autonomous development and learning. In: Proceedings of the IEEE international conference on robotics and automation (ICRA), New Orleans, pp 3506–3511

  14. Hashimoto M, Yokogawa C, Sadoyama T (2006) Development and control of a face robot imitating human muscular structures. In: IEEE/RSJ international conference on intelligent robots and systems, pp 1855–1860

  15. Hashimoto T, Hitramatsu S, Tsuji T, Kobayashi H (2006) Development of the face robot SAYA for rich facial expressions. In: International joint conference SICE-ICASE, pp 5423–5428

  16. Hirth J, Schmitz N, Berns K (2007) Emotional architecture for the humanoid robot head ROMAN. In: IEEE international conference on robotics and automation (ICRA), pp 2150–2155

  17. Oh JH, Hanson D, Kim WS, Han IY, Kim JY, Park IW (2006) Design of android type humanoid robot Albert HUBO. In: IEEE/RSJ international conference on intelligent robots and systems, pp 1428–1433

  18. Ishiguro H, Nishio S (2007) Building artificial humans to understand humans. J Artif Organs 10(3):133–142

    Article  Google Scholar 

  19. Kaneko K, Kanehiro F, Morisawa M, Miura K, Nakaoka S, Kajita S. (2009) Cybernetic human HRP-4C. In: IEEE-RAS international conference on humanoid robots, pp 7–14

  20. Ahn H, Lee DW, Choi D, Lee D, Hur M, Lee H, Shon W (2011) Development of an android for singing with facial expression. In: 37th annual conference on IEEE industrial electronics society, pp 104–109

  21. Tadesse Y, Hong D, Priya S (2011) Twelve degree of freedom baby humanoid head using shape memory alloy actuators. J Mech Robot 3(1):01108

    Article  Google Scholar 

  22. Tadesse Y, Subbarao K, Priya S (2010) Realizing a humanoid neck with serial chain four-bar mechanism. J Intell Mater Syst Struct 21:1169–1191

    Article  Google Scholar 

  23. Foner L (1997) What’s agency anyway? A sociological case study. In: Proceedings of the international conference on autonomous agents

  24. Kidd C, Breazeal C (2004) Effect of a robot on user perception, vol 4. In: Proceedings of the IEEE/RSJ international conference on intelligent robots and systems, pp 3559–3564

  25. Ekman P, Friesen WV (1978) Facial action coding system. Consulting Psychologists Press, Palo Alto

    Google Scholar 

  26. Erden MS (2013) Emotional postures for the Humanoid-Robot Nao. Int J Soc Robot 5:441–456

    Article  Google Scholar 

  27. Hackel M, Schwope S, Fritsch J, Wrede B, Sagerer G (2006) Designing a sociable humanoid robot for interdisciplinary research. Adv Robot 20(11):1219–1235

    Article  Google Scholar 

  28. MacDorman KF, Ishiguro H (2006) The uncanny advantage of using androids in cognitive and social science research. Interact Stud 7(3):297–337

    Article  Google Scholar 

  29. Silver F, Freeman J, DeVore D (2001) Viscoelastic properties of human skin and processed dermis. Skin Res Technol 7(1):18–23

    Article  Google Scholar 

  30. Tadesse Y, Priya S (2011) Determination of the sinking and terminating points of action unit on humanoid skull through GFEAD. In: Proceedings of SPIE EAPAD.79761V

  31. Liu K, Fitzgerald J, Lewis F (1993) Kinematic analysis of a stewart platform manipulator. IEEE Trans Ind Electron 40(2):282–293

    Article  Google Scholar 

  32. Toshima I, Uematsu H, Hirahara T (2003) A steerable dummy head that tracks three-dimensional head movement: telehead. Acoust Sci Technol 24(5):327–329

    Article  Google Scholar 

  33. Spong M, Vidyasagar M (1989) Robot dynamics and control. Wiley, New York

    Google Scholar 

  34. Delaunay F, Greeff J, Belpaeme T (2009) Towards retro-projected robot faces: an alternative to mechatronic an android faces. The 18th IEEE international symposium on robot and human interactive communication, Toyama, pp 306–311

  35. Shimada M, Yoshikawa Y, Asada M, Saiwaki N, Ishiguro H (2011) Effects of observing eye contact between a robot and another person. Int J Soc Robot 3:143–154

    Article  Google Scholar 

  36. Bickley L, Szilagyi P (2009) Bate’s guide to physical examination and history taking. J. B. Lippincott Company, Philadelphia

    Google Scholar 

  37. Cassin B, Solomon S (1990) Dictionary of eye terminology. Triad Publishing Company, Gainsville

    Google Scholar 

  38. Westheimer G, McKee S (1975) Visual acuity in the presence of retinal-image motion. J Opt Soc Am 65(7):847–850

    Article  Google Scholar 

  39. Fischer B, Boch R (1983) Saccadic eye movements after extremely short reaction times in the monkey. Brain Res 260(1):21–26

    Article  Google Scholar 

  40. American Association of Neurological Surgeons (2011) Traumatic brain injury. http://www.aans.org/Patient%20Information/Conditions%20and%20Treatments/Traumatic%20Brain%20Injury.aspx. Accessed 14 Nov 2013

  41. Ekman P, Friesen W (1971) Constants across cultures in the face and emotion. J Pers Soc Psychol 17(2):124–129

    Article  Google Scholar 

  42. Cline D, Hofstetter H, Griffin J (1996) Dictionary of visual science. Butterworth Heinemann, Boston

    Google Scholar 

  43. Bradley P (2006) The history of simulation in medical education and possible future directions. Med Educ 40:252–254

    Article  Google Scholar 

Download references

Acknowledgments

The authors gratefully acknowledge the financial support from National Science Foundation through INAMM program and Institute of Critical Technology and Applied Sciences (ICTAS).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shashank Priya.

Appendices

Appendix

Coordinates for neck attachment points for bottom (B) and top (T) plates.

 

X

Y

Z

B1

\(\frac{\sqrt{3}}{6}(2b+d)\)

\(\frac{d}{2}\)

0

B2

\(-\frac{\sqrt{3}}{6}(b-d)\)

\(\frac{(b+d)}{2}\)

0

B3

\(-\frac{\sqrt{3}}{6}(b+2d)\)

\(\frac{b}{2}\)

0

B4

\(-\frac{\sqrt{3}}{6}(b+2d)\)

\(-\frac{b}{2}\)

0

B5

\(-\frac{\sqrt{3}}{6}(b-d)\)

\(-\frac{(b+d)}{2}\)

0

B6

\(\frac{\sqrt{3}}{6}(2b+d)\)

\(-\frac{d}{2}\)

0

T1

\(\frac{\sqrt{3}}{6}(a+2c)\)

\(\frac{a}{2}\)

0

T2

\(\frac{\sqrt{3}}{6}(a-c)\)

\(\frac{(a+c)}{2}\)

0

T3

\(-\frac{\sqrt{3}}{6}(2a+c)\)

\(\frac{c}{2}\)

0

T4

\(-\frac{\sqrt{3}}{6}(2a+c)\)

\(-\frac{c}{2}\)

0

T5

\(\frac{\sqrt{3}}{6}(a-c)\)

\(\frac{(a+c)}{2}\)

0

T6

\(\frac{\sqrt{3}}{6}a+2c\)

\(-\frac{a}{2}\)

0

Equations for coordinates of top plate with respect to bottom plate of Stewart platform

$$\begin{aligned} X_{T1}&=x_T -(a*(cb*sz-cz*sa*sb))/2\\&\quad +(3^{(1/2)}*(cb*cz+sa*sb*sz)*(a+2*c))/6 \\ Y_{T1}&=y_T +(a*ca*cz)/2\\&\quad +(3^{(1/2)}*ca*sz*(a+2*c))/6 \\ X_{T1}&=z_T +(a*(sb*sz+cb*cz*sa))/2\\&\quad -(3^{(1/2)}*(cz*sb-cb*sa*sz)*(a+2*c))/6 \\ X_{T2}&=x_T -(cb*sz-cz*sa*sb)*(a/2+c/2)\\&\quad +(3^{(1/2)}*(cb*cz+sa*sb*sz)*(a-c))/6 \\ Y_{T2}&=y_T +ca*cz*(a/2+c/2)\\&\quad +(3^{(1/2)}*ca*sz*(a-c))/6 \\ Z_{T2}&=z_T +(sb*sz+cb*cz*sa)*(a/2+c/2)\\&\quad -(3^{(1/2)}*(cz*sb-cb*sa*sz)*(a-c))/6 \\ X_{T3}&=x_T -(c*(cb*sz-cz*sa*sb))/2\\&\quad -(3^{(1/2)}*(cb*cz+sa*sb*sz)*(2*a+c))/6 \\ Y_{T3}&=y_T +(c*ca*cz)/2\\&\quad -(3^{(1/2)}*ca*sz*(2*a+c))/6 \\ Z_{T3}&=z_T +(c*(sb*sz+cb*cz*sa))/2\\&\quad +(3^{(1/2)}*(cz*sb-cb*sa*sz)*(2*a+c))/6 \\ X_{T4}&=x_T +(c*(cb*sz-cz*sa*sb))/2\\&\quad -(3^{(1/2)}*(cb*cz+sa*sb*sz)*(2*a+c))/6 \\ Y_{T4}&=y_T -(c*ca*cz)/2\\&\quad -(3^{(1/2)}*ca*sz*(2*a+c))/6 \\ Z_{T4}&=z_T -(c*(sb*sz+cb*cz*sa))/2\\&\quad +(3^{(1/2)}*(cz*sb-cb*sa*sz)*(2*a+c))/6 \\ X_{T5}&=x_T +(cb*sz-cz*sa*sb)*(a/2+c/2)\\&\quad +(3^{(1/2)}*(cb*cz+sa*sb*sz)*(a-c))/6 \\ Y_{T5}&=y_T -ca*cz*(a/2+c/2)\\&\quad +(3^{(1/2)}*ca*sz*(a-c))/6 \\ Z_{T5}&=z_T -(sb*sz+cb*cz*sa)*(a/2+c/2)\\&\quad -(3^{(1/2)}*(cz*sb-cb*sa*sz)*(a-c))/6 \\ X_{T6}&=x_T +(a*(cb*sz-cz*sa*sb))/2\\&\quad +(3^{(1/2)}*(cb*cz+sa*sb*sz)*(a+2*c))/6 \\ Y_{T6}&=y_T -(a*ca*cz)/2\\&\quad +(3^{(1/2)}*ca*sz*(a+2*c))/6 \\ Z_{T6}&=z_T -(a*(sb*sz+cb*cz*sa))/2\\&\quad -(3^{(1/2)}*(cz*sb-cb*sa*sz)*(a+2*c))/6 \\ \end{aligned}$$

34 Predefined Expressions

Happy

Angry

Disgusted

Surprised

Suspicious

Sarcastic

Sad

Happy

Angry

Disgusted

Surprised

Suspicious

Sarcastic

Sad

Ecstatic

Upset

Bothered

Astonished

Curious

Crazy

Depressed

Content

Aggravated

Jealous

Fearful

Arrogant

Disrespectful

Mournful

Joyful

Furious

Offended

Confused

Untrusting

 

Embarrassed

Hopeful

Spiteful

 

Bewildered

Interested

 

Shy

    

Disturbed

 

Guilty

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Baldrighi, E., Thayer, N., Stevens, M. et al. Design and Implementation of the Bio-inspired Facial Expressions for Medical Mannequin. Int J of Soc Robotics 6, 555–574 (2014). https://doi.org/10.1007/s12369-014-0240-4

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-014-0240-4

Keywords

Navigation