ABSTRACT
We propose a bottom-up analysis method of multi-modal dialogue interaction with a pattern and motif mining method to summarize such interviews as between doctors and patients for medical diagnosis. Our aim is to generate a hierarchical model of the interviewing behavior of such kinds as interaction corpora, consisting of primitive, pattern, motif, and pattern clusters from the given dialogue session data. We exploit a Jensen-Shannon Divergence measure to extract important patterns and motifs. Medical interview is chosen as an important application of such analysis because a doctor's multi-modal interviewing technique is essential to establish a reliable relationship and to conclude with a successful diagnosis.
An interaction corpus of example simulated medical interviews is constructed by the proposed method. The interviews are captured by a video camera and microphones. Based on the constructed indices in terms of given pattern notations and clusters, the interviews were summarized. Performance evaluation of the indices by a medical doctor was performed to confirm their plausibility and summary descriptions of the results.
- K. Mase et. al.: Ubiquitous Experience Media, IEEE Multimedia, Vol. Oct--Dec, pp. 20--29 (2006). Google ScholarDigital Library
- C. R. Wren, Y. A. Ivanov, I. Kaur, D. Leigh and J. Westhues: SocialMotion: Measuring the Hidden Social Life of a Building, Location- and Context- Awareness 2007, LNCS, Vol. 4718, pp. 85--102 (2007). Google ScholarDigital Library
- M. Shiomi, T. Kanda, H. Ishiguro, and N. Hagita, Interactive Humanoid Robots for a Science Museum, IEEE Intelligent Systems, Vol. 22, No. 2, pp. 25--32, Mar/Apr, 2007. Google ScholarDigital Library
- K. Otsuka et. al., A Probabilistic Inference of Multiparty-Conversation Structure Based on Markov-Switching Models of Gaze Patterns, Head Directions, and Utterances, ICMI'05, pp. 191--198, October, 2005. Google ScholarDigital Library
- J. Lin, E. Keogh, S. Lonardi, and P. Patel: Finding Motifs in Time Series, In the 2nd Workshop on Temporal Data Mining, pp. 23--26 (2002).Google Scholar
- T. Morita, Y. Hirano, Y. Sumi, S. Kajita, and K. Mase, A Pattern Mining Method for Interpretation of Interaction, ICMI'05, pp. 267--273, Oct. 2005. Google ScholarDigital Library
- S. Kullback and R. A. Leibler: On Information and Sufficiency, The Annals of Mathematical Statistics, Vol. 22, No. 1, pp. 79--86 (1951).Google ScholarCross Ref
- B. Fuglede, and F. Topsoe: Jensen-Shannon Divergence and Hilbert space embedding, Proceedings of the International Symposium on Information Theory, pp. 31--36 (2004).Google ScholarCross Ref
Index Terms
- Interaction pattern and motif mining method for doctor-patient multi-modal dialog analysis
Recommendations
Facilitating multiparty dialog with gaze, gesture, and speech
ICMI-MLMI '10: International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal InteractionWe study how synchronized gaze, gesture and speech rendered by an embodied conversational agent can influence the flow of conversations in multiparty settings. We begin by reviewing a computational framework for turn-taking that provides the foundation ...
A multi-modal dialogue analysis method for medical interviews based on design of interaction corpus
We propose a multi-modal dialogue analysis method for medical interviews that hierarchically interprets nonverbal interaction patterns in a bottom-up manner and simultaneously visualizes the topic structure. Our method aims to provide physicians with ...
Multimodal multiparty social interaction with the furhat head
ICMI '12: Proceedings of the 14th ACM international conference on Multimodal interactionWe will show in this demonstrator an advanced multimodal and multiparty spoken conversational system using Furhat, a robot head based on projected facial animation. Furhat is a human-like interface that utilizes facial animation for physical robot heads ...
Comments