skip to main content
10.1145/3154862.3154938acmotherconferencesArticle/Chapter ViewAbstractPublication PagespervasivehealthConference Proceedingsconference-collections
research-article

Coaching through smart objects

Published: 23 May 2017 Publication History

Abstract

We explore the ways in which smart objects can be used to cue actions as part of coaching for Activities of Daily Living (ADL) following brain damage or injury, such as might arise following a stroke. In this case, appropriate actions are cued for a given context. The context is defined by the intention of the users, the state of the objects and the tasks for which these objects can be used. This requires objects to be instrumented so that they can recognize the actions that users perform. In order to provide appropriate cues, the objects also need to be able to display information to users, e.g., by changing their physical appearance or by providing auditory output. We discuss the ways in which information can be displayed to cue user action.

References

[1]
Op den Akker, H., Jones, V. M., & Hermens, H. J. (2014) Tailoring Real-Time Physical Activity Coaching Systems: A Literature Survey and Model, User Modeling And User-Adapted Interaction, 24, 351--392.
[2]
Schwartz, M. F. (2006). The cognitive neuropsychology of everyday action and planning. Cogn Neuropsychol., 23(1), 202--221.
[3]
Hughes, C.M.L., Baber, C., Bienkiewicz, M., Worthington, A., Hazell, A. & Hermsdörfer, J. (2015) The application of SHERPA (Systematic Human Error Reduction and Prediction Approach) in the development of compensatory cognitive rehabilitation strategies for stroke patients with left and right brain damage, Ergonomics, 58, 75--95.
[4]
Hermsdörfer, J., Bienkiewicz, M., Cogollor, J.M., Russel, M., Jean-Baptiste, E., Parekh, M., Wing, A.M., Ferre, M. & Hughes, C. (2013) CogWatch-Automated Assistance and Rehabilitation of Stroke-Induced Action Disorders in the Home Environment. In International Conference on Engineering Psychology and Cognitive Ergonomics, Springer Berlin Heidelberg 343--350.
[5]
Pastorino, M., Fioravanti, A., Arredondo, M., Cogollor, J., Rojo, J., Ferre, M., Bienkiewicz, M., Hermsdörfer, J.; Fringi, E. & Wing, A. (2014) Preliminary Evaluation of a Personal Healthcare System Prototype for Cognitive eRehabilitation in a Living Assistance Domain, Sensors, 14, 10213--10233.
[6]
Giachritsis, C. & Randall, G. (2012) CogWatch: Cognitive Rehabilitation for Apraxia and Action Disorganization Syndrome Patients. In The Seventh International Workshop on Haptic and Audio Interaction Design August 23--24 2012 Lund, Sweden.
[7]
Jean-Baptiste, E.M., Nabiei, R., Parekh, M., Fringi, E., Drozdowska, B., Baber, C., Jancovic, P., Rotshein, P. & Russell, M. (2014) Intelligent assistive system using real-time action recognition for stroke survivors, 2014 IEEE International Conference on Healthcare Informatics (ICHI), IEEE, 39--44.
[8]
Jean-Baptiste, E.M., Rotshtein, P. & Russell, M (2015) POMDP based action planning and human error detection, IFIP International Conference on Artificial Intelligence Applications and Innovations, Springer International Publishing, 250--265
[9]
Jean-Baptiste, E.M., Rotshtein, P. & Russell, M. (2016) CogWatch: Automatic prompting system for stroke survivors during activities of daily living. Journal of Innovation in Digital Ecosystems, 3(2), 48--56.
[10]
Kortuem, G., Kawsar, F., Fitton, D. & Sundramoorthy, V. (2010) Smart objects as building blocks for the Internet of Things, IEEE Internet Computing, Jan/Feb., 44--51.
[11]
Ishii, H. (2008) The tangible user interface and its evolution. Communications of the ACM, 51(6), 32--36.
[12]
Ishii, H., Leithinger, D., Follmer, S., Zoran, A., Schoessler, P., & Counts, J. (2015) TRANSFORM: Embodiment of Radical Atoms at Milano Design Week. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (pp. 687--694). ACM
[13]
Zuckerman, O. (2015) Objects for change: A case study of a tangible user interface for behavior change. In Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction (pp. 649--654). ACM.
[14]
Sung, M-H. & Chian, C-W. (2016) The Research of Using Magnetic Pillbox as Smart Pillbox System's Interactive Tangible User Interface, International Conference on Human-Computer Interaction, Springer International Publishing, 451--456.
[15]
Reeder, B., Chung, J., Le, T., Thompson, H. J., & Demiris, G. (2014) Assessing older adults' perceptions of sensor data and designing visual displays for ambient assisted living environments: An exploratory study. Methods of information in medicine, 53, 152.
[16]
Dobkin, B. H. (2016) A Rehabilitation-Internet-of-Things in the Home to Augment Motor Skills and Exercise Training, Neurorehabilitation and Neural Repair.
[17]
Bonanni, L., Lee, C.H. & Selker, T., 2005. CounterIntelligence: Augmented reality kitchen. In Proc. CHI (Vol. 2239, p. 45).
[18]
Cervantes-Solis, J.W., Baber, C., Khattab, A. & Mitch, R., 2015, July. Rule and theme discovery in human interactions with an'internet of things'. In Proceedings of the 2015 British HCI Conference (pp. 222--227). ACM.
[19]
Young, W.R., Shreve, L., Quinn, E.J., Craig, C. and Bronte-Stewart, H. (2016) Auditory cueing in Parkinson's patients with freezing of gait. What matters most: Action-relevance or cue continuity? Neuropsychologia, 87, 54--62.
[20]
Bieńkiewicz, M. N., Gulde, P., Schlegel, A., & Hermsdörfer, J. (2014) The Use of Ecological Sounds in Facilitation of Tool Use in Apraxia, 2nd International Conference on NeuroRehabilitation (ICNR2014). Paper presented at the Replace, Repair, Restore, Relieve - Bridging Clinical and Engineering Solutions in Neurorehabilitation, Aalborg, 24--26 June, 2014.
[21]
Hermsdörfer, J., Li, Y., Randerath, J., Goldenberg, G., & Johannsen, L. (2012) Tool use without a tool: kinematic characteristics of pantomiming as compared to actual use and the effect of brain damage, Experimental Brain Research, 218, 201--214
[22]
Randerath, J., Li, Y., Goldenberg, G., & Hermsdörfer, J. (2009) Grasping tools: Effects of task and apraxia. Neuropsychologia, 47, 497--505.
[23]
Graham, N. L., Zeman, A., Young, A. W., Patterson, K., & Hodges, J. R. (1999) Dyspraxia in a patient with cortico-basal degeneration: the role of visual and tactile inputs to action, Journal of Neurology, Neurosurgery and Psychiatry, 67, 334--344.
[24]
Matheson, H., Newman, A.J., Satel, J. & McMullen, P., (2014) Handles of manipulable objects attract covert visual attention: ERP evidence, Brain and cognition, 86, 17--23.
[25]
Poupyrev, I., Nashida, T. & Okabe, M., 2007, February. Actuation and tangible user interfaces: the Vaucanson duck, robots, and shape displays. In Proceedings of the 1st international conference on Tangible and embedded interaction (pp. 205--212). ACM....
[26]
Norman, D. A. (1990) The Design of Everyday Things, New York: Doubleday.
[27]
Gaver, W. (1991). Technology affordances, Proceedings of CHI 1991, New York: ACM Press, 79--84.
[28]
Gibson, J.J. (1986) The Ecological Approach to Visual Perception, Houghton Mifflin Company.
[29]
Blevis, E., Bødker, S., Flach, J., Forlizzi, J., Jung, H., Kaptelinin, V., Nardi, B. & Rizzo, A. (2015) Ecological perspectives in HCI: Promise, problems, and potential, Proceedings of CHI 2015, New York: ACM Press, 2401--2404.
[30]
Gellersen, H-W., Beigl, M. & Krull, H. (1999) The MediaCup: awareness technology embedded in an everyday object, Handheld and Ubiquitous Computing 1st International Symposium HUC'99, Berlin: Springer, 308--310.
[31]
Jarrassé, N., Kuhne, M., Roach, N., Hussain, A., Balasubramanian, S., Burdet, E. and Roby-Brami, A. (2013) Analysis of grasping strategies and function in hemiparetic patients using an instrumented object, Proceedings of the 13th International Conference on Rehabilitation Robotics (ICORR), 1--8.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
PervasiveHealth '17: Proceedings of the 11th EAI International Conference on Pervasive Computing Technologies for Healthcare
May 2017
503 pages
ISBN:9781450363631
DOI:10.1145/3154862
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 23 May 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. activity recognition
  2. multimodal cueing
  3. tangible user interface

Qualifiers

  • Research-article

Conference

PervasiveHealth '17

Acceptance Rates

Overall Acceptance Rate 55 of 116 submissions, 47%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)4
Reflects downloads up to 13 Feb 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media