skip to main content
10.1145/1240624.1240642acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Article

Shoogle: excitatory multimodal interaction on mobile devices

Published: 29 April 2007 Publication History

Abstract

Shoogle is a novel, intuitive interface for sensing data withina mobile device, such as presence and properties of textmessages or remaining resources. It is based around activeexploration: devices are shaken, revealing the contents rattlingaround "inside". Vibrotactile display and realistic impactsonification create a compelling system. Inertial sensingis used for completely eyes-free, single-handed interactionthat is entirely natural. Prototypes are described runningboth on a PDA and on a mobile phone with a wireless sensorpack. Scenarios of use are explored where active sensing ismore appropriate than the dominant alert paradigm.

Supplementary Material

index.html (index.html)
Slides from the presentation
ZIP File (p121-slides.zip)
Supplemental material for 'Shoogle: excitatory multimodal interaction on mobile devices'
Audio only (1240642.mp3)
Video (1240642.mp4)

References

[1]
P. Eslambolchilar and R. Murray-Smith. Model-based, multimodal interaction in document browsing. In Multimodal Interaction and Related Machine Learning Algorithms, 2006.
[2]
M. Fernström. Sound objects and human-computer interaction design. In D. Rocchesso and F. Fontana, editors, The Sounding Object, pages 45--59. Mondo Estremo Publishing, 2003.
[3]
T. Hermann and H. Ritter. Listen to your data: Model-based sonification for data analysis. In Advances in intelligent computing and mulimedia systems., pages 189--194. IIASRC, 1999.
[4]
K. Hinckley, J. Pierce, E. Horvitz, and M. Sinclair. Foreground and background interaction with sensor-enhanced mobile devices. ACM Trans. Comput.-Hum. Interact., 12(1):31--52, 2005.
[5]
K. Hinckley, J. Pierce, M. Sinclair, and E. Horvitz. Sensing techinques for mobile interaction. In UIST'2000, 2000.
[6]
S. Hughes, I. Oakley, and S. O'Modhrain. Mesh: Supporting mobile multi-modal interfaces. In UIST 2004. ACM, 2004.
[7]
K. J. Kuchenbecker, J. Fiene, and G. Niemeyer. Improving contact realism through event-based haptic feedback. IEEE Transactions on Visualization and Computer Graphics, 12(2):219--230, 2006.
[8]
J. Linjama, J. Hakkila, and S. Ronkainen. Gesture interfaces for mobile devices -- minimalist approach for haptic interaction. In CHI Workshop: Hands on Haptics: Exploring Non-Visual Visualisation Using the Sense of Touch, 2005.
[9]
S. O'Modhrain and G. Essl. Pebblebox and crumblebag: Tactile interfaces for granular synthesis. In NIME'04, 2004.
[10]
M. Rath and D. Rocchesso. Continuous sonic feedback from a rolling ball. IEEE MultiMedia, 12(2):60--69, 2005.
[11]
J. Rekimoto. Tilting operations for small screen interfaces. In ACM Symposium on User Interface Software and Technology, pages 167--168, 1996.
[12]
K. van den Doel, P. G. Kry, and D. K. Pai. Foleyautomatic: physically-based sound effects for interactive simulation and animation. In SIGGRAPH '01, pages 537--544. ACM Press, 2001.
[13]
J. Williamson. Continuous Uncertain Interaction. PhD thesis, Dept. Computing Science, University of Glasgow, 2006.
[14]
J. Williamson and R. Murray-Smith. Pointing without a pointer. In ACM SIG CHI, pages 1407--1410. ACM, 2004.
[15]
H.-Y. Yao and V. Hayward. An experiment on length perception with a virtual rolling stone. In Eurohaptics 06, 2006.

Cited By

View all
  • (2023)MultiViz: Towards User-Centric Visualizations and Interpretations of Multimodal ModelsExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585604(1-21)Online publication date: 19-Apr-2023
  • (2022)Movement interaction with a loudspeaker: an index of possibilitiesProceedings of the 14th Conference on Creativity and Cognition10.1145/3527927.3533026(247-261)Online publication date: 20-Jun-2022
  • (2022)Sound experts’ perspectives on astronomy sonification projectsNature Astronomy10.1038/s41550-022-01821-w6:11(1249-1255)Online publication date: 11-Nov-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '07: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
April 2007
1654 pages
ISBN:9781595935939
DOI:10.1145/1240624
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 29 April 2007

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. accelerometer
  2. audio
  3. mobile
  4. multimodal
  5. vibrotactile

Qualifiers

  • Article

Conference

CHI07
Sponsor:
CHI07: CHI Conference on Human Factors in Computing Systems
April 28 - May 3, 2007
California, San Jose, USA

Acceptance Rates

CHI '07 Paper Acceptance Rate 182 of 840 submissions, 22%;
Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)25
  • Downloads (Last 6 weeks)1
Reflects downloads up to 17 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)MultiViz: Towards User-Centric Visualizations and Interpretations of Multimodal ModelsExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585604(1-21)Online publication date: 19-Apr-2023
  • (2022)Movement interaction with a loudspeaker: an index of possibilitiesProceedings of the 14th Conference on Creativity and Cognition10.1145/3527927.3533026(247-261)Online publication date: 20-Jun-2022
  • (2022)Sound experts’ perspectives on astronomy sonification projectsNature Astronomy10.1038/s41550-022-01821-w6:11(1249-1255)Online publication date: 11-Nov-2022
  • (2022)Haptic Rattle: Multi-modal Rendering of Virtual Objects Inside a Hollow ContainerHaptics: Science, Technology, Applications10.1007/978-3-031-06249-0_22(189-197)Online publication date: 20-May-2022
  • (2021)Intermittent Control as a Model of Mouse MovementsACM Transactions on Computer-Human Interaction10.1145/346183628:5(1-46)Online publication date: 20-Aug-2021
  • (2020)MMGatorAuthProceedings of the 2020 International Conference on Multimodal Interaction10.1145/3382507.3418881(370-377)Online publication date: 21-Oct-2020
  • (2020)Learning personalized ADL recognition models from few raw dataArtificial Intelligence in Medicine10.1016/j.artmed.2020.101916107(101916)Online publication date: Jul-2020
  • (2019)Realistic Haptic Rendering of Collision Effects Using Multimodal Vibrotactile and Impact Feedback2019 IEEE World Haptics Conference (WHC)10.1109/WHC.2019.8816116(449-454)Online publication date: Jul-2019
  • (2019)Personalized Posture and Fall Classification with Shallow Gated Recurrent Units2019 IEEE 32nd International Symposium on Computer-Based Medical Systems (CBMS)10.1109/CBMS.2019.00034(114-119)Online publication date: Jun-2019
  • (2018)Investigating Perceptual Congruence between Data and Display Dimensions in SonificationProceedings of the 2018 CHI Conference on Human Factors in Computing Systems10.1145/3173574.3174185(1-9)Online publication date: 21-Apr-2018
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media