skip to main content
10.1145/2047196.2047242acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

Conté: multimodal input inspired by an artist's crayon

Published: 16 October 2011 Publication History

Abstract

Conté is a small input device inspired by the way artists manipulate a real Conté crayon. By changing which corner, edge, end, or side is contacting the display, the operator can switch interaction modes using a single hand. Conté's rectangular prism shape enables both precise pen-like input and tangible handle interaction. Conté also has a natural compatibility with multi-touch input: it can be tucked in the palm to interleave same-hand touch input, or used to expand the vocabulary of bimanual touch. Inspired by informal interviews with artists, we catalogue Conté's characteristics, and use these to outline a design space. We describe a prototype device using common materials and simple electronics. With this device, we demonstrate interaction techniques in a test-bed drawing application. Finally, we discuss alternate hardware designs and future human factors research to study this new class of input.

Supplementary Material

JPG File (fp262.jpg)
MP4 File (fp262.mp4)

References

[1]
Benko, H., Wilson, A.D., and Baudisch, P. Precise selection techniques for multi-touch screens. Proc. of the SIGCHI conference on Human Factors in computing systems, ACM (2006), 1263--1272.
[2]
Bi, X., Moscovich, T., Ramos, G., Balakrishnan, R., and Hinckley, K. An exploration of pen rolling for pen-based interaction. Proc. of the 21st annual ACM symposium on User interface software and technology, ACM (2008), 191--200.
[3]
Brandl, P., Forlines, C., Wigdor, D., Haller, M., and Shen, C. Combining and measuring the benefits of bimanual pen and direct-touch interaction on horizontal interfaces. Proc. of the working conference on Advanced visual interfaces, ACM (2008), 154--161.
[4]
Buxton, W.A.S. Chunking and phrasing and the design of human-computer dialogues. In Human-computer interaction: toward the year 2000. Morgan Kaufmann Publishers Inc, 1995, 494--499.
[5]
Casiez, G., Vogel, D., Balakrishnan, R., and Cockburn, A. The Impact of Control-Display Gain on User Performance in Pointing Tasks. Human-Computer Interaction 23, 3 (2008), 215--250.
[6]
Deming, K. and Lank, E. Mode Selection Techniques for Pen Input Systems. San Francisco State University, 2005.
[7]
Fitzmaurice, G., Baudel, T., Kurtenbach, G., and Buxton, B. A GUI paradigm using tablets, two-hands and transparency. CHI '97 extended abstracts on Human factors in computing systems: looking to the future, ACM (1997), 212--213.
[8]
Forlines, C., Wigdor, D., Shen, C., and Balakrishnan, R. Direct-touch vs. mouse input for tabletop displays. Proc. of the SIGCHI conference on Human factors in computing systems, ACM (2007), 647--656.
[9]
Frisch, M., Heydekorn, J., and Dachselt, R. Investigating multi-touch and pen gestures for diagram editing on interactive surfaces. Proc. of the ACM International Conference on Interactive Tabletops and Surfaces, ACM (2009), 149--156.
[10]
Goonetilleke, R.S., Hoffmann, E.R., and Luximon, A. Effects of pen design on drawing and writing performance. Applied Ergonomics 40, 2 (2009), 292--301.
[11]
Greenberg, S. and Buxton, B. Usability evaluation considered harmful (some of the time). Proc. of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, ACM (2008), 111--120.
[12]
Guiard, Y. Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. Journal of Motor Behavior 19, 4 (1987), 486--517.
[13]
Hartmann, B., Morris, M.R., Benko, H., and Wilson, A.D. Augmenting interactive tables with mice & keyboards. Proc. of the 22nd annual ACM symposium on User interface software and technology, ACM (2009), 149--152.
[14]
Hilliges, O., Izadi, S., Wilson, A.D., Hodges, S., Garcia-Mendoza, A., and Butz, A. Interactions in the air: adding further depth to interactive tabletops. Proc. of the 22nd annual ACM symposium on User interface software and technology, ACM (2009), 139--148.
[15]
Hinckley, K., Yatani, K., Pahud, M., et al. Pen + touch = new tools. Proc. of the 23nd annual ACM symposium on User interface software and technology, ACM (2010), 27--36.
[16]
Jordà, S., Geiger, G., Alonso, M., and Kaltenbrunner, M. The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces. Proc. of the 1st international conference on Tangible and embedded interaction, ACM (2007), 139--146.
[17]
Kabbash, P., MacKenzie, I.S., and Buxton, W. Human performance using computer input devices in the preferred and non-preferred hands. Proc. of the INTERACT '93 and CHI '93 conference on Human factors in computing systems, ACM (1993), 474--481.
[18]
Li, Y., Hinckley, K., Guan, Z., and Landay, J.A. Experimental analysis of mode switching techniques in pen-based user interfaces. Proc. of the SIGCHI conference on Human factors in computing systems, ACM (2005), 461--470.
[19]
Lin, J., Newman, M.W., Hong, J.I., and Landay, J.A. DENIM: finding a tighter fit between tools and practice for Web site design. Proc. of the SIGCHI conference on Human factors in computing systems, ACM (2000), 510--517.
[20]
Moran, T.P., Chiu, P., and Melle, W. van. Pen-based interaction techniques for organizing material on an electronic whiteboard. Proc. of the 10th annual ACM symposium on User interface software and technology, ACM (1997), 45--54.
[21]
Napier, J.R. The prehensile movements of the human hand. The Journal of Bone and Joint Surgery. British Volume 38-B, 4 (1956), 902--913.
[22]
Ramos, G., Boulos, M., and Balakrishnan, R. Pressure widgets. Proc. of the SIGCHI conference on Human factors in computing systems, ACM (2004), 487--494.
[23]
Rekimoto, J. and Sciammarella, E. ToolStone: effective use of the physical manipulation vocabularies of input devices. Proc. of the 13th annual ACM symposium on User interface software and technology, ACM (2000), 109--117.
[24]
Scheyer, E. French Drawings of the Great Revolution and the Napoleonic Era. The Art Quarterly IV, 4 (1941), 187--204.
[25]
Song, H., Benko, H., Guimbretière, F., Izadi, S., and Cao, X. Grips and gestures on a multi-touch pen. Proc. of the SIGCHI conference on Human factors in computing systems, ACM (2011).
[26]
Tian, F., Xu, L., Wang, H., et al. Tilt menu: using the 3D orientation information of pen devices to extend the selection capability of pen-based user interfaces. Proc. of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, ACM (2008), 1371--1380.
[27]
Vogel, D. and Baudisch, P. Shift: a technique for operating pen-based interfaces using touch. Proc. of the SIGCHI conference on Human factors in computing systems, ACM (2007), 657--666.
[28]
Wu, F. G. and Luo, S. Design and evaluation approach for increasing stability and performance of touch pens in screen handwriting tasks. Applied Ergonomics {Kidlington} 37, 3 (2006).
[29]
Wu, M., Shen, C., Ryall, K., Forlines, C., and Balakrishnan, R. Gesture Registration, Relaxation, and Reuse for Multi-Point Direct-Touch Surfaces. Proc. of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems, IEEE Computer Society (2006), 185--192.
[30]
Yee, K.-P. Two-handed interaction on a tablet display. CHI '04 extended abstracts on Human factors in computing systems, ACM (2004), 1493--1496.
[31]
Computer Sketchpad. Science Reporter (Television Series), 1964.

Cited By

View all
  • (2024)OptiBasePen: Mobile Base+Pen Input on Passive Surfaces by Sensing Relative Base Motion Plus Close-Range Pen PositionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676467(1-9)Online publication date: 13-Oct-2024
  • (2024)Exploiting Physical Referent Features as Input for Multidimensional Data Selection in Augmented RealityACM Transactions on Computer-Human Interaction10.1145/364861331:4(1-40)Online publication date: 19-Sep-2024
  • (2022)Understanding How People with Limited Mobility Use Multi-Modal InputProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517458(1-17)Online publication date: 29-Apr-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
UIST '11: Proceedings of the 24th annual ACM symposium on User interface software and technology
October 2011
654 pages
ISBN:9781450307161
DOI:10.1145/2047196
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 16 October 2011

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. gestures
  2. multimodal
  3. pen
  4. tabletop
  5. touch

Qualifiers

  • Research-article

Conference

UIST '11

Acceptance Rates

UIST '11 Paper Acceptance Rate 67 of 262 submissions, 26%;
Overall Acceptance Rate 561 of 2,567 submissions, 22%

Upcoming Conference

UIST '25
The 38th Annual ACM Symposium on User Interface Software and Technology
September 28 - October 1, 2025
Busan , Republic of Korea

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)24
  • Downloads (Last 6 weeks)1
Reflects downloads up to 17 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)OptiBasePen: Mobile Base+Pen Input on Passive Surfaces by Sensing Relative Base Motion Plus Close-Range Pen PositionProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676467(1-9)Online publication date: 13-Oct-2024
  • (2024)Exploiting Physical Referent Features as Input for Multidimensional Data Selection in Augmented RealityACM Transactions on Computer-Human Interaction10.1145/364861331:4(1-40)Online publication date: 19-Sep-2024
  • (2022)Understanding How People with Limited Mobility Use Multi-Modal InputProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517458(1-17)Online publication date: 29-Apr-2022
  • (2021)PenShaft: Enabling Pen Shaft Detection and Interaction for Touchscreens12th Augmented Human International Conference10.1145/3460881.3460934(1-9)Online publication date: 27-May-2021
  • (2021)FacialPen: Using Facial Detection to Augment Pen-Based InteractionProceedings of the Asian CHI Symposium 202110.1145/3429360.3467672(1-8)Online publication date: 8-May-2021
  • (2021)Deep Learning-Based Hand Posture Recognition for Pen Interaction EnhancementArtificial Intelligence for Human Computer Interaction: A Modern Approach10.1007/978-3-030-82681-9_7(193-225)Online publication date: 5-Nov-2021
  • (2020)A 26-Contact Tangible Pen-Like Input Device for Capacitive DisplaysExtended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3334480.3383157(1-4)Online publication date: 25-Apr-2020
  • (2020)Manipulation, Learning, and Recall with Tangible Pen-Like InputProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376772(1-12)Online publication date: 21-Apr-2020
  • (2019)Experimental Analysis of Single Mode Switching Techniques in Augmented RealityProceedings of the 45th Graphics Interface Conference10.20380/GI2019.20(1-8)Online publication date: 1-Jun-2019
  • (2019)WatchPenProceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services10.1145/3338286.3340122(1-8)Online publication date: 1-Oct-2019
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media