Abstract
The user interface is one of the most important factors to use various functions of mobile device easily. Recently, the importance of the user interface has been increased after Apple emphasized emotional UX(User eXperience) by applying multi-touches to iPhone. Mobile device Manufacturers have tried to apply embedded sensors to apps as user interface. Microphone was applied to execute functions combined with voice recognition and GPS sensor was used to find a location in navigation apps or game apps. Like these services, several sensors have roles of interface between user and device. In this paper, we propose an intuitive user interaction method using multi-sensors in NPR(Non-Photorealistic) rendering. The proposed method renders pencil drawing filter for a given photo image with the direction and strength of line strokes which are applied in vector fields. At this point, the method changes direction of stroke with x-coordinate values and y-coordinate values and control the strength of the stroke with z-coordinate values after catching three coordinate values in real-time from the orientation sensor on a mobile device. The method also adjusts brightness of image by transforming intensity of light from the light sensor. The light sensor updates intensity of light source continuously. If the user brings the mobile device closer to the external light source, the light sensor value will increase and the intensity of brightness becomes stronger. In addition, the user can make an attenuation effect with orientation sensor combined with light sensor for specific scope in a photo image and change the attenuation scope by moving action of his holding mobile device. This is a user interaction method for emotional image processing using sensors in mobile devices. We developed an app which generated pencil drawing filter effect for a photo image. We experimented sensor based emotional user interaction and compared various results of image processing effects using two combinatorial sensors. Finally, this paper proved efficiency through the experimental results and evaluated the effectiveness of the proposed method.
Similar content being viewed by others
References
Kim J (2011) Digisensus. ChosunBiz Article
Kim Y (2012) Mobile UI technology trends and market outlook. ETRI Electron Telecommun Trends
Kim E-G, Yeom MI-R, Kim J-H (2011) Development of android-based application for measure a space coordinate and an area using of orientation sensor. J Korean Assoc Inform Educ, p 439–447
KT AIT (2010) Mobile applications catalyst: smartphone sensors. Technol Hot Issues
Lee H, Park J, Lee J (2011) Sensibility UX technology trends. ETRI Electron and Telecommun Trends 26(5)
Lim C (2010) Emotion ICT industry status and prospects. KEIT PD issue report
Ma J-y, Yong H-s, Park J, Yoon KH (2005) Pencil hatching effect using sharpening and blurring spatial filter. Korea Comput Graph Soc 11(1):8–12
Mao X, Nagasaka Y, Imamiya A (2001) Automatic generation of pencil drawing from 2D Images using line integral convolution. Proceedings of 7th International Conference on Computer Aided Design and Computer Graphics CAD/GRAPHICS2001, pp 240–248
Mao X, Nagasaka Y, Imamiya A (2004) Enhanced LIC pencil filter. Proc Int Conf Comput Graph, Imaging Visualization, p 251--256
Mobile Device User Interface (2010) ABI Research
Morgan JP (2010) Mobile advertising: an in depth look at the future of mobile advertising
Na Y et al. (2012) Touch-based smart terminal UX technology. KEIT PD Issue Report
Next-Generation Interfaces (2011) Future UI needs and R&D strategy, ETRI
Yong H-j, Back J-w, Jang T-J (2006) Stereo vision based virtual reality game using position/orientation sensors and vibrotactile device. Proc Conf Korean Hum Comput Interact, p 88–93
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Jung, JJ., Kim, JY., Chung, HS. et al. An intuitive user interaction method using multi-sensors for pencil drawing filter of NPR rendering in mobile devices. Multimed Tools Appl 74, 2371–2389 (2015). https://doi.org/10.1007/s11042-014-2054-7
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-014-2054-7