Abstract:
Smart-phone users can search for information about surrounding facilities or a route to their destination. However, it is difficult to get or search for information while...Show MoreMetadata
Abstract:
Smart-phone users can search for information about surrounding facilities or a route to their destination. However, it is difficult to get or search for information while walking because of low legibility. To address this problem, users have to stop walking or enlarge the screen. Our previously proposed system for smart-phone switches the information presentation policies in response to the user's context. In this paper we describe our context recognition mechanism for this system. This mechanism estimates user context from sensors embedded in a smart-phone. We use a Support Vector Machine for the context classification and compare four types of feature values consisting of FFT and 3 types of Wavelet Transforms. Experimental results show that recognition rates are 87.2 % with FFT, 90.9 % with Gabor Wavelet, 91.8 % with Haar Wavelet, and 92.1 % with MexicanHat Wavelet.
Published in: 2012 IEEE Virtual Reality Workshops (VRW)
Date of Conference: 04-08 March 2012
Date Added to IEEE Xplore: 12 April 2012
ISBN Information: