Abstract
Even though an increasing amount of products are bought online today, the remote evaluation of product surfaces remains a challenging task because of their multimodal nature. Despite the common use of modern mobile consumer devices for online shopping which would allow for multimodal user interfaces, current presentations of products mainly focus on the visual and (rarely) auditory presentation. However, tactile properties are also significant for a buying decision and should thus not be disregarded. Current multimodal approaches still use specific hardware, especially for the presentation of tactile properties. In this paper, we therefore present an approach for the multimodal inspection of product surfaces using only common mobile consumer hardware such as smartphones and tablets.
Approaches for the adaptation of the visual, auditory, and tactile output based on device features and user inputs are presented that enable an interactive multimodal evaluation of product surfaces. The tactile presentation is realized using only the vibration motor built into modern mobile devices. In order to deal with the restriction of mobile systems, a four-step tactile signal processing approach is introduced. The realization of the approaches is demonstrated by means of an Android application, which was used to conduct an initial user study that shows promising results but also the potential for future works.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
- Virtual product surface representation
- Multimodal user interface
- Vibrotactile feedback
- Mobile user interface
1 Introduction
Even with the e-commerce market size continuing to grow in recent years, the challenge of remote evaluation of product surfaces has not yet been solved. This mainly results from their multimodal nature, i.e. they are defined by multiple visual, auditory, and haptic properties. While the presentation of products currently focuses on the display of the visual and sometimes auditory properties, the haptic property remains a vital modality that should not be disregarded, e.g., when evaluating fashion [30]. Thus, a multimodal presentation of product surfaces – that incorporates all three modalities – is required and may also help to reduce the high number of product returns in online shopping that becomes an increasing challenge for the e-commerce sector [11].
A subjective inspection of the product surfaces is still essential for their comprehensive and genuine evaluation. For this task a representation of their visual, auditory, and haptic properties has to be displayed on a multimodal user interface. The visual and auditory properties are usually represented by photos and (on rare occasions) by prerecorded audio files using displays and speakers. Few approaches for the presentation of the haptic properties of product surfaces exist that mostly use vibration to represent the (tactile property) roughness. However, these approaches use specific hardware such as arrays of vibrating contactor pins [1, 22] or electrodynamic shakers [2] that are not mobile and not widely available to the public. This greatly hinders their use for e-commerce applications as most online shopping today is already done using common mobile devices such as smartphones and tablets [10] and does thus not require any additional hardware.
However, mobile devices today already allow for multimodal interactions without the need for additional hardware by offering distinct sensors and actuators: a touch sensitive screen for tactile input as well as visual output, speakers for auditory output, and a vibration motor for tactile feedback. In this paper, we therefore introduce an approach that allows the remote multimodal (auditory, visual, and tactile) inspection of product surfaces using only common mobile consumer hardware like smartphones or tablets. For this purpose, a multimodal user interface is presented that allows the adaptation of prerecorded auditory, visual, and tactile media data of product surfaces according to user interactions.
After a short introduction to the multimodal inspection of product surfaces (especially of textile fabrics) in Sect. 2, the processing of the multimodal signals is described in Sect. 3. The realization of these algorithms in an Android app is introduced in Sect. 4 and evaluated by means of a user study in Sect. 5. Finally, in Sect. 6 we will draw conclusions about the applicability of our approach for the multimodal inspection of product surfaces on common mobile devices and discuss future works.
2 Multimodal Properties of Product Surfaces
As stated earlier, product surfaces are defined by multiple properties of different modalities. In the following some of the most important properties are discussed subdivided by their modality. In order to achieve a realistic virtual representation of product surfaces, as many of these properties as possible should be presented in a multimodal user interface. In this section, textile fabrics are used as exemplary material of the product surfaces as their flexible nature results in the most diverse properties and hence poses the most challenging task for a virtual multimodal presentation.
2.1 Visual and Auditory Inspection
Color is a primary visual characteristic of product surfaces. In addition to the pure color tone, brightness and saturation play a separate role. These properties, also known as color depth, are, e.g., determined by dye concentration. For the objective evaluation of color depth, color tone metrics and standard type depths were defined and standardized [15]. However, as color must be evaluated by each person individually based on their personal taste, a photo may be used to show the color of the product surface. As colors may look different under certain light conditions, all photos should be taken using a constant light color. This way, the reflection behavior may also be represented by taking multiple photos with changing angles of the lighting system. Visual properties are today often already presented in online shops by presenting multiple photos of the product, e.g., by showing picture of the cloths worn by models or even close-ups of the textile fabrics. This way the design of the surface can also be presented.
An auditory description of a product surface may be obtained by mechanical interaction with the surface itself, e.g. by swiping over a textile surface with the finger or the aid of an object [2]. If audio is recorded during the swipe over motion, an audio profile of the fabric can be generated, which among others depends on the selected speed of the object [3]. This recording procedure can also be varied with regard to the coating head (different materials such as organic fabrics/finger) or the swiping direction in order to increase the significance of the audio recording. The resulting audio record may then give an impression of auditory properties such as the loudness of a fabric or the creaking sound of leather [20].
2.2 Kinesthetic and Tactile Inspection
The haptic properties of a product surface are usually the most diverse of the multimodal properties. In the textile industry, the totality of all sensory impressions associated with touching or manipulating a fabric by hand or finger is called fabric hand [13]. These impressions result from the mechanic properties of the product surface (e.g. its surface texture) and stimulate both the kinesthetic and tactile mechanoreceptors [19]: tactile due to the unavoidable skin contact with the surface and kinesthetic as a prerequisite for establishing contact and actively exploring the surface by moving the hand and fingers.
The first major haptic property to consider when evaluating product surfaces – especially of textile fabrics because of their flexible nature – is its stiffness, i.e. its resistance against deformation when applying force. It is therefore mostly recognized by the kinesthetic mechanoreceptors. A standardized test [8] for measuring a product’s stiffness has already been established. It enables an objective comparison of the stiffness by means of a specific measurement value.
Physically a product surface is among others defined by its friction. While the coefficient of friction may be used to objectively describe this effect, the roughness of a surface may still be perceived different by every individual person [2]. The perceived roughness of a surface results from high-frequency vibration when swiping the finger over a surface due its structure as well as the softness of the textile fabric and is registered by the tactile mechanoreceptors (cf. [27]). A standardized measurement method [14] may be used to calculate the roughness depth of a surface by means of a 3D scan.
Other properties like the form, weight, compressibility, and temperature are also important when evaluating a product surface and may be measured.
3 Multimodal Signal Processing
As stated in the last section, the auditive and haptic presentation depends among others on the position, direction, and velocity of the finger movement. In order to provide an interactive multimodal user interface for the inspection of product surfaces, it is hence not only necessary to find an expressive representation of the properties described in Sect. 2 but also to continuously adapt the multimodal data to user inputs as displayed in Fig. 1. Therefore, representative multimodal media data that can be prerecorded are discussed in this section as well as algorithms for the adaption of this data to presentation devices and user inputs.
3.1 Visual and Auditory Signal Processing
As discussed in Sect. 2.1, the visual properties of a product surface may be interpreted differently by each person and should thus best be presented by displaying a photo of it allowing the individual evaluation of the surface. An example of such photo is shown in Fig. 2a. Photos are shot as a top view to offer the best resemblance of a real fabric when shown on a touch screen. However, to allow an accurate evaluation of the surface, the photo must be displayed true to scale, because otherwise information about the design or surface structure may be distorted. For this, additional information about the presentation device’s and photo’s real dimension are required. While the former can be calculated from the pixel density that may be read from the system API (cf. [5]), the latter must be stored in a database as an additional image property. Based on these information, the photo can then be zoomed to be displayed on the device screen true to scale and thus enable an accurate evaluation of the surface.
Currently, there is no animated presentation of the deformation of the surface resulting from the physical interaction with the textile fabric as, e.g., proposed by Magnenat-Thalmann et al. [22]. As this could further enhance the visual evaluation, especially of flexible textile product surfaces, an animation of the surface deformation based on user interaction and information about the fabric stiffness should also be added in future work. Other visual information, such as the measured color depth, may be displayed in a tabular view and could, e.g., be used for finding further fabrics with similar colors.
Auditory properties may be represented by playback of prerecorded audio files. However, as the audio feedback – created by swiping over a product surface – changes based on the velocity of the motion, the output must be adapted continuously according to current user interactions. This can be accomplished by alteration of the playback speed by means of the time stretching mechanism, which allow change of playback speed without affecting the pitch. To allow for such processing, the audio file must be recorded using a swiping motion at a constant speed and this original velocity must be stored in a database. In addition, the velocity of the finger movement on the device screen must be tracked and may then be used for parameterization of the time stretching effect. On top of that, our studies have shown that the audio signal must also get louder with increasing finger velocity and vice versa for an higher acceptance of the virtual presentation. Hence, the amplitude of the audio signal should also be adapted based on the current finger velocity. To allow for a multimodal user interaction, the adaption is performed continuously while the user may interact with the photo of the product surface on the touch screen of the mobile device as shown in Fig. 1.
3.2 Tactile Signal Processing
While the auditory and visual modalities are already commonly supported by mobile user interfaces, tactile feedback is typically only used for alarming the user in case of new notifications or on button presses, e.g., on the on-screen keyboard. Different predefined vibration patterns may also be used for different kinds of notifications. However, during the inspection of product surfaces a continuous adaptive haptic feedback is required.
In this section, we therefore introduce a four-step algorithm that generates a tactile output using the vibration motor of mobile devices. Unfortunately, a kinesthetic output can not be provided using common mobile consumer devices as they lack actuators for the kinesthetic stimulation. Consequently, we will focus on the representation of the tactile property roughness (cf. Sect. 2.2) as it is currently the only tactile property that may be simulated on a touch display [2].
As stated earlier, the roughness of a surface is defined by its structure which may be recorded by 3D scanning the surface. The result of such 3D scan is a three-dimensional point cloud, an example of which is shown in Fig. 2b. However, as a very high resolution is required to record the fine structures of textile surfaces, the resulting point clouds usually become too large for processing on a mobile device. Hence, a data reduction is required in order to make the point clouds usable. Allerkamp et al. [1] propose a grayscale image as a representation of the surface profile which they calculated from a single photo of a surface. Adapting this approach, we propose transforming the point cloud into a grayscale image representing a physical map of the surface profile. Each pixel of this image represents an area on the x-y-plane of the point cloud with the brightness value B(x, y) of a pixel representing the average height z of the area. An example of a resulting surface profile image can be seen in Fig. 2c.
Based on the surface profile image, that can easily be processed by mobile devices due to its reduced amount of data, we propose a cyclic four step tactile signal processing algorithm that is shown in Fig. 3. The algorithm enables the continuous adaptation of a vibrotactile feedback to the user’s swiping motion over the touch screen of a mobile device displaying the photo of a product surface (cf. Sect. 3.1).
Step of the process requires the continuous tracking of the finger position on the touch screen that corresponds to a specific position on the inspected product surface. As user inputs are usually only registered by discrete touch events, the continuous path of the user’s swiping movements \(\vec {x}(t)\) must be reconstructed. At first, the positions associated with the touch events \(\vec {x}(t_i)\) must be converted from pixel values to real world dimensions to enable a later mapping of the finger position onto the surface profile image. As described in Sect. 3.1, the pixel density may be used for this step. In order to reconstruct a steady, continuous path, the ranges between consecutive sampling points of the touch events are interpolated using a linear function according to Eq. 1.
Based on the finger position history from step and the surface profile image, the height variations z(t), that a finger would have passed on the real surface during its trajectory, are calculated in step . Using Eq. 2, the height of the product surface z at a specific position \(\vec {x}(t_i)\) can be reconstructed from the brightness values (\(B(\vec {x})\)) of the surface profile image based on the resolution of the surface profile image () and scaling factor of the brightness value representing the height value () that must be stored in a database in addition to the surface profile image.
Figure 4 shows the resulting reconstructed surface height variation history \(z(t)=z(\vec {x}(t))\) for an exemplary path and surface profile image.
To allow an output to the vibration motor, an output signal \(P(t)=P(z(t))\) is generated in step . This is done by analyzing the surface height variation history, as it resembles the surface structure that the tactile mechanoreceptors recognize as the roughness of the product surface. Common mobile systems usually only allow controlling the vibration motor by using predefined vibration patterns or setting a power level for a defined amount of time (see e.g. [6]). Hence, the output signal must be reduced to a set of discrete vibration power values allowing the representation of the roughness feeling of a surface.
Allerkampf et al. [1] proposed the decomposing of the vibration signal into two base frequencies of \(40\,\mathrm{Hz}\) and \(320\,\mathrm{Hz}\) based on the frequency spectrum created by a Fourier transformation. Comparable to this approach, we propose using a Fast Fourier Transformation [12] of the surface height variation history from step to enable the processing of the frequency spectrum: \(F(\omega ) = \mathcal {F}(z(t))\). To cope with the inertia of vibration motors in mobile system, \(50\,\mathrm{ms}\) was defined as the window for the Fourier Transformation. The sample rate is set to \(1600\,\mathrm{Hz}\), which is twice as high as the upper sensory threshold of the pacinian corpuscles [18] – the tactile mechanoreceptors that mainly register vibration.
Based on the restrictions set by mobile systems, our algorithm requires the decomposition of the vibration signal into only a single vibration power value per execution cycle with an unknown frequency, as vibration frequencies may vary between mobile devices. In order to determine this single value, the weighted root mean square of the determined frequency spectrum is calculated as shown in Eq. 3. The root mean square is used to assign a stronger weighting to more dominant frequencies.
With lower frequencies usually representing static effects – e.g., when pressing the finger against a corner – and higher frequencies being primarily responsible for the perception of roughness (cf. [27]), frequencies lower than \(10\,\mathrm{Hz}\) resp. \(40\,\mathrm{Hz}\) are weighed lower by means of the factor \(c_{rec}(\omega )\). Additionally, frequencies higher than \(800\,\mathrm{Hz}\) are cut off, because they are above the human sensory threshold (cf. [18]), i.e. they cannot be recognized by the tactile mechanoreceptors.
With the vibration power P(t) now determined, it may be output to the vibration motor in step using the device’s system API. For this, the value must be rescaled to match the expected value range of the mobile system (e.g., \(0 \le P_v(t) \le 255\) in case of Android [6]): \(P_v(t) = c_{v} \cdot P(t)\). The scaling factor \(c_v\) may however depend on the specific vibration motor or even user, i.e. it cannot be calculated and was therefore determined in a user study presented in Sect. 5.
4 Realization of a Mobile Application
In order to demonstrate our approach, an Android application was created that allows loading prerecorded multimodal media data from the internet and realizes a multimodal user interface by integrating the signal processing approaches described in Sect. 3. The application was tested on a Sony Xperia XZ3 smartphone [25] that offers a high resolution screen, loudspeaker, and a linear resonant actuator as vibration motor as part of their Dynamic Vibration System, which offers better transient response [29] and thus allows a better tactile output.
At first, multimodal media data and measurements for 26 textile surfaces have been recorded and stored in a relational database as described in [20]. A RESTful web service interfaceFootnote 1 was then created that allows browsing the available products and provides the required multimodal media data as wells as the parameters for the multimodal signal processing in a JSON syntax. Furthermore, the media data are provided in an efficient way by support of multiple media codecs for file size reduction and a caching mechanism is implemented to avoid re-rendering of media data with previously used quality parameters (e.g. equivalent resolutions). To retrieve the data, a web service client was implemented in the Android application using the GSONFootnote 2 library. The products loaded from the web services are listed in the user interface shown in the left image of Fig. 5.
Upon selection of a product, the multimodal interaction screen is shown, the debug version of which is shown in the center image of Fig. 5. In the bottom of the screen, the top view photo of the product surface is shown true to scale as described in Sect. 3.1. In the non-debug version of this screen, the photo is also displayed instead of the three diagrams which show the finger position history (top, x(t) resp. y(t)), the surface height variation history (middle, z(t)), and the resulting vibration signal (bottom, \(P_v(t)\)). When swiping over the photo of the product surface, a vibration signal is generated and output by continuous execution of the tactile signal processing algorithm described in Sect. 3.2. Furthermore, audio of the prerecorded swiping sound is played and adapted to the finger velocity as described in Sect. 3.1.
An alternate output mode was also implemented that allows audio and haptic output to external audio and haptic devices such as electrodynamic shakers via a stereo audio signal. For this, the left channel is used for audio output while the right channel may be used for pre-rendered haptic signals as, e.g., introduced in [2]. Haptic devices may then be connected to the mobile device using the headphone jack.
Additionally, the measured multimodal properties of the product surface (cf. Sect. 2) that are stored in the database are listed in a tabular view as shown in the right image of Fig. 5. These allow a comparison of products, e.g., based on the coefficient of friction or the fibre-composition.
The multimodal interaction design poses special challenges such as allowing only certain maximum latencies for haptic (\(36\,\mathrm{ms}\) [17]), auditory (\(30\,\mathrm{ms}\) [23]), or visual (\(25\,\mathrm{ms}\) [21]) feedback to a haptic input if a natural feeling of the user interface should be maintained. As the continuous tracking of the finger position in Android already raises certain delays and the calculation of the corresponding multimodal feedback signals adds further delay because of its 50 ms execution cycle time, an approximation of the future finger trajectory was also added to our application. The extrapolation is based on a regression of the finger position history \(\vec {x}(t)\) from step of the tactile signal processing algorithm. The method of least squares with a quadratic model function was used for calculation of the extrapolated trajectory function. However, the error increases significantly on rapid finger movements, as shown in the exemplary extrapolated finger trajectory in Fig. 6, e.g., between 4 s and 4.6 s.
Therefore, the approximated time frame should be as short as possible. A bandwidth of up to 5 Hz for both arm and finger movement for a known or trained pattern has been determined as dynamic parameters for haptic user inputs according to [9, 26]. Hence, the extrapolated range should not exceed 100 ms to avoid rapid changes of direction as a complete back and forth movement of the finger should take at least 200 ms.
5 User Study
As already discussed in Sect. 3.2, the scaling factor \(c_v\) required for the vibrotactile output is hardware- and user-specific. In order to determine a value for the scaling factor and validate our application, we conducted a user study using a modified version of the Android application and the Sony Xperia XZ3 smartphone (cf. Sect. 4). The multimodal inspection screen – that was shown in the center of Fig. 5 – was modified for this user study by replacing the three debug diagrams with a slider that allows changing of the scaling factor \(c_v\) within the range \(\left[ 0, 40\right] \). While a value of 0 would result in no vibrations, 40 generates a permanent maximum vibration for the roughest textile surface of the experiment. The three textiles shown in Fig. 7 were selected for the experiment as their surface structure ranges from very smooth (H71) to very rough (H31).
For the user study, the smartphone and textiles were fixated on a table, allowing the subjects to explore both the real and virtual textile surfaces by swiping their finger tips across the surfaces. The swiping motions were performed with a faster () and a slower velocity () which were visually monitored. Noise-canceling headphones were used for audio output to reduce possible outside influences as the sound of textile surfaces is very quiet and subtle.
13 subjects, 7 men and 6 women, aged between 21 and 34 resp. above 50 years, were asked to determine the point of subjective equality (PSE) as, e.g., introduced in [4]. That means, the subjects were ask to set the scaling factor \(c_v\) to a value where they feel the vibration of the smartphone gives the most realisitic impression of the corresponding real textile surface. The experience level of the subjects in handling textiles ranged from beginners to experts, though most of them were beginners. All subjects had no previous experience with tactile simulation of surfaces. The order of the textiles and the swiping velocities were randomized. The subjects were given a training phase of approximately 5 min with the entire experiment taking approximately 20 min in average.
The resulting PSE values of \(c_v\) and their standard deviation are shown in Fig. 8. The mean PSE values increase with increasing surface roughness. For example, for the slow swiping motion, the PSE values range from 22.2 (H71) to 27.2 (H31). The relative standard deviation ranges from \(23\%\) (B1, fast) resp. \(29\%\) (B1, slow) to \(34\%\) (H71, fast) resp. \(41\%\) (H31, slow). PSE values of the fast swiping motion are in average \(10\%\) higher than for the slow velocity.
The results show that an increase in roughness also requires an increased scaling factor \(c_v\). An increase in vibration power is expected, as a bigger amplitude is usually perceived as a rougher surface texture. However, by using the surface height profile, the surface structure is already factored into our tactile signal processing algorithm. Hence, the scaling factor should be approximately even across all textiles for a defined hardware and user. Thus, an adaption of the algorithm should be considered for future works by integrating additional measured roughness values as, e.g., described in [20, 24].
The high relative standard deviations are caused by the user-specific perceptual characteristics. In order to set the user-specific scaling factor, an initialization phase should be provided in the application. The highest relative standard deviation was found for the roughest fabric (H31) at slow velocity which is an indication for an imperfect tactile user experience. This is due to the fact that the surface structure of this fabric is very sharp and is mainly perceived statically – i.e., at low frequencies as described in Sect. 3.2 – by the tactile mechanoreceptors, especially at lower velocities. This cannot be reproduced by the smooth glass surface of the smartphone or by the small amplitudes of the vibrations provided by vibration motors of mobile devices. On the other hand, the lowest relative standard deviation is found for the medium-rough jeans fabric (B1). This, in combination with comments by the subjects, shows that the vibrotactile output of a smartphone is best suited for rougher surfaces without distinct sharp structures.
Furthermore, subjects of the study were also asked to rate the haptic quality of the vibrotactile representation of the real textile surfaces from very unrealistic (1) to very realistic (7). The results depicted in Fig. 9 show no major difference in the assessed quality between the different surface textures or velocities.
As the mean evaluation shows an below average result (3.18), future work is still required regarding the tactile feedback. Criticism from the subjects included primarily the hard, cold, and smooth glass surface of the smartphone being too different from the soft and flexible nature of textile fabrics and the vibration intensity being too low for very rough surfaces (H31) even with the scaling factor set to maximum. However, when asked if such virtual tactile inspection options would support the decision to buy more clothes online in the future, most subjects thought it probable (\(40\%\)) or were still undecided (\(27\%\)). This underlines the great potential of virtual tactile inspection of product surfaces.
6 Conclusions and Future Work
In this Paper, we have presented an approach for the multimodal inspection of product surfaces by usage of common mobile devices such as smartphones. To create a virtual representation of the multimodal properties of product surfaces, we introduced algorithms for the adaptation of visual, auditory, and tactile prerecorded media files to user interaction and device features. While visual and auditory presentation may already be found, e.g. within e-commerce applications, the vibrotactile presentation of surfaces using only the vibration motor of modern mobile device is very uncommon. We therefore introduced a four-step tactile signal processing algorithm that generates vibrotactile feedback based on the surface profile and user interactions with the virtual product surface.
We have demonstrated the applicability of this approach by realization of a web service – providing the multimodal media data – and an Android application. The application was tested using a modern smartphone with a linear resonant actuator for vibration output. Based on this application, a user study has been conducted in order to define the user-specific scaling factor \(c_v\). Results of this user study show the potential of multimodal inspection of product surfaces but also the need for further developments, especially in the area of tactile output.
Initial tests with mobile devices with rotary vibration motors showed that the vibrotactile output produces very poor results due to the higher inertia. Some new technologies have already been introduced recently in the field of haptic feedback in mobile consumer devices such as Apple’s Taptic Engine or Nintendo’s HD Rumble that both also use a linear resonant actuator [16, 28]. The algorithm presented in this paper serves as a foundation for future developments and may well be adapted to new technologies.
Another result of the user study concerns the dependence of the vibration intensity on the installation location of the vibration motor. The vibration is perceived more strongly in direct proximity to the installation location than at a greater distance. For example, the vibration intensity decreased towards the top of the smartphone that was used in our user study as the actuator is situated at the bottom of the phone. The area of interaction should therefore be limited in the user interface in order to achieve a consistent, realistic tactile presentation. Since the position of the actuator cannot be detected automatically, a device database may be developed in the future, which could also be used to compensate the device dependency of the scaling factor.
Furthermore, the visual presentation may be enhance by adding a 3D animation of the fabric deformation when applying force to the surface (cf. Sect. 3.1). Measured information about the fabric stiffness may be used for this as well as live data about the pressure the user applies with his fingertip to touch screen, as provided by Apple’s 3D Touch technology [7], for example.
Notes
- 1.
The web service may be accessed at https://agtele.eats.et.tu-dresden.de/multimod.
- 2.
References
Allerkamp, D., Böttcher, G., Wolter, F.E., Brady, A.C., Qu, J., Summers, I.R.: A vibrotactile approach to tactile rendering. Visual Comput. 23(2), 97–108 (2007). https://doi.org/10.1007/s00371-006-0031-5
Alma, U.A., Altinsoy, E.: Perceived roughness of band-limited noise, single, and multiple sinusoids compared to recorded vibration. In: 2019 IEEE World Haptics Conference (WHC), Tokyo, Japan, pp. 337–342. IEEE (2019). https://doi.org/10.1109/WHC.2019.8816163
Altınsoy, M.E.: Auditory-tactile interaction in virtual environments. Ph.D. thesis, Ruhr-Universität Bochum, Aachen (2005)
Altinsoy, M.E., Merchel, S.: Electrotactile feedback for handheld devices with touch screen and simulation of roughness. IEEE Trans. Haptics 5(1), 6–13 (2012). https://doi.org/10.1109/TOH.2011.56
Android Developers: DisplayMetrics (2019). https://developer.android.com/reference/android/util/DisplayMetrics
Android Developers: Vibrator (2019). https://developer.android.com/reference/android/os/Vibrator
Apple Inc.: 3D Touch - User Interaction - iOS - Human Interface Guidelines - Apple Developer (2019). https://developer.apple.com/design/human-interface-guidelines/ios/user-interaction/3d-touch/
ASTM D1388–18: Standard test method for stiffness of fabrics. Technical report, ASTM International, West Conshohocken, PA (2018). https://doi.org/10.1520/D1388-18
Brooks, T.L.: Telerobotic response requirements. In: 1990 IEEE International Conference on Systems, Man, and Cybernetics Conference Proceedings, pp. 113–120. IEEE (1990). https://doi.org/10.1109/ICSMC.1990.142071
Clement, J.: Global mobile retail commerce share 2021. https://www.statista.com/statistics/806336/mobile-retail-commerce-share-worldwide/
Dennis, S.: The Ticking Time Bomb of E-commerce Returns, February 2018. https://www.forbes.com/sites/stevendennis/2018/02/14/the-ticking-time-bomb-of-e-commerce-returns/
Douglas L.J.: Decimation-in-Time (DIT) Radix-2 FFT. OpenStax CNX (2006)
Hatch, K.L.: Textile Science. West Publishing Company, Minneapolis (1993)
ISO International Standard Organization: Geometrical Product Specifications (GPS) - Surface Texture: Profile Method - Terms, Definitions and Surface Texture Parameters. No. ISO 4287 (1997)
ISO International Standard Organization: Textiles - Tests for Colour Fastness - Part A01: General Principles of Testing. No. ISO 105-A01 (2010)
Porter, J.: Meet the minds behind Nintendo Switch’s HD Rumble tech (2017). https://www.techradar.com/news/meet-the-minds-behind-nintendo-switchs-hd-rumble-tech
Kaaresoja, T., Hoggan, E., Anttila, E.: Playing with tactile feedback latency in touchscreen interaction: two approaches. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds.) INTERACT 2011. LNCS, vol. 6947, pp. 554–571. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-23771-3_42
Kaczmarek, K., Webster, J., Bach-y-Rita, P., Tompkins, W.: Electrotactile and vibrotactile displays for sensory substitution systems. IEEE Trans. Biomed. Eng. 38(1), 1–16 (1991). https://doi.org/10.1109/10.68204
Kern, T.A.: Entwicklung Haptischer Geräte: Ein Einstieg für Ingenieure. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-540-87644-1
Krzywinski, S., Pietsch, K., Boll, J., Alma, A., Martin, C., Rataj, C.: Advances in the field of digitalization by multimodal inspection of textiles - digital haptics. In: Proceedings of the 19th World Textile Conference-AUTEX 2019 (2019)
MacKenzie, I.S., Ware, C.: Lag as a determinant of human performance in interactive systems. In: Proceedings of the INTERACT 1993 and CHI 1993 Conference on Human Factors in Computing Systems, CHI 1993, pp. 488–493. ACM, Amsterdam (1993). https://doi.org/10.1145/169059.169431
Magnenat-Thalmann, N., et al.: From physics-based simulation to the touching of textiles: the HAPTEX project. Int. J. Virtual Reality 6(3), 35–44 (2007)
Mäki-Patola, T., Hämäläinen, P.: Latency tolerance for gesture controlled continuous sound instrument without tactile feedback. In: Proceedings of International Computer Music Conference (ICMC), Miami, USA, p. 8 (2004)
Mooneghi, S., Saharkhiz, S., Varkiani, S.: Surface roughness evaluation of textile fabrics: a literature review. J. Eng. Fibers Fabr. 9(2), 1–18 (2014). https://doi.org/10.1177/155892501400900201
Sony Mobile: Xperia XZ3 – official website (2019). https://www.sonymobile.com/global-en/products/phones/xperia-xz3/
Srinivasan, M.A., Cheng, J.S.: Human performance in controlling normal forces of contact with rigid objects. In: ASME Winter Annual Meeting: Advances in Robotics, Mechatronics and Haptic Interfaces, New Orleans, vol. 49, pp. 119–125 (1993)
Steinbach, E., et al.: Haptic codecs for the tactile internet. Proc. IEEE 107(2), 447–470 (2019). https://doi.org/10.1109/JPROC.2018.2867835
Dixon, T.: Good Vibrations: How Apple Dominates The Touch Feedback Game (2019). https://de.ifixit.com/News/16768/apple-taptic-engine-haptic-feedback
Texas Instruments: Haptics: Solutions for ERM and LRA Actuators (2013). http://www.ti.com/lit/ml/sszb151/sszb151.pdf
Workman, J.E.: Fashion consumer groups, gender, and need for touch. Clothing Text. Res. J. 28(2), 126–139 (2010). https://doi.org/10.1177/0887302X09356323
Acknowledgments
The authors would like to thank Sebastian Artur Lehmann and Tung Le for their work on the Android application as well as Ugur Alican Alma, Kathrin Pietsch, and Cornelia Rataj for their contributions towards the virtual representation of textile fabrics.
The IGF proposal 19479 BG of the research association “Forschungskuratorium Textil e.V.” is funded via the AiF within the scope of the “Industrial Collective Research” (IGF) by the German Federal Ministry for Economic Affairs and Energy (BMWi) on the basis of a decision by the German Bundestag.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Martin, C., Braune, A. (2020). Multimodal Inspection of Product Surfaces Using Mobile Consumer Devices. In: Yamamoto, S., Mori, H. (eds) Human Interface and the Management of Information. Designing Information. HCII 2020. Lecture Notes in Computer Science(), vol 12184. Springer, Cham. https://doi.org/10.1007/978-3-030-50020-7_27
Download citation
DOI: https://doi.org/10.1007/978-3-030-50020-7_27
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-50019-1
Online ISBN: 978-3-030-50020-7
eBook Packages: Computer ScienceComputer Science (R0)