1 Introduction

With the emergence of a smart life style, consumer demands for information appliances are growing. The characteristic of this trend is diversification of products considering personal taste. In order to be competitive with the nature of this market, companies need to consider product design in terms of not only functionality but user’s sense of cognition and emotion.

In product design, the importance of user-centered design has been emphasized for a long time in practice as well as in academia. Particularly, visual, auditory, and tactile senses have a critical impact on the added value of products because they play a crucial role in how people form emotion by communicating with the external environment. Therefore, understanding the sensory information transmitted via these three senses and the emotional data induced from it is an essential element in user-centered product design.

However, collecting and applying relevant information requires a lot of time and cost for many companies. Small and mid-sized businesses that have a short product development cycle and a high reliance on a single product tend to be more subjective to this problem. This study aims to make the sensory emotion data more available in the form of online database so that designers in the small and mid-sized businesses can apply them to actual practices more easily.

We first summarize the product design process from the literature on new product development and user experience design. Second, we define the sensory emotion data based on the theories on visual, auditory, and tactile perception and transform them into modules of a database system. Finally, we design the user interface of the sensory emotion data system based on the interview with the product designers of information appliances.

2 Background

In the information science field, having recognized the importance of user’s emotional factors, a group of researchers began to explore their effects (Beaudry et al. 2010). They created sensor-based theoretical framework for the acceptance of information technology: technology acceptance models, UTAUT (Venkatesh et al. 2003), innovation diffusion theory, and planned behavior theory. In recent years, however, since the use environment of information technology has become complex and multidimensional, the above mentioned theories have limitations in grasping the preceding variables of user’s behavior. Therefore, this study starts with a comprehensive understanding of the design process that will involve emotional data by limiting the scope of the product called information appliances. Next, we review the literature on the concept and role of sensory information and on the visual, auditory, and tactile sensory emotion data.

2.1 Design Process of Information Appliances

Product design process represents the linkage system between activities for designing a product. The user-centered design process consists of five stages: an understanding stage that defines the problem, a research stage that examines the interaction between people and objects, a design stage that explores pain points or new values, a prototype stage in which an idea or a demo version is tested, and an evaluation stage to evaluate the likelihood of use and performance results. (Roozenburg and Eekels 1995) There is also a double-diamond design process that is used in the UX design methodology aimed at spreading and defining the scope of thinking through the process of diffusion and convergence, and then to derive innovative ideas from this. This process consists of four steps: Discover, Define, Develop, and Deliver (Council 2005).

2.2 Visual, Auditory and Tactile Senses in Information Appliances

Visual perception information: the overall visual perception process proceeds as shown (Fig. 1). Visualization occurs through selection, structuring, simplification, and clustering as the iterative process of searching and fixing through eye movement occurs, followed by cognitive judgment. Thinking and judgments arise to solve problems according to human sensory models.

Fig. 1.
figure 1

Visual perception Process (Arnheim 1969)

Visual information in information appliances allows users to use products more efficiently. The goal of the visual information design is to design user-friendly, natural and pleasant interactions and also includes solving various problems that exist in human-computer interface (Jacobson 2000). The visual representation of information is composed of elements such as shape, color, texture, layout, typography, metaphor, etc. (Byung Keun and Sung Jung 2008). The visual information module of the system includes space between letters, stroke width ratio (Alexander 1986), and color of text (Alexander 1986).

Auditory cognitive information: the auditory information is processed through various stages where physical stimulation by the sound source and the vehicle is transformed to nerve energy. The auditory information in the information appliances design is the user interface related to the hearing. The hearing interface in information appliances helps to establish the identity of a product. Auditory information is the sound heard by the user, and can be largely defined by three factors: size, height, and tone. The auditory module of the system consists of volume, tone, and pitch.

Tactile cognitive information: unlike the visual and auditory sense described above, the tactile sense is spread throughout the entire body. Sense is a process of primary information processing, in which external stimulus information is converted into signals of the internal nervous system and input. In the information appliances design, the tactile sense can form either a particular emotion from the surface material of the product, or a tactile interface for interacting with the product. In this study, we constructed a tactile cognitive module that compares the difference threshold (Matsumoto et al. 2011) and absolute threshold data (Parsons and Griffin 1988; Yonekawa et al. 1999) with reference to document (11, 14, 20).

2.3 Emotional Data in Information Appliances Design

Emotional data is the result of primary perception and subjective interpretation. In the general process of Kansei Engineering, (1) the basic adjectives are selected from dictionaries, magazines, surveys, and observation methods, (2) secondary representative adjectives are selected according to experts and observers’ evaluation, frequency and suitability and (3) emotional factors are identified through psychological evaluation and physiological evaluation (Seung-Hwa and Myung-Suk 2001).

Visual emotion data:

This is emotion information generated from a user’s visual perception on a product’s design elements (Mini et al. 1996). When the user chooses the product based on the design, the color and the shape are the main criteria for the product selection; especially the color which has a great influence on the emotion is selected as the visual emotion module. IRI emotion scale (IRI color research institute 2011) and Kobayashi color emotion scale (Kobayashi 1991) which are representative color emotion scale were gathered.

Auditory emotion data:

This is the emotion generated from a product’s auditory design elements. There are limitations on the number of emotions to deal with and there is a sensitivity that should be emphasized according to the characteristics of the product. The auditory sensibility module was formed using Russell’s emotional adjectives (Russell 2003).

Tactile emotion data:

This data provides information between a user’s emotion and a product’s design variables during the usage of the product (Chen et al. 2009). When the user interacts with the product, the material perceived by the tactile sensor influences the emotional aspect (preference/non-preference) of the user (Dépeault et al. 2009; Karlsson and Velasco 2007; Kawasegi et al. 2013; Park et al. 2013).

3 Designer Interview

3.1 Goal

In this study, an in-depth interview was conducted with designers of information appliances in order to establish a user’s visual, auditory, tactile emotion data. Through the interview, we tried to confirm the need of user’s visual, auditory, tactile emotion data and how such information is utilized. Table 1 summarizes the interviews with 6 field designers.

Table 1. Interview summaries

4 Result

Two evaluators coded the recording of the interview independently and agreed to a single version after a few iterations. The analysis is as follows.

Design process of information appliances:

The information appliances product design process has basic steps such as Planning, Design, Prototype, and Evaluation, but it differs from the research in the user-centered design process described in above literature review. The user-centered research method focuses on user’s direct observations and observes behavior, pattern, experience, and potential desire. On the other hand, planning stage research in the design process of information appliances does not consider user’s direct observation but the competitor’s models and given functions. You can validate the finding through the interview with Ever-net Co.

From planning to release, there is almost no step that tries to satisfy the consumer’s needs

[…] Plan the process according to the merchandising schedule and

[…] also choose the design accordingly.

(Tovis Designer Interview)

If a certain product needs to be planned, we have a meeting with the development team regarding the price range, hardware, software of a competitive product A, and go through a design phase that reflects the price and specifications of the most popular product in the market.

(Evernet Designer Interview)

From a process perspective, there was a company who executes a sequential process: clarifies the purpose of a product, decides the function that fits the purpose, sets the concept, and determines the material within the budget. On the other hand, there also was a nonlinear process where a design and function are decided by a scenario and adjusted based on the manufacturability. It is assumed that the nonlinear product process is preferred when the technology is not yet mature or the new product has little user experience data.

I think the scenario is the most important thing. We first work on the scenario that determines the target, content, and place of the service and on the design. During the process, we discuss with the engineers on the possibility of implementing certain functions.

(SOC Interview)

Utilizing sensory emotion data:

As designers perform specific sub-design tasks, there was a need for a sensory emotion data that helps to determine design variables. Due to the limitations on cost and stability, they preferred materials and colors commonly used in the industry, for existing voice database. However, they were also willing to look into the sensory emotion data as long as it matches the target product function and the design characteristics. In addition, interviewers confirmed that data reflecting the rapid changes in consumer trends are necessary.

The industry is conservative. We hardly seldom our practice.

(Neo Café Interview)

The babies have such a sensitive hearing that we couldn’t find a sound that only adults can wake up to. We are willing to provide auditory functions if the data support such information.

(Miro Interview)

The products are almost the same. The material is fixed.

[…] Previously customers liked red so much they sold a lot of red. Now it’s black. We have to reflect these things.

(Evernet Interview)

5 Sensory Emotion Data Support System

5.1 System Interface

The sensory emotional data support system can be largely divided into two parts: the input of design variable for the design of particular product and the output of the sensory emotion data related to it. The main structure of the interface is based on these two components (Fig. 2). The input part of the system is the design information by the designers at each stage of the design process, and the output is the result composed of data modules of visual, auditory, and tactile senses constructed from literature review.

Fig. 2.
figure 2

Interface design input and output basic structure

Figure 3 shows the integrated process framework of the user cognitive/emotional information system. Based on the product design process, detailed design activities were defined in each stage (Planning, Design, Prototype, and Evaluation). The system provides sensory data modules relevant to the sub-tasks in each product design stage.

Fig. 3.
figure 3

System framework

5.2 Sensory Emotion Database

The function of the database is to enable users to receive relevant sensory emotion data at each stage of design activity. Through interviews with practitioners, we have found that at each step of the design process, designers need to determine a certain set of design variables and sensory emotion data can help the decision-making. We created totally six modules where emotion data are extracted.

Figure 4 shows an example of the sensory data module of the visual information (letter, symbol, and figure) for the display design of information appliances. If there is a design variable called readable font size, the database system shows the sub-variables such as the distance, luminance, font type, and the number of strokes.

Fig. 4.
figure 4

Visual module example

6 Conclusion

Although the industry recognizes the necessity to utilize sensory emotion data in product development, it is still a challenging task to collect and present such information in a structured way. Therefore we extracted the required information through interviews and literature reviews and tried to establish an interactive system with corresponding modules. In the process of implementing the system, the following conclusions could be drawn.

First, a platform-based sensory emotion data system of open source concept is more suitable than a static data library. According to the interview with the designers, even the companies’ manufacturing the same type of projects in the same industry sector have different design processes with different sensory information requirements at each stage depending on the product development cycle and the portfolio. Therefore an open-platform with flexible interface is more desired than a fixed database based on single process in order to serve different scenarios by different designers. Also, emotion tends to change and evolve by social events and trends. The user context shifts rapidly as numerous information systems affect our life style. An open-platform allows adapting the sensory emotion data more quickly.

Second, while the contribution of our study lies in collecting and providing the sensory emotion data to the industry, our feeling is that there still is a large empty area in terms of the emotion data. Human emotion is affected by social and physical environment and hard to be quantified or systemized. There is a certain set of generalized emotion data but they do not serve particular target consumers or products. Therefore there exists a need for further development of emotion data particularly those differentiates different contexts.

We hope that the system help designers improve their products by providing the relevant sensory emotion data. In future research, the user evaluation should be conducted to verify the usability which will enhance the reliability of the system.