Keywords

1 Relational Artifacts and Users’ Notions of Them

In recent years, technical systems called ‘relational artifacts’ [1] gained attention in the field of human-computer interaction. They provide individualized assistive, monitoring or companionship services [2] and are known under terms like, ‘artificial companions’ [3], ‘sociable robots’ [4], ‘relational agents’ [5] or ‘Companion-systems’ [6]. They share the vision that “the computer is not a tool but a companion” (so-called ‘companion metaphor’ [1, p. 150]), which aims at supporting the userFootnote 1 and maintaining an emotional, long-time social relationship with him [7]. Therefore, on the technical side, implementing features that enable systems to provide the required functionality is inevitable. However, a technical system will only become a ‘companion’ if the individual user himself experiences it as such, including qualities like, e.g., empathy and trustworthiness (e.g., [8]).

Usually, users’ notions of technical systems are referred to as ‘mental models’ [9]. These describe internal system representations focusing on structure and functionality (e.g., [10]). They entail individual notions about the functioning of a system and its requirements including relevant components of the system, their interrelations and the conditions of their interaction.

Relational aspects aimed at by designers of relational artifacts are not covered in mental models. Thus, a supplementary concept may be beneficial. In [11], the concept of ‘anthropomorphization’ of technical systems is contrasted to mental models. Literature provides lots of examples for how users think about technical systems as entities with human-like attributes, mental states and behaviors (e.g., [12, 13]). Of course, those ideas may entail information about users’ relationship-related notions of a system. This is the case, e.g., when owners of Tamagotchi ascribe to them the human feeling of longing, when they did not spend time with them for a longer period [3]. However, to our best knowledge, a concept that is specially geared to artifacts which are designed to be experienced as ‘companions’ by their users and hence focuses especially on relational aspects of the interaction is still missing.

In this paper, we propose a concept termed ‘users’ relational ascriptions’ that shall fill this research gap. On the basis of our empirical work presented in [8, 14] we will explain how these ascriptions are formed in the users, we will define them and outline their relevance for user-companion interaction (UCI).

2 Insights from a Qualitative User Study

Besides the theoretical considerations of relational aspects of users’ experiences in interactions with relational artifacts, their relevance is supported by empiricism, too.

In a user study we conducted [8, 14], participants underwent a wizard of oz experiment in which they interacted with a speech-based dialogue system. Besides other interaction foci, the system asked for personal information for the purpose of individualization. The system was meant to represent a preliminary step towards future Companion-systems [15]. After the experiment, participants took part in a semi-structured interview focusing on subjective experiences during the interaction.

Our basic assumption in this study was that anthropomorphization of the simulated system is likely to occur in users’ reports. This assumption is especially based on the theory of the ‘intentional stance’ [16]. It describes that users explain and predict the behavior of a technical system by ascribing mental states to it in order to interact effectively with the system, when construction and functioning of the system are far too complex to make explanations and predictions on their basis.

The interview material contained users’ ideas about the system as well as users’ emotions and reflections upon themselves, which were occurring during the experiment. Its analysis was led by the following two research questions: (1) How do users experience, i.e., what do they ascribe to the simulated Companion-system (system-related experiences)?, and (2) How do users experience themselves in reaction to their individual experiences of, i.e., their ascriptions to it (self-related experiences)?

As described in [14], the analysis of 31 interviews revealed that relational issues are important for users. They tended to think in interpersonal relationship categories and ascribed human-like characteristics and behaviors to the system, e.g., support, honest interest in the user or nosiness. Findings regarding the system-related experiences are comprised in Table 1. Therein, categories are listed, which were worked out in the analysis and entail users’ ascriptions towards the system.

Table 1. Categories illustrating users’ ascriptions in individualization-focused UCI

We discussed our findings with regard to the system-related as well as the self-related experiences, firstly, concerning users’ attempt to regain safety by ascribing familiar human-like mental states to the system and thus, turn it into a predictable counterpart. This is in line with [16] and was explained on the basis of the human inherent need for safety [17]. Secondly, we discussed our findings regarding users’ efforts to make the system a potential relational partner they can get into contact with. In literature, this phenomenon is connected to the human inherent need to belong (e.g., [18]). In line with this latter motivation, a lot of private and intimate information was disclosed to the system, even when users’ ascriptions to it were negative in quality (e.g., pursuit of own hidden goals, ability to abuse confidence).

3 The Concept of ‘Users’ Relational Ascriptions’ in UCI

Aspects of relationship and attachment are highly relevant for UCI. They are not only aimed at by designers to build up systems as ‘relational partners’ for potential users, but even arise in users themselves during UCI [14]. Hence, we decided to expand our findings by describing a concept we called ‘users’ relational ascriptions’ for the field of UCI.

In the following, we describe why relational ascriptions are formed in users. Afterwards, we work out the characteristics of relational ascriptions based on our interview data and summarize them in a definition of users’ relational ascriptions.

3.1 Formation of Relational Ascriptions in Users

Figure 1 illustrates why relational ascriptions arise in users, from which situation they originated and which goal is aspired by making use of them. The relationships explained here are derived from our user study [14].

Fig. 1.
figure 1

Formation of users’ relational ascriptions

The interaction situation is experienced as uncertain regarding both, the system itself representing the interaction counterpart (With whom or what am I interacting here?, What can I expect from it?, What does it want from me?, etc.), and the interaction process (How should I behave in reaction to my counterpart?, How will the interaction proceed?, etc.). In such an uncertain situation, the necessity to adopt relational ascriptions arises in the user and is accompanied by the wish to do so on the basis of the following two needs inherent in humans.

The need for safety marks ‘the necessity for relational ascriptions’ evolving in the user. Companion-systems are intelligent technical devices providing both, an emotional and a relational dimension of interaction. Thereby, machine-like and human-like aspects can get into conflict and the system may be experienced as an unsettling hybrid [14]. In the sense of the human need for safety [17], the user tries to reduce uncertainty by ascribing human-like characteristics well-known from human-human interaction to the system. Hence, he is able to regain safety by turning the counterpart into something predictable and explainable, including ideas on how to interact effectively and successfully with it.

The need to belong marks ‘the wish for relational ascriptions’ that is inherent in the user and fundamentally motivates him in addition to the need for safety. It is defined as humans’ strong desire to establish and maintain relationships [19]. Based on this need, the user himself is motivated to turn the system into a social, human-like counterpart. On that basis, he is enabled to see a potential relational partner in the system.

3.2 Description of Relational Ascriptions

The interview material gained in our user study revealed a variety of users’ ascriptions towards the simulated system [14]. It became apparent that these ascriptions implied users’ ideas regarding the relationship between the system and themselves. Hence, we decided to call them ‘users’ relational ascriptions’ to emphasize the importance of the relationship and to contrast these ascriptions to users’ internal representation regarding systems’ structure and functioning described in mental models. We worked out the characteristics of relational ascriptions, as well as, on the one hand, the factors influencing their content and quality and, on the other hand, the factors which are influenced by their content and quality. All these aspects will be presented subsequently by referencing users’ utterances from our interview material.

3.2.1 Defining Characteristics of Relational Ascriptions

It became apparent that even if the ascriptions worked out in the interview material appeared to be similar in groups of users, each user developed individual ones. In our study, e.g., one user ascribes a similarity to human beings to the simulated system (“you do not necessarily expect that it has let’s say human-like habits”, BH Footnote 2 ), whereas another user clearly experiences the system as a technical entity (“so it was a computer I was sitting opposite to”, FW).

For example, [20] report about robots and interface agents that “the perception of a robot/agent and its assigned role can be very different from the perception and role intended by the developer of the artificial entity.” [20, p. 20]. Besides differences between developer and user, ascriptions also vary from one user to the other. This implies that there is not ‘one bundle of relational ascriptions’ every user shares. In fact, each user ‘creates’ his individual bundle. Thus, relational ascriptions arise from user’s subjectivity, rather than representing objective appraisals. Hence, we define relational ascriptions as user’s subjective interpretations, which concern the appearance, the implemented characteristics as well as the resulting behaviors of a Companion - system.

All these individual interpretations had in common that they are significant for the relationship between system and user. This marks the substantial difference between relational ascriptions and mental models, which refer to ideas about function and structure of the system.

According to our empirical analysis [14], the reference to the relationship is entailed in ascriptions regarding system’s nature, its performance, requirements by the system and the relational offer of it. Whether the system is experienced as a more human- or a more machine-like counterpart (‘nature’) having more or less advanced capabilities (‘performance’) influences the user’s expectations about the system in the interaction and in the relationship to it (“it mostly understood what I wanted (…) it reasoned (…) I thought (…) cool (…) how advanced this technology is already”, CT). The way requests by the system are interpreted (‘requirements by the system’) determines what expectations regarding the user’s behavior the user ascribes to the system (“then I thought, oh my god, what does he just want to hear now”, UK). Finally, the relational offer of the system includes notions about how the system is positioned to the user, how it gets into and stays in contact and which roles the user thinks the system assigns to him (“what is the point of that now (…) I just felt a bit provoked”, SP).

The content of relational ascriptions was made up mostly by anthropomorphic characteristics the user ascribes to the system [14]. For example, one user ascribes interest to the system by saying “you just felt like someone is really interested in you (…) it was just another kind of experience” (FK).

Functional and structural ascriptions are not excluded in UCI contexts, but become secondary. The priority of anthropomorphic ascriptions was explained by referring to two needs inherent in every user: the need to belong and the need for safety (cf. Sect. 3.1). Numerous examples in the literature underline the existence of anthropomorphization by users even when systems provide only small amounts of social cues [18] (e.g., [21]).

Furthermore, relational ascriptions appeared to be often formed on an implicit level of awareness. For example, the following utterance allows interpreting the user’s implicit ascription of nosiness to the system: “one feels like being picked ones brains a little bit”(EG). This seems to be related with anthropomorphic content of ascriptions. Users tend to adopt an ‘as-if-mode’ (see also [12]) in reasoning about and interacting with Companion-systems, as if the system would be a human-like counterpart, without really reflecting upon this attitude.

According to [22], subjective denotations regarding systems could occur on a conscious (explicit) level, but also on a preconscious or unconscious (implicit) level. That is, because it has to be differentiated between explicit knowledge about a system and thereby conscious ascriptions on the one hand and the extensive implicitness of ascriptions on the other hand.

Moreover, we observed that relational ascriptions had a dynamic character (“when it asked me I thought (…) that it really cobbles something individual-specific and when I see this in retrospective, I don’t know why it needed this”, CT). During an interaction users are able to approve or falsify previous relational ascriptions, but are also able to create new ones. Therefore, the interaction history is the most relevant influential factor: Experiences within the interaction and reflections upon these help to verify, to falsify or to change relational ascriptions made before.

3.2.2 Factors Influencing Content and Quality of Relational Ascriptions

By ‘quality’ it is meant whether the relational ascription is a positive or negative one. For instance, ascribing to the system to be personally interested in the user and to help him in a certain situation represents a positive ascription. Instead, ascribing nosiness in the sense of following own interests, represents a negative ascription.

Our interview analysis revealed that there are factors influencing the quality of relational ascriptions. For instance, regarding the simulated individualization-focused interaction in our study one user said: “for example the question for the shoe size and also for the age and one should give the full name (…) I think I didn’t answer twice or I said I won’t tell that (…) I’ve been suspicious” (SP). This indicates that the user’s internal state influences the ascription evolving in the user.

Besides user’s internal state, also the context of an interaction seems to influence the quality of relational ascriptions. In order to clarify this connection, imagine the following example: a computer crashes during the search for a nearby restaurant. If this happens during the summer holidays while one is relaxing on the veranda of a hotel room, one would probably not be upset or even ascribe malice or intentional provocation to the computer. In contrast, these ascriptions would probably occur while sitting in the office, being stressed because of preparing relevant documents before an important business associate arrives, who could barely fit this appointment into his tight time schedule.

Furthermore, we could recognize that the quality of relational ascriptions is influenced by user’s former experience regarding human-machine interaction as well as human-human interaction. Previous contacts with humans and machines lead to preconceptions, expectations and assumptions the users adopt when or even before entering the interaction with the Companion-system for the first time (“it is better than most of the computer voices I heard (…) it filters out more, it knows more, it is more human-like (…) it is not only such a yes-no-principle (…) it rather speaks to you, that was really thrilling”, CT).

Besides experiences with other technical systems (way of use, positive and negative experiences with them etc.), especially interpersonal experiences are supposed to guide the quality of relational ascriptions (“like a child who is taken at the hand without being informed, well, it shall accompany the parents but it isn’t told why”, SP). The systems’ behavior is interpreted in the sense of what is called “relational schemas” [23] in the user: “cognitive structures representing regularities in patterns of interpersonal relatedness [that consist of] an interpersonal script for the interaction pattern, a self-schema for how self is experienced in that interpersonal situation, and a schema for the other person in the interaction” [23, p. 461]. Because these schemas go back to individual primary interpersonal experiences with significant others, they differ from one person to the other. Based on the “schema for the other person” the resulting relational ascriptions to the system are made up by individual anthropomorphic attributes.

Besides the schema developed for the system, relational schemas also include a self-schema of the interacting person. It seems that the user’s self - related experiences during the interaction are reflected by him on the basis of his self-schema. These reflections reverberate to the quality of relational ascriptions to the system that are created. For example, one female user reported about her feeling during the individualization-focused interaction sequence “I really felt such a refusal” (SP), where a negative ascription arising in her, e.g., a pressure to surrender, may be interpreted.

3.2.3 Factors Being Influenced by Content and Quality of Relational Ascriptions

Users’ utterances like “I tried all the time to speak slow and accented, because I thought it wouldn’t recognize my speech otherwise” (SB) suggest that the quality of relational ascriptions itself influences the user’s behavioral choices during the interaction.

In [14] it could be shown that regarding an individualization-focused interaction with a Companion-system, the information disclosing behavior of the user is connected to the quality of his ascriptions to it. In this case, the relationship is not always a linear one. Paradox effects were recognized, too, and discussed in the sense of user’s need to belong. For example, some of the users disclosed even personal and intimate data although negative ascriptions towards the system appeared. This paradox is illustrated in the following user utterance, where unpleasant persistency is ascribed to the system: “anyhow it is only a computer and you didn’t know what it will do with your information and then you just said anything for making it shut up” (FW).

Besides influencing the user’s behavior, the quality of relational ascriptions seems to influence, if not even makes up the relationship between user and system, too (“with the human who speaks, with the voice, you built up a bond really at the beginning when you say ‘hello, my name is’, FK”). It determines whether or not a relationship is built up, what kind of relationship it is and if is maintained over a longer period of time.

3.3 Summarizing Definition of Relational Ascriptions

In order to sum up our explanation of users’ relational ascriptions, a definition of the concept is given subsequently.

Relational ascriptions are

  • Mainly unconscious individual user’s interpretations with regard to the appearance and the implemented characteristics as well as the resulting behaviors of a Companion-system that are significant for the relationship between system and user.

  • They entail interpretations regarding system’s nature, its performance, requirements by the system as well as the relational offer of the system.

  • They represent mostly notions of anthropomorphic content, which develop in the user before, during and after the interaction with the system.

  • They are dynamic; hence, they can be verified, falsified or changed through interactional experiences.

Their quality is influenced by

  • The context of the interaction,

  • Users’ former relational experiences from human-machine and human-human interaction,

  • As well as users’ self-related experiences during UCI.

The quality of relational ascriptions influences

  • Users’ behavioral choices during the interaction.

  • As well as the question if a relationship between user and system will be established and maintained and how it will look like.

4 Practical Implications and Future Research

Based on empirical investigations, we developed a concept dealing with relational aspects of users’ individual notions regarding Companion-systems. For the design and evaluation of relational artifacts, examining users’ individual experiences while interacting with them is indispensable. Besides researching individual notions about systems’ structure and functioning as represented in mental models, we propose to supplementary consider what we called ‘users’ relational ascriptions’.

We suggest understanding relational ascriptions as ‘interpretation foil’ for users’ experiences of interactions with Companion-systems. By ascribing to the system the user creates his individual view on it. The ascription-based view on the system is experienced as ‘real’ and ‘objective’ by him. This perspective may supplement works on user experience, which focus on investigating relationships between user experience as a summarized overall evaluation and distinct psychological variables [24, 25].

Of course, further investigations are needed to confirm our concept and our findings. In order to examine users’ relational ascriptions, we benefitted from using an open, narration-generating user interview as data collection method as well as structuring, interpretation-focused qualitative methods for analyzing the interview material. Qualitative methods are based on the assumption that spirit and purpose of experiences and actions can only be inferred by subjective meanings the experiencing or acting person ascribes to it [26]. Subjectivity and implicitness of users’ relational ascriptions are strong arguments for adopting an idiographic research approach.

Future research is required in many respects. For instance, changes in users’ relational ascriptions during long-term interactions with Companion-systems should be investigated. This is highly important when considering the design goal of relational artifacts to provide even long-term companionship to their users.

Moreover, research should face the challenge of making relational ascriptions applicable to the design of Companion-systems. If it is possible to combine these ascriptions with other individual user characteristics, individual user profiles could be build up. Then, it would be imaginable to derive profile-specific dialog strategies that may be implemented in the system. These strategies could be used to foster positive relational ascriptions and reduce negative ones. Thereby, user-companion interactions could be optimized in terms of comfortable long-term interaction patterns suitable for each individual user.