Make it personal: A social explanation system applied to group recommendations

https://doi.org/10.1016/j.eswa.2017.01.045Get rights and content

Highlights

  • We propose personalized social individual explanations for group recommenders.

  • We propose both a textual and a graphical social explanation approach.

  • We study the benefits of including explanations in group recommender systems.

  • We study the benefits of including social components to these explanations.

  • Results show a significant increase in users’ intent to follow our recommendations.

Abstract

Recommender systems help users to identify which items from a variety of choices best match their needs and preferences. In this context, explanations act as complementary information that can help users to better comprehend the system’s output and to encourage goals such as trust, confidence in decision-making or utility. In this paper we propose a Personalized Social Individual Explanation approach (PSIE). Unlike other expert systems the PSIE proposal novelly includes explanations about the system’s group recommendation and explanations about the group’s social reality with the goal of inducing a positive reaction that leads to a better perception of the received group recommendations. Among other challenges, we uncover a special need to focus on “tactful” explanations when addressing users’ personal relationships within a group and to focus on personalized reassuring explanations that encourage users to accept the presented recommendations. Besides, the resulting intelligent system significatively increases users’ intent (likelihood) to follow the recommendations, users’ satisfaction and the system’s efficiency and trustworthiness.

Introduction

Recommender systems (Jameson, Smyth, 2007, Ricci, Rokach, Shapira, 2015) are expert systems which support human decision-making. They commonly use real or inferred preferences to suggest to their users items that they might like to consume. Depending on the number of users that will employ the product, we can speak of individual recommenders (Ricci et al., 2015) or group recommenders (Jameson & Smyth, 2007). In this paper we focus on the latter, and, more specifically on how to improve user’s acceptance of these systems’ outcome.

In the literature, e.g. (Golbeck, 2006, Jamali, Ester, 2009, Massa, Avesani, 2007) it was shown that using Social Network information in addition to feedback data (e.g. ratings) can significantly improve group recommendations’ accuracy. Besides, there is an agreement about the need to adapt group recommendation processes to group composition (Cantador, Castells, 2012, Ricci, Rokach, Shapira, 2015, Salamó, McCarthy, Smyth, 2012). Recent work has focused on modelling users’ social behaviour within a group to enhance the recommendation’s outcome (Mccarthy, Mcginty, Smyth, Salamó, 2006, Quijano-Sánchez, Recio-García, Díaz-Agudo, Jiménez-Díaz, 2013, Salehi-Abari, Boutilier, 2015). However, there is a lack of explanation methods for these social group recommendation results. Although some explanation components have been included in group recommender systems (Boratto, Carta, 2011, Jameson, 2004, McCarthy, Salamó, Coyle, McGinty, Smyth, Nixon, 2006) none of them have focused on using the social reality within a group for explanation generation.

Explanations and recommender systems have frequently been considered as part of the studies developed in the area of knowledge-based systems (Lopez-Suarez & Kamel, 1994) where both of them can be used to support decision-making processes. It was found that explanations can help increase users’ acceptance of the proposed recommendations, helping them make faster decisions, convincing them to buy the proposed items or even develop trust in the system as a whole (Herlocker, Konstan, & Riedl, 2000). Besides, it has been acknowledged that for the users, many recommender systems function as black boxes, and therefore, do not provide transparency into how the recommendation process works and do not offer further information to go along with the recommendations. This situation can lead to the user being startled by a given recommendation, producing the need for an explanation (Herlocker et al., 2000). For instance, explanations are able to provide transparency by presenting the reasoning and data behind a recommendation. There are some individual explanation approaches (Christakis, Fowler, 2009, Guy, Zwerdling, Carmel, Ronen, Uziel, Yogev, Ofek-Koifman, 2009), that provide the names of particular friends who liked the proposed item to induce a better acceptance of that specific item; especially if the chosen names refer to good friends, tapping into the idea that people we like are more likely to persuade us. We can consider this type of explanations to be social, as they induce a positive reaction by recalling social bonds.

To date, our main line of research has focused on improving current state-of-the art research on group recommenders through the inclusion of social factors in the generation of recommendations that satisfy a group of users with potentially competing interests. To do so, we have reviewed different ways of combining people’s personal preferences and proposed an approach that takes into account the social reality within a group. Quijano-Sánchez et al. (2013)’s Social Recommendation Model (SRM), defines a set of recommendation methods that include the analysis and use of several social factors such as the personality of group members, the tie strength between them and users’ satisfaction with past recommendations.

Departing from this starting point, this research takes a step forward and novelly translates the previously mentioned social explanations (Christakis, Fowler, 2009, Guy, Zwerdling, Carmel, Ronen, Uziel, Yogev, Ofek-Koifman, 2009) to group recommender systems. This is done by including not only friend-related information, as the previous mentioned works propose, but also all the social information that the adopted system (SRM) is able to retrieve, that is: personal ratings, user’s personality, tie strength between users and previous satisfaction. Hence, this paper’s goal is to provide to each group member a Personalized Social Individual Explanation (PSIE) about the system’s proposed group recommendation and, by doing so, to induce positive reactions that lead to a better perception of the received group recommendation and of the system in general.

Thus, this work aims to improve the performance of Quijano-Sánchez et al. (2013)’s system through the inclusion of PSIE. To do so, two different approaches are proposed, Textual Social Explanations (TSE) (Section 4.1) and Graphical Social Explanations (GSE) (Section 4.2). Then, the effects of including simple non-social explanations in group recommender systems, the effects of including just one social component to these group explanations and the effects of including all of SRM’s social information to these explanations, that is, the complete PSIE approach, are studied. To address these questions two experiments have been designed, for the textual approach (Section 5.2) and for the graphical approach (Section 5.3). By performing these experiments we evaluate which of the two presented approaches is preferable (Section 5.3).

Consequently, this research has the following main contributions:

  • 1.

    Study of the benefits of including explanations in group recommender systems.

  • 2.

    Study of the benefits of including a social component to explanations in group recommender systems.

  • 3.

    Proposal of a Personalized Social Individual Explanation approach (PSIE):

    • (a)

      Through a Textual Social Explanation approach (TSE).

    • (b)

      Through a Graphical Social Explanation approach (GSE).

The remainder of this paper is structured as follows: In the next section we introduce some of the state-of-the-art research regarding explanations. In Section 3 we present the main theoretical concepts needed to develop this work. Section 4 presents our PSIE proposal. In Section 5 we present experiments and results. Finally, Section 6 concludes the paper.

Section snippets

Related work

Jannach, Zanker, Felfernig, and Friedrich (2011) affirm that explanations in recommender systems can be basically understood as some sort of communication between a selling agent (i.e. the recommender system) and a buying agent (i.e. the user). Research in explanations started with the premise that users would more likely trust a recommendation when they know the rationale behind it (Herlocker, Konstan, Riedl, 2000, Lamche, Adigüzel, Wörndl, 2014, Symeonidis, Nanopoulos, Manolopoulos, 2009,

Motivation

Group recommenders, often operate in leisure domains, where it is common for people to consume items in groups. The choice of a date movie, a family holiday destination, or a restaurant for a celebration meal all require the balancing of the preferences of multiple consumers. These systems commonly aggregate real or predicted ratings for group members (Baltrunas, Makcinskas, Ricci, 2010, Berkovsky, Freyne, 2010, Jameson, Smyth, 2007, Pessemier, Dooms, Martens, 2013). The aggregation functions

Personalized social individual explanations

We understand explanation as in the Oxford Dictionary definition: “give a reason or justification for”. In our case, explanations aim to justify the recommendation of an item for the group’s welfare. We appeal to users’ sense of justice and social bonds to help them comprehend why the recommender has presented a specific item as the best option for the group. Our hypotheses aim to demonstrate that, by helping users understand the system’s recommendation through a set of explanations, the

Experimental evaluation

To validate this paper’s posed hypotheses (Section 3), we have performed two randomized experiments (Diez, Barr, & Cetinkaya-Rundel, 2013) that, in general allow us to study causal connections between providing explanations and improving users’ reactions and in particular, allow us to address each of the stated research questions. The first experiment (Section 5.2) uses the TSE approach and the second one (Section 5.3) uses both the TSE and the GSE approaches. The overall experiment procedure

Conclusions

In this paper we have introduced a Personalized Social Individual Explanation (PSIE) approach where we present group recommenders’ users an explanation of why the system assumes that the recommended item is the best option for the group. The presented research illustrates a novel way of reflecting group dynamics as opposed to other works (Jameson, 2004, McCarthy, Salamó, Coyle, McGinty, Smyth, Nixon, 2006) that limit the explanation of a group recommendation to the trivial process of justifying

References (55)

  • L. Baltrunas et al.

    Group recommendations with rank aggregation and collaborative filtering

    International conference on recommender systems, recsys ’10

    (2010)
  • S. Berkovsky et al.

    Group-based recipe recommendations: analysis of data aggregation strategies

    International conference on recommender systems, recsys ’10

    (2010)
  • M. Bilgic et al.

    Explaining recommendations: Satisfaction vs. promotion

    Beyond personalization, the workshop on the next stage of recommender systems research, iui ’05

    (2005)
  • L. Boratto et al.

    State-of-the-art in group recommendation and new approaches for automatic identification of groups

    Information retrieval and mining in distributed environments

    (2011)
  • I. Cantador et al.

    Group recommender systems: New perspectives in the social web

    Recommender systems for the social web

    (2012)
  • N.A. Christakis et al.

    Connected: The surprising power of our social networks and how they shape our lives

    (2009)
  • R.B. Cialdini et al.

    Social influence: Social norms, conformity and compliance

    The handbook of social psychology, 2-volume set

    (1998)
  • D. Cosley et al.

    Is seeing believing?: How recommender system interfaces affect users’ opinions

    Conference on human factors in computing systems, chi ’03

    (2003)
  • D. Diez et al.

    OpenIntro statistics: Second edition

    (2013)
  • B.J. Fogg et al.

    What makes web sites credible?: A report on a large quantitative study

    Conference on human factors in computing systems, chi’ 01.

    (2001)
  • B. Forcher et al.

    Intuitive justifications of medical semantic search results

    Engineering Applications of Artificial Intelligence

    (2014)
  • I. Garcia et al.

    On the design of individual and group recommender systems for tourism

    Expert System with Applications

    (2011)
  • G. Groh et al.

    Social recommender systems

    Recommender systems for the social web

    (2012)
  • I. Guy et al.

    Personalized recommendation of social software items based on social relations

    International conference on recommender systems, recsys ’09

    (2009)
  • J.L. Herlocker et al.

    Explaining collaborative filtering recommendations

    ACM conference on computer supported cooperative work, CSCW’00

    (2000)
  • M. Hingston et al.

    User friendly recommender systems

    (2006)
  • M. Jamali et al.

    Using a trust network to improve top-n recommendation

    International conference on recommender systems, recsys ’09

    (2009)
  • Cited by (0)

    View full text