Make it personal: A social explanation system applied to group recommendations
Introduction
Recommender systems (Jameson, Smyth, 2007, Ricci, Rokach, Shapira, 2015) are expert systems which support human decision-making. They commonly use real or inferred preferences to suggest to their users items that they might like to consume. Depending on the number of users that will employ the product, we can speak of individual recommenders (Ricci et al., 2015) or group recommenders (Jameson & Smyth, 2007). In this paper we focus on the latter, and, more specifically on how to improve user’s acceptance of these systems’ outcome.
In the literature, e.g. (Golbeck, 2006, Jamali, Ester, 2009, Massa, Avesani, 2007) it was shown that using Social Network information in addition to feedback data (e.g. ratings) can significantly improve group recommendations’ accuracy. Besides, there is an agreement about the need to adapt group recommendation processes to group composition (Cantador, Castells, 2012, Ricci, Rokach, Shapira, 2015, Salamó, McCarthy, Smyth, 2012). Recent work has focused on modelling users’ social behaviour within a group to enhance the recommendation’s outcome (Mccarthy, Mcginty, Smyth, Salamó, 2006, Quijano-Sánchez, Recio-García, Díaz-Agudo, Jiménez-Díaz, 2013, Salehi-Abari, Boutilier, 2015). However, there is a lack of explanation methods for these social group recommendation results. Although some explanation components have been included in group recommender systems (Boratto, Carta, 2011, Jameson, 2004, McCarthy, Salamó, Coyle, McGinty, Smyth, Nixon, 2006) none of them have focused on using the social reality within a group for explanation generation.
Explanations and recommender systems have frequently been considered as part of the studies developed in the area of knowledge-based systems (Lopez-Suarez & Kamel, 1994) where both of them can be used to support decision-making processes. It was found that explanations can help increase users’ acceptance of the proposed recommendations, helping them make faster decisions, convincing them to buy the proposed items or even develop trust in the system as a whole (Herlocker, Konstan, & Riedl, 2000). Besides, it has been acknowledged that for the users, many recommender systems function as black boxes, and therefore, do not provide transparency into how the recommendation process works and do not offer further information to go along with the recommendations. This situation can lead to the user being startled by a given recommendation, producing the need for an explanation (Herlocker et al., 2000). For instance, explanations are able to provide transparency by presenting the reasoning and data behind a recommendation. There are some individual explanation approaches (Christakis, Fowler, 2009, Guy, Zwerdling, Carmel, Ronen, Uziel, Yogev, Ofek-Koifman, 2009), that provide the names of particular friends who liked the proposed item to induce a better acceptance of that specific item; especially if the chosen names refer to good friends, tapping into the idea that people we like are more likely to persuade us. We can consider this type of explanations to be social, as they induce a positive reaction by recalling social bonds.
To date, our main line of research has focused on improving current state-of-the art research on group recommenders through the inclusion of social factors in the generation of recommendations that satisfy a group of users with potentially competing interests. To do so, we have reviewed different ways of combining people’s personal preferences and proposed an approach that takes into account the social reality within a group. Quijano-Sánchez et al. (2013)’s Social Recommendation Model (SRM), defines a set of recommendation methods that include the analysis and use of several social factors such as the personality of group members, the tie strength between them and users’ satisfaction with past recommendations.
Departing from this starting point, this research takes a step forward and novelly translates the previously mentioned social explanations (Christakis, Fowler, 2009, Guy, Zwerdling, Carmel, Ronen, Uziel, Yogev, Ofek-Koifman, 2009) to group recommender systems. This is done by including not only friend-related information, as the previous mentioned works propose, but also all the social information that the adopted system (SRM) is able to retrieve, that is: personal ratings, user’s personality, tie strength between users and previous satisfaction. Hence, this paper’s goal is to provide to each group member a Personalized Social Individual Explanation (PSIE) about the system’s proposed group recommendation and, by doing so, to induce positive reactions that lead to a better perception of the received group recommendation and of the system in general.
Thus, this work aims to improve the performance of Quijano-Sánchez et al. (2013)’s system through the inclusion of PSIE. To do so, two different approaches are proposed, Textual Social Explanations (TSE) (Section 4.1) and Graphical Social Explanations (GSE) (Section 4.2). Then, the effects of including simple non-social explanations in group recommender systems, the effects of including just one social component to these group explanations and the effects of including all of SRM’s social information to these explanations, that is, the complete PSIE approach, are studied. To address these questions two experiments have been designed, for the textual approach (Section 5.2) and for the graphical approach (Section 5.3). By performing these experiments we evaluate which of the two presented approaches is preferable (Section 5.3).
Consequently, this research has the following main contributions:
- 1.
Study of the benefits of including explanations in group recommender systems.
- 2.
Study of the benefits of including a social component to explanations in group recommender systems.
- 3.
Proposal of a Personalized Social Individual Explanation approach (PSIE):
- (a)
Through a Textual Social Explanation approach (TSE).
- (b)
Through a Graphical Social Explanation approach (GSE).
- (a)
The remainder of this paper is structured as follows: In the next section we introduce some of the state-of-the-art research regarding explanations. In Section 3 we present the main theoretical concepts needed to develop this work. Section 4 presents our PSIE proposal. In Section 5 we present experiments and results. Finally, Section 6 concludes the paper.
Section snippets
Related work
Jannach, Zanker, Felfernig, and Friedrich (2011) affirm that explanations in recommender systems can be basically understood as some sort of communication between a selling agent (i.e. the recommender system) and a buying agent (i.e. the user). Research in explanations started with the premise that users would more likely trust a recommendation when they know the rationale behind it (Herlocker, Konstan, Riedl, 2000, Lamche, Adigüzel, Wörndl, 2014, Symeonidis, Nanopoulos, Manolopoulos, 2009,
Motivation
Group recommenders, often operate in leisure domains, where it is common for people to consume items in groups. The choice of a date movie, a family holiday destination, or a restaurant for a celebration meal all require the balancing of the preferences of multiple consumers. These systems commonly aggregate real or predicted ratings for group members (Baltrunas, Makcinskas, Ricci, 2010, Berkovsky, Freyne, 2010, Jameson, Smyth, 2007, Pessemier, Dooms, Martens, 2013). The aggregation functions
Personalized social individual explanations
We understand explanation as in the Oxford Dictionary definition: “give a reason or justification for”. In our case, explanations aim to justify the recommendation of an item for the group’s welfare. We appeal to users’ sense of justice and social bonds to help them comprehend why the recommender has presented a specific item as the best option for the group. Our hypotheses aim to demonstrate that, by helping users understand the system’s recommendation through a set of explanations, the
Experimental evaluation
To validate this paper’s posed hypotheses (Section 3), we have performed two randomized experiments (Diez, Barr, & Cetinkaya-Rundel, 2013) that, in general allow us to study causal connections between providing explanations and improving users’ reactions and in particular, allow us to address each of the stated research questions. The first experiment (Section 5.2) uses the TSE approach and the second one (Section 5.3) uses both the TSE and the GSE approaches. The overall experiment procedure
Conclusions
In this paper we have introduced a Personalized Social Individual Explanation (PSIE) approach where we present group recommenders’ users an explanation of why the system assumes that the recommended item is the best option for the group. The presented research illustrates a novel way of reflecting group dynamics as opposed to other works (Jameson, 2004, McCarthy, Salamó, Coyle, McGinty, Smyth, Nixon, 2006) that limit the explanation of a group recommendation to the trivial process of justifying
References (55)
- et al.
Flexible knowledge representation and new similarity measure: Application on case based reasoning for waste treatment
Expert System with Applications
(2016) - et al.
How should i explain? a comparison of different explanation types for recommender systems
International Journal of Human-Computer Studies
(2014) Generating predictive movie recommendations from trust in social networks
International conference on trust management, itrust ’06
(2006)- et al.
Trust-aware recommender systems
International conference on recommender systems, recsys ’07
(2007) - et al.
Comparison of group recommendation algorithms
Multimedia tools an applications
(2013) - et al.
Development of a group recommender application in a social network
Knowledge-Based Systems
(2014) - et al.
Social factors in group recommender systems
ACM TIST
(2013) - et al.
Providing justifications in recommender systems
IEEE Transactions on Systems, Man, and Cybernetics, Part A
(2008) - et al.
Visualization of explanations in recommender systems
Journal of Advanced Management Science
(2014) - et al.
Group recommendation: semantics and efficiency
Proceedings of the VLDB Endowment
(2009)