Full Length ArticleAutomatic analysis and identification of verbal aggression and abusive behaviors for online social games
Graphical abstract
Introduction
Online social games provide rich interaction possibilities to their users, and create micro-worlds with social rules that parallel, but do not completely overlap with the real world. Since most transactions and interactions happen over digital media, these platforms present great opportunities to analyze user behavior. In online social games it is possible to record user actions, to create or to filter target interactions, and to obtain contextualized behavior instances. With the help of these data, one can either improve the game experience, by for instance adapting the game to maximize player enjoyment (Asteriadis, Shaker, Karpouzis, & Yannakakis, 2012), or use the game for a better understanding of the players themselves, for instance by inferring personality traits from in-game behavior (van Lankveld, Spronck, van den Herik, & Arntz, 2011).
There is a significant body of work that investigates the effects of aggressive and violent content in computer games on the players, particularly whether violent games induce aggression in children or not (Egenfeldt-Nielsen et al., 2013, Griffiths, 1999). However, little research has been done on aggressive behaviors within computer games. We do not deal here with the controversial issues of violent games (Ferguson, 2013). We distinguish here avatar aggression, which involves aggression displayed by the virtual characters of a game, from player aggression, which implicates the actual player as the target of aggression. The latter is a form of cyber-aggression, and is often disruptive for gaming experience. In this paper, we deal specifically with verbal player aggression via in-game communication channels. Most social online games provide several communication channels, including in-game chat, private messaging, gifting (i.e. sending a virtual gift to another player), message boards, friendship and alliance requests, and such. Rapid identification and resolution of verbal aggression over these channels is important for the gaming community. For this purpose, the content of verbal messages should be analyzed automatically.
In addition to verbal messages, we explore in this work a number of features that can be used for player profiling in social online games. In particular, we use a supervised machine learning approach to create models of abusive and aggressive verbal behavior from labeled instances of abuse in such an online game, based on actual player complaints. While mechanisms for handling player complaints exist in most social games, game moderators need to spend time and energy to analyze player complaints to resolve each case individually.1 Subsequently, labeled data are costly to obtain. We introduce here a labeled corpus for this purpose. Our study aims to improve the game experience indirectly, by automatically analyzing player complaints, and thus helping game moderators to respond to aggressive and abusive behaviors in the game. At the same time, our analysis may contribute to a better understanding of the factors that underlie such behaviors.
Our approach is based on the analysis of player complaints, player behavior, and player characteristics, including demographic data, game play statistics, and similar features of player history. The social interactions we analyze include chatting, as well as in-game friendship, offline messaging, and gifting. Our profiling methodology performs with a small number of false positives, and is now being incorporated into an actual game environment.
Our evaluation is based on the performance analysis of the classifiers we build for detecting abusive verbal behaviors automatically; if a classifier can perform well, this means the features we look at are selected correctly.
For classification, we have used the Bayes Point Machine formalism in this work (Herbrich, Graepel, & Campbell, 2001). To evaluate our proposed methodology, we have collected the CCSoft Okey Player Abuse (COPA) Database over six months of game play, with 100,000 unique users, and 800,000 individual games. Our labeled complaint data comprises 1066 player complaints, each involving one or more game plays between involved players.
The main research questions at the onset of this study were about understanding how social performance and gaming behaviors relate to verbal aggression, and whether there were factors that correlate highly with verbal aggression and abuse, or common features of abusive players. Our hypothesis is that player profiling and analysis of gaming behavior can provide useful cues in assessing cases of verbal aggression. In addition to answering these questions in the context of a particular game, we have sought to create an application of practical value, to help game designers in the moderation of their online game communities.
This paper is organized as follows. First, we give an overview of related work in social game analytics and aggressive behavior detection. We then introduce the game of Okey used in our experiments, and describe its social role in the Turkish culture. Next, we explain our proposed methodology. We present the COPA database, describe its annotation, and report experimental results. We conclude with a summary of findings and limitations.
Section snippets
Related work
A recent survey on human behavior analysis for computer games illustrates that while game designers analyze player behavior intensively when designing their games, real time behavior analysis is rarely incorporated into the game (Schouten, Tieben, van de Ven, & Schouten, 2011). There are companies that adapt their game content to user preferences by means of A–B testing, where a group of users receive one version of the game, while a second group receives a slightly modified version, and the
An online social game: Okey
Like in many countries, traditional Turkish games also have seen their online counterparts hitting the markets. A very well-known example of such a Turkish game is Okey, probably of Chinese origin, but adopted (and adapted) in Turkey and played socially for more than a hundred years. Okey is a part of the “Kıraathane” (i.e. the coffeehouse, but the word comes from “reading house”) culture in Turkey; the coffehouses are social gathering places for men, who spend long hours there to drink tea and
Methods
We propose a system to automatically analyze and rank player complaints. First, a training and benchmarking set is generated, where player complaints are manually labeled as ‘abusive’ or ‘offending’ by human moderators. For this study, a single moderator is assigned for the annotation task. Multiple annotators would certainly increase the quality of annotations, but at the cost of doubling or tripling the annotation expenses. The information of players involved in these complaints is extracted
Results and discussion
First, we present the results obtained when all features are fed to BPMs individually and observe the effect of changing the threshold value, which is used to decide whether a player falls into abusive players category or not (see Fig. 4). Variance of precision, specificity and sensitivity for different threshold values show that increasing the threshold beyond 60% does not generate any significant gain for precision and specificity. However, sensitivity drops drastically as the threshold
Conclusions
In this study, we presented our analysis of player complaints data acquired from a real online social game. Our primary aim was to spot offenders by profiling players according to their in-game behavior and performance. We propose a feature vector for player profile and a binary clustering methodology, followed by an adaptive thresholding for confidence to classify players reported in complaints as genuine offenders. Our approach performs well enough to aid human moderators in terms of
Acknowledgments
This work was supported in part by the Scientific and Technological Research Council of Turkey (TUBITAK) under grant number 114E481.
References (41)
- et al.
Informing aggression–prevention efforts by comparing perpetrators of brief vs. extended cyber aggression
Computers in Human Behavior
(2013) Violent video games and aggression: A review of the literature
Aggression and Violent Behavior
(1999)- et al.
Facebook bullying: An extension of battles in school
Computers in Human Behavior
(2013) New bottle but old wine: A research of cyberbullying in schools
Computers in Human Behavior
(2007)- et al.
Multiparticipant chat analysis: A survey
Artificial Intelligence
(2013) - et al.
Analysis of group conversations: Modeling social verticality
- et al.
Comparison of two aggression inventories
Aggressive Behavior
(2006) - Asteriadis, S., Shaker, N., Karpouzis, K., & Yannakakis, G. N. (2012). Towards player’s affective and behavioral visual...
- Balci, K., & Salah, A. A. (2013). Player profiling and offender classification from player complaints in online social...
- Bean, A., & Groth-Marnat, G. (2014). Video gamers and personality: A five-factor model to understand game playing...