Elsevier

Computers in Human Behavior

Volume 53, December 2015, Pages 517-526
Computers in Human Behavior

Full Length Article
Automatic analysis and identification of verbal aggression and abusive behaviors for online social games

https://doi.org/10.1016/j.chb.2014.10.025Get rights and content

Highlights

  • We propose a system to automatically evaluate player complaints in a social game.

  • Our database contains 100,000 players, 1000 complaints and 240 abusive players.

  • We experiment with several pieces of player information and their combinations.

  • Our system can correctly identify abusive players with up to 85% precision.

  • We can isolate and identify severe abuse cases with a higher confidence.

Abstract

Online multiplayer games create new social platforms, with their own etiquette, social rules of conduct and ways of expression. What counts as aggressive and abusing behavior may change depending on the platform, but most online gaming companies need to deal with aggressive and abusive players explicitly. This usually is tied to a reporting mechanism where the offended player reports an offense. In this paper, we develop tools for validating whether a verbal aggression offense report refers to a real offense or not, in the context of a very popular online social game, called Okey. Our approach relies on the analysis of player behavior and characteristics of offending players. In the proposed system, chat records and other social activities in the game are taken into account, as well as player history. This methodology is sufficiently generic, and it can be applied to similar gaming platforms, thus describing a useful tool for game companies. We report our results on data collected over a six months period, involving 100,000 users and 800,000 game records, and illustrate the viability of such analysis, while providing insights on the factors associated with verbal aggression and abusive behavior for social games.

Introduction

Online social games provide rich interaction possibilities to their users, and create micro-worlds with social rules that parallel, but do not completely overlap with the real world. Since most transactions and interactions happen over digital media, these platforms present great opportunities to analyze user behavior. In online social games it is possible to record user actions, to create or to filter target interactions, and to obtain contextualized behavior instances. With the help of these data, one can either improve the game experience, by for instance adapting the game to maximize player enjoyment (Asteriadis, Shaker, Karpouzis, & Yannakakis, 2012), or use the game for a better understanding of the players themselves, for instance by inferring personality traits from in-game behavior (van Lankveld, Spronck, van den Herik, & Arntz, 2011).

There is a significant body of work that investigates the effects of aggressive and violent content in computer games on the players, particularly whether violent games induce aggression in children or not (Egenfeldt-Nielsen et al., 2013, Griffiths, 1999). However, little research has been done on aggressive behaviors within computer games. We do not deal here with the controversial issues of violent games (Ferguson, 2013). We distinguish here avatar aggression, which involves aggression displayed by the virtual characters of a game, from player aggression, which implicates the actual player as the target of aggression. The latter is a form of cyber-aggression, and is often disruptive for gaming experience. In this paper, we deal specifically with verbal player aggression via in-game communication channels. Most social online games provide several communication channels, including in-game chat, private messaging, gifting (i.e. sending a virtual gift to another player), message boards, friendship and alliance requests, and such. Rapid identification and resolution of verbal aggression over these channels is important for the gaming community. For this purpose, the content of verbal messages should be analyzed automatically.

In addition to verbal messages, we explore in this work a number of features that can be used for player profiling in social online games. In particular, we use a supervised machine learning approach to create models of abusive and aggressive verbal behavior from labeled instances of abuse in such an online game, based on actual player complaints. While mechanisms for handling player complaints exist in most social games, game moderators need to spend time and energy to analyze player complaints to resolve each case individually.1 Subsequently, labeled data are costly to obtain. We introduce here a labeled corpus for this purpose. Our study aims to improve the game experience indirectly, by automatically analyzing player complaints, and thus helping game moderators to respond to aggressive and abusive behaviors in the game. At the same time, our analysis may contribute to a better understanding of the factors that underlie such behaviors.

Our approach is based on the analysis of player complaints, player behavior, and player characteristics, including demographic data, game play statistics, and similar features of player history. The social interactions we analyze include chatting, as well as in-game friendship, offline messaging, and gifting. Our profiling methodology performs with a small number of false positives, and is now being incorporated into an actual game environment.

Our evaluation is based on the performance analysis of the classifiers we build for detecting abusive verbal behaviors automatically; if a classifier can perform well, this means the features we look at are selected correctly.

For classification, we have used the Bayes Point Machine formalism in this work (Herbrich, Graepel, & Campbell, 2001). To evaluate our proposed methodology, we have collected the CCSoft Okey Player Abuse (COPA) Database over six months of game play, with 100,000 unique users, and 800,000 individual games. Our labeled complaint data comprises 1066 player complaints, each involving one or more game plays between involved players.

The main research questions at the onset of this study were about understanding how social performance and gaming behaviors relate to verbal aggression, and whether there were factors that correlate highly with verbal aggression and abuse, or common features of abusive players. Our hypothesis is that player profiling and analysis of gaming behavior can provide useful cues in assessing cases of verbal aggression. In addition to answering these questions in the context of a particular game, we have sought to create an application of practical value, to help game designers in the moderation of their online game communities.

This paper is organized as follows. First, we give an overview of related work in social game analytics and aggressive behavior detection. We then introduce the game of Okey used in our experiments, and describe its social role in the Turkish culture. Next, we explain our proposed methodology. We present the COPA database, describe its annotation, and report experimental results. We conclude with a summary of findings and limitations.

Section snippets

Related work

A recent survey on human behavior analysis for computer games illustrates that while game designers analyze player behavior intensively when designing their games, real time behavior analysis is rarely incorporated into the game (Schouten, Tieben, van de Ven, & Schouten, 2011). There are companies that adapt their game content to user preferences by means of A–B testing, where a group of users receive one version of the game, while a second group receives a slightly modified version, and the

An online social game: Okey

Like in many countries, traditional Turkish games also have seen their online counterparts hitting the markets. A very well-known example of such a Turkish game is Okey, probably of Chinese origin, but adopted (and adapted) in Turkey and played socially for more than a hundred years. Okey is a part of the “Kıraathane” (i.e. the coffeehouse, but the word comes from “reading house”) culture in Turkey; the coffehouses are social gathering places for men, who spend long hours there to drink tea and

Methods

We propose a system to automatically analyze and rank player complaints. First, a training and benchmarking set is generated, where player complaints are manually labeled as ‘abusive’ or ‘offending’ by human moderators. For this study, a single moderator is assigned for the annotation task. Multiple annotators would certainly increase the quality of annotations, but at the cost of doubling or tripling the annotation expenses. The information of players involved in these complaints is extracted

Results and discussion

First, we present the results obtained when all features are fed to BPMs individually and observe the effect of changing the threshold value, which is used to decide whether a player falls into abusive players category or not (see Fig. 4). Variance of precision, specificity and sensitivity for different threshold values show that increasing the threshold beyond 60% does not generate any significant gain for precision and specificity. However, sensitivity drops drastically as the threshold

Conclusions

In this study, we presented our analysis of player complaints data acquired from a real online social game. Our primary aim was to spot offenders by profiling players according to their in-game behavior and performance. We propose a feature vector for player profile and a binary clustering methodology, followed by an adaptive thresholding for confidence to classify players reported in complaints as genuine offenders. Our approach performs well enough to aid human moderators in terms of

Acknowledgments

This work was supported in part by the Scientific and Technological Research Council of Turkey (TUBITAK) under grant number 114E481.

References (41)

  • C.M. Bishop

    Pattern recognition and machine learning (information science and statistics)

    (2006)
  • A.H. Buss et al.

    The aggression questionnaire

    Journal of Personality and Social Psychology

    (1992)
  • E. Chang et al.

    CBSA: Content-based soft annotation for multimodal image retrieval using Bayes point machines

    IEEE Transactions on Circuits and Systems for Video Technology

    (2003)
  • J.M. Digman

    Personality structure: Emergence of the five-factor model

    Annual Review of Psychology

    (1990)
  • A. Drachen et al.

    Player modeling using self-organization in tomb raider: Underworld

  • R.O. Duda et al.

    Pattern classification

    (2012)
  • S. Egenfeldt-Nielsen et al.

    Understanding video games: The essential introduction

    (2013)
  • M.S. El-Nasr et al.

    Game analytics: Maximizing the value of player data

    (2013)
  • Elo, A. E. (1978). The rating of chessplayers, past and present (vol. 3). Batsford...
  • C.J. Ferguson

    Violent video games and the supreme court: Lessons for the scientific community in the wake of brown v. Entertainment merchants association

    American Psychologist

    (2013)
  • Cited by (0)

    View full text