Elsevier

Computers in Human Behavior

Volume 49, August 2015, Pages 412-426
Computers in Human Behavior

Understanding the role of social context and user factors in video Quality of Experience

https://doi.org/10.1016/j.chb.2015.02.054Get rights and content

Highlights

  • We examine the impact of social context on compressed video Quality of Experience.

  • We analyze the effect on QoE of user factors such as interest and demographics.

  • Co-viewing videos increases user’s enjoyment and endurability of the experience.

  • Low bitrate does not decrease video enjoyment, yet lower video quality is perceived.

  • User interest increases QoE; this effect is suppressed by the presence of co-viewers.

Abstract

Quality of Experience is a concept to reflect the level of satisfaction of a user with a multimedia content, service or system. So far, the objective (i.e., computational) approaches to measure QoE have been mostly based on the analysis of the media technical properties. However, recent studies have shown that this approach cannot sufficiently estimate user satisfaction, and that QoE depends on multiple factors, besides the media technical properties. This paper aims to identify the role of social context and user factors (such as interest and demographics) in determining quality of viewing experience. We also investigate the relationships between social context, user factors and some media technical properties, the effect of which on image quality is already known (i.e., bitrate level and video genre). Our results show that the presence of co-viewers increases the user’s level of enjoyment and enhances the endurability of the experience, and so does interest in the video content. Furthermore, although participants can clearly distinguish the various levels of video quality used in our study, these do not affect any of the other aspects of QoE. Finally, we report an impact of both gender and cultural background on QoE. Our results provide a first step toward building an accurate model of user QoE appreciation, to be deployed in future multimedia systems to optimize the user experience.

Introduction

Online video services show a continuous growth. By 2010, over 71% of internet users had watched videos online, and this number grew from 33% in 2006 (Moore, 2011). These figures are forecasted to further grow in the coming years (Cisco, 2012, Moore, 2011). With a constantly increasing volume of streamed video data, maintaining a satisfactory video service to users at all times is challenging for internet and multimedia providers. Due to different technological limitations (e.g., bandwidth and storage constraints, network malfunctioning), visible artifacts (e.g., blockiness or blur due to compression, freezes or jerkiness due to transmission errors) can be introduced to any stage of the video delivery cycle (Pérez et al., 2011, Wang et al., 2003). This, in turn, can severely degrade the user’s satisfaction, and evidence shows that users intend to pay less if a service cannot meet their expectations (Naumann et al., 2010, Yamori and Tanaka, 2004). As a consequence, online video providers are eager to find ways to measure and predict user’s satisfaction with videos in order to optimize their video delivery chains.

Quality of Experience (QoE) is a concept commonly used to describe user’s overall satisfaction (Le Callet, Möller, & Perkis, 2012), reflecting the degree of delight or annoyance of a user with a (multimedia) system, service or application. In the past decades, user’s satisfaction with videos has been estimated mainly from a technical perspective, i.e., based on either the information gathered from the network and service conditions or from image and video analysis (Serral-Gracià et al., 2010). From a network management perspective, the concept Quality of Service (QoS) has often been equated to QoE. Here, network parameters, such as packet loss or delay (Asghar, Le Faucheur, & Hood, 2009), as well as video QoS parameters, e.g., the so-called join time at the start of playing the video or the buffering time during the video (Dobrian et al., 2011), were monitored; their compliance to given standards was considered enough to guarantee sufficiently high QoE. The signal processing community has instead relied more often on the analysis of information extracted from the decoded image/video signal to estimate the visibility of artifacts in it (Hemami and Reibman, 2010, Lin and Jay Kuo, 2011). Artifact visibility was considered to be inversely related to perceptual quality, and therefore to user satisfaction (Chikkerur, Sundaram, Reisslein, & Karam, 2011). In both cases, user satisfaction was mainly associated to technical properties of the multimedia signal, service or system.

Lately, research has shown that this approach has limitations, and that other elements concur to guarantee user satisfaction when watching video (Le Callet et al., 2012, Zhu et al., 2014). For example, recent studies claimed that QoE should also be considered from a user perspective (De Pessemier, De Moor, Joseph, De Marez, & Martens, 2013): evidence has been provided that user’s interest (Kortum & Sullivan, 2010) and personality (Wechsung, Schulz, Engelbrecht, Niemann, & Möller, 2011) influence QoE too. Such findings reveal the complexity of QoE: it is a combination of many influencing factors, not limited to QoS parameters nor artifact visibility.

Influencing factors on QoE are often grouped into three categories, i.e., system, user and context factors (Le Callet et al., 2012). System factors concern the technical aspects of a multimedia system (e.g., network parameters, media genre and media configuration). User factors refer to individual characteristics of the user who is experiencing the video (e.g., demographics, personal interest or personality). Context factors refer to the characteristics of the environment within which the video experience is consumed (e.g., physical features of the environment, economical factors related to the video fruition, presence or absence of co-viewers). As mentioned earlier, most research in the field has focused on system factors, leaving the contribution of user and context factors largely unexplored. However, the rise of online video fruition has created a shift from a passive viewing experience to a more active, personalized and shared experience, changing the traditional television market considerably (Tercek, 2011). Compared to traditional TV users who just watch scheduled programs, internet users are free to choose the content they want, at any point in time and space they want, through a variety of devices (e.g., tablets, smartphone or computers). Thus, it is expected that personal characteristics as well as context of fruition will play an important role in such viewing experiences. Moreover, the rise of social media has led to a new type of social viewing experience, where preferences for video content are clearly reported on social media platforms (through comments and ratings), and are visible to the rest of the (vast) online community. The social context in which the video is experienced is therefore expected to play a key role in the eventual user satisfaction.

As the optimization of online video watching requires a more in-depth understanding of the impact of user and context factors on QoE, we here want to contribute to the generation of this knowledge by considering the impact of social context in particular. Interestingly, very little is known about how social context (1) relates to QoE and (2) combines with system and user factors to determine the final user satisfaction with the viewing experience. We specifically focus on what we define as “direct” social context, that is, the presence or absence of co-viewers in the physical proximity of the user. We report the outcomes of an empirical study looking into the role played by direct social context in determining QoE when given system factors (i.e., video genre and bitrate) are in place. Furthermore, we analyze the interactions of direct social context with user influencing factors such as demographics, interest in the video genre and immersive tendency. We measure six different aspects of the viewing experience, namely perceived video quality, enjoyment, endurability, satisfaction, involvement and information assimilation. The outcomes should support building an accurate objective model for QoE on the longer term.

The paper continues by presenting the related work in Section 2, which we reviewed to define the hypotheses for the empirical study as described in Section 3. We then outline our experimental methodology in Section 4, followed by the analysis of the results in Section 5. We discuss our findings in Section 6, leading to the most important conclusions in Section 7.

Section snippets

Related work

In the past decades, the effectiveness of multimedia services has been linked to the notion of Quality of Service (QoS), defined as the “totality of characteristics of a telecommunication service that bears on its ability to satisfy stated and implied needs of the user of the service” (ITU-T, 1994). QoS is mainly operationalized in terms of system and network performance-related measures (e.g., packet loss ratio, jitter or delay). This approach has started showing its limitations, and was found

Research questions and hypotheses

Based on the literature overview given in Section 2, we formulate three research questions:

  • 1.

    What is the effect of direct social context on QoE?

  • 2.

    How is the impact of system factors on QoE affected by the direct social context?

  • 3.

    How is the impact of user factors on QoE affected by the direct social context?

To answer these research questions, QoE is measured along the six attributes, mentioned above: perceived visual quality, enjoyment, satisfaction, endurability, involvement and information

Experimental design

To test our hypotheses, we created two real-life viewing situations with varying direct social context. In the first situation, single users (hereafter indicated with S, shown in Fig. 1a) watched the videos alone (i.e., absence of direct social context). In the second one, a group of three friends (hereafter indicated with G, shown in Fig. 1b) watched the videos together. Participants who were involved in one social situation (e.g., single) were not presented with the other situation (e.g.,

Data preparation

Before discussing our results in more detail, we performed a number of bias checks on the distribution of our participants over the two social contexts, i.e., participation in the single vs. group viewing situation. Note that for some variables such as interest, immersive tendency and some demographic data, values of one participant contributing to the group viewing situation were missing. Thus, where applicable, the results of only 59 instead of 60 participants are reported. In addition, since

Discussion

Quality of Experience is a very complex concept and its proper quantification still has several challenges ahead. Based on existing literature, we proposed to measure various aspects of Quality of Experience, including perceived visual quality (along two separate dimensions of (1) artifact visibility and (2) overall quality), enjoyment, satisfaction, endurability, involvement and information assimilation. Measurement scales for perceived visual quality are well established; conversely, no

Conclusions

In this paper, we investigated a set of influencing factors on user’s QoE with videos. Our results showed that co-viewing videos with friends increased the user’s level of enjoyment and enhanced the endurability of the experience, indicating that social context should be further investigated in relation to QoE and considered also in automated measurements. The presence of co-viewers did not change participant’s ability to detect visual artifacts, yet the presence of visible artifacts did not

Acknowledgements

This work is supported in part by the scholarship from China Scholarship Council (CSC) under the Grant CSC No. 201206090028. This work is also partially supported by the NWO Veni Grant 639.021.230.

References (70)

  • P. Brooks et al.

    User measures of quality of experience: Why being objective and quantitative is important

    Network, IEEE

    (2010)
  • S. Chikkerur et al.

    Objective video quality assessment methods: A classification, review, and performance comparison

    IEEE Transactions on Broadcasting

    (2011)
  • K. Chorianopoulos et al.

    Introduction to social TV: Enhancing the shared experience with interactive TV

    International Journal of Human–Computer Interaction

    (2008)
  • Cisco, I. (2012). Cisco visual networking index: Forecast and methodology, 2011–2016. CISCO White paper,...
  • J.M. Cortina

    What is coefficient alpha? An examination of theory and applications

    Journal of Applied Psychology

    (1993)
  • De Moor, K., Quintero, M. R., Strohmeier, D., & Raake, A. (2013). Evaluating QoE by means of traditional and...
  • K. De Moor et al.

    Chamber QoE: A multi-instrumental approach to explore affective aspects in relation to quality of experience, IS&T/SPIE Electronic Imaging

    International Society for Optics and Photonics

    (2014)
  • T. De Pessemier et al.

    Quantifying the influence of rebuffering interruptions on the user’s quality of experience during mobile video watching

    IEEE Transactions on Broadcasting

    (2013)
  • P. Desmet

    Measuring emotion: Development and application of an instrument to measure emotional responses to products

    (2005)
  • F. Dobrian et al.

    Understanding the impact of video quality on user engagement

    ACM SIGCOMM Computer Communication Review

    (2011)
  • P.G. Engeldrum

    Psychometric scaling: A toolkit for imaging systems development

    (2000)
  • M. Fiedler et al.

    A generic quantitative relationship between quality of experience and quality of service

    Network, IEEE

    (2010)
  • Fröhlich, P., Baillie, L., & Schatz, R. (2006). Exploring the Joint iTV Experience. FTW Technical Report,...
  • B. Gardlo et al.

    Microworkers vs. facebook: The impact of crowdsourcing platform choice on experimental results

    (2012)
  • P. Gastaldo et al.

    Supporting visual quality assessment with machine learning

    EURASIP Journal on Image and Video Processing

    (2013)
  • G. Ghinea et al.

    Quality of perception: User quality of service in multimedia presentations

    IEEE Transactions on Multimedia

    (2005)
  • S.R. Gulliver et al.

    Stars in their eyes: What eye-tracking reveals about multimedia perceptual quality

    IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans

    (2004)
  • S.R. Gulliver et al.

    Defining user perception of distributed multimedia quality

    ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)

    (2006)
  • T. Hoßfeld et al.

    Best practices for QoE crowdtesting: QoE assessment with crowdsourcing

    IEEE Transactions on Multimedia

    (2014)
  • Q. Huynh-Thu et al.

    Temporal aspect of perceived quality in mobile video broadcasting

    IEEE Transactions on Broadcasting

    (2008)
  • M. Hyder et al.

    Are QoE requirements for multimedia services different for men and women? Analysis of gender differences in forming QoE in virtual acoustic environments

  • S. Ickin et al.

    Factors influencing quality of experience of commonly used mobile applications

    Communications Magazine, IEEE

    (2012)
  • ISO/IEC, 2006. ISO/IEC 14496-3: 2005....
  • ITU-R (2002). BT.500-11, Methodology for the subjective assessment of the quality of television pictures. In...
  • ITU-T (1994). Recommendation E. 800: Terms and definitions related to quality of service and network performance...
  • Cited by (0)

    View full text