Abstract
Crowdsourcing quality of experience (QoE) evaluation for multimedia are more cost effective and flexible than traditional in-lab evaluations, and it has gradually caused extensive concern. In this paper, we start from the concept, characteristics and challenges of crowdsourcing QoE evaluation for multimedia, and then summarize the current research progresses including some key technologies in a crowdsourceable QoE evaluation framework. Finally, we point out the open research problems to be solved and the future trends.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Chen, K.-T., Wu, C.-C., Chang, Y.-C., Lei, C.-L.: A crowdsourceable QoE evaluation framework for multimedia content. In: Proceedings of the 17th ACM International Conference on MultimediaPages, pp. 491–500 (2009)
Jain, R.: Quality of experience. IEEE Multimedia 11(1), 95–96 (2004)
Wang, H., Kwong, S., Kok, C.-W.: An efficient mode decision algorithm for H.264/AVC encoding optimization. IEEE Transactions on Multimedia 9(4), 882–888 (2007)
Mushtaq, M.S., Augustin, B., Mellouk, A.: Crowd-sourcing framework to assess QoE. In: 2014 IEEE International Conference on Communications (ICC), pp. 1705–1710 (2014)
Anegekuh, L., Sun, L., Ifeachor, E.: A screening methodlogy for crowdsourcing video QoE evaluation. In: 2014 IEEE Global Communications Conference (GLOBECOM), pp. 1152–1157 (2014)
Methods for subjective determination of transmission quality. ITU-R Recommendation, p. 800 (1996)
Keimel, C., Habigt, J., Diepold, K.: Challenge in crowd-based ideo quality assessment. In: 2012 Fourth International Workshop on Quality of Multimedia Experience (QoMEX), pp. 13–18 (2012)
Hossfeld, T., Keimel, C., Timmerer, C.: IEEE Computer Society, Crowdsourcing quality-of-experience assessments (2014)
Hossfeld, T., Keimel, C., Hirth, M., Gardlo, B., Habigt, J., Diepold, K., Tran-Gia, P.: Best practices for QoE crowdtesting: QoE assessment with crowdsourcing. IEEE Transactions on Multimedia 16(2), 541–555 (2014)
Hossfeld, T., et al.: Quantification of YouTube QoE via crowdsourcing. In: 2011 IEEE International Symposium on Multimedia, pp. 494–499 (2011)
Schulze, T., Seedorf, S., Geiger, D., Kaufmann, N., Schader, M.: Exploring task properties in crowdsourcing-an empirical study on mechanical turk. In: ECIS 2011 Proceedings (2011)
Faradani, S., Hartmann, B., Ipeirotis, P.G., Whats the right price? pricingtasks for finishing on time right price? pricing tasks for finishing on time. In: Workshops at the Twenty-Fifth AAAI Conference on Artificial Intelligence, August 2011
Wu, C.C., Chen, K.T., Chang, Y.C., Lei, C.L.: Crowdsourcing multimedia QoE evaluation: a trusted framework. IEEE Transaction on Multimedia 15(5), 1121–1136 (2015)
Resnick, P., Kuwabara, K., Zeckhauser, R., Friedman, E.: Reputation systems. Commun. ACM 43(12), 45–48 (2000)
Shaw, A.D., Horton, J.J., Chen, D.L.: Designing incentives for inexpert human raters. In: Proc. ACM 2011 Conf. Computer Supported Cooperative Work, CSCW 2011, New York, NY, USA, pp. 275–284 (2011)
David, H.A.: The Method of Paired Comparisons, 2nd edn. Hodder Arnold, London (1988). ISBN 0852642903
Yen, Y.-C., Chu, C.-Y., Yeh, S.-L., Chu, H.-H., Huang, P.: Lab experiment vs. crowdsourcing: a comparative user study on skype call quality. In: AINTEC 2013, Bangkok, Thailand (2013)
Hobfeld, T., Hirth, M., Korshunov, P., et al.: Survey of web-based crowdsourcing frameworks for subjective quality assessment. In: IEEE 16th International Workshop on Multimedia Signal Processing (MMSP) (2014)
Nowak, S., et al.: How reliable are annotations via crowdsourcing: a study about inter-annotator agreement for multi-label image annotation. In: Proceedings of the International Conference on Multimedia Information Retrieval, New York, USA, pp. 557–566 (2010)
Su, H., et al.: Crowdsourcing annotations for visual object detection. In: Workshops at the Twenty-Sixth AAAI Conference on Artificial Intelligence (2012)
Wu, S.Y., et al.: Video summarization via crowdsourcing. In: Workshops atthe Twenty-Sixth Extended Abstracts on Human Factors in Computing Systems, New York, USA, pp. 1531–1536 (2011)
Tang, A., Boring, S.: EpicPlay: crowdsourcing sports video highlights. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, USA, pp. 1569–1572 (2012)
Ma, H., Liu, Y: Correlation based video processing in video sensor networks. In: IEEE International Conference on Wireless Networks, Communications and Mobile Computing, pp. 987–992 (2005)
Wang, H., Kwong, S.: Rate-Distortion optimization of rate control for H.264 with adaptive initial quantization parameter determination. IEEE Transactions on Circuits and Systems for Video Technology 18(1), 140–144 (2008)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Wang, Z., Tao, D., Liu, P. (2015). Development and Challenges of Crowdsourcing Quality of Experience Evaluation for Multimedia. In: Wang, Y., Xiong, H., Argamon, S., Li, X., Li, J. (eds) Big Data Computing and Communications. BigCom 2015. Lecture Notes in Computer Science(), vol 9196. Springer, Cham. https://doi.org/10.1007/978-3-319-22047-5_36
Download citation
DOI: https://doi.org/10.1007/978-3-319-22047-5_36
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-22046-8
Online ISBN: 978-3-319-22047-5
eBook Packages: Computer ScienceComputer Science (R0)