Skip to main content
Log in

Crowdsourced subjective 3D video quality assessment

  • Regular Paper
  • Published:
Multimedia Systems Aims and scope Submit manuscript

Abstract

This article proposes a new method for subjective 3D video quality assessment based on crowdsourced workers—Crowd3D. The limitations of traditional laboratory-based grade collection procedures are outlined, and their solution through the use of a crowd-based approach is described. Several conceptual and technical requirements of crowd-based 3D video quality assessment methods are identified and the solutions adopted described in detail. The system built takes the form of a web-based platform that supports 3D video monitors, and orchestrates the entire process of observer validation, content presentation and quality, depth, and comfort grade recording in a remote database. The crowdsourced subjective 3D quality assessment system uses as source contents a set of 3D video and grades database assembled earlier in a laboratory setting. To evaluate the validity of the crowd-based approach, the grades gathered using the crowdsourced system were analysed and compared to a set of grades obtained in laboratory settings using the same data set. Results show that it is possible to obtain Pearson’s and Spearman’s correlation up to 0.95 for quality Difference Mean Opinion Score and 0.96 for quality Mean Opinion Score, when comparing with laboratory grades. Apart from the present study, the 3D video quality assessment platform proposed can be used with advantage for further related research activities, reducing the time and cost compared to the traditional laboratory-based quality assessments.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. ITU-R BT.500-13: Methodology for the subjective assessment of the quality of television pictures. International Telecommunication Union/ITU Radiocommunication Sector (2012)

  2. ITU-R BT.2021: Subjective methods for the assessment of stereoscopic 3DTV systems. International Telecommunication Union/ITU Radiocommunication Sector (2012)

  3. ITU-R P.914: Display requirements for 3D video quality assessment (2016)

  4. ITU-R P.915: Subjective assessment methods for 3D video quality (2016)

  5. ITU-R P.916: Information and guidelines for assessing and minimizing visual discomfort and visual fatigue from 3D video (2016)

  6. Loncaric, M., et al.: Testing picture quality in HDTV systems. In: ELMAR, 2008 50th International Symposium, September 2008, pp. 5–8

  7. Dumic, E., Grgic, S., Sakic, K., Frank, D.: Subjective quality assessment of H.265 versus H.264 video coding for high-definition video systems. In: 13th International Conference on Telecommunications ConTEL 2015, ConTEL 2015 Proceedings, July 2015, pp. 1–7

  8. Ghadiyaram, D., Bovik, A.C.: Massive Online Crowdsourced Study of Subjective and Objective Picture Quality. IEEE Trans. on Image Processing 25(1), 372–387 (2015)

    Article  MathSciNet  Google Scholar 

  9. Ghadiyaram, D., Bovik, A.C.: Crowdsourced study of subjective image quality. In: 2014 48th Asilomar Conference on Signals, Systems and Computers, November 2014, pp. 1–5

  10. Sakic, K., Dumic, E., Grgic, S.: Crowdsourced subjective video quality assessment. In: The 21st International Conference on Systems, Signals and Image Processing—IWSSIP, IWSSIP 2014 Proceedings, May 2014, pp. 223–226

  11. Microworkers: http://microworkers.com. Accessed 16 Nov 2017

  12. Amazon Mechanical Turk: http://mturk.com. Accessed 16 Nov 2017

  13. Dumic, E., Grgic, S., Sakic, K., Rocha, P.M.R., da Silva Cruz, L.A.: 3D video subjective quality: a new database and grade comparison study. Multimed. Tools Appl. 1380–7501, 1–23 (2016)

    Google Scholar 

  14. Hanhart, P., Korshunov, P., Ebrahimi, T.: Crowd-based quality assessment of multiview video plus depth coding. In: IEEE International Conference on Image Processing, Paris, France, October 2014, pp. 743–747

  15. Hoßfeld, T., et al.: Best practices for QoE crowdtesting: QoE assessment with crowdsourcing. IEEE Trans. Multimed. 16(2), 541–557 (2014)

    Article  Google Scholar 

  16. Hoßfeld, T., Redi, J.: Journey through the crowd: best practices and recommendations for crowdsourced QoE. In: Quality of Multimedia Experience (QoMEX) 2015, May 2015, pp. 1–2

  17. Kaptein, R.G., Kuijsters, A., Lambooij, M.T.M., IJsselsteijn, W.A., Heynderickx, I.: Performance evaluation of 3D-TV systems. In: Proceedings of SPIE Image Quality and System Performance V, SPIE, vol. 6808, p. 680819 (2008)

  18. Lambooij, M., IJsselsteijn, W., Bouwhuis, D.G., Heynderickx, I.: Evaluation of stereoscopic images: beyond 2D quality. IEEE Trans. Broadcast. 57(2), 432–444 (2011)

    Article  Google Scholar 

  19. Lambooij, M.T.M., IJsselsteijn, W.A., Heynderickx, I.: Visual discomfort in stereoscopic displays: a review. In: Stereoscopic Displays and Virtual Reality Systems XIV, SPIE, vol. 6490, p. 64900I (2007)

  20. Hanhart, P., Ramzan, N., Baroncini, V., Ebrahimi, T.: Cross-lab subjective evaluation of the MVC+D and 3D-AVC 3D video coding standards. In: 6th International Workshop on Quality of Multimedia Experience (QoMEX), Singapore, September 2014, pp. 183–188

  21. Barkowsky, M., Li, J., Han, T., Youn, S., Ok, J., et al.: Towards standardized 3DTV QoE assessment: cross-lab study on display technology and viewing environment parameters. In: SPIE Electronic Imaging, San Franscisco, United States, February 2013, vol. 8648, pp. 864809–864809

  22. Urvoy, M., et al.: NAMA3DS1-COSPAD1: subjective video quality assessment database on coding conditions introducing freely available high quality 3D stereoscopic sequences. In: Quality of Multimedia Experience (QoMEX 2012), Melbourne, Australia, July 2012, pp. 109–114

  23. Lambooij, M., Murdoch, M., IJsselsteijn, W.A., Heynderickx, I.: The impact of video characteristics and subtitles on visual comfort of 3D TV. Displays 34(1), 8–16 (2013)

    Article  Google Scholar 

  24. Paulus, J., Michelson, G., Barkowsky, M., Hornegger, J., Eskofier, B., Schmidt, M.: Measurement of individual changes in the performance of human stereoscopic vision for disparities at the limits of the zone of comfortable viewing. In: 2013 International Conference on 3D Vision, 2013, pp. 310–317

  25. Reichl, P., Egger, S., Möller, S., Kilkki, K., Fiedler, M., Hossfeld, T., Tsiaras, C., Asrese, A.: Towards a comprehensive framework for QOE and user behavior modelling. In: 2015 Seventh International Workshop on Quality of Multimedia Experience (QoMEX), May 2015, pp. 1–6

  26. Google Chrome based application for laboratory subjective assessments: http://crowd3d.co.it.pt/suis3d. Accessed 16 Nov 2017

  27. Mozilla Firefox based application for laboratory subjective assessments: http://crowd3d.co.it.pt/suis3d_webm. Accessed 16 Nov 2017

  28. Application for phase 1 of the crowdsourced subjective 3D video quality assessment: http://crowd3d.co.it.pt/test/suis.php. Accessed 16 Nov 2017

  29. Application for phase 2 of the crowdsourced subjective 3D video quality assessment: http://crowd3d.co.it.pt/crowd3d/suis.php. Accessed 16 Nov 2017

  30. Dataset Repository: ftp://ftp.ivc.polytech.univ-nantes.fr/NAMA3DS1_COSPAD1/Avi_videos/HRC_00_Reference/. Accessed: 16 November 2017

  31. ITU-T: Recommendation P.910, Subjective video quality assessment methods for multimedia applications (2008)

  32. Sheikh, H.R.: Image quality assessment using natural scene statistics. Ph.D. dissertation, University of Texas at Austin (2004)

  33. ITU-T J.149, “Method for specifying accuracy and crosscalibration of Video Quality Metrics (VQM),” International Telecommunication Union, March 2004

  34. Hanhart, P., Ebrahimi, T.: On the evaluation of 3D codecs on multiview autostereoscopic display. In: 4th IEEE International Workshop on Hot Topics in 3D (Hot3D), July 2013, pp. 1–2

  35. Brunnström, K., Ananth, I.V., Hedberg, C., Wang, K., Andrén, B., Barkowsky, M.: 36.4: Comparison between different rating scales for 3D TV. In: SID Symposium Digest of Technical Papers, vol. 44, no. 1, June 2013, pp. 509–512

    Article  Google Scholar 

  36. Dataset Repository: http://datasets.dyndns.org. Accessed 16 Nov 2017

Download references

Acknowledgements

The authors would like to thank the European COST Action IC1105, 3DConTourNet for the active support and cooperation. This work is funded by FCT/MEC through national funds and when applicable co-funded by FEDER – PT2020 partnership agreement under the project UID/EEA/50008/2019.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Emil Dumic.

Additional information

Communicated by G. Morin.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dumic, E., Sakic, K. & da Silva Cruz, L.A. Crowdsourced subjective 3D video quality assessment. Multimedia Systems 25, 673–694 (2019). https://doi.org/10.1007/s00530-019-00619-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00530-019-00619-7

Keywords

Navigation