skip to main content
10.1145/2713168.2723145acmconferencesArticle/Chapter ViewAbstractPublication PagesmmsysConference Proceedingsconference-collections
research-article

Video BenchLab: an open platform for realistic benchmarking of streaming media workloads

Published:18 March 2015Publication History

ABSTRACT

In this paper, we present an open, flexible and realistic benchmarking platform named Video BenchLab to measure the performance of streaming media workloads. While Video BenchLab can be used with any existing media server, we provide a set of tools for researchers to experiment with their own platform and protocols. The components include a MediaDrop video server, a suite of tools to bulk insert videos and generate streaming media workloads, a dataset of freely available video and a client runtime to replay videos in the native video players of real Web browsers such as Firefox, Chrome and Internet Explorer. We define simple metrics that are able to capture the quality of video playback and identify issues that can happen during video replay. Finally, we provide a Dashboard to manage experiments, collect results and perform analytics to compare performance between experiments.

We present a series of experiments with Video BenchLab to illustrate how the video specific metrics can be used to measure the user perceived experience in real browsers when streaming videos. We also show Internet scale experiments by deploying clients in data centers distributed all over the globe. All the software, datasets, workloads and results used in this paper are made freely available on SourceForge for anyone to reuse and expand.

References

  1. ACM Multimedia Systems conference Dataset archive, http://traces.cs.umass.edu/index.php/Mmsys/MmsysGoogle ScholarGoogle Scholar
  2. Bienia, Christian, Sanjeev Kumar, Jaswinder Pal Singh, and Kai Li. "The PARSEC benchmark suite: Characterization and architectural implications." In Proceedings of the 17th international ACM conference on Parallel architectures and compilation techniques, pp. 72--81. 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Philip Bräunlich and Gerrit van Aaken -- HTML5 Video Player Comparison -- http://praenanz.de, last update 2014-07-09.Google ScholarGoogle Scholar
  4. Emmanuel Cecchet, Veena Udayabhanu, Timothy Wood and Prashant Shenoy -- BenchLab: An Open Testbed for Realistic Benchmarking of Web Applications -- Proceedings of 2nd USENIX Conference on Web Application Development (WebApps '11), June 15-16, 2011, Portland, OR. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Cherkasova, Ludmila, and Loren Staley. "Building a Performance Model of Streaming Media Applications in Utility Data Center Environment." In CCGRID, vol. 3, p. 52. 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Chesire, Maureen, Alec Wolman, Geoffrey M. Voelker, and Henry M. Levy. "Measurement and Analysis of a Streaming Media Workload." In USITS, vol. 1, pp. 1--1. 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. The CloudSuite Media Streaming Benchmark, http://parsa.epfl.ch/cloudsuite/streaming.html, 2012.Google ScholarGoogle Scholar
  8. Ganglia Monitoring system - http://ganglia.sourceforge.net/.Google ScholarGoogle Scholar
  9. Google Video Quality Report, https://www.google.com/get/videoqualityreport/Google ScholarGoogle Scholar
  10. Phillipa Gill, Martin Arlitt, Zongpeng Li, and Anirban Mahanti-- YouTube Traffic Characterization: A View From the Edge -- IMC'07, October 24-26, 2007, San Diego, CA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. HTTP Archive specification (HAR) v1.2 - http://www.softwareishard.com/blog/har-12-spec/.Google ScholarGoogle Scholar
  12. M. Larson, M. Soleymani, M. Eskevich, P. Serdyukov, R. Ordelman, and G. Jones. "The community and the crowd: Developing large-scale data collections for multimedia benchmarking." IEEE Multimedia, (2012).Google ScholarGoogle Scholar
  13. C. Lee, P. Miodrag, and W. H. Mangione-Smith. "MediaBench: a tool for evaluating and synthesizing multimedia and communications systems." In Proceedings of the 30th annual ACM/IEEE international symposium on Microarchitecture, pp. 330--335. 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. C. Leung, and H. Ho-Shing Ip. "Benchmarking for content-based visual information search." In Advances in Visual Information Systems, pp. 442--456. Springer 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Libav - Open source audio and video processing tools - http://libav.org/.Google ScholarGoogle Scholar
  16. S. Marchand-Maillet, and M. Worring. "Benchmarking image and video retrieval: an overview." In Proceedings of the 8th ACM international workshop on Multimedia information retrieval, pp. 297--300, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. MediaDrop - http://mediadrop.net/.Google ScholarGoogle Scholar
  18. Meeyoung Cha, Haewoon Kwak, Pablo Rodriguez, Yong-Yeol Ahn, and Sue Moon -- I Tube, You Tube, Everybody Tubes: Analyzing the World's Largest User Generated Content Video System -- IMC'07, October 24-26, 2007, San Diego, CA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Li, Mingzhe, Mark Claypool, Robert Kinicki, and James Nichols. "Characteristics of streaming media stored on the Web." ACM Transactions on Internet Technology (TOIT) 5, no. 4: 601--626, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Gill, Phillipa, Martin Arlitt, Zongpeng Li, and Anirban Mahanti. "Youtube traffic characterization: a view from the edge." In Proceedings of the 7th ACM Internet Measurement Conference, pp. 15--28. 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Rabbah, Rodric M., Ian Bratt, Krste Asanovic, and Anant Agarwal. "Versatility and versabench: A new metric and a benchmark suite for flexible architectures." MIT LCS Technical Report MIT-CSAIL-TR-2004-039, 2004.Google ScholarGoogle Scholar
  22. Jin, Shudong, and Azer Bestavros. "Gismo: a generator of internet streaming media objects and workloads." ACM SIGMETRICS Performance Evaluation Review 29, no. 3 pages 2--10 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Jim Summers, Tim Brecht, Derek Eager, and Bernard Wong "Methodologies for generating HTTP streaming video workloads to evaluate web server performance", Proceedings of the 5th Annual International Systems and Storage Conference (SYSTOR '12). New York, NY, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Selenium - http://seleniumhq.org/.Google ScholarGoogle Scholar
  25. Slingerland, Nathan T., and Alan Jay Smith. "Design and characterization of the Berkeley multimedia workload." Multimedia Systems 8, no. 4: 315--327, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Standard Performance Evaluation Corporation (SPEC) Benchmarks, www.spec.orgGoogle ScholarGoogle Scholar
  27. Veloso, Eveline, Virgilio Almeida, Wagner Meira, Azer Bestavros, and Shudong Jin. "A hierarchical characterization of a live streaming media workload." In Proceedings of the 2nd ACM SIGCOMM Workshop on Internet measurment, pp. 117--130. 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Vimeo Creative Commons / Attribution Licensed video - http://vimeo.com/creativecommons/by.Google ScholarGoogle Scholar
  29. WebMetrics BrowserMob proxy - http://opensource.webmetrics.com/browsermob-proxy/.Google ScholarGoogle Scholar
  30. Michael Zink, Kyoungwon Suh, Yu Gu, and Jim Kurose, Characteristics of YouTube Network Traffic at a Campus Network - Measurements, Models, and Implications. Elsevier Computer Networks. Vol. 53, No. 4, March 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Video BenchLab: an open platform for realistic benchmarking of streaming media workloads

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image ACM Conferences
            MMSys '15: Proceedings of the 6th ACM Multimedia Systems Conference
            March 2015
            277 pages
            ISBN:9781450333511
            DOI:10.1145/2713168

            Copyright © 2015 ACM

            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 18 March 2015

            Permissions

            Request permissions about this article.

            Request Permissions

            Check for updates

            Qualifiers

            • research-article

            Acceptance Rates

            MMSys '15 Paper Acceptance Rate12of41submissions,29%Overall Acceptance Rate176of530submissions,33%

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader