skip to main content
10.1145/3530019.3534980acmotherconferencesArticle/Chapter ViewAbstractPublication PageseaseConference Proceedingsconference-collections
extended-abstract

The Lifecycle of Developing Overall Evaluation Criterion in AB Testing for Netflix Messaging

Published: 13 June 2022 Publication History

Abstract

Many organizations use multiple key metrics to measure success for online controlled experimentation. By combining multiple metrics, we introduce Overall Evaluation Criterion (OEC) which aligns with organizational long-term success. We will cover the definition of OEC, its critical properties, a case study on Netflix Messaging and four critical key steps in designing your own organizational OEC.

References

[1]
https://exp-platform.com/
[2]
Kohavi, Ron, Stefan Thomke, 2017, “The Surprising Power of Online Experimentation”, Harvard Business Review.
[3]
Kohavi, Ron, Alex Deng, Brian Frasca, Roger Longbotham, Toby Walker and Ya Xu. 2012. "Trustworthy online controlled experiments: Five puzzling outcomes explained." Proceedings of the 18th Conference on Knowledge Discovery and Data Mining. http://bit.ly/expPuzzling.
[4]
Ron, Alex Deng, Brian Frasca, Toby Walker, Ya Xu, and Nils Pohlmann. 2013. "Online Controlled Experiments at Large Scale." KDD 2013: Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining. http://bit.ly/ExPScale.
[5]
Kohavi, Ron, Alex Deng, Roger Longbotham, and Ya Xu. 2014. "Seven Rules of Thumb for Web Site." Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining (KDD '14). http://bit.ly/expRulesOfThumb
[6]
Kohavi, Ron, Roger Longbotham, and Toby Walker. 2010. "Online Experiments: Practical Lessons." IEEE Computer, September: 82-85. http://bit.ly/expPracticalLessons.
[7]
Kohavi, Ron, Roger Longbotham, Dan Sommerfield, and Randal M. Henne. 2009. "Controlled experiments on the web: survey and practical guide." Data Mining and Knowledge Discovery 18: 140-181.http://bit.ly/expSurvey

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
EASE '22: Proceedings of the 26th International Conference on Evaluation and Assessment in Software Engineering
June 2022
466 pages
ISBN:9781450396134
DOI:10.1145/3530019
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 June 2022

Check for updates

Author Tags

  1. AB Tests
  2. OEC
  3. Overall Evaluation Criteria

Qualifiers

  • Extended-abstract
  • Research
  • Refereed limited

Conference

EASE 2022

Acceptance Rates

Overall Acceptance Rate 71 of 232 submissions, 31%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 155
    Total Downloads
  • Downloads (Last 12 months)47
  • Downloads (Last 6 weeks)8
Reflects downloads up to 17 Feb 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media