skip to main content
10.1145/3549206.3549234acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesic3Conference Proceedingsconference-collections
research-article

Minimizing Cold Start Times in Serverless Deployments

Published:24 October 2022Publication History

ABSTRACT

Serverless deployments of Cloud Applications involve containerizing an application that then remains dormant(cold) until a trigger event like a user visiting an endpoint occurs. The host machine then provisions this dormant container into a Virtual Machine that serves the request and then stays idle, waiting for subsequent requests to come in(warm). While the performance for requests made while a container is warm is indistinguishable from a fully managed server stack, requests when a container is cold can take several seconds because of the overheads involved in VM provisioning. The time at which a container goes from warm to cold is decided by the host VM depending on existing load and it’s configuration.

This paper aims to come up with methods to reduce the frequency and duration of cold starts occurring across different workloads and cloud providers. By changing base images, lazy loading I/O and DB initializations and modifying CPU capacity the cold start times on GCP Cloud Run were reduced by upto 5% and 10.5% for simple and database dependent workloads respectively.

References

  1. Siddharth Agarwal, Maria A. Rodriguez, and Rajkumar Buyya. 2021. A Reinforcement Learning Approach to Reduce Serverless Function Cold Start Frequency. In 2021 IEEE/ACM 21st International Symposium on Cluster, Cloud and Internet Computing (CCGrid). 797–803. https://doi.org/10.1109/CCGrid51090.2021.00097Google ScholarGoogle ScholarCross RefCross Ref
  2. Rajdeep Dua, A Reddy Raja, and Dharmesh Kakadia. 2014. Virtualization vs Containerization to Support PaaS. In 2014 IEEE International Conference on Cloud Engineering. 610–614. https://doi.org/10.1109/IC2E.2014.41Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Alim Ul Gias and Giuliano Casale. 2020. COCOA: Cold Start Aware Capacity Planning for Function-as-a-Service Platforms. In 2020 28th International Symposium on Modeling, Analysis, and Simulation of Computer and Telecommunication Systems (MASCOTS). 1–8. https://doi.org/10.1109/MASCOTS50786.2020.9285966Google ScholarGoogle Scholar
  4. Changyuan Lin and Hamzeh Khazaei. 2021. Modeling and Optimization of Performance and Cost of Serverless Applications. IEEE Transactions on Parallel and Distributed Systems 32, 3 (2021), 615–632. https://doi.org/10.1109/TPDS.2020.3028841Google ScholarGoogle ScholarCross RefCross Ref
  5. Dirk Merkel. 2014. Docker: Lightweight Linux Containers for Consistent Development and Deployment. Linux J. 2014, 239, Article 2 (mar 2014).Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Anup Mohan, Harshad Sane, Kshitij Doshi, Saikrishna Edupuganti, Naren Nayak, and Vadim Sukhomlinov. 2019. Agile Cold Starts for Scalable Serverless. In 11th USENIX Workshop on Hot Topics in Cloud Computing (HotCloud 19). USENIX Association, Renton, WA. https://www.usenix.org/conference/hotcloud19/presentation/mohanGoogle ScholarGoogle Scholar
  7. Claus Pahl. 2015. Containerization and the PaaS Cloud. IEEE Cloud Computing 2, 3 (2015), 24–31. https://doi.org/10.1109/MCC.2015.51Google ScholarGoogle ScholarCross RefCross Ref
  8. Joel Scheuner and Philipp Leitner. 2019. Transpiling Applications into Optimized Serverless Orchestrations. In 2019 IEEE 4th International Workshops on Foundations and Applications of Self* Systems (FAS*W). 72–73. https://doi.org/10.1109/FAS-W.2019.00031Google ScholarGoogle ScholarCross RefCross Ref
  9. Parichehr Vahidinia, Bahar Farahani, and Fereidoon Shams Aliee. 2020. Cold Start in Serverless Computing: Current Trends and Mitigation Strategies. In 2020 International Conference on Omni-layer Intelligent Systems (COINS). 1–7. https://doi.org/10.1109/COINS49042.2020.9191377Google ScholarGoogle Scholar
  10. Liang Wang, Mengyuan Li, Yinqian Zhang, Thomas Ristenpart, and Michael Swift. 2018. Peeking Behind the Curtains of Serverless Platforms. In 2018 USENIX Annual Technical Conference (USENIX ATC 18). USENIX Association, Boston, MA, 133–146. https://www.usenix.org/conference/atc18/presentation/wang-liangGoogle ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Minimizing Cold Start Times in Serverless Deployments

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Other conferences
        IC3-2022: Proceedings of the 2022 Fourteenth International Conference on Contemporary Computing
        August 2022
        710 pages
        ISBN:9781450396752
        DOI:10.1145/3549206

        Copyright © 2022 ACM

        Publication rights licensed to ACM. ACM acknowledges that this contribution was co-authored by an affiliate of the Crown in Right of Canada. As such, the Crown in Right of Canada retains an equal interest in the copyright. Reprint requests should be forwarded to ACM, and reprints must include clear attribution to ACM and Crown in Right of Canada.

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 24 October 2022

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article
        • Research
        • Refereed limited
      • Article Metrics

        • Downloads (Last 12 months)68
        • Downloads (Last 6 weeks)3

        Other Metrics

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format .

      View HTML Format