ABSTRACT
Serverless deployments of Cloud Applications involve containerizing an application that then remains dormant(cold) until a trigger event like a user visiting an endpoint occurs. The host machine then provisions this dormant container into a Virtual Machine that serves the request and then stays idle, waiting for subsequent requests to come in(warm). While the performance for requests made while a container is warm is indistinguishable from a fully managed server stack, requests when a container is cold can take several seconds because of the overheads involved in VM provisioning. The time at which a container goes from warm to cold is decided by the host VM depending on existing load and it’s configuration.
This paper aims to come up with methods to reduce the frequency and duration of cold starts occurring across different workloads and cloud providers. By changing base images, lazy loading I/O and DB initializations and modifying CPU capacity the cold start times on GCP Cloud Run were reduced by upto 5% and 10.5% for simple and database dependent workloads respectively.
- Siddharth Agarwal, Maria A. Rodriguez, and Rajkumar Buyya. 2021. A Reinforcement Learning Approach to Reduce Serverless Function Cold Start Frequency. In 2021 IEEE/ACM 21st International Symposium on Cluster, Cloud and Internet Computing (CCGrid). 797–803. https://doi.org/10.1109/CCGrid51090.2021.00097Google ScholarCross Ref
- Rajdeep Dua, A Reddy Raja, and Dharmesh Kakadia. 2014. Virtualization vs Containerization to Support PaaS. In 2014 IEEE International Conference on Cloud Engineering. 610–614. https://doi.org/10.1109/IC2E.2014.41Google ScholarDigital Library
- Alim Ul Gias and Giuliano Casale. 2020. COCOA: Cold Start Aware Capacity Planning for Function-as-a-Service Platforms. In 2020 28th International Symposium on Modeling, Analysis, and Simulation of Computer and Telecommunication Systems (MASCOTS). 1–8. https://doi.org/10.1109/MASCOTS50786.2020.9285966Google Scholar
- Changyuan Lin and Hamzeh Khazaei. 2021. Modeling and Optimization of Performance and Cost of Serverless Applications. IEEE Transactions on Parallel and Distributed Systems 32, 3 (2021), 615–632. https://doi.org/10.1109/TPDS.2020.3028841Google ScholarCross Ref
- Dirk Merkel. 2014. Docker: Lightweight Linux Containers for Consistent Development and Deployment. Linux J. 2014, 239, Article 2 (mar 2014).Google ScholarDigital Library
- Anup Mohan, Harshad Sane, Kshitij Doshi, Saikrishna Edupuganti, Naren Nayak, and Vadim Sukhomlinov. 2019. Agile Cold Starts for Scalable Serverless. In 11th USENIX Workshop on Hot Topics in Cloud Computing (HotCloud 19). USENIX Association, Renton, WA. https://www.usenix.org/conference/hotcloud19/presentation/mohanGoogle Scholar
- Claus Pahl. 2015. Containerization and the PaaS Cloud. IEEE Cloud Computing 2, 3 (2015), 24–31. https://doi.org/10.1109/MCC.2015.51Google ScholarCross Ref
- Joel Scheuner and Philipp Leitner. 2019. Transpiling Applications into Optimized Serverless Orchestrations. In 2019 IEEE 4th International Workshops on Foundations and Applications of Self* Systems (FAS*W). 72–73. https://doi.org/10.1109/FAS-W.2019.00031Google ScholarCross Ref
- Parichehr Vahidinia, Bahar Farahani, and Fereidoon Shams Aliee. 2020. Cold Start in Serverless Computing: Current Trends and Mitigation Strategies. In 2020 International Conference on Omni-layer Intelligent Systems (COINS). 1–7. https://doi.org/10.1109/COINS49042.2020.9191377Google Scholar
- Liang Wang, Mengyuan Li, Yinqian Zhang, Thomas Ristenpart, and Michael Swift. 2018. Peeking Behind the Curtains of Serverless Platforms. In 2018 USENIX Annual Technical Conference (USENIX ATC 18). USENIX Association, Boston, MA, 133–146. https://www.usenix.org/conference/atc18/presentation/wang-liangGoogle ScholarDigital Library
Index Terms
- Minimizing Cold Start Times in Serverless Deployments
Recommendations
LCS : Alleviating Total Cold Start Latency in Serverless Applications with LRU Warm Container Approach
ICDCN '23: Proceedings of the 24th International Conference on Distributed Computing and NetworkingServerless computing offers "Function-as-a-Service"(FaaS), which promotes an application in the form of independent granular components called functions. FaaS goes well as a widespread standard that facilitates the development of applications in cloud-...
Colder Than the Warm Start and Warmer Than the Cold Start! Experience the Spawn Start in FaaS Providers
ApPLIED '22: Proceedings of the 2022 Workshop on Advanced tools, programming languages, and PLatforms for Implementing and Evaluating algorithms for Distributed systemsMany researchers reported considerable delay of up to a few seconds when invoking serverless functions for the first time. This phenomenon, which is known as a cold start, affects even more when users are running multiple serverless functions ...
Tackling Cold Start in Serverless Computing with Container Runtime Reusing
NAI '20: Proceedings of the Workshop on Network Application Integration/CoDesignDuring past few years, serverless computing has changed the paradigm of application development and deployment in the cloud and edge due to its unique advantages, including easy administration, automatic scaling, built-in fault tolerance, etc. ...
Comments