Razavi, Kamran ; Salmani, Mehran ; Mühlhäuser, Max ; Koldehofe, Boris ; Wang, Lin (2024)
A Tale of Two Scales: Reconciling Horizontal and Vertical Scaling for Inference Serving Systems.
doi: 10.48550/arXiv.2407.14843
Report, Bibliographie
Kurzbeschreibung (Abstract)
Inference serving is of great importance in deploying machine learning models in real-world applications, ensuring efficient processing and quick responses to inference requests. However, managing resources in these systems poses significant challenges, particularly in maintaining performance under varying and unpredictable workloads. Two primary scaling strategies, horizontal and vertical scaling, offer different advantages and limitations. Horizontal scaling adds more instances to handle increased loads but can suffer from cold start issues and increased management complexity. Vertical scaling boosts the capacity of existing instances, allowing for quicker responses but is limited by hardware and model parallelization capabilities. This paper introduces Themis, a system designed to leverage the benefits of both horizontal and vertical scaling in inference serving systems. Themis employs a two-stage autoscaling strategy: initially using in-place vertical scaling to handle workload surges and then switching to horizontal scaling to optimize resource efficiency once the workload stabilizes. The system profiles the processing latency of deep learning models, calculates queuing delays, and employs different dynamic programming algorithms to solve the joint horizontal and vertical scaling problem optimally based on the workload situation. Extensive evaluations with real-world workload traces demonstrate over 10× SLO violation reduction compared to the state-of-the-art horizontal or vertical autoscaling approaches while maintaining resource efficiency when the workload is stable.
Typ des Eintrags: | Report |
---|---|
Erschienen: | 2024 |
Autor(en): | Razavi, Kamran ; Salmani, Mehran ; Mühlhäuser, Max ; Koldehofe, Boris ; Wang, Lin |
Art des Eintrags: | Bibliographie |
Titel: | A Tale of Two Scales: Reconciling Horizontal and Vertical Scaling for Inference Serving Systems |
Sprache: | Englisch |
Publikationsjahr: | 24 Juli 2024 |
Verlag: | arXiv |
Reihe: | Distributed, Parallel, and Cluster Computing |
Auflage: | 1. Version |
DOI: | 10.48550/arXiv.2407.14843 |
Kurzbeschreibung (Abstract): | Inference serving is of great importance in deploying machine learning models in real-world applications, ensuring efficient processing and quick responses to inference requests. However, managing resources in these systems poses significant challenges, particularly in maintaining performance under varying and unpredictable workloads. Two primary scaling strategies, horizontal and vertical scaling, offer different advantages and limitations. Horizontal scaling adds more instances to handle increased loads but can suffer from cold start issues and increased management complexity. Vertical scaling boosts the capacity of existing instances, allowing for quicker responses but is limited by hardware and model parallelization capabilities. This paper introduces Themis, a system designed to leverage the benefits of both horizontal and vertical scaling in inference serving systems. Themis employs a two-stage autoscaling strategy: initially using in-place vertical scaling to handle workload surges and then switching to horizontal scaling to optimize resource efficiency once the workload stabilizes. The system profiles the processing latency of deep learning models, calculates queuing delays, and employs different dynamic programming algorithms to solve the joint horizontal and vertical scaling problem optimally based on the workload situation. Extensive evaluations with real-world workload traces demonstrate over 10× SLO violation reduction compared to the state-of-the-art horizontal or vertical autoscaling approaches while maintaining resource efficiency when the workload is stable. |
Fachbereich(e)/-gebiet(e): | 20 Fachbereich Informatik 20 Fachbereich Informatik > Telekooperation |
TU-Projekte: | DFG|SFB1053|SFB1053 TPA01 Mühlhä DFG|SFB1053|SFB1053 TPB02 Mühlhä |
Hinterlegungsdatum: | 02 Aug 2024 08:32 |
Letzte Änderung: | 24 Sep 2024 13:51 |
PPN: | |
Export: | |
Suche nach Titel in: | TUfind oder in Google |
Frage zum Eintrag |
Optionen (nur für Redakteure)
Redaktionelle Details anzeigen |