BENCHMARKING THE PERFORMANCE OF CONTAINERIZED MICROSERVICES USING DISTRIBUTED TRACING AND DYNAMIC LOAD TESTING
Keywords:
Microservices, Distributed Tracing, Load Testing, Performance Benchmarking, Observability, Kubernetes, Service Mesh, Performance Metrics, Container Orchestration, Dynamic Workloads'Synopsis
The rapid proliferation of microservices architectures, especially in cloud-native environments, necessitates robust methods for evaluating their performance under real-world conditions. Traditional performance evaluation tools fall short in capturing the complexities of containerized microservices, particularly in heterogeneous deployment environments. This paper proposes an integrated approach combining distributed tracing with dynamic load testing to benchmark performance more holistically. By observing latency, throughput, and resource consumption across service dependencies, our method offers a deeper understanding of performance bottlenecks and scalability limits. We implement this approach in a Kubernetes environment using open-source observability tools and dynamic stress generators, providing a modular benchmarking framework for DevOps and SRE teams. Empirical results demonstrate how fine-grained tracing and adaptive load generation enable proactive performance tuning and improve service reliability under fluctuating workloads.
References
[1] Sigelman B.H., Barroso L.A., Burrows M., Stephenson P., Plakal M., Beaver D., Jaspan S., Shanbhag C. Dapper, a large-scale distributed systems tracing infrastructure. Google Research, 2010.
[2] Chen L., Zhang Z., Lin D. Zipkin: A scalable, distributed tracing system. In: Proceedings of the 2014 ACM Symposium on Cloud Computing. ACM, 2014.
[3] Cito J., Leitner P., Fritz T., Gall H.C. The making of cloud applications: An empirical study on software development for the cloud. In: Proceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering. ACM, 2015.
[4] Gummadi, V. P. K. (2020). API design and implementation: RAML and OpenAPI specification. Journal of Electrical Systems, 16(4). https://doi.org/10.52783/jes.9329
[5] Thönes J. Microservices. IEEE Software, 2015, 32(1): 116–116.
[6] Al-Dhuraibi Y., Paraiso F., Djarallah N., Merle P. Elasticity in cloud computing: State of the art and research challenges. IEEE Transactions on Services Computing, 2017, 11(2): 430–447.
[7] Jamshidi P., Pahl C., Mendonça N.C., Lewis J., Tilkov S. Microservices: The journey so far and challenges ahead. IEEE Software, 2018, 35(3): 24–35.
[8] Newman S. Building Microservices. O’Reilly Media, 2015.
[9] Burns B., Grant B., Oppenheimer D., Brewer E., Wilkes J. Borg, Omega, and Kubernetes. ACM Queue, 2016, 14(1): 70.
[10] Natis Y.V., Shrier B., Gall N., Adnams K. Microservice architecture: Adoption, drivers, and challenges. Gartner Research, 2017.
[11] Dragoni N., Giallorenzo S., Lafuente A.L., Mazzara M., Montesi F., Mustafin R., Safina L. Microservices: Yesterday, today, and tomorrow. In: Present and Ulterior Software Engineering. Springer, 2017.
[12] Villamizar M., Garcés O., Castro H., Verano M., Salamanca L., Casallas R., Gil S. Evaluating the monolithic and the microservice architecture pattern to deploy web applications in the cloud. In: Proceedings of the 10th Computing Colombian Conference. IEEE, 2015.
[13] Hüttermann M. DevOps for Developers. Apress, 2012.
[14] Hindle A., Godfrey M.W., Holt R.C. Software process recovery using concurrent trace analysis. In: Proceedings of the 2005 International Workshop on Mining Software Repositories. ACM, 2005.
[15] Basiri A., Casale G., Ghezzi C., Guinea S., Sharifloo A.M. Modeling and analyzing adaptive self-healing strategies in self-adaptive systems. In: Proceedings of the 9th International ACM SIGSOFT Conference on Quality of Software Architectures. ACM, 2013.
[16] Gannon D., Barga R., Sundaresan N. Cloud-native applications. IEEE Internet Computing, 2017, 21(5): 20–22.
Published
Series
Categories
License

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.