Complex multi-tier applications deployed in cloud
computing environments can experience rapid changes in their workloads. To
ensure market readiness of such applications, adequate resources need to be
provisioned so that the applications can meet the demands of specified workload
levels and at the same time ensure that service level agreements are met.
Multi-tier cloud applications can have complex deployment configurations with
load balancers, web servers, application servers and database servers. Complex
dependencies may exist between servers in various tiers. To support
provisioning and capacity planning decisions, performance testing approaches
with synthetic workloads are used. Accuracy of a performance testing approach
is determined by how closely the generated synthetic workloads mimic the
realistic workloads. Since multi-tier applications can have varied deployment
configurations and characteristic workloads, there is a need for a generic
performance testing methodology that allows accurately modeling the performance
of applications. We propose a methodology for performance testing of complex
multi-tier applications. The workloads of multi-tier cloud applications are captured
in two different models-benchmark application and workload models. An
architecture model captures the deployment configurations of multi-tier
applications. We propose a rapid deployment prototyping methodology that can
help in choosing the best and most cost effective deployments for multi-tier applications
that meet the specified performance requirements. We also describe a system
bottleneck detection approach based on experimental evaluation of multi-tier
Cite this paper
A. Bahga and V. Madisetti, "Performance Evaluation Approach for Multi-Tier Cloud Applications," Journal of Software Engineering and Applications
, Vol. 6 No. 2, 2013, pp. 74-83. doi: 10.4236/jsea.2013.62012
 A. Bahga and V. K. Madisetti, “Synthetic Workload Generation for Cloud Computing Applications,” Journal of Software Engineering and Applications, Vol. 4, No. , 2011, pp. 396-410. doi:10.4236/jsea.2011.47046
 SPECweb99, 2012. http://www.spec.org/osg/web99
 P. Barford and M. E. Crovella, “Generating Representative Web Workloads for Network and Server Performance Evaluation,” SIGMETRICS, Vol. 98, 1998, pp. 151-160.
 D. Krishnamurthy, J. A. Rolia and S. Majumdar, “SWAT: A Tool for Stress Testing Session-Based Web Applications,” Proceedings of International CMG conference, Dallas, 7-12 December 2003, pp. 639-649.
 H. P. LoadRunner, 2012.
 A. Mahanti, C. Williamson and D. Eager, “Traffic Analysis of a Web Proxy Caching Hierarchy,” IEEE Network, Vol. 14, No. 3, 2000, pp. 16-23.
 S. Manley, M. Seltzer and M. Courage, “A Self-Scaling and Self-Configuring Benchmark for Web Servers,” Proceedings of the ACM SIGMETRICS Conference, Madison, 22-26 June 1998.
 Webjamma, 2012.
 G. Abdulla, “Analysis and Modeling of World Wide Web Traffic,” Ph.D. Thesis, Chair-Edward A. Fox, 1998.
 M. Crovella and A. Bestavros, “Self-Similarity in World Wide Web Traffic: Evidence and Possible Causes, IEEE/ ACM Trans,” Networking, Vol. 5, No. 6, 1997, pp. 835- 846. doi:10.1109/90.650143
 D. Mosberger and T. Jin, “httperf: A Tool for Measuring Web Server Performance,” ACM Performance Evaluation Review, Vol. 26, No. 3, 1998, pp. 31-37.
 D. Garcia and J. Garcia, “TPC-W E-Commerce Benchmark Evaluation,” IEEE Computer, 2003.
 RUBiS, 2012. http://rubis.ow2.org
 TPC-W, 2012. http://jmob.ow2.org/tpcw.html
 Faban, 2012. http://faban.sunsource.net
 2012. http://www.mysql.com/products/cluster
 2012. http://memcached.org