JSEA  Vol.6 No.4 , April 2013
An Approach to Developing a Performance Test Based on the Tradeoffs from SW Architectures
ABSTRACT

In a performance test, the standards for assessing its test results are not sufficiently determined due to the lack of a well-structured test developing methods which are found in a functionality test. By extending the established workflow structure, this approach will concentrate on tradeoffs within T-workflow and further develop tests based on T-workflow. The monitoring and tuning point have also been investigated to understand the validity and performance of software. Finally through a case study, it has been shown that better assessment of software performance can be obtained with the suggested tests developed based on T-workflow and by locating its monitoring point and tuning point.


Cite this paper
B. Choi, M. Yoon and H. Kim, "An Approach to Developing a Performance Test Based on the Tradeoffs from SW Architectures," Journal of Software Engineering and Applications, Vol. 6 No. 4, 2013, pp. 184-195. doi: 10.4236/jsea.2013.64024.
References
[1]   F. Mattiello-Francisco, E. Martins, A. R. Cavalli and E. T. Yano, “InRob: An Approach for Testing Interoperability and Robustness of Real-Time Embedded Softwarem,” Journal of Systems and Software, Vol. 85, No. 1, 2011, pp. 3-15.

[2]   S. Balsamo, P. Inverardi and C. Mangano, “An Approach to Performance Evaluation of Software Architectures,” Proceedings of the 1st International Workshop on Software and Performance, Sata Fe, 12-16 October 1998, pp. 178-190. doi:10.1145/287318.287354

[3]   F. Aquilani, S. Balsamo and P. Inverardi, “Performance Analysis at the Software Architectural Design Level, Performance Evaluation,” Vol. 45, No. 4, 2001, pp. 147-178.

[4]   E. J. Weyuker and F. I. Vokolos, “Experience with Performance Testing of Software Systems: Issues, an Approach, and Case Study, Software Engineering,” IEEE Transactions on Software Engineering, Vol. 26, No. 12, 2000, pp. 1147-1156.

[5]   R. Kazman, M. Klein, M. Barbacci, T. Longstaff, H. Lipson and J. Carriere, “The Architecture Tradeoff Analysis Method,” Proceedings of the 4th International Conference on Engineering of Complex Computer Systems (ICECCS ’98), Monterey, 10-14 August 1998, pp. 68-78.

[6]   C. W. Ho and L. Williams, “Deriving Performance Requirements and Test Cases with the Performance Refinement and Evolution Model (PREM),” North Carolina State University, Raleigh, 2006, Technical Report No. TR-2006-30.

[7]   F. I. Vokolos and E. J. Weyuker, “Performance Testing of Software Systems,” Proceedings of the 1st International Workshop on Software and Performance, Sata Fe, 12-16 October 1998, pp. 80-87. doi:10.1145/287318.287337

[8]   D. Draheim, J. Grundy, J. Hosking, C. Lutteroth and G. Weber, “Realistic Load Testing of Web Applications,” Proceedings of the 10th European Conference on Software Maintenance and Reengineering (CSMR ‘06), Bari, 22-24 March 2006, pp. 57-70.

[9]   Y. Y. Gu and Y. J. Ge, “Search-Based Performance Testing of Applications with Composite Services,” International Conference on Web Information Systems and Mining, Shanghai, 7-8 November 2009, pp. 320-324. doi:10.1109/WISM.2009.73

[10]   C. D. Grosso, G. Antoniol, M. Di Penta, P. Galinier and E. Merlo, “Improving Network Applications Security: A New Heuristic to Generate Stress Testing Data,” In: Proceedings of the 2005 Conference on Genetic and Evolutionary Computation (GECCO ‘05), Hans-Georg Beyer, Ed., ACM, New York, pp. 1037-1043.

[11]   C. U. Smith, “Performance Engineering of Software Systems,” Addison-Wesley, Boston, 1990.

[12]   http://www.yaffs.net/

[13]   IEEE Standard Glossary of Software Engineering Terminology.

 
 
Top