Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2015, Article ID 690829, 9 pages
http://dx.doi.org/10.1155/2015/690829
Research Article

Distance Based Root Cause Analysis and Change Impact Analysis of Performance Regressions

College of Computer Science and Technology, Zhejiang University, Hangzhou 310012, China

Received 15 February 2015; Revised 7 May 2015; Accepted 11 May 2015

Academic Editor: Evangelos J. Sapountzakis

Copyright © 2015 Junzan Zhou and Shanping Li. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. H. Sarojadevi, “Performance testing: methodologies and tools,” Journal of Information Engineering and Applications, vol. 1, no. 5, pp. 5–13, 2011. View at Google Scholar
  2. S. L. Graham, P. B. Kessler, and M. K. Mckusick, “Gprof: a call graph execution profiler,” ACM SIGPLAN Notices, vol. 17, no. 6, pp. 120–126, 1982. View at Publisher · View at Google Scholar
  3. A. Sarimbekov, A. Sewe, W. Binder, P. Moret, and M. Mezini, “JP2: call-site aware calling context profiling for the Java Virtual Machine,” Science of Computer Programming, vol. 79, pp. 146–157, 2014. View at Publisher · View at Google Scholar · View at Scopus
  4. Z. M. Jiang, A. E. Hassan, G. Hamann, and P. Flora, “Automated performance analysis of load tests,” in Proceedings of the IEEE International Conference on Software Maintenance (ICSM '09), pp. 125–134, September 2009. View at Publisher · View at Google Scholar · View at Scopus
  5. J. Zhou, B. Zhou, and S. Li, “Automated model-based performance testing for PaaS cloud services,” in Proceedings of the IEEE 38th International Computer Software and Applications Conference Workshops (COMPSACW '14), pp. 644–649, Vasteras, Sweden, July 2014. View at Publisher · View at Google Scholar
  6. M. Acharya and B. Robinson, “Practical change impact analysis based on static program slicing for industrial software systems,” in Proceedings of the 33rd International Conference on Software Engineering (ICSE '11), pp. 746–755, May 2011. View at Publisher · View at Google Scholar · View at Scopus
  7. J. Law and G. Rothermel, “Whole program path-based dynamic impact analysis,” in Proceedings of the 25th International Conference on Software Engineering, pp. 308–318, May 2003. View at Scopus
  8. T. Zimmermann, A. Zeller, P. Weissgerber, and S. Diehl, “Mining version histories to guide software changes,” IEEE Transactions on Software Engineering, vol. 31, no. 6, pp. 429–445, 2005. View at Publisher · View at Google Scholar · View at Scopus
  9. D. M. German, A. E. Hassan, and G. Robles, “Change impact graphs: determining the impact of prior codechanges,” Information and Software Technology, vol. 51, no. 10, pp. 1394–1408, 2009. View at Publisher · View at Google Scholar · View at Scopus
  10. X. Wang, B. Zhou, and W. Li, “Model-based load testing of web applications,” Journal of the Chinese Institute of Engineers, vol. 36, no. 1, pp. 74–86, 2013. View at Publisher · View at Google Scholar · View at Scopus
  11. F. Abbors, T. Ahmad, D. Truscan, and I. Porres, “Model-based performance testing in the cloud using the MBPeT tool,” in Proceedings of the 4th ACM/SPEC International Conference on Performance Engineering (ICPE '13), pp. 423–424, April 2013. View at Publisher · View at Google Scholar · View at Scopus
  12. A. Bahga and V. K. Madisetti, “Synthetic workload generation for cloud computing applications,” Journal of Software Engineering and Applications, vol. 4, no. 7, pp. 396–410, 2011. View at Publisher · View at Google Scholar
  13. C. Barna, M. Litoiu, and H. Ghanbari, “Model-based performance testing (NIER track),” in Proceedings of the 33rd International Conference on Software Engineering (ICSE '11), pp. 872–875, May 2011. View at Publisher · View at Google Scholar · View at Scopus
  14. M. Grechanik, C. Fu, and Q. Xie, “Automatically finding performance problems with feedback-directed learning software testing,” in Proceedings of the 34th International Conference on Software Engineering (ICSE '12), pp. 156–166, June 2012. View at Publisher · View at Google Scholar · View at Scopus
  15. P. Zhang, S. Elbaum, and M. B. Dwyer, “Automatic generation of load tests,” in Proceedings of the 26th IEEE/ACM International Conference on Automated Software Engineering (ASE '11), pp. 43–52, November 2011. View at Publisher · View at Google Scholar · View at Scopus
  16. P. Huang, X. Ma, D. Shen, and Y. Zhou, “Performance regression testing target prioritization via performance risk analysis,” in Proceedings of the 36th International Conference on Software Engineering (ICSE '14), pp. 60–71, May 2014. View at Publisher · View at Google Scholar
  17. Loadrunner, http://www8.hp.com/cn/zh/software-solutions/loadrunner-load-testing/index.html.
  18. JMeter, https://jmeter.apache.org/.
  19. “Grinder,” http://grinder.sourceforge.net/.
  20. SOASTA, http://soasta.com/.
  21. B. Dillenseger, “CLIF, a framework based on Fractal for flexible, distributed load testing,” Annals of Telecommunications, vol. 64, no. 1-2, pp. 101–120, 2008. View at Publisher · View at Google Scholar
  22. C. H. Kao, C. C. Lin, and J.-N. Chen, “Performance testing framework for REST-based web applications,” in Proceedings of the 13th International Conference on Quality Software (QSIC '13), pp. 349–354, July 2013.
  23. J. Gao and Y. Lan, “Automatic test task allocation in agent-based distributed automated testing framework,” in Proceedings of the International Conference on Computational Intelligence and Software Engineering (CiSE '09), pp. 1–5, December 2009. View at Publisher · View at Google Scholar · View at Scopus
  24. T. Chen, L. I. Ananiev, and A. V. Tikhonov, “Keeping kernel performance from regressions,” OSL, vol. 1, pp. 93–102, 2007. View at Google Scholar
  25. T. Kalibera, L. Bulej, and P. Tuma, “Automated detection of performance regressions: the Mono experience,” in Proceedings of the 13th IEEE International Symposium on Modeling, Analysis and Simulation of Computer and Telecommunications Systems (MASCOTS '05), pp. 183–190, September 2005. View at Publisher · View at Google Scholar · View at Scopus
  26. H. Malik, Z. M. Jiang, B. Adams, A. E. Hassan, P. Flora, and G. Hamann, “Automatic comparison of load tests to support the performance analysis of large enterprise systems,” in Proceedings of the 14th European Conference on Software Maintenance and Reengineering (CSMR '10), pp. 222–231, March 2010. View at Publisher · View at Google Scholar · View at Scopus
  27. T. H. D. Nguyen, “Using control charts for detecting and understanding performance regressions in large software,” in Proceedings of the 5th IEEE International Conference on Software Testing, Verification and Validation (ICST '12), pp. 491–494, April 2012. View at Publisher · View at Google Scholar · View at Scopus
  28. N. Kama, “Change impact analysis for the software development phase: state-of-the-art,” International Journal of Software Engineering and Its Applications, vol. 7, no. 2, pp. 235–244, 2013. View at Google Scholar · View at Scopus
  29. R. S. Arnold and S. A. Bohner, “Impact analysis-towards a framework for comparison,” in Proceedings of the Conference on Software Maintenance, pp. 292–301, Montreal, Canada, September 1993. View at Scopus
  30. A. Orso, T. Apiwattanapong, and M. J. Harrold, “Leveraging field data for impact analysis and regression testing,” ACM SIGSOFT Software Engineering Notes, vol. 28, no. 5, pp. 128–137, 2003. View at Publisher · View at Google Scholar
  31. A. Orso, T. Apiwattanapong, J. Law, G. Rothermel, and M. J. Harrold, “An empirical comparison of dynamic impact analysis algorithms,” in Proceedings of the 26th International Conference on Software Engineering (ICSE '04), pp. 491–500, May 2004. View at Scopus
  32. H. Cai and R. Santelices, “Diver: precise dynamic impact analysis using dependence-based trace pruning,” in Proceedings of the 29th ACM/IEEE International Conference on Automated Software Engineering (ASE '14), vol. 33, pp. 343–348, ACM, Vasteras, Sweden, September 2014. View at Publisher · View at Google Scholar
  33. T. Apiwattanapong, A. Orso, and M. J. Harrold, “Efficient and precise dynamic impact analysis using execute-after sequences,” in Proceedings of the 27th International Conference on Software Engineering (ICSE '05), pp. 432–441, May 2005. View at Scopus
  34. X. Ren, F. Shah, F. Tip, B. G. Ryder, and O. Chesley, “Chianti: a tool for change impact analysis of java programs categories and subject descriptors,” ACM SIGPLAN Notices, vol. 39, no. 10, pp. 432–448, 2004. View at Google Scholar
  35. R. J. Turver and M. Munro, “An early impact analysis technique for software maintenance,” Journal of Software Maintenance: Research and Practice, vol. 6, no. 1, pp. 35–52, 1994. View at Publisher · View at Google Scholar · View at Scopus
  36. B. Li, X. Sun, H. Leung, and S. Zhang, “A survey of code-based change impact analysis techniques,” Software Testing Verification and Reliability, vol. 23, no. 8, pp. 613–646, 2013. View at Publisher · View at Google Scholar · View at Scopus