Table of Contents
Advances in Software Engineering
Volume 2012 (2012), Article ID 964064, 13 pages
http://dx.doi.org/10.1155/2012/964064
Research Article

Evaluating the Effect of Control Flow on the Unit Testing Effort of Classes: An Empirical Analysis

Software Engineering Research Laboratory, Department of Mathematics and Computer Science, University of Quebec at Trois-Rivières, Trois-Rivières, QC, Canada G9A 5H7

Received 25 November 2011; Revised 18 March 2012; Accepted 28 March 2012

Academic Editor: Filippo Lanubile

Copyright © 2012 Mourad Badri and Fadel Toure. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Linked References

  1. P. L. Yeh and J. C. Lin, “Software testability measurement derived from data flow analysis,” in Proceedings of the 2nd Euromicro Conference on Software Maintenance and Reengineering, Florence, Italy, 1998.
  2. B. Baudry, B. Le Traon, and G. Sunyé, “Testability analysis of a UML class diagram,” in Proceedings of the 9th International Software Metrics Symposium (METRICS ’03), IEEE CS, 2003.
  3. B. Baudry, Y. Le Traon, G. Sunyé, and J. M. Jézéquel, “Measuring and improving design patterns testability,” in Proceedings of the 9th International Software Metrics Symposium (METRICS '03), IEEE Computer Society, 2003.
  4. M. Bruntink and A. van Deursen, “An empirical study into class testability,” Journal of Systems and Software, vol. 79, no. 9, pp. 1219–1232, 2006. View at Publisher · View at Google Scholar · View at Scopus
  5. L. Zhao, “A new approach for software testability analysis,” in Proceedings of the 28th International Conference on Software Engineering (ICSE '06), pp. 985–988, May 2006. View at Scopus
  6. IEEE, IEEE Standard Glossary of Software Engineering Terminology, IEEE Computer Society Press, 1990.
  7. ISO/IEC 9126: Software Engineering Product Quality, 1991.
  8. M. Bruntink and A. Van Deursen, “Predicting class testability using object-oriented metrics,” in Proceedings of the 4th IEEE International Workshop on Source Code Analysis and Manipulation (SCAM '04), pp. 136–145, September 2004. View at Publisher · View at Google Scholar · View at Scopus
  9. V. Gupta, K. K. Aggarwal, and Y. Singh, “A Fuzzy Approach for Integrated Measure of Object-Oriented Software Testability,” Journal of Computer Science, vol. 1, no. 2, pp. 276–282, 2005. View at Google Scholar
  10. B. Henderson-Sellers, Object-Oriented Metrics Measures of Complexity, Prentice-Hall, 1996.
  11. Y. Singh, A. Kaur, and R. Malhota, “Predicting testability effort using artificial neural network,” in Proceedings of the World Congress on Engineering and Computer Science, San Francisco, Calif, USA, 2008.
  12. Y. Singh, A. Kaur, and R. Malhotra, “Empirical validation of object-oriented metrics for predicting fault proneness models,” Software Quality Journal, vol. 18, no. 1, pp. 3–35, 2009. View at Publisher · View at Google Scholar · View at Scopus
  13. L. Badri, M. Badri, and F. Touré, “Exploring empirically the relationship between lack of cohesion and testability in object-oriented systems,” in Advances in Software Engineering, T.-h. Kim, H.-K. Kim, M. K. Khan et al., Eds., vol. 117 of Communications in Computer and Information Science, Springer, Berlin, Germany, 2010. View at Google Scholar
  14. L. Badri, M. Badri, and F. Touré, “An empirical analysis of lack of cohesion metrics for predicting testability of classes,” International Journal of Software Engineering and Its Applications, vol. 5, no. 2, 2011. View at Google Scholar
  15. M. Badri and F. Touré, “Empirical analysis for investigating the effect of control flow dependencies on testability of classes,” in Proceedings of the 23rd International Conference on Software Engineering and Knowledge Engineering (SEKE '11), 2011.
  16. M. Badri, L. Badri, and F. Touré, “Empirical analysis of object-oriented design metrics: towards a new metric using control flow paths and probabilities,” Journal of Object Technology, vol. 8, no. 6, pp. 123–142, 2009. View at Google Scholar · View at Scopus
  17. N. Fenton and S. L. Pfleeger, Software Metrics: A Rigorous and Practical Approach, PWS Publishing Company, 1997.
  18. J. Gao and M. C. Shih, “A component testability model for verification and measurement,” in Proceedings of the 29th Annual International Computer Software and Applications Conference (COMPSAC '05), pp. 211–218, July 2005. View at Publisher · View at Google Scholar · View at Scopus
  19. J. W. Sheppard and M. Kaufman, “Formal specification of testability metrics in IEEE P1522,” in Proceedings of the IEEE Systems Readiness Technology Conference Autotestcom (AUTOTESTCON '01), pp. 71–82, Valley Forge, Pa, USA, August 2001. View at Scopus
  20. R. S. Freedman, “Testability of software components,” IEEE Transactions on Software Engineering, vol. 17, no. 6, pp. 553–564, 1991. View at Publisher · View at Google Scholar · View at Scopus
  21. J. M. Voas, “PIE: a dynamic failure-based technique,” IEEE Transactions on Software Engineering, vol. 18, no. 8, pp. 717–727, 1992. View at Publisher · View at Google Scholar · View at Scopus
  22. J. M. Voas and K. W. Miller, “Semantic metrics for software testability,” The Journal of Systems and Software, vol. 20, no. 3, pp. 207–216, 1993. View at Google Scholar · View at Scopus
  23. J. M. Voas and K. W. Miller, “Software testability: the new verification,” IEEE Software, vol. 12, no. 3, pp. 17–28, 1995. View at Publisher · View at Google Scholar · View at Scopus
  24. R. V. Binder, “Design for testability in object-oriented systems,” Communications of the ACM, vol. 37, no. 9, 1994. View at Google Scholar
  25. T. M. Khoshgoftaar, R. M. Szabo, and J. M. Voas, “Detecting program modules with low testability,” in Proceedings of the 11th IEEE International Conference on Software Maintenance, pp. 242–250, October 1995. View at Scopus
  26. T. M. Khoshgoftaar, E. B. Allen, and Z. Xu, “Predicting testability of program modules using a neural network,” in Proceedings of the 3rd IEEE Symposium on Application-Specific Systems and SE Technology, 2000.
  27. J. McGregor and S. Srinivas, “A measure of testing effort,” in Proceedings of the Conference on Object-Oriented Technologies, pp. 129–142, USENIX Association, June1996.
  28. A. Bertolino and L. Strigini, “On the use of testability measures for dependability assessment,” IEEE Transactions on Software Engineering, vol. 22, no. 2, pp. 97–108, 1996. View at Google Scholar · View at Scopus
  29. Y. Le Traon and C. Robach, “Testability analysis of co-designed systems,” in Proceedings of the 4th Asian Test Symposium (ATS '95), IEEE Computer Society, Washington, DC, USA, 1995.
  30. Y. Le Traon and C. Robach, “Testability measurements for data flow designs,” in Proceedings of the 4th International Software Metrics Symposium, pp. 91–98, Albuquerque, NM, USA, November 1997. View at Scopus
  31. Y. Le Traon, F. Ouabdesselam, and C. Robach, “Analyzing testability on data flow designs,” in Proceedings of the 11th International Symposium on Software Reliability Engineering (ISSRE '00), pp. 162–173, October 2000. View at Scopus
  32. A. Petrenko, R. Dssouli, and H. Koenig, “On evaluation of testability of protocol structures,” in Proceedings of the International Workshop on Protocol Test Systems (IFIP '93), Pau, France, 1993.
  33. K. Karoui and R. Dssouli, “Specification transformations and design for testability,” in Proceedings of the IEEE Global Elecommunications Conference (GLOBECOM ’96), 1996.
  34. S. Jungmayr, “Testability measurement and software dependencies,” in Proceedings of the 12th International Workshop on Software Measurement, October 2002.
  35. J. Gao, J. Tsao, and Y. Wu, Testing and Quality Assurance for Component-Based Software, Artech House, 2003.
  36. T. B. Nguyen, M. Delaunay, and C. Robach, “Testability analysis applied to embedded data-flow software,” in Proceedings of the 3rd International Conference on Quality Software (QSIC ’03), 2003.
  37. B. Baudry, Y. Le Traon, and G. Sunyé, “Improving the testability of UML class diagrams,” in Proceedings of the International Workshop on Testability Analysis (IWoTA '04), Rennes, France, 2004.
  38. V. Chowdhary, “Practicing testability in the real world,” in Proceedings of the International Conference on Software Testing, Verification and Validation, IEEE Computer Society Press, 2009.
  39. R. A. Khan and K. Mustafa, “Metric based testability model for object-oriented design (MTMOOD),” ACM SIGSOFT Software Engineering Notes, vol. 34, no. 2, 2009. View at Google Scholar
  40. A. Kout, F. Touré, and M. Badri, “An empirical analysis of a testability model for object-oriented programs,” ACM SIGSOFT Software Engineering Notes, vol. 36, no. 4, 2011. View at Google Scholar
  41. Y. Singh and A. Saha, “Predicting testability of eclipse: a case study,” Journal of Software Engineering, vol. 4, no. 2, 2010. View at Google Scholar
  42. V. R. Basili, L. C. Briand, and W. L. Melo, “A validation of object-oriented design metrics as quality indicators,” IEEE Transactions on Software Engineering, vol. 22, no. 10, pp. 751–761, 1996. View at Google Scholar · View at Scopus
  43. Y. Zhou and H. Leung, “Empirical analysis of object-oriented design metrics for predicting high and low severity faults,” IEEE Transactions on Software Engineering, vol. 32, no. 10, pp. 771–789, 2006. View at Publisher · View at Google Scholar · View at Scopus
  44. K. K. Aggarwal, Y. Singh, A. Kaur, and R. Malhotra, “Empirical analysis for investigating the effect of object-oriented metrics on fault proneness: a replicated case study,” Software Process Improvement and Practice, vol. 14, no. 1, pp. 39–62, 2009. View at Publisher · View at Google Scholar · View at Scopus
  45. A. Mockus, N. Nagappan, and T. T. Dinh-Trong, “Test coverage and post-verification defects: a multiple case study,” in Proceedings of the 3rd International Symposium on Empirical Software Engineering and Measurement (ESEM '09), pp. 291–301, October 2009. View at Publisher · View at Google Scholar · View at Scopus
  46. B. V. Rompaey and S. Demeyer, “Establishing traceability links between unit test cases and units under test,” in Proceedings of the 13th European Conference on Software Maintenance and Reengineering (CSMR '09), pp. 209–218, March 2009. View at Publisher · View at Google Scholar · View at Scopus
  47. A. Qusef, G. Bavota, R. Oliveto, A. De Lucia, and D. Binkley, “SCOTCH: test-to-code traceability using slicing and conceptual coupling,” in Proceedings of the International Conference on Software Maintenance (ICSM '11), 2011.
  48. M. H. Halstead, Elements of Software Science, Elsevier/North-Holland, New York, NY, USA, 1977.
  49. L. C. Briand, J. W. Daly, and J. Wüst, “A unified framework for cohesion measurement in object-oriented systems,” Empirical Software Engineering, vol. 3, no. 1, pp. 65–117, 1998. View at Google Scholar · View at Scopus
  50. L. C. Briand, J. Wüst, J. W. Daly, and D. Victor Porter, “Exploring the relationships between design measures and software quality in object-oriented systems,” Journal of Systems and Software, vol. 51, no. 3, pp. 245–273, 2000. View at Publisher · View at Google Scholar · View at Scopus
  51. T. Gyimóthy, R. Ferenc, and I. Siket, “Empirical validation of object-oriented metrics on open source software for fault prediction,” IEEE Transactions on Software Engineering, vol. 31, no. 10, pp. 897–910, 2005. View at Publisher · View at Google Scholar · View at Scopus
  52. A. Marcus, D. Poshyvanyk, and R. Ferenc, “Using the conceptual cohesion of classes for fault prediction in object-oriented systems,” IEEE Transactions on Software Engineering, vol. 34, no. 2, pp. 287–300, 2008. View at Publisher · View at Google Scholar · View at Scopus
  53. K. El Emam and W. Melo, “The prediction of faulty classes using object-oriented design metrics,” National Research Council of Canada NRC/ERB 1064, 1999.
  54. D. Hosmer and S. Lemeshow, Applied Logistic Regression, Wiley-Interscience, 2nd edition, 2000.
  55. K. El Emam, “A Methodology for validating software product metrics,” National Research Council of Canada NRC/ERB 1076, 2000.