- About this Journal ·
- Abstracting and Indexing ·
- Advance Access ·
- Aims and Scope ·
- Annual Issues ·
- Article Processing Charges ·
- Articles in Press ·
- Author Guidelines ·
- Bibliographic Information ·
- Citations to this Journal ·
- Contact Information ·
- Editorial Board ·
- Editorial Workflow ·
- Free eTOC Alerts ·
- Publication Ethics ·
- Reviewers Acknowledgment ·
- Submit a Manuscript ·
- Subscription Information ·
- Table of Contents
Mathematical Problems in Engineering
Volume 2013 (2013), Article ID 165029, 11 pages
A Quantitative Assessment Approach to COTS Component Security
1School of Computer Science and Telecommunication Engineering, Jiangsu University, Zhenjiang 212013, China
2School of Computer Science and Technology, Huazhong University of Science and Technology, Wuhan 430074, China
3School of Software and Communication Engineering, Jiangxi University of Finance and Economics, Nanchang 330013, China
Received 28 August 2012; Revised 26 December 2012; Accepted 31 December 2012
Academic Editor: Huaguang Zhang
Copyright © 2013 Jinfu Chen et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
The vulnerability of software components hinders the development of component technology. An effective assessment approach to component security level can promote the development of component technology. Thus, the current paper proposes a quantitative assessment approach to COTS (commercial-off-the-shelf) component security. The steps of interface fault injection and the assessment framework are given based on the internal factors of the tested component. The quantitative assessment algorithm and formula of component security level are also presented. The experiment results show that the approach not only can detect component security vulnerabilities effectively but also quantitatively assess the component security level. The score of component security can be accurately calculated, which represents the security level of the tested component.
In the area of software engineering, component-based software engineering (CBSE) has currently become a research point. The various development technologies of components further improve the development efficiency and performance of components, but the problems on the reliability and security of components have not yet been resolved. Current component testing approaches have focused on component functionality testing, which, to some extent, guarantees the correctness and integrity of component functionality. However, component security testing is rarely researched as a special subject. Component security testing mainly involved in testing the vulnerabilities of component which can be exploited to crack software component. There are some general component vulnerabilities such as memory leakage, stack overflow, access beyond boundary, and access protected page. In addition, the special difficulties for the security testing of COTS component are as follows. Firstly, the source codes are unavailable. Secondly, The general security pattern is difficult to attain by analyzing the general classification model of software vulnerability. Finally, The COTS component is very independent, which results in some problems such as gaining the effective component information and running the independent component. On the other hand, the assessment of component security has not been solved. Research on the quantitative assessment of component security level is rather rare. In the field of software testing, quantitatively assessing the security level of the component system not only guarantees and enhances the system’s reliability and security but also conducts a quantitative assessment level to software developers and facilitates users in selecting software products based on its security to reduce the risk of attack.
Nevertheless, component security vulnerabilities are usually studied, but the quantitative assessment of component security is not common. Most scholars only research on the theoretical description of security assessment, but practical implementation methods are not presented. A basic assessment approach was given in the literature . Based on the knowledge of information security, the main approach is dividing the component security vulnerability into several ranks . The security level of components is assessed using the software testing technology. The relevant experiments were conducted but were not studied in depth.
Enough internal information should be collected before assessing the security level for the software component. But, it is usually very difficult to obtain the internal information for COTS (commercial-off-the-shelf) components, because its source codes are unavailable. In order to obtain more information about COTS component, it is necessary that some data are inputted the tested component to drive the component running. When the tested component is running, some internal information including some security information can be collected by monitoring the running state of the tested component. The data-driven method can obtain internal security information for COTS component. In the current research, we propose a feasible approach to assess the component security level quantitatively through using data-driven method and learning from relevant statistical and probabilistic methods. The exceptional component information monitored by the dynamic monitoring system can be obtained by the testing system of component vulnerability . Relevant security factors are assigned to appropriate security exceptions according to the classification of the vulnerability in the Common Vulnerabilities and Exposures (CVE) vulnerability library. Each of the exceptional vulnerability factor values can be obtained from the triggering causes of software security vulnerabilities, which is a harmful degree combined with the characteristics of the component security vulnerabilities. Finally, six commercial components, in which security vulnerabilities exist, are collected for experimental analysis in the CVE Web site and are assessed by the proposed approach. The number of each interface and methods used for each interface can be obtained by dynamically executing the tested components. The executed probabilities of each interface and methods used for each interface can be determined by statistical method. At the same time, fault injection operators and the logical prediction rules of exceptional database (PRED) are proposed according to the component security exceptions commonly used. This approach not only can assess the component security level but also effectively assess the security level of each interface, thus providing better assessment capabilities.
2. Related Work
At present, commercial-off-the-shelf (COTS) components, such as military, mechanization, financial components, and so on, are used by more and more commercial software including some crucial security software. COTS security takes an important part in component systems because the security of the whole software system is determined by it. Therefore, assessing the security level of the component is used to test the security of the entire software system, which has become an essential part in component testing. To the best of our knowledge, research on assessing the software component security mainly focuses on component security requirements and assessment approaches, vulnerability assessment, and quantitative risk assessment.
Han and Zheng  and Jabeen and Jaffar-Ur Rehman  studied the security requirements and security assessment methods of components but did not give the specific relevant assessment methods. Md. Khan et al. examined the security testing description and component assessment [6–8]. Models of the user data security, component security framework, and security assessment model of software were proposed. Khan and Han developed an assessment scheme for the security properties of software components. The assessment scheme provided a numeric score that indicates the relative strength of the security properties of the component . However, the approach consists of two prerequisites: (1) a security requirement specification of the candidate component and (2) a component-specific security rating. The approach requires the source codes of candidate component to be available. The security property descriptions of the components based on the security assessment standards of information technology were proposed by Khan et al. . The main attributes of resource allocation, user data protection, and communications were included. The model was based on the active interface, which could provide the basis for reasoning and assessing a component’s security to meet certain security requirements of a particular application. Common criteria (CC) provide a schema for evaluating information system and enumerate the specific security requirements for such systems . The functional requirements in CC describe the security behavior or functions expected of an information system to counter threats, but not all the requirements in CC may be applicable directly to software components because of the distributed nature of software components and their compositional complexity.
Alhazmi and Malaiya proposed a time-based model for the total vulnerabilities and an alternative model which is analogous to the software reliability growth models . However, this model cannot quantitatively assess the security level of software. Zhang et al. presented a vulnerability risk assessment method based on the attack tree model and Bayesian network model . The analysis method can assess the risk probability of the software system but not its security level. The security risk assessment was introduced to the classical MDA (Model-Driven Architecture) framework . The extension offers early assessment and early validation of security requirements. However, the improved framework still has not provided the quantitative mechanism of the software security. A systematic methodology was proposed for some software vulnerability assessment and security function verification . A scalable and adaptable automatic test system was implemented to test hundreds of software releases, but the security assessment of software releases was not implemented. The state of the architecture-based approach to reliability assessment of component-based software was presented . The common requirements of the architecture-based models were identified, and the classification was proposed. However, the approach only gave a framework for the reliability assessment of component-based software.
Idongesit proposed a quantitative risk assessment model based on annualized loss expectancy and the exposure factor, which is the percentage of asset loss due to the potential threat . The model can assess the potential threats to a development effort and rank these threats based on the risk metric calculation. The Common Vulnerability Scoring System (CVSS)  provides an open framework for communicating the characteristics and effects of IT vulnerabilities. Its quantitative model ensures repeatable accurate measurement while enabling users to see the underlying vulnerability characteristics used to generate the scores. However, CVSS measures software vulnerability based on the external factors of software system. Occasionally, the result cannot precisely represent the software vulnerability.
In summary, we propose a new quantitative assessment of component security (QACS) approach to represent more accurately the security level of the COTS component based on the internal factors of software component. The internal factors of software component are some important information inside the component instead of the outside enviornment. There are some main internal factors such as the component objects, component interfaces, component methods, and component attributes. These internal factors reflect more accurate security features for the tested component.
3. A Quantitative Assessment Framework for Component Security
With the further development of component-based software engineering, COTS components have become frequently used in software systems. The source codes of the COTS component are unavailable and highly independent. The traditional testing approach of interface mutation is suitable for system integration testing. That is, mutation operators are only effective at the interface between the module and the subsystem. Therefore, the traditional method is unsuitable for testing the COTS component system. Interface mutation testing, which is suitable for the COTS component, can inject faults at the component interface level by extending the mutation operators in the program mutation and increasing the mutation operators at the component interface level. The defects of components can be identified through observation and analysis of the testing result of the system. Figure 1 shows the quantitative assessment framework of the component security level.
The interface fault injection operators of the COTS component include irregular input parameter operators, mutation operators based on program language, mutation operators based on interface specifications , and mutation operators based on interface definition language (IDL) . The fault injection operators, which are commonly used, are as follows: boundary value parameter, parameter beyond the length, parameter attribute replaced, parameter swap, parameter change, parameter assignment, parameter set null, and so on.
The test cases of fault injection can be generated through the selection of a suitable fault injection operator based on parameter types and number of component methods in the interface information. The test scripts can be generated according to the test cases. The component can be run with the test scripts. The running state can be monitored through the dynamic monitoring mechanism. Component vulnerability of the interface methods can be determined by the running state of the component, security prediction mechanism, prediction rules, and vulnerability factor in various methods. The vulnerable score can be obtained under certain running environments in terms of the vulnerability assessment method of component . The vulnerability score, which can assess the security of component , is called the security level of component.
3.1. Fault Injection Operator of the Component Vulnerability
Definition 1. Fault injection operator is the rule that generates a number of fault injection test cases, namely, . denotes a fault injection test case. A class of fault injection test cases can be implemented by one fault injection operator.
Definition 2. Interface fault injection operation is a binary relation , where is a class of interface parameters, . Fault injection test case, which is implemented by fault injection operator , acts on the interface parameter , where can be short for only if it does not cause confusion.
Whether the generated test cases of fault injection are representative is the key to designing the fault injection operator, that is, whether the generated test cases of fault injection can trigger the component’s security vulnerability as greatly as possible. Two errors, namely, environmental errors and interface parameter errors, can trigger component security vulnerabilities in a component system. The first error, which is discussed in the previous paper , does not repeat it again. The focus of the current paper is the interface parameter fault injection. The errors in the interface parameter include three types: IDL description errors, boundary value errors, and irregular input errors. In detail, they are the method errors defined by the IDL description, all boundary value, and irregular input value errors, respectively. Based on the cause of triggering software security vulnerabilities in practical experience, 18 fault injection operators of component vulnerability are specifically designed by comprehensively analyzing the interface mutation operators and the operators of parameter fault injection. These operators are PAS (Parameter Attribute Swap), PSN (Parameter Set Null), IPO (Insert Parameter Operator), PFB (Parameter Flip Bit), PRI (Parameter Reverse-probability Input), IIV (integer Irregular Value), FIV (Float Irregular Value), CIV (Char Irregular Value), BIV (Boolean Irregular Value), RSV (Random String Value), LSV (Long String Value), FSV (Format String Value), DSV (Directory String Value), USV (URL and string value of file directory), CSV (System Command String Value), SSV (SQL string injection), and CSS (Cascading Style Sheet). The new security vulnerabilities found in components can also be added.
In addition, the reverse distribution function of the parameter input domain is generated using the input operator PRI of the parameters of reverse probability distribution. The parameters of reverse probability distribution are then generated. The experimental result shows that component security exception can be effectively triggered through the irregular value of input domain. The input distribution function, which is usually difficult to obtain, is a probability density function. The one dimensional and multidimensional functions are the two commonly used kinds. is assumed the input distribution function, of which the value is in the interval , and the value 0 indicates the probability of selection input field is 0. The algorithm of reverse probability distribution is given in the literature .
A number of models have been designed to facilitate the automatic management of the fault injection operator, such as the automatic learning model of the fault injection operator, adaptive model, topic model, intelligent vulnerable trees, among others. They can automatically increase and manage the operator libraries of fault injection.
3.2. Exceptional Predication Rules and Vulnerability Factors
Definition 3. The component exceptional predication rule is a set of all the possible exceptions that may arise under the exceptional circumstances.
Definition 4. The vulnerability factor of the component represents the degree of the component security vulnerabilities resulting from exceptions. The maximum value is 1, and the minimum value is 0. Table 1 shows the component exceptional predication rules and relative vulnerability factors used to assess the component security level. Each of the factor value of exceptional security vulnerability, which can objectively describe the component security level, is obtained from the triggering causes of software security vulnerabilities, which is a harmful degree combined with the characteristics of component security vulnerabilities based on experience software engineering.
4. Component Security Assessment Algorithm
Suppose is a component, and is a set of interface in . Let represent the execution probability of interface , and is a method properties set of interface . Meanwhile, is the executable probability in method of interface . Suppose , . is the input parameter of the method attribute in . is vulnerable error test cases set and . Δ is all the input set of component . is the reverse probability distribution of . PRED is the logical prediction rules of exceptional database. is the vulnerability factor of exceptional function. Before the vulnerability assessment method of component is given, the following definitions should be introduced first.
Definition 5. Let represent an probability matrix that can be described as , row represents the interface number of component , and column represents the method number of interface . The element states the executable probability for function at interface of component , where .
Definition 6. Interface method coverage (), where (the number of methods executed at least once/the total number methods ) , is used to measure the sufficient degree of the interface method during component testing.
Definition 7. Interface coverage (), where (the number of interfaces executed at least once/the total number interface ) , is used to measure the sufficient degree of the interface test during component testing.
Definition 8. Component test adequacy can be measured by a two-tuple (i.e., and ). Generally, the value is determined by the total score of and , that is, . A greater value of indicates a more comprehensive test; the relationship between and is parallel.
Suppose Sum represents the number of test methods, represents the number of vulnerability methods in component , and represents the vulnerability factors of exceptional method. The value of vulnerability factor is assigned to 0 in case the interface method has no security exceptions. The execution probability of the vulnerability method is labeled . The component vulnerability assessment formula is given as follows:
The value , the security level of the component , can be derived from the Formula (1). The vulnerability assessment formula of the component has the following properties.
Property 9. .
Proof. For any vulnerability method, and can be determined according to the definition of . Therefore, , and are derived, that is, .
Property 9 shows that the value of any component’s vulnerability index is between 0 and 1, with a larger value indicating a higher vulnerability index.
Property 10. .
Proof. For any vulnerability method, and can be determined according to the definition of . Therefore, . In addition, and can be derived from Formula (1). ; therefore, and are derived, that is, . Thus, can proved correct.
Property 10 shows that the vulnerability index of any one component is not less than the peak value of the methods’ vulnerability indexes of the component. All situations of the vulnerability methods of component are synthetically considered by the index of the vulnerability of component.
Based on the component vulnerable assessment method based on the analysis of the interface, the algorithm of the component’s security assessment level is given as follows.
The input probability matrix in Algorithm 1 is not only derived from the component requirement specifications or equal probability but is also statistically calculated after the component is practically executed. The vulnerability factor is assigned to 0 given that the component has no security exceptional interface method. We usually set the value of and from 0 to 1 according to the complexity of tested component and the testing requirement. If the number of interfaces and methods of tested component is no more than 200, we usually let the value of and to be 1 to ensure a complete testing of COTS. In addition, the test cases of interface parameters are generated using the TGSM [20, 21] algorithm. First, the test cases of the interface parameter fault-injection, expressed in matrix form, are generated in terms of fault injection operator. The -factor coverage solution matrix is generated by the TGSM algorithm, which is based on the parameter set of the matrix form. The fault injection test cases are composed of row data of the solution matrix. Test cases generated by the TGSM algorithm not only greatly reduce the scale of the test cases but also have certain coverage. The algorithm can trigger most of the security exceptions with fewer test cases.
is the number of vulnerability methods of the component in the algorithm results. is the vulnerability ratio of the method at the interface of the component . represents the vulnerability level of the component . , the security level of the component , represents the security of the method in the interface of component .
5. Experiment Analysis
To validate the effectiveness of the proposed approach, the approach was implemented in our component security testing system (CSTS) platform . The component security testing and assessing system based on interface fault injection (CSTASboIFT) was implemented using C # language in the Microsoft. NET platform. It is used as a subsystem of the CSTS platform. Figure 2 shows the function flow chart of CSTASboIFT.
In the experiment, six commercial components (Table 2), in which security vulnerabilities exist, were collected for experimental analysis in the CVE Web site. Several group experiments were conducted for these components. A component composed of six interfaces and each interface consisting of eight methods were assumed so that the component could be regarded as a matrix. The first group of experiments tested the condition of assigned to a fixed matrix and divided into interval descending. As changed, the was affected. That is, only the level of vulnerability factors affected the component security level. The second group of experiments tested the condition of assigned to a fixed matrix and divided into interval descending. As changed, was affected. That is, only the executed probability affected the component security level. The last group of experiments considered the effect on , as both vulnerability factor and executed probability simultaneously changed.
5.1. Experiment One
The vulnerability factor value is divided into intervals based on the knowledge of security vulnerability provided by the CVE Web site. represents the executable probability of the method at the interface , represents the vulnerability factors of the method at the interface , and represents the vulnerability level of component . is the security level of component .
The first experiment assumes that is a fixed matrix, which is made up of random values. For every interval, is the same matrix. is divided into 10 intervals: , , , , , , , , , and . The tests were done about ten times with different random values of and , respectively, in the experiment, and then the value was got by calculating the average result. Under the condition that is regarded as a matrix at every interval of , affects the value .
When is equal to , as is a fixed value, the values of and are proportional to the value of . The value of is reversely proportional to . A larger value indicates a lower security level. As the vulnerability number increases, the security level decreases slowly. The results are shown in Figure 3. The testing results show that the proposed assessment Formula (1) is effective.
5.2. Experiment Two
is assumed a fixed matrix made up of random values in the second experiment. Then, is divided into 10 intervals: , , , , , , , , , and . The tests were done about ten times with different random values of and , respectively, in the experiment, and then the value was got by calculating the average result. Under the condition that is regarded as a matrix at every interval of , affects the value of .
When is equal to , as the value of is fixed, the values of and are proportional to the value of . The value of is reversely proportional to . A larger value indicates a lower security level. That is, the component is more insecure. As the number of increases, the decline rate of the security level is initially fast and then becomes slow. The results, as presented in Figure 4, show that the proposed assessment Formula (1) is effective.
5.3. Experiment Three
Both parameters and are assumed to influence the value of in Experiment three. The two parameters are randomly divided into 10 intervals. The effect on the value of when the two parameters changed simultaneously was observed. The results show that when is proportional to , the value of changes significantly as shown in (a) and (b) of Figure 5. When is inversely proportional to , the value of changes indistinctly as shown in (c) and (d) of Figure 5. As the vulnerability number cnt increases, the security level decreases. Results are shown in Figure 5. The values of and are based on the main -axis values (the left axis), and the value of the is based on the secondary -axis values (the right axis). The testing results show that the proposed assessment Formula (1) is suitable for assessing the COTS component.
5.4. Experiment Four
During the system testing process, the test cases of fault injection can be implemented by testing the component information and fault injection operators. The testing state can be recorded through the dynamic monitoring mechanism after running the tested component. The vulnerability type of the component method and vulnerability factor can be derived from comparing the testing state with the component information record in PRED. The execution probability of each method in the component can be determined by running the tested component 100 times, respectively. Then, the component security level can be determined using Formula (1). The testing results are shown in Table 3. Because different components have different method parameters which have different parameter types such as integer, string, and boolean, we select suitable fault injection factors to test the component security based on the above presented 18 fault injection operators of component in Table 3.
A higher level of security indicates better relative security of the component. The component is secure if and only if the component security level is 1. Otherwise, it has security vulnerability. In contrast to the test results in Table 3, the level of security is closer to the vulnerability information published in the CVE Web site. The values of execution probability and vulnerability factors have been proved to affect the component security. The results agree with the previous experimental results, further confirming the feasibility and effectiveness of this approach.
5.5. Experimental Comparison
Table 4 shows the comparison with other assessment approaches. The related assessment approaches have various merits as shown in the analysis of Table 4. Our QACS approach mainly focuses on quantitatively assessing the security level of the COTS component. The source codes of the COTS component are generally unknown. The internal factors of the component are considered in QACS. The QACS approach is implemented in the CSTASboIFT that is integrated into our CSTS  system developed to test the security of the COTS component.
Aside from the above-mentioned approaches, CVSS is a widely used quantitative assessment approach for software component. Khan and Han developed an assessment scheme for the security properties of software components. The assessment scheme provided a numeric score that indicates the relative strength of the security properties of the component . Results of the comparison with the CVSS approach and Khaled approach are shown in Table 5. In addition, the security level of six tested components is also listed according to the reported empirical data of security Web sites in Table 5. The initial CVSS Base Score is calculated by the CVSS formula .
Analyzing Table 5, the QACS approach is found to be close to the empirical data reported on some security Web sites regarding the security of the tested COTS components. The security score of the QACS approach is more accurate than that of others because the internal factors of the tested component are considered. The assessment process of the QACS approach is objective, without human intervention. Nevertheless, the CVSS approach is mainly focused on assessing the vulnerability of the general software component. If an assessor is not familiar with the assessed software, the vulnerability score becomes inaccurate because of the assessor’s subjectivity in the assessment process. On the other hand, the Khaled approach is an approximate assessment approach based on the assumed component environment. In addition, presently there are no more effective assessment approaches for the security level of software component according to our studies.
In component-based software development and maintenance activities, the assessment of the component reliability and security level is essential in the third-party component. Research on an effective quantitative assessment method for component security level is valuable in the CBSE. The current paper proposes the QACS approach for quantitatively assessing component security, which is based on the interface fault injection of the component. The fault injection operators of the component vulnerability and PRED are defined and discussed. The assessment framework and algorithm for component security are proposed as well. The testing effect proves excellent in practical application. QACS has the following main advantages: (1) the security exceptional assessment rules and vulnerability factors of the component are given, providing the basis for security assessment; (2) QACS can assess the security level of the third-party component, and the score of component security can be accurately calculated; (3) QACS has better operability.
We would like to thank Professor T. Y. Chen for his constructive suggestions and valuable discussion in this paper writing. In addition, this work was supported partly by the National Natural Science Foundation of China (NSFC) under Grants no. 61202110 and no. 61063013, Natural Science Foundation of Jiangsu Province under Grant no. BK2012284, and the Research Fund for the Doctoral Program of Higher Education of China under Grant no. 20103227120005.
- J. Chen, Y. Lu, and X. D. Xie, “Component security testing approach by using interface fault injection,” Journal of Chinese Computer Systems, vol. 31, no. 6, pp. 1090–1096, 2010.
- S. Gudder and R. Greechie, “Uniqueness and order in sequential effect algebras,” International Journal of Theoretical Physics, vol. 44, no. 7, pp. 755–770, 2005.
- I. O. Mkpong-Ruffin, Quantitative risk assessment model for software security in the design phase of software development, [Ph.D. thesis], Auburn University, Auburn, Ala, USA, 2009.
- J. Han and Y. Zheng, “Security characterization and integrity assurance for component-based software,” in Proceedings of the International Conference on Software Methods and Tools (SMT '00), pp. 61–66, IEEE Computer Society, Wollongong, Australia, 2000.
- F. Jabeen and M. Jaffar-Ur Rehman, “A framework for object oriented component testing,” in Proceedings of the IEEE International Conference on Emerging Technologies, (ICET '05), pp. 451–460, Islamabad, Pakistan, September 2005.
- K. Md. Khan, J. Han, and Y. Zheng, “Characterizing user data protection of software components,” in Proceedings of the Software Engineering Conference, pp. 3–11, Canberra, Australian, 2000.
- K. M. Khan and J. Han, “A security characterisation framework for trustworthy component based software systems,” in Proceedings of the 27th Annual International Computer Software and Applications Conference (COMPSAC '03), pp. 164–169, November 2003.
- K. M. Khan and J. Han, “Assessing security properties of software components: a software engineer's perspective,” in Proceedings of the Australian Software Engineering Conference (ASWEC '06), pp. 199–208, Sydney, Australia, April 2006.
- K. Khan, J. Han, and Y. Zheng, “A framework for an active interface to characterise compositional security contracts of software components,” in Proceedings of the Australian Software Engineering Conference (ASWEC '01), pp. 117–126, 2001.
- “Common criteria project/ISO. Common criteria for information technology security evaluation,” version 2.1 (ISO/IEC International Standard 15408). NIST, USA and ISO, Switzerland, 1999, http://csrc.nist.gov/cc/.
- O. H. Alhazmi and Y. K. Malaiya, “Quantitative vulnerability assessment of systems software,” in Proceedings of the Annual Reliability and Maintainability Symposium (RAMS '05), pp. 615–620, January 2005.
- Y.-K. Zhang, S.-Y. Jiang, Y.-A. Cui, B. W. Zhang, and H. Xia, “A qualitative and quantitative risk assessment method in software security,” in Proceedings of the 3rd International Conference on Advanced Computer Theory and Engineering, (ICACTE '10), pp. V1534–V1539, Chengdu, China, August 2010.
- X. Tang and B. Shen, “Extending model driven architecture with software security assessment,” in Proceedings of the 3rd IEEE International Conference on Secure Software Integration Reliability Improvement, (SSIRI '09), pp. 436–441, Shanghai, China, July 2009.
- X. Wang, H. Shi, T. Y. W. Huang, and F. C. Lin, “Integrated software vulnerability and security functionality assessment,” in Proceedings of the 18th IEEE International Symposium on Software Reliability Engineering (ISSRE '07), pp. 103–108, Trollhattan, Sweden, November 2007.
- K. Goševa-Popstojanova and K. S. Trivedi, “Architecture-based approach to reliability assessment of software systems,” Performance Evaluation, vol. 45, no. 2-3, pp. 179–204, 2001.
- “NVD Common Vulnerability Scoring System (CVSS),” http://nvd.nist.gov/cvss.cfm.
- Y. Jiang, G. M. Xin, J.-H. Shan, L. Zhang, B. Xie, and F.-Q. Yang, “Method of automated test data generation for web service,” Chinese Journal of Computers, vol. 28, no. 4, pp. 568–577, 2005.
- M. E. Delamaro, J. C. Maldonado, and A. P. Mathur, “Interface mutation: an approach for integration testing,” IEEE Transactions on Software Engineering, vol. 27, no. 3, pp. 228–247, 2001.
- J. M. Voas and K. W. Miller, “Predicting software's minimum-time-to-hazard and mean-time-to-hazard for rare input events,” in Proceedings of the 6th International Symposium on Software Reliability Engineering, pp. 229–238, October 1995.
- J. Chen, Y. Lu, and X. Xie, “A fault injection model of component security testing,” Computer Research and Development, vol. 46, no. 7, pp. 1127–1135, 2009.
- J. Chen, Y. Lu, W. Zhang, and X. D. Xie, “A fault injection model-oriented testing strategy for component security,” Journal of Central South University of Technology, vol. 16, no. 2, pp. 258–264, 2009.
- J. Chen, Y. Lu, X. Xie, et al., “Design and implementation of an automatic testing platform for component security,” Computer Science, vol. 35, no. 12, pp. 229–233, 2008.