Abstract

Biogeography based optimization (BBO) is a new competitive population-based algorithm inspired by biogeography. It simulates the migration of species in nature to share information. A new hybrid BBO (HBBO) is presented in the paper for constrained optimization. By combining differential evolution (DE) mutation operator with simulated binary crosser (SBX) of genetic algorithms (GAs) reasonably, a new mutation operator is proposed to generate promising solution instead of the random mutation in basic BBO. In addition, DE mutation is still integrated to update one half of population to further lead the evolution towards the global optimum and the chaotic search is introduced to improve the diversity of population. HBBO is tested on twelve benchmark functions and four engineering optimization problems. Experimental results demonstrate that HBBO is effective and efficient for constrained optimization and in contrast with other state-of-the-art evolutionary algorithms (EAs), the performance of HBBO is better, or at least comparable in terms of the quality of the final solutions and computational cost. Furthermore, the influence of the maximum mutation rate is also investigated.

1. Introduction

With the development of science and engineering, related optimization problems become more and more complex. Optimization methods are being confronted with great challenges brought by some undesirable but unavoidable characteristics of optimization problems such as being high-dimensional, nondifferentiable, nonconvex, noncontinuous, and so on. Efficient optimization methods are urgently required by the complicated optimization problems in the real world. Therefore, various evolutionary algorithms have been applied to solve difficult optimization problems in recent decades, which include GAs [1], particle swarm optimization approach (PSO) [2], DE [3, 4], ant colony optimization (ACO) [5], artificial bee colony strategy (ABC) [6, 7], and BBO [8, 9].

Biogeography based optimization (BBO) is a new population-based algorithm. It simulates the blossom and extinction of species in different habitats based on the mathematical model of biogeography. Decision variables of better solutions tend to be shared in the migration operation and decision variables of each solution are probabilistically replaced to improve the diversity of population in mutation operation. Due to good search ability, BBO has been applied to PID parameter tuning [10], parameter estimation of chaotic system [11], complex system optimization [12], satellite image classification [13], and so forth.

In comparison with other EAs, owing to direct-copying-based migration and random mutation, exploration ability of BBO is not so efficient despite outstanding exploitation. In other words, BBO can be easily trapped into local optimum and suffer from premature convergence owing to lack of corresponding exploration to balance its exploitation.

In order to overcome the weakness of BBO, lots of improved BBO variants have been proposed. Ma [14] presented six different migration mathematical models and made basic improvements on the migration operator. Gong et al. [15] hybridized the DE with the migration operator of BBO to balance the exploration and exploitation of BBO. Motivated by blended crossover operator in GAs, Ma and Simon [16] got decision variables of an offspring by blending corresponding decision variables of two parent individuals based on different weighting constants. Li and Yin [17] proposed multiparent migration in which three consecutive individuals are chosen to generate three new individuals by basic BBO and then the new individuals are modified like multiparent crossover in GA; besides, the mutation was improved based on Gaussian operator. Li et al. [18] updated the decision variables not selected in migration by generating a perturbation from the neighborhood solutions and Gaussian operator was integrated into mutation. Wang and Xu [11] integrated DE mutation operator into migration operator of BBO and simplex search was introduced to improve the searching accuracy. Sayed et al. [10] formed new decision variables of an offspring by combining corresponding decision variables from two different parents with weighted constants related to the rank of their fitness in migration. Boussaïd et al. [19] proposed a new hybrid BBO in which new solutions are first generated by DE mutation and then modified by migration of original BBO. Xiong et al. [20] utilized four individuals’ features to construct a new solution in proposed polyphyletic migration and orthogonal learning was introduced to further enhance converge speed toward global optimum.

In order to balance the exploration and exploitation of BBO, a new hybrid BBO called as HBBO is proposed in the paper. The unique points of HBBO are shown as the following. On one hand, a new hybrid mutation operator combining DE mutation and SBX is presented in HBBO while operators of EAs are often hybridized with migration operator in most of BBO variants. On the other hand, HBBO provides a new method to extend BBO to optimize constrained problems well due to only a few BBO variants available for constrained optimization in previous literatures. In addition, DE is applied to evolve one half of population to improve convergence speed further and chaotic search is introduced to enhance the diversity of population. Experiments have been conducted on twelve benchmark functions and four engineering optimization problems, and HBBO is compared with many other state-of-the-art algorithms from the quality of solutions obtained and computational cost. Furthermore, the influence of maximum mutation rate on HBBO is studied.

The rest of the paper is organized as follows. Constrained optimization, basic BBO, mutation strategies of DE, and SBX are briefly introduced in Section 2. In Section 3, the HBBO method proposed in the paper is specifically depicted. The comparison with six state-of-the-art algorithms on twelve benchmark functions is presented in Section 4. In Section 5, HBBO is compared with other methods on four well known engineering optimization problems. Section 6 further demonstrates the efficiency of HBBO and presents the investigation on the influence of maximum mutation rate. Finally, the work is concluded in Section 7.

2. Preliminary

2.1. Constrained Optimization

Constrained optimizations are always inevitable in scientific study and engineering design. A general constrained optimization problem can be written as the followingwhere represents the solution vector ,    is the dimensionality of a solution in the paper, is the number of inequality constraints, and is the number of equality constraints. In common practice, equality constraints are often transformed to inequality constraints with a given small tolerance  . For example, the equality constraint above can be converted to . In the paper, the feasible-based rule by Deb [21] is applied to handle constraint. In the constraint handling mechanism, fitness value and constraint violation are considered separately based on the following criterions: (a) any feasible solution is preferred to any infeasible solution; (b) between two feasible solutions, the one having smaller objective function value is preferred; (c) between two infeasible solutions, the one having smaller constraint violation is preferred.

2.2. Biogeography Based Optimization

Biogeography is the study of the distribution of species on earth surface over time. BBO is proposed based on the mathematical model of biogeography by Simon in 2008 [8]. In BBO, every solution is analogous to a habit; habit suitability index (HSI) is utilized to measure habits just like fitness function in other EAs; the elements that characterize habitability are called suitability index variables (SIVs) which are identical to the decision variables in other EAs. A good solution is similar to a habitat with high HSI which have a large number of species and vice versa. The species in habitats with high HSI tend to emigrate to habitats with low HSI. That is, habitats with high HSI tend to share their features while habitats with low HSI are inclined to accept the features from good habitats.

In BBO, each individual evolves by immigration and mutation operator. The SIVs of individuals are probabilistically shared in migration operator as shown in Algorithm 1, where is the jth SIV of ith individual in the population, here represents the immigration rate of , and is the emigration rate of , which are related to the number of species in the corresponding habitat; NP is the population size in the paper.

Target individual for migration
For to do
 Select with probability
 If is selected
  For to NP do
   Select with probability
   If is selected
    Replace with
   End if
  End for
 End if
End for

The following mathematical model is applied to calculate immigration rate and emigration rate owing to its outstanding performance in [22]:where is the number of species in habitat , and are, respectively, the maximum value of immigration and emigration rate; is equal to .

In mutation operator, it is probabilistically decided whether or not to replace each SIV in a solution by a randomly generated SIV in the light of mutation rate. The detail of mutation operator is shown in Algorithm 2. The mutation rate can be calculated as follows:where , represents the priori probability of existence for ith individual; is a user-defined parameter which represents the maximum mutation rate.

Target individual for mutation
For to
 Select based on
 If is selected
   Replace it with a randomly generated SIV
 End if
End for

More details about basic BBO can be found in [8, 23].

2.3. Differential Evolution

DE algorithm is a population-based stochastic search method proposed by Storn and Price in 1997 [3]. Due to simple structure, few parameters, easy use, and fast convergence speed, DE has obtained wide application in various regions. DE generates new individuals by perturbing a randomly chosen individual with weighted differences for some couples of different individuals. Only when the offspring outperforms corresponding parent, the offspring survives as the parent for next generation. Mutation operator is the most important part in DE. In this part, only three widely applied mutation strategies are briefly introduced as follows:

rand/1:

best/1:

rand to best/1:where is the best individual in population; , , are uniformly distributed different numbers in the range ; is mutation scaling factor; represents the new individuals generated by mutation operator.

2.4. SBX of GA

Genetic algorithms simulate the evolutional process in nature to solve optimization problems. In GA, some good individuals are chosen based on Deb’s feasible-based rule. Different individuals can share information in crosser operator. SBX is one of the most popular crosser operators which can explore the neighborhood region of parent individual as follows:where is the jth decision variable of the ith offspring individual; is the jth decision variable of ith parent individual selected. can be obtained by the random number in based on (9), where is the distribution index for crossover. The detail of SBX can be found in [24]. Consider the following:

3. Proposed Algorithm HBBO

In mutation operator of basic BBO, SIVs are replaced probabilistically by new SIVs randomly generated. Although the mutation of BBO can improve the diversity of population, the random operation brings blindness to search. To modify the defect, a new hybrid mutation operator is proposed, in which DE mutation operator and SBX are mixed to generate promising SIV as shown in Algorithm 3. From Algorithm 3, it can been seen that two candidate SIVs are generated for each SIV mutated: one is gotten by DE rand/1 mutation and the other by SBX. One point should be stated specially: in DE rand/1 mutation and and in SBX are all randomly selected from the first half of parent population which is sorted based on Deb’s feasible rule (better one in front); , in DE rand/1 mutation are randomly selected from the whole population. The core idea of hybrid mutation is based on the following considerations. First, owing to well-known performance in locating the region of global optimum, DE mutation can explore new search space with more clear direction towards global optimum instead of the random mutation in the original BBO. Second, SBX can explore the neighbor region of parent individual so that it can be combined with DE to explore search space efficiently. Third, the combination of DE mutation and SBX can balance the exploitation ability of BBO.

Target individual for mutation;
For to
 If is selected for mutation as basic BBO
   Get two candidate SIVs of offspring;
   (1) Get a temp SIV by DE rand/1 mutation;
   (2) If rand < 0.5 (rand is random number in )
      Get another temp SIV by (7);
      Else
      Get another temp SIV by (8);
      End if
 Else
   the th SIV of th individual in population Island survives as SIV of offspring;
   (population Island contains new individuals gotten by migration operator)
 End if
End for
Two temp offspring individuals are gotten for , and the better one survives as offspring;

In order to speed up convergence, DE is further hybridized with BBO. The first half of parent population also evolves by two DE mutation strategies (rand/1 and rand to best/1) and two new individuals are generated for each one in the first half. The best one among these two new individuals and corresponding parent individual survives to replace corresponding one in the second half of parent population.

For convenience and easy use, self-adaption mechanism for mutation scaling factor of DE proposed in [25] is applied, in which each individual is given an independent mutation scaling factor. The self-adaption mechanism is written as follows:where represents mutation scaling factor for ith individual in ()th generation, represents mutation scaling factor for ith individual in Gth generation; , are the lower boundary and the change range of mutation scaling factor, respectively, and and are uniformly distributed numbers in .

Based on unique ergodicity, inherent stochastic property, and irregular chaos, chaotic search can reach each situation in given space so that it can contribute to the escape from the local optimum and is often integrated into EAs to enhance global search ability. Hence, the chaotic search is brought in for the first half of population. In the paper, logistic maps are used to generate chaotic sequences as follows:where , , are, respectively, the first ith, ()th element of sequence ; represents the size of population for chaotic search; ; is the control parameter; sequence is chaotic when and . We can apply the following equation to perform chaotic search for i-th individual in parent population by vector :where is the new individual generated by chaotic search for ; represents the search radium vector in Gth generation; each dimension in represents the search radium for the variables in corresponding dimension of solutions in parent population.

In the initial phase, large chaotic search radium is helpful for escape from the local optimum; small chaotic search radium can improve the accuracy of search at the later stage of evolution. The search radium for jth decision variable is adapted as follows:where is the initial search radium for jth decision variable; is the maximum number of generations.

In order to maintain solutions feasible, any new decision variable generated in evolution process should be repaired if it violates boundary. Suppose that is the jth decisions variable in certain new individual generated during evolution process. If violates given boundaries, it can be modified as follows:where , are, respectively, the lower boundary and upper boundary of jth decision variable; randnumj is uniformly distributed number in in each dimension.

The whole procedure of HBBO is described in Algorithm 4 in detail. From Algorithm 4, it can be seen that operations are mainly concentrated on the first half of population. It can be explained as follows. First, the convergence speed can be improved by focusing operations on the first half. Second, the information of the second half is also utilized in migration and hybrid mutation operator to generate promising solutions. Third, the risk of trapping into stagnation brought by concentration of operations can be relieved by chaotic search. Consequently, the focus of operations can make the search of HBBO efficient.

Generate the initial population POP and vector of mutation scaling factors;
Evaluate the fitness and constraint violations of each individual in POP;
For each generation do
 Sort the individuals in POP based on Deb’s feasibility-based rule (better in front);
 For each one in POP’s first half
   Get two new individuals by two DE mutation strategies (rand/1, rand to best/1);
   Evaluate the fitness value and constraint violations of these two new individuals;
   Among these two new individuals and corresponding parent individual, the best one is stored into population
   Tempbest;
 End for
 Update the vector of prior probability;
 For each one in the first half of POP
   Generate a new individual by Algorithm 1, and store it into population Island;
 End for
 For each one in the first half of POP
   Get one offspring by Algorithm 3 and replace the corresponding individual in population Island with it;
 End for
 Go on chaotic search for the first half of POP and the new individuals generated are stored into population tempIsland;
 Make a contrast between the corresponding ones in Island and tempIsland, and the first half of POP, the best one survives
 as the corresponding one in POP for next generation;
 The population Tempbest replace the second half of POP as the parent ones for next generation;
 Update , by (10), (13) respectively;
End for

4. Simulation Tests on Benchmark Functions

4.1. Parameter Setting and Statistical Results Obtained by HBBO

In order to validate the performance of the proposed HBBO on numerical optimization, twelve benchmark test functions are adopted. The selected benchmark problems propose a good challenge and measure for constrained optimization techniques. Main characteristics of the selected benchmark functions are shown in detail in Table 1 where is the dimensionality of a solution for test function, represents the ratio of feasible region to search space, NI is the number of nonlinear inequality constraints, LI is the number of linear inequality constraints, NE is the number of nonlinear equality constraints, and is the number of constraints active at the optimal solution. The metric can be computed as the following:where is the number of solutions generated randomly ( 1,000,000 in the paper), and is the number of feasible solutions found in all the solutions randomly generated. All the benchmark functions selected are depicted explicitly in Appendix A.

For each test function, we performed 30 independent runs in matlab 7.0. The parameters of HBBO for experiments are set as follows: is chosen as recommended in [8]; is set to be 0.8 which is much bigger than the corresponding value in basic BBO because big can improve mutation probabilities of individuals in population and enhance population diversity; based on the suggestions of mutation factor in DE in [3] and numerous experiments, and are chosen; is chosen in the light of the effect of on the search ability of SBX [24].

Through various tests, an appropriate set of population size NP for all the selected functions is found with which HBBO can present desirable performance. In the set found, population size NP for each benchmark function is given as the following: 200 for G02, 150 for G07, and 100 for the rest of benchmark functions. In each run, the maximum generations are given as the following: 200 for G01 and G06, 150 for G02 and G04, 600 for G03, 1500 for G05, 334 for G07, 40 for G08, 300 for G09 and G11, 550 for G10, and 50 for G12. In G03 and G05, the toleration value for equation constraint equals 0.001 as recommended in [19]; the toleration value for equation constraint of G11 is set to be 0.0001 as suggested in [26].

Table 2 summarizes the statistical features of results for twelve test functions obtained by HBBO and number of fitness function evaluations (FFEs) required. From Table 2, we can see that HBBO can get optimal solution in all 30 runs for seven benchmark functions (G04, G06, G07, G08, G09, G10, and G12); for G01, HBBO can get the optimal solutions in some runs; the best results obtained by HBBO are very close to the known best solution for G02; for three benchmark functions (G03, G05, and G11), the results gained by HBBO are very close to the optimal solutions or the known best solution.

4.2. Comparison with Other State-of-the-Art Methods

In this part, the proposed approach HBBO is compared with other six state-of-the-art optimization technologies.

The following are the six state-of-the art optimization technologies: conventional BBO with DE mutation technology (CBO-DM) [19], hybrid PSO with DE strategy (PSO-DE) [27], coevolutionary DE algorithm (CDE) [28], changing range genetic algorithm (CRGA) [26], self-adaptive penalty function based algorithm (SAPF) [29], and simple multimembered evolution strategy (SMES) [30]. The statistic results of other six approaches are compared with that of HBBO in Table 3, which are gotten from the original references. The “NA” in tables of the paper indicates the results of compared algorithms are not available. It should be noted that the best results obtained by algorithms are marked in boldface in the following tables. As far as computational cost is concerned, CBO-DM, SAPF, CDE, SMES, respectively, need 350,000, 500,000, 248,000, and 240,000 FFEs for all the test functions; PSO-DE needs 70,100 FFEs for G04, 17,600 FFEs for G12, and 140,100 FFEs for the rest of test functions; 1,350 to 68,000 FFEs are required for CRGA; the computational cost for CRGA is given in detail in [26].

With respect to CBO-DM, a variant of BBO, similar results are obtained by HBBO for five functions (G04, G06, G08, G09, and G12); in two functions (G07, G10), HBBO has better performance in the respect of considered metrics (best, mean, and worst); in G02, HBBO gets better best value with greater variability; the results of HBBO are obviously inferior but comparable for G01; the results obtained by HBBO are only lightly inferior for three test functions (G03, G05, and G11). In addition, the computational cost is far less than that of CBO-DM for all selected benchmark functions except G05. Consequently, HBBO is powerful competitor for CBO-DM on constrained optimization.

In contrast with other five state-of-the-art methods, the performance of HBBO is obviously inferior for function G01; HBBO can get better or similar solutions for the selected test functions except for G01, G03, and G11. In G03 and G11, the results obtained by HBBO are only lightly inferior to those of SMES. Furthermore, the computational cost is very competitive with respect to other methods for all selected test functions except G05.

5. Simulation Tests on Engineering Optimization Problems

In this part, four well-known engineering optimization problems are utilized to validate the performance of HBBO on solving real-world optimization problems. The four engineering optimization problems contain welded beam design problem, tension/compression spring design problem, speed reducer design problem, and three-bar truss design problem, which are listed in Appendix B. Parameters in HBBO for these four engineering optimization problems are as follows: population size and maximum generations are, respectively, 50 and 200 for welded beam design problem, 50 and 350 for tension/compression spring design problem, 100 and 100 for speed reducer design problem, and 50 and 60 for three-bar truss design problem; other parameters for HBBO are set in the same way as Section 4. For each engineering optimization problem, 30 independent runs are performed. Table 4 showed the statistic results for the four engineering optimization problems solved by HBBO. We will evaluate performance of HBBO in respect of the quality of results and computational cost.

In order to demonstrate the superiority of HBBO, it is compared with other state-of-the-art algorithms on the four engineering problems. Welded beam and tension/compression spring design problems are also attempted by PSO-DE [27], CDE [28], coevolutionary particle swarm optimization (CPSO) [31], ()-evolutionary strategy (()-ES) [32], unified particle swarm optimization (UPSO) [33], and ABC [7]. PSO-DE [27], ()-ES [32], and ABC [7] have also already performed on speed reducer design problem. PSO-DE [27] and Ray and Liew [34] have also been applied to solve three-bar truss design problem. The comparison of statistical results and computational cost for four engineering optimization problems between HBBO and other algorithms is shown in Tables 5, 6, 7, and 8.

From Tables 5, 6, 7, and 8, it can be seen that HBBO outperforms other compared algorithms for the given engineering optimization problems except tension/compression spring design problem for which PSO-DE has best performance. For tension/compression spring design problem, HBBO get similar best result and the mean and worst results of it are just lightly inferior in contrast with PSO-DE.

6. Discussions

In this part, HBBO is compared with the original BBO and self-adapting DE (SADE) to demonstrate the searching efficiency of HBBO further. In addition, the influence of maximum mutation rate on searching efficiency of HBBO is investigated.

6.1. Comparison with the Original BBO and SADE

The detail of the original BBO can be gotten from [8] and SADE proposed in [25] is compared. Specific parameter setting of these two algorithms is the same as the original references while the parameters related to test functions such as population size and constraint tolerance are in accordance with Section 4. Deb’s feasible rule is applied in BBO and SADE to handle constraint. Here, HBBO is adopted in an identical way as described in Section 4.

Figure 1 illustrates typical evolution processes of objective function value of best solution in population when four benchmark functions (G02, G03, G07, and G09) are, respectively, solved by HBBO, BBO, and SADE. From Figure 1, it can be seen that HBBO have fastest convergence speed while BBO is often trapped into stagnation. So it can be concluded that the exploration and exploitation of BBO are well enhanced and balanced by new mutation operator and further hybridization with DE and chaotic search.

6.2. Influence of Maximum Mutation Rate on HBBO

The maximum value of mutation rate is related to the probability that individuals mutate by new hybrid mutation operator so that it affects the balance degree of exploration and exploitation of HBBO. In order to investigate the effect of on search efficiency of HBBO, is set to be different values including 0.05, 0.1, 0.4, 0.8, and 1. The investigation experiments are performed on five benchmark functions (G01, G02, G03, G07, and G10).

The other parameters are in accordance with description in Section 4. For each value of and test function, we perform 30 independent runs. The statistical features of results obtained by HBBO with different are summarized in Table 9.

From Table 9, we can see that the value of has significant influence on search efficiency of HBBO. The influence can be stated from three respects as the following. First, HBBO with different value of has different performance on the five test functions. Second, HBBO with too small or too big could not solve the selected test functions well while HBBO with middle value of has better comprehensive performance. Third, the fittest for each test function is different. The influence above may be explained as follows. Too small cannot balance the exploitation of BBO well while too big will destroy the exploitation of BBO; different test functions have different characteristics so that they have different requirements of exploration and exploitation of optimization algorithm.

7. Conclusions

The paper proposes a new hybrid biogeography based optimization (HBBO) for constrained optimization. For the presented algorithm HBBO, a new mutation operator was proposed to generate promising solutions by merging DE mutation with SBX; a half of the population also evolved by two mutation strategies of DE. Chaotic search was introduced for escape from stagnation and Deb’s feasibility-based rule was applied to handle constraints. Furthermore, self-adaption mechanism for the mutation scaling factor of DE was utilized to avoid bothering of choosing an appropriate parameter.

Simulation experiments were performed on twelve benchmark test functions and four well-known engineering optimization problems. HBBO can obtain better or comparable results in contrast with other state-of-the-art optimization technologies. At the same time, the low computation cost is the obvious advantage of our HBBO. In short, HBBO is an effective and efficient method for constrained optimization. In addition, the influence of maximum mutation rate was investigated and the results demonstrate HBBO with maximum mutation rate of middle value has better comprehensive performance.

Maximum immigration rate and emigration rate affect the migration probability and the prior probability of existence for individuals in population. How to select the fittest and for HBBO will be one possible focus of our research in the future. Besides, HBBO will be extended to multiobjectives optimization.

Appendices

A. Benchmark Test Functions

A.1. G01

where the bounds are (), (), and . The global optimum is at with

A.2. G02

where and (). The global optimum is unknown; the best objective function value reported is .

A.3. G03

where and   . The optimum solution is with .

A.4. G04

where , , and (). The optimum solution is with .

A.5. G05

where , , , and . The best known solution is , where .

A.6. G06

where and . The optimum solution is with .

A.7. G07

where (). The optimum solution is (2.171996, 2.363683, 8.773926, 5.095984, 0.9906548, 1.430574, 1.321644, 9.828726, 8.280092, 8.375927) with .

A.8. G08

where and . The optimal solution is where .

A.9. G09

where . The optimum solution is = (2.330499, 1.951372, −0.4775414, 4.365726, −0.6244870, 1.038131, 1.594227) with .

A.10. G10

where , (), and 10,000 (). The optimum solution is = (579.3066, 1359.9707, 5109.9707, 182.0177, 295.601, 217.928, 286.165, 395.6012) with .

A.11. G11

where and . The optimum solution is with .

A.12. G12

where () and . A point is feasible if and only if there exist , , such that the above inequality holds. The optimum solution is with .

B. Engineering Design Problems

B.1. Welded Beam Design Problem

A welded beam is designed for the minimum cost subject to constraints on shear stress (); bending stress in the beam (); bucking load on the bar ; end deflection of the beam (); and side constraints. There are four design variables , , , and . where , , , , , , , , ,  lb, in,  psi,  psi,  psi,  psi, in, , , , .

B.2. Tension/Compression String Design Problem

In this problem, the objective is to minimize the weight of a tension/compression spring subject to constraints on minimum deflection, shear stress, surge frequency, and limits on outside diameter and on design variables. The design variables are the mean coil diameter ; the wire diameter ; and the number of active coils . where , , and .

B.3. Speed Reducer Design Problem

where , , , , , , and .

B.4. Three-Bar Truss Design Problem

where and ;  cm,  kN/cm2, and  kN/cm2.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This research is supported by the Specialized Research Fund for the Doctoral Program of Higher Education of China under Grant no. 20120036130001, the Fundamental Research Funds for the Central Universities of China under Grant no. 2014MS93, and the Independent Research Funds of State Key Laboratory of Alternate Electrical Power System with Renewable Energy Sources of China under Grant no. 201414.