Research Article  Open Access
A Study on ManyObjective Optimization Using the KrigingSurrogateBased Evolutionary Algorithm Maximizing Expected Hypervolume Improvement
Abstract
The manyobjective optimization performance of the Krigingsurrogatebased evolutionary algorithm (EA), which maximizes expected hypervolume improvement (EHVI) for updating the Kriging model, is investigated and compared with those using expected improvement (EI) and estimation (EST) updating criteria in this paper. Numerical experiments are conducted in 3 to 15objective DTLZ17 problems. In the experiments, an exact hypervolume calculating algorithm is used for the problems with less than six objectives. On the other hand, an approximate hypervolume calculating algorithm based on Monte Carlo sampling is adopted for the problems with more objectives. The results indicate that, in the nonconstrained case, EHVI is a highly competitive updating criterion for the Kriging model and EA based manyobjective optimization, especially when the test problem is complex and the number of objectives or design variables is large.
1. Introduction
Evolutionary algorithms (EAs) are a class of metaheuristics inspired by the process of natural evolution, and they have been successfully applied to solve the optimization problems with two or three objectives over the past two decades [1]. Recently, EAs are gradually extended to deal with manyobjective problems that involve four or more objectives. GarzaFabre et al. [2] combined three novel fitness assignment methods with a generic multiobjective evolutionary algorithm (MOEA) and conducted the numerical experiments in 5 to 50objective DTLZ1, DTLZ3, and DTLZ6 problems [3]. It was found that the proposed three methods were effective in guiding the search in highdimensional objective spaces. Adra and Fleming [4] investigated two new management mechanisms for promoting diversity in evolutionary manyobjective optimization and tested 6 to 20objective DTLZ2 problems. The results indicated that the inclusion of one of the mechanisms improved the convergence and diversity performances of the existing MOEAs in solving manyobjective optimization problems. Hadka and Reed [5] proposed the Borg MOEA for the manyobjective and multimodal optimization using the dominance concept and an adaptive population sizing approach. A comparative study was conducted on 18 test problems from DTLZ [3], WFG [6], and CEC 2009 [7] test suites, and the Borg MOEA showed significant advantages over the competing algorithms on manyobjective and multimodal problems. Lopez et al. [8] coupled an achievement function and the indicator using an alternative preference relation in a MOEA for solving manyobjective optimization problems. The experiments were conducted on DTLZ and WFG test suites. The results showed that the proposed approach had a good performance even when using a high number of objectives. Yang et al. [9] proposed a gridbased evolutionary algorithm (GrEA) to solve manyobjective optimization problems and performed the numerical experiments in DTLZ test suite. The experimental results indicated that the proposed algorithm showed the effectiveness and competitiveness in balancing convergence and diversity compared with six stateoftheart MOEAs. Deb and Jain [10] proposed a referencepoint based manyobjective NSGAII (called NSGAIII). This algorithm was applied to 3 to 15objective DTLZ1–4 problems and compared with two recently suggested versions of MOEA/D. It was found that the proposed NSGAIII obtained satisfactory results on all test problems.
However, the realworld applications of the conventional EA based optimization approaches need high CPU cost due to a large number of expensive performance analyses such as threedimensional computational fluid dynamics for complex geometries. A common strategy to reduce their CPU cost is to combine surrogate model with EA. A number of surrogate models such as response surface model (RSM) [11], radial basis function (RBF) [12], Kriging model [13], and neural network (NN) [14] have been applied to practical engineering designs. Among these surrogate models, the Kriging model can estimate the deviation between the response model and sample points and automatically adapt to the sample points. Additionally, the Kriging model has a characteristic that an assumption of order of the approximate function is not needed, so it is superior to general RSM. In this study, the Kriging model is adopted as the surrogate model.
For the Kriging model, a few updating criteria have been proposed to determine the locations in the design space where new sample points should be added for improving the model accuracy. Jones et al. [15] suggested that the expected improvement (EI) of an original objective function was maximized to determine the location of a new additional sample point and designated the efficient global optimization algorithm (EGO). Jeong et al. [16] extended EGO for multiobjective optimization problems, in which the EIs in terms of all objective functions were maximized and some of the nondominated solutions in the EIs space were selected as the additional sample points. Emmerich et al. [17] proposed the expected hypervolume improvement (EHVI) as the updating criterion of the Gaussian random field metamodel and evaluated EHVI using the Monte Carlo integration method. Li et al. [18] suggested a domination status based updating criterion to judge whether new sample point was needed in the Kriging metamodel assisted multiobjective genetic algorithm. Shimoyama et al. [19] compared the optimization performances of the EHVI, EI, and estimation (EST) updating criteria in the multiobjective optimization. The results indicated that EHVI kept a good balance between accurate and wide search for nondominated solutions.
However, the study of using EHVI as updating criterion for the optimization of manyobjective problems that involve four or more objectives has not been found in the open literature. The present paper is concerned with the manyobjective optimization performance of the Krigingsurrogatebased EA approach considering EHVI as the updating criterion. The long term objective is to reduce the CPU cost of the manyobjective optimization approach by decreasing the number of expensive performance analyses. In order to further investigate the performance of EHVI updating criterion, two other updating criteria EI and EST are adopted for comparisons. The present comparisons are conducted through numerical experiments in DTLZ test suite [3], which can be extended to an arbitrary number of objective functions.
The structure of this paper is as follows. In Section 2, the terminology and background related to the current study are reviewed. In Section 3, the Kriging model and EA based approach are outlined. In Section 4, several examples and corresponding results are given. Concluding remarks are presented in Section 5.
2. Background
2.1. ManyObjective Optimization
First, a manyobjective problem with objectives is defined as follows:where is a vector of design variables, is a vector of objective functions, denotes the th objective function to be minimized, and is the feasible region delimited by the problem’s constraints.
In the manyobjective problems, the goal is to find the optimal tradeoff solutions known as the Pareto optimal set. The definition of the Pareto optimal set is based on the domination concept between points in the design space. The dominance relation between two design vectors and is defined as follows.
Definition 1. A design vector dominates another design vector , if for all and for at least one . This is denoted by .
Definition 2. is nondominated if and only if there is no vector in that dominates .
A design vector is Pareto optimal if the design vector is nondominated with respect to the set of all possible design vectors. The set of all possible Pareto optimal is called Pareto optimal set, which forms the Pareto front in the objective space. In general, multiobjective optimization algorithms are designed to capture or closely approximate the Pareto front.
2.2. Kriging Model
The Kriging model has its original applications in mining and geostatistical fields referring to spatially and temporally correlated data [20]. The Kriging model is a combination of global model and localized departures as follows:where denotes an unknown function of interest and denotes a known global approximation model. is a realization of a stochastic process with mean zero and variance , and the covariance matrix of is given byIn (3), is an correlation matrix whose entries are symmetric with respect to the diagonal, is the correlation function between any two points and among sample points. The type of correlation function needs to be determined by users. This paper employs the following Gaussian correlation function:where denotes the number of design variables. is the weight of the distance along the th design variable.
In the Kriging model, the values of , , and are determined by maximizing the likelihood function. First, is obtained by maximizing the concentrated loglikelihood function as and that maximize the likelihood function are represented in closed form aswhere and 1 is an dimensional unit vector. After is obtained, and are obtained by (6) and (7), respectively. Thereinto, is calculated by (4). Now, (2) can be written to be the Kriging model predictor aswhere is an dimensional vector whose th element is .
The accuracy of the predicted value depends greatly on the distances between predicted point and the sample points. The mean squared error for a predicted point using the Kriging model predictor is defined by
2.3. Expected Hypervolume Improvement
EHVI is based on the theory of the hypervolume indicator [21]. Hypervolume indicator is a recently popular metric which is used for comparing the performances of different multiobjective optimizers. The hypervolume of a set of solutions measures the size of the portion of objective space that is dominated by the set collectively. In the field of evolutionary multiobjective optimization (EMO), the hypervolume indicator is the only unary indicator that is known to be strictly monotonic with regard to Pareto dominance. This characteristic is of high interest and relevance for the problems with a large number of objectives.
Hypervolume calculation requires high computational effort. Several algorithms have been proposed for calculating hypervolume exactly. Wu and Azarm [22] proposed the inclusionexclusion algorithm (IEA) for hypervolume calculation, and the complexity of this algorithm is for solutions and objectives. Fleischer [23] introduced the algorithm based on the Lebesgue measure, and its complexity is . While et al. [24] suggested a fast hypervolume by slicing objective (HSO) algorithm, and the complexity is . Based on HSO, Fonseca et al. [25] proposed an improved dimensionsweep algorithm for calculating hypervolume. The proposed algorithm achieved complexity in the worst case. The fastest algorithm yet known for exact hypervolume calculation is the Walking Fish Group (WFG) algorithm proposed by While et al. [26]. Its complexity is in the worst case; however, even relatively small percentages of dominated points can make this algorithm’s performance a huge improvement.
On the other hand, some approximate algorithms to calculate hypervolume have also been developed in recent years. Bader and Zitzler [27] proposed the approximate algorithm based on Monte Carlo sampling. In their work, the approximate algorithm was adopted to calculate the hypervolume values of the problems with more than 5 objectives. Bringmann and Friedrich [28] presented a new approximation algorithm also based on Monte Carlo sampling and worked extremely fast for all tested practical instances. Ishibuchi et al. [29] proposed the approximation algorithm using achievement scalarizing functions with uniformly distributed weight vectors.
EHVI is the expected value of hypervolume improvement in the Kriging model. The hypervolume improvement is defined as the difference of hypervolume between the current sample set and the next sample set, as illustrated in Figure 1, and its expected value is expressed aswhere denotes the Gaussian random variable . is the probability density function, and is the reference value used for calculating hypervolume. The maximization of EHVI is considered as the updating criterion to determine the location of an additional sample point. It is a single objective optimization problem.
In the present optimizations, objective function values are normalized between 0 and 1 for all given sample points, and the reference values are set to be 1.1 for all normalized objective functions. When the number of objectives is small (≤5), hypervolume is calculated using the fastest exact algorithm WFG [26]. When the number of objectives is large (>5), in consideration of the tradeoff between the EHVI accuracy and the overall computing time, hypervolume is calculated using the approximate algorithm as described in [27]. The number of samples to estimate an approximated value of hypervolume in this algorithm is set to be 300 for 8 and 10objective problems and 500 for 12 and 15objective problems. It is difficult to calculate EHVI by the analytical integration. As a simple alternative, we use Monte Carlo integration [30] by producing 1,000 random points with Gaussian distribution around the candidate sample point to measure the EHVI.
3. KrigingSurrogateBased EA for ManyObjective Optimization
Figure 2 gives the flowchart of the Krigingsurrogatebased EA for manyobjective optimization. The initial sample points are generated using Latin hypercube sampling (LHS) method [31] and distribute uniformly in the design space. LHS determines the locations of the points to be sampled taking account of the orthogonality condition, which means that each of the samples does not have overlapping values regarding all the design variables, and LHS can comprehend the whole variable space even with a smaller sample size than the Monte Carlo method. Using the initial samples, the Kriging model is constructed to approximate the responses of objective functions to design variables in the form of simple algebraic functions. These algebraic functions are derived to interpolate the initial sample points with real objectives function values. Therefore, the Kriging model can promptly give the estimation function values at other points with unknown objective function values. In order to investigate the performance of EHVI, two other updating criteria EI and EST [19] are adopted for comparing with EHVI in this study. An EMO algorithm, NSGAII [32], is adopted to search for the nondominated solutions for EI or EST and the optimal solution for EHVI on the response surfaces. The parameter values [33] used in NSGAII are given in Table 1.
In the optimization based on EHVI, a single solution with the maximum EHVI is obtained and employed as the additional sample point. This indicates that the sample points are added one by one. On the other hand, a set of multiple nondominated solutions is obtained in the optimization based on EI or EST. For a fair comparison, only the solution closest to the centroid of the obtained nondominated solutions in the design space is chosen as the additional sample point. When the additional sample point determined in the previous step is added to the set of initial samples, the Kriging model is reconstructed and the optimization is performed again. The termination condition of updating the Kriging model is that the total number of the initial sample points plus the additional sample points reaches 70.
4. Numerical Test
4.1. Test Problems
Three to fifteenobjective DTLZ17 problems [3] are considered in the present numerical experiments. All these problems are nonconstrained and scalable to an arbitrary number of objective functions. Table 2 lists these test problems and their key properties.
 
is the number of objectives, and is the number of variables. 
4.2. Performance Metric
The inverted generational distance (IGD) [34] is considered as the performance metric in the present numerical experiments. IGD can provide the combined information about the convergence and diversity of the obtained solutions. IGD is defined as follows:where denotes the set of the points located on the true Pareto optimal front. is the nondominated solution set obtained by the optimization. Smaller IGD indicates better performance in multiobjective optimization. The solutions in distribute uniformly on the true Pareto optimal front. The size of varies with the number of objectives and is shown in Table 3.

4.3. Results and Discussion
For each test problem, 20 runs are carried out with different initial datasets of ten samples points. The IGD values that are given in the following are averaged over 20 runs. Figure 3 gives the IGD histories during the Kriging model update based on EHVI, EI, and EST criteria in 3 to 15objective DTLZ1 problems. EST achieves faster reduction of IGD than EHVI and EI except for the case of ; EHVI achieves the fastest IGD reduction. In addition, EST always obtains faster reduction of IGD than EI for any number of objectives. It indicates that the updating criterion EST without considering estimation errors works better than the updating criterion EI considering estimation errors. It is because DTLZ1 problem is simple and easily approximated by the Kriging model.
(a)
(b)
(c)
(d)
(e)
(f)
The IGD histories during the Kriging model update based on EHVI, EI, and EST criteria in 3 to 15objective DTLZ2 problems are shown in Figure 4. When and 5, the optimization performance of using EHVI as the updating criterion is poor. This is because the updating strategy based on EHVI may result in the additional sample points falling into a local optimal region. When , EHVI obtains a little faster IGD reduction than EI and EST. In addition, it is noted that EI obtains much faster reduction of IGD than EHVI and EST when is larger than 10.
(a)
(b)
(c)
(d)
(e)
(f)
Figure 5 presents the IGD histories during the Kriging model update based on EHVI, EI, and EST criteria in 3 to 15objective DTLZ3 problems. EHVI always achieves better convergence and diversity performances than EI and EST for any number of objectives. When , EI obtains faster IGD reduction than EST. When is larger than 3, the final IGD values obtained by EI and EST are very close. It indicates that the problems are well approximated by the Kriging model, and the estimation errors are relatively small.
(a)
(b)
(c)
(d)
(e)
(f)
Figure 6 shows the IGD histories during the Kriging model update based on EHVI, EI, and EST criteria in 3 to 15objective DTLZ4 problems. For any number of objectives, EST without considering estimation errors obtains faster IGD reduction than EI considering estimation errors. As increases, the convergence and diversity performances of using EHVI as updating criterion become better. When is larger than 8, EHVI outperforms EI and EST. This indicates that EHVI updating criterion is particularly advantage for the problems with big number of objectives.
(a)
(b)
(c)
(d)
(e)
(f)
The IGD histories during the Kriging model update based on EHVI, EI, and EST criteria in 3 to 15objective DTLZ5 problems are shown in Figure 7. When is smaller than 12, EST obtained faster reduction of IGD than EI. However, EI outperforms EST when and 15. It indicates that EI considering estimation errors shows positive impact on convergence when the number of objectives is large. On the other hand, when and 5, the optimization performance of EHVI is poor. However, the optimization performance based on EHVI is gradually improved as increases. When is larger than 8, EHVI obtains the fastest reduction of IGD.
(a)
(b)
(c)
(d)
(e)
(f)
Figure 8 presents the IGD histories during the Kriging model update based on EHVI, EI, and EST criteria in 3 to 15objective DTLZ6 problems. EHVI always obtains much faster IGD reduction than EI and EST for any number of objectives. The updating criterion EHVI shows the efficient exploitation capacity of the Krigingsurrogatebased EA for manyobjective optimization. EI achieves faster reduction of IGD than EST except for the cases of and 10.
(a)
(b)
(c)
(d)
(e)
(f)
The IGD histories during the Kriging model update based on EHVI, EI, and EST criteria in 3 to 15objective DTLZ7 problems are shown in Figure 9. EHVI outperforms EI and EST for any number of objectives. When is larger than 8, EHVI shows more obvious advantage compared with EI and EST. On the other hand, EI has better optimization performance than EST except for the case of .
(a)
(b)
(c)
(d)
(e)
(f)
In order to investigate the effect of the number of design variables on the optimization performance of using EHVI as updating criterion, DTLZ7k10 is conducted for the manyobjective optimization. In the original DTLZ7 [15], the , the number the required variables which must take any function that is not less than 0, is suggested to be 20. However, the is set to be 10 in DTLZ7k10 problem. It indicates that the number of design variables of DTLZ7k10 problem is 10 less than that of the corresponding DTLZ7 problem. The IGD histories during the Kriging model update based on EHVI, EI, and EST criteria in 3 to 15objective DTLZ7k10 problems are shown in Figure 10. EHVI obtains faster reduction of IGD than EI and EST except for the case of ; EST achieves the fastest IGD reduction. By comparing the optimization performances in both DTLZ7 and DTLZ7k10 problems, the advantage of EHVI over EI and EST in DTLZ7 is more obvious than that in DTLZ7k10. It is indicated that EHVI is more suitable for the problems with large number of design variables.
(a)
(b)
(c)
(d)
(e)
(f)
5. Conclusion
The Krigingsurrogatebased EA considering EHVI, EI, and EST as updating criteria was conducted in 3 to 15objective DTLZ1–7 test problems, and the optimization performances using different updating criteria were compared in this study. Thereinto, the fastest WFG exact algorithm and the approximate algorithm based on Monte Carlo sampling were adopted for the hypervolume calculation.
In DTLZ1 problem, EHVI had faster IGD reduction than EI and EST when , but EHVI did not keep the advantage when was larger than 5. In DTLZ2 and DTLZ5 problems, when and 5, the optimization performance of using EHVI as the updating criterion was poor. However, the optimization performance of EHVI was improved when the number of objectives was larger than 5. In DTLZ4 and DTLZ5 problems, the advantage of EHVI was shown gradually as increased, and EHVI obtained faster reduction of IGD than EST and EI when was larger than 8. In DTLZ3, DTLZ6, and DTLZ7 problems, EHVI always obtained better convergence and diversity performances than EI and EST for any number of objectives. In DTLZ7k10 problem, the advantage of EHVI over EI and EST was reduced compared with that in DTLZ7. On the whole, in the nonconstrained case, EHVI shows that it is highly competitive for the updating of the Kriging model compared with EI and EST in manyobjective optimization, especially when the test problem is complex and the number of objectives or design variables is large.
On the other hand, EST always obtained faster reduction of IGD than EI for any number of objectives in DTLZ1 and DTLZ4 problems. It is because the two problems are easily approximated by the Kriging model. In most of the other manyobjective optimizations, the updating criterion EI considering estimation errors had better optimization performance than EST without considering estimation errors, especially for the problems with big number of objectives.
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
References
 C. A. C. Coello, G. B. Lamont, and D. A. van Veldhuizen, Evolutionary Algorithms for Solving MultiObjective Problems, Springer, New York, NY, USA, 2007. View at: MathSciNet
 M. GarzaFabre, G. T. Pulido, and C. A. Coello Coello, “Ranking methods for manyobjective optimization,” in MICAI 2009: Advances in Artificial Intelligence: 8th Mexican International Conference on Artificial Intelligence, Guanajuato, México, November 913, 2009. Proceedings, vol. 5845 of Lecture Notes in Computer Science, pp. 633–645, Springer, Berlin, Germany, 2009. View at: Publisher Site  Google Scholar
 K. Deb, L. Thiele, M. Laumanns, and E. Zitzler, “Scalable test problems for evolutionary multiobjective optimization,” TIKTechnical Report No. 112, 2001. View at: Google Scholar
 S. F. Adra and P. J. Fleming, “Diversity management in evolutionary manyobjective optimization,” IEEE Transactions on Evolutionary Computation, vol. 15, no. 2, pp. 183–195, 2011. View at: Publisher Site  Google Scholar
 D. Hadka and P. Reed, “Borg: an autoadaptive manyobjective evolutionary computing framework,” Evolutionary Computation, vol. 21, no. 2, pp. 231–259, 2013. View at: Publisher Site  Google Scholar
 S. Huband, P. Hingston, L. Barone, and L. While, “A review of multiobjective test problems and a scalable test problem toolkit,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 5, pp. 477–506, 2006. View at: Publisher Site  Google Scholar
 Q. Zhang, A. Zhou, S. Zhao, P. N. Suganthan, W. Liu, and S. Tiwari, “Multiobjective optimization test instances for the CEC 2009 special session and competition,” Tech. Rep. CES487, University of Essex, 2009. View at: Google Scholar
 A. Lopez, C. A. Coello Coello, A. Oyama, and K. Fujii, “An alternative performance relation to deal with manyobjective optimization problems,” in Evolutionary MultiCriterion Optimization, vol. 7811 of Lecture Notes in Computer Science, pp. 291–306, Springer, Berlin, Germany, 2013. View at: Publisher Site  Google Scholar
 S. Yang, M. Li, X. Liu, and J. Zheng, “A gridbased evolutionary algorithm for manyobjective optimization,” IEEE Transactions on Evolutionary Computation, vol. 17, no. 5, pp. 721–736, 2013. View at: Publisher Site  Google Scholar
 K. Deb and H. Jain, “An evolutionary manyobjective optimization algorithm using referencepoint based nondominated sorting approach, part I: solving problems with box constraints,” IEEE Transactions on Evolutionary Computation, vol. 18, no. 4, pp. 577–601, 2014. View at: Google Scholar
 Y. Lian and M.S. Liou, “Multiobjective optimization using coupled response surface model and evolutionary algorithm,” AIAA Journal, vol. 43, no. 6, pp. 1316–1325, 2005. View at: Publisher Site  Google Scholar
 H. Fang, M. RaisRohani, and M. F. Horstemeyer, “Multiobjective crashworthiness optimization with radial basis functions,” in Proceedings of the 10th AIAA/ISSMO Multidisciplinary Analysis and Optimization Conference, pp. 2095–2104, Albany, NY, USA, September 2004. View at: Google Scholar
 K. Shimoyama, S. Yoshimizu, S. Jeong, S. Obayashi, and Y. Yokono, “Multiobjective design optimization for a steam turbine stator blade using LES and GA,” Journal of Computational Science and Technology, vol. 5, no. 3, pp. 134–147, 2011. View at: Publisher Site  Google Scholar
 M. Papadrakakis, N. D. Lagaros, and Y. Tsompanakis, “Optimization of largescale 3D trusses using evolution strategies and neural networks,” International Journal of Space Structures, vol. 14, no. 3, pp. 211–223, 1999. View at: Publisher Site  Google Scholar
 D. R. Jones, M. Schonlau, and W. J. Welch, “Efficient global optimization of expensive blackbox functions,” Journal of Global Optimization, vol. 13, no. 4, pp. 455–492, 1998. View at: Publisher Site  Google Scholar  MathSciNet
 S. Jeong, Y. Minemura, and S. Obayashi, “Optimization of combustion chamber for diesel engine using Kriging model,” Journal of Fluid Science and Technology, vol. 1, no. 2, pp. 138–146, 2006. View at: Publisher Site  Google Scholar
 M. T. M. Emmerich, K. C. Giannakoglou, and B. Naujoks, “Single and multiobjective evolutionary optimization assisted by Gaussian random field metamodels,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 4, pp. 421–439, 2006. View at: Publisher Site  Google Scholar
 M. Li, G. Li, and S. Azarm, “A kriging metamodel assisted multiobjective genetic algorithm for design optimization,” Journal of Mechanical Design, Transactions of the ASME, vol. 130, no. 3, Article ID 031401, 2008. View at: Publisher Site  Google Scholar
 K. Shimoyama, K. Sato, S. Jeong, and S. Obayashi, “Updating Kriging surrogate models based on the hypervolume indicator in multiobjective optimization,” Journal of Mechanical Design, vol. 135, no. 9, Article ID 094503, 2013. View at: Publisher Site  Google Scholar
 N. A. C. Cressie, Statistics for Spatial Data, John Wiley & Sons, New York, NY, USA, 1993, Revision.
 E. Zitzler and L. Thiele, “Multiobjective optimization using evolutionary algorithms—a comparative case study,” in Proceedings of the 5th International Conference on Parallel Problem Solving from Nature (PPSN '98), vol. 1498 of Lecture Notes in Computer Science, pp. 292–301, Amsterdam, The Netherlands, 1998. View at: Publisher Site  Google Scholar
 J. Wu and S. Azarm, “Metrics for quality assessment of a multiobjective design optimization solution set,” Journal of Mechanical Design, Transactions of the ASME, vol. 123, no. 1, pp. 18–25, 2001. View at: Publisher Site  Google Scholar
 M. Fleischer, “The measure of pareto optima applications to multiobjective metaheuristics,” in Evolutionary MultiCriterion Optimization, vol. 2632 of Lecture Notes in Computer Science, pp. 519–533, Springer, Berlin, Germany, 2003. View at: Publisher Site  Google Scholar
 L. While, P. Hingston, L. Barone, and S. Huband, “A faster algorithm for calculating hypervolume,” IEEE Transactions on Evolutionary Computation, vol. 10, no. 1, pp. 29–38, 2006. View at: Publisher Site  Google Scholar
 C. M. Fonseca, L. Paquete, and M. LópezIbáñez, “An improved dimensionsweep algorithm for the hypervolume indicator,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '06), pp. 1157–1163, July 2006. View at: Google Scholar
 L. While, L. Bradstreet, and L. Barone, “A fast way of calculating exact hypervolumes,” IEEE Transactions on Evolutionary Computation, vol. 16, no. 1, pp. 86–95, 2012. View at: Publisher Site  Google Scholar
 J. Bader and E. Zitzler, “Hype: an algorithm for fast hypervolumebased manyobjective optimization,” TIKTechnical Report 286, 2008. View at: Google Scholar
 K. Bringmann and T. Friedrich, “Approximating the least hypervolume contributor: NPhard in general, but fast in practice,” in Evolutionary MultiObjective Optimization, vol. 5467 of Lecture Notes on Computer Science, pp. 6–20, Springer, Berlin, Germany, 2009. View at: Google Scholar
 H. Ishibuchi, N. Tsukamoto, Y. Sakane, and Y. Nojima, “Hypervolume approximation using achievement scalarizing functions for evolutionary manyobjective optimization,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '09), pp. 530–537, Trondheim, Norway, May 2009. View at: Publisher Site  Google Scholar
 S. Weinzierl, “Introduction to Monte Carlo methods,” Tech. Rep. NIKHEF00012, NIKHEF, Theory Group, Amsterdam, The Netherlands, 2000. View at: Google Scholar
 M. D. McKay, R. J. Beckman, and W. J. Conover, “A comparison of three methods for selecting values of input variables in the analysis of output from a computer code,” Technometrics, vol. 21, no. 2, pp. 239–245, 1979. View at: Publisher Site  Google Scholar  MathSciNet
 K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A fast and elitist multiobjective genetic algorithm: NSGAII,” IEEE Transactions on Evolutionary Computation, vol. 6, no. 2, pp. 182–197, 2002. View at: Publisher Site  Google Scholar
 K. Deb and R. B. Agrawal, “Simulated binary crossover for continuous search space,” Complex Systems, vol. 9, no. 2, pp. 115–148, 1995. View at: Google Scholar  MathSciNet
 D. A. Van Veldhuizen and G. B. Lamont, “Multiobjective evolutionary algorithm research: a history and analysis,” Tech. Rep. TR9803, Air Force Institute of Technology, Dayton, Ohio, USA, 1998. View at: Google Scholar
Copyright
Copyright © 2015 Chang Luo et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.