Mathematical Problems in Engineering

Volume 2015, Article ID 162712, 15 pages

http://dx.doi.org/10.1155/2015/162712

## A Study on Many-Objective Optimization Using the Kriging-Surrogate-Based Evolutionary Algorithm Maximizing Expected Hypervolume Improvement

Institute of Fluid Science, Tohoku University, Sendai 980-8577, Japan

Received 25 August 2014; Revised 13 January 2015; Accepted 13 January 2015

Academic Editor: Yudong Zhang

Copyright © 2015 Chang Luo et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

The many-objective optimization performance of the Kriging-surrogate-based evolutionary algorithm (EA), which maximizes expected hypervolume improvement (EHVI) for updating the Kriging model, is investigated and compared with those using expected improvement (EI) and estimation (EST) updating criteria in this paper. Numerical experiments are conducted in 3- to 15-objective DTLZ1-7 problems. In the experiments, an exact hypervolume calculating algorithm is used for the problems with less than six objectives. On the other hand, an approximate hypervolume calculating algorithm based on Monte Carlo sampling is adopted for the problems with more objectives. The results indicate that, in the nonconstrained case, EHVI is a highly competitive updating criterion for the Kriging model and EA based many-objective optimization, especially when the test problem is complex and the number of objectives or design variables is large.

#### 1. Introduction

Evolutionary algorithms (EAs) are a class of metaheuristics inspired by the process of natural evolution, and they have been successfully applied to solve the optimization problems with two or three objectives over the past two decades [1]. Recently, EAs are gradually extended to deal with many-objective problems that involve four or more objectives. Garza-Fabre et al. [2] combined three novel fitness assignment methods with a generic multiobjective evolutionary algorithm (MOEA) and conducted the numerical experiments in 5- to 50-objective DTLZ1, DTLZ3, and DTLZ6 problems [3]. It was found that the proposed three methods were effective in guiding the search in high-dimensional objective spaces. Adra and Fleming [4] investigated two new management mechanisms for promoting diversity in evolutionary many-objective optimization and tested 6- to 20-objective DTLZ2 problems. The results indicated that the inclusion of one of the mechanisms improved the convergence and diversity performances of the existing MOEAs in solving many-objective optimization problems. Hadka and Reed [5] proposed the Borg MOEA for the many-objective and multimodal optimization using the -dominance concept and an adaptive population sizing approach. A comparative study was conducted on 18 test problems from DTLZ [3], WFG [6], and CEC 2009 [7] test suites, and the Borg MOEA showed significant advantages over the competing algorithms on many-objective and multimodal problems. Lopez et al. [8] coupled an achievement function and the -indicator using an alternative preference relation in a MOEA for solving many-objective optimization problems. The experiments were conducted on DTLZ and WFG test suites. The results showed that the proposed approach had a good performance even when using a high number of objectives. Yang et al. [9] proposed a grid-based evolutionary algorithm (GrEA) to solve many-objective optimization problems and performed the numerical experiments in DTLZ test suite. The experimental results indicated that the proposed algorithm showed the effectiveness and competitiveness in balancing convergence and diversity compared with six state-of-the-art MOEAs. Deb and Jain [10] proposed a reference-point based many-objective NSGA-II (called NSGA-III). This algorithm was applied to 3- to 15-objective DTLZ1–4 problems and compared with two recently suggested versions of MOEA/D. It was found that the proposed NSGA-III obtained satisfactory results on all test problems.

However, the real-world applications of the conventional EA based optimization approaches need high CPU cost due to a large number of expensive performance analyses such as three-dimensional computational fluid dynamics for complex geometries. A common strategy to reduce their CPU cost is to combine surrogate model with EA. A number of surrogate models such as response surface model (RSM) [11], radial basis function (RBF) [12], Kriging model [13], and neural network (NN) [14] have been applied to practical engineering designs. Among these surrogate models, the Kriging model can estimate the deviation between the response model and sample points and automatically adapt to the sample points. Additionally, the Kriging model has a characteristic that an assumption of order of the approximate function is not needed, so it is superior to general RSM. In this study, the Kriging model is adopted as the surrogate model.

For the Kriging model, a few updating criteria have been proposed to determine the locations in the design space where new sample points should be added for improving the model accuracy. Jones et al. [15] suggested that the expected improvement (EI) of an original objective function was maximized to determine the location of a new additional sample point and designated the efficient global optimization algorithm (EGO). Jeong et al. [16] extended EGO for multiobjective optimization problems, in which the EIs in terms of all objective functions were maximized and some of the nondominated solutions in the EIs space were selected as the additional sample points. Emmerich et al. [17] proposed the expected hypervolume improvement (EHVI) as the updating criterion of the Gaussian random field metamodel and evaluated EHVI using the Monte Carlo integration method. Li et al. [18] suggested a domination status based updating criterion to judge whether new sample point was needed in the Kriging metamodel assisted multiobjective genetic algorithm. Shimoyama et al. [19] compared the optimization performances of the EHVI, EI, and estimation (EST) updating criteria in the multiobjective optimization. The results indicated that EHVI kept a good balance between accurate and wide search for nondominated solutions.

However, the study of using EHVI as updating criterion for the optimization of many-objective problems that involve four or more objectives has not been found in the open literature. The present paper is concerned with the many-objective optimization performance of the Kriging-surrogate-based EA approach considering EHVI as the updating criterion. The long term objective is to reduce the CPU cost of the many-objective optimization approach by decreasing the number of expensive performance analyses. In order to further investigate the performance of EHVI updating criterion, two other updating criteria EI and EST are adopted for comparisons. The present comparisons are conducted through numerical experiments in DTLZ test suite [3], which can be extended to an arbitrary number of objective functions.

The structure of this paper is as follows. In Section 2, the terminology and background related to the current study are reviewed. In Section 3, the Kriging model and EA based approach are outlined. In Section 4, several examples and corresponding results are given. Concluding remarks are presented in Section 5.

#### 2. Background

##### 2.1. Many-Objective Optimization

First, a many-objective problem with objectives is defined as follows:where is a vector of design variables, is a vector of objective functions, denotes the th objective function to be minimized, and is the feasible region delimited by the problem’s constraints.

In the many-objective problems, the goal is to find the optimal tradeoff solutions known as the Pareto optimal set. The definition of the Pareto optimal set is based on the domination concept between points in the design space. The dominance relation between two design vectors and is defined as follows.

*Definition 1. *A design vector dominates another design vector , if for all and for at least one . This is denoted by .

*Definition 2. * is nondominated if and only if there is no vector in that dominates .

A design vector is Pareto optimal if the design vector is nondominated with respect to the set of all possible design vectors. The set of all possible Pareto optimal is called Pareto optimal set, which forms the Pareto front in the objective space. In general, multiobjective optimization algorithms are designed to capture or closely approximate the Pareto front.

##### 2.2. Kriging Model

The Kriging model has its original applications in mining and geostatistical fields referring to spatially and temporally correlated data [20]. The Kriging model is a combination of global model and localized departures as follows:where denotes an unknown function of interest and denotes a known global approximation model. is a realization of a stochastic process with mean zero and variance , and the covariance matrix of is given byIn (3), is an correlation matrix whose entries are symmetric with respect to the diagonal, is the correlation function between any two points and among sample points. The type of correlation function needs to be determined by users. This paper employs the following Gaussian correlation function:where denotes the number of design variables. is the weight of the distance along the th design variable.

In the Kriging model, the values of , , and are determined by maximizing the likelihood function. First, is obtained by maximizing the concentrated log-likelihood function as and that maximize the likelihood function are represented in closed form aswhere and** 1** is an -dimensional unit vector. After is obtained, and are obtained by (6) and (7), respectively. Thereinto, is calculated by (4). Now, (2) can be written to be the Kriging model predictor aswhere is an -dimensional vector whose th element is .

The accuracy of the predicted value depends greatly on the distances between predicted point and the sample points. The mean squared error for a predicted point using the Kriging model predictor is defined by

##### 2.3. Expected Hypervolume Improvement

EHVI is based on the theory of the hypervolume indicator [21]. Hypervolume indicator is a recently popular metric which is used for comparing the performances of different multiobjective optimizers. The hypervolume of a set of solutions measures the size of the portion of objective space that is dominated by the set collectively. In the field of evolutionary multiobjective optimization (EMO), the hypervolume indicator is the only unary indicator that is known to be strictly monotonic with regard to Pareto dominance. This characteristic is of high interest and relevance for the problems with a large number of objectives.

Hypervolume calculation requires high computational effort. Several algorithms have been proposed for calculating hypervolume exactly. Wu and Azarm [22] proposed the inclusion-exclusion algorithm (IEA) for hypervolume calculation, and the complexity of this algorithm is for solutions and objectives. Fleischer [23] introduced the algorithm based on the Lebesgue measure, and its complexity is . While et al. [24] suggested a fast hypervolume by slicing objective (HSO) algorithm, and the complexity is . Based on HSO, Fonseca et al. [25] proposed an improved dimension-sweep algorithm for calculating hypervolume. The proposed algorithm achieved complexity in the worst case. The fastest algorithm yet known for exact hypervolume calculation is the Walking Fish Group (WFG) algorithm proposed by While et al. [26]. Its complexity is in the worst case; however, even relatively small percentages of dominated points can make this algorithm’s performance a huge improvement.

On the other hand, some approximate algorithms to calculate hypervolume have also been developed in recent years. Bader and Zitzler [27] proposed the approximate algorithm based on Monte Carlo sampling. In their work, the approximate algorithm was adopted to calculate the hypervolume values of the problems with more than 5 objectives. Bringmann and Friedrich [28] presented a new approximation algorithm also based on Monte Carlo sampling and worked extremely fast for all tested practical instances. Ishibuchi et al. [29] proposed the approximation algorithm using achievement scalarizing functions with uniformly distributed weight vectors.

EHVI is the expected value of hypervolume improvement in the Kriging model. The hypervolume improvement is defined as the difference of hypervolume between the current sample set and the next sample set, as illustrated in Figure 1, and its expected value is expressed aswhere denotes the Gaussian random variable . is the probability density function, and is the reference value used for calculating hypervolume. The maximization of EHVI is considered as the updating criterion to determine the location of an additional sample point. It is a single objective optimization problem.