Computational Intelligence and Neuroscience

Computational Intelligence and Neuroscience / 2020 / Article

Research Article | Open Access

Volume 2020 |Article ID 8835852 | https://doi.org/10.1155/2020/8835852

Takumi Nakane, Xuequan Lu, Chao Zhang, "A Search History-Driven Offspring Generation Method for the Real-Coded Genetic Algorithm", Computational Intelligence and Neuroscience, vol. 2020, Article ID 8835852, 20 pages, 2020. https://doi.org/10.1155/2020/8835852

A Search History-Driven Offspring Generation Method for the Real-Coded Genetic Algorithm

Academic Editor: José Alfredo Hernández-Pérez
Received13 May 2020
Revised01 Sep 2020
Accepted09 Sep 2020
Published27 Sep 2020

Abstract

In evolutionary algorithms, genetic operators iteratively generate new offspring which constitute a potentially valuable set of search history. To boost the performance of offspring generation in the real-coded genetic algorithm (RCGA), in this paper, we propose to exploit the search history cached so far in an online style during the iteration. Specifically, survivor individuals over the past few generations are collected and stored in the archive to form the search history. We introduce a simple yet effective crossover model driven by the search history (abbreviated as SHX). In particular, the search history is clustered, and each cluster is assigned a score for SHX. In essence, the proposed SHX is a data-driven method which exploits the search history to perform offspring selection after the offspring generation. Since no additional fitness evaluations are needed, SHX is favorable for the tasks with limited budget or expensive fitness evaluations. We experimentally verify the effectiveness of SHX over 15 benchmark functions. Quantitative results show that our SHX can significantly enhance the performance of RCGA, in terms of both accuracy and convergence speed. Also, the induced additional runtime is negligible compared to the total processing time.

1. Introduction

Evolutionary algorithms (EAs) have been shown to be generic and effective to search for global optima in the complex search space theoretically [13] and practically [46]. The exploration process of EAs imitates the natural selection process, which is realized by conducting the offspring generation and survivor individual selection alternately and iteratively. The population quality is gradually improved throughout the exploration process, which can be viewed as a stochastic population-based generation-and-test process. Because of the offspring generation, a large number of candidate solutions (i.e., individuals) are sampled, accompanied by corresponding fitness values, genetic information, and genealogy information. Such accumulated search data constitute search history which can be very informative and valuable for boosting the overall performance. For instance, exploiting search history can be useful for improving the search procedure under a limited budget of fitness evaluations (FEs). That is, no additional FEs are allowed for improving the search performance. Also, the computational cost of a single FE can be high when the fitness functions are complicated. To enable a better solution for the population without increasing the number of FEs, the way of exploiting the search history truly matters. Nevertheless, search history has been sparsely exploited and studied in existing methods.

Real-coded genetic algorithm (RCGA) has been widely studied in the past decades [711], and the main efforts for improving the performance of RCGA have been focused on the development of the crossover techniques [12]. Because the crossover operator is to generate new offspring from the current population, the quality of the new solutions straightforwardly affects the evolution direction and convergence speed. Given different mechanisms, crossover methods can differ from (1) parent selection, (2) offspring generation, and (3) offspring selection. Both parent and offspring can be more than two, depending on the design. The abovementioned three aspects associate the exploration ability with exploitation ability, and the degree and balance between both abilities affect the performance largely [13]. Although the self-adaptive feature of RCGA [14] can adjust the relationship to a certain extent, the “best” degrees and balance between exploration and exploitation for achieving a satisfactory solution can differ greatly with respect to different problem settings and can be hardly achieved with the adaptive feature.

With a large amount of search history data up to the current generation in hand, we attempt to introduce a crossover method that effectively exploits the history data in this paper. At first, an archive is defined to collect the survivor individuals over generations as the search history. Then, the stored individuals are clustered by k-means [15], and each cluster is assigned a score depending on the number of belonging individuals. At last, offspring is generated and selected according to the scores. We introduce two different schemes to update the archive. The proposed crossover operator, named search history-driven crossover (SHX), generates offspring by considering the cluster scores. Since SHX enables an offspring selection mechanism, any existing parent selection and offspring generation mechanisms can be easily integrated with it. To our knowledge, this is the first work to design the crossover model by effectively exploiting search history. We present a set of experiments to systemically evaluate the effectiveness of the proposed method using 15 benchmark functions. Three conventional crossover operators are employed, and the results with/without SHX are compared. Apart from the above, two archive update methods are also analyzed.

The main technical contributions of this paper are threefold. First, we propose a novel crossover model by effectively exploiting the search history. Second, we introduce the offspring selection based on the clusters calculated from the search history. Third, we introduce two schemes to update the survivor archive. A preliminary version of this paper appears in GECCO2020 [16].

Crossover is one of the principal operators for generating offspring and deeply relates to the performance of the real-coded genetic algorithm (RCGA). Blend- crossover (BLX-) [17] proposed by Eshelman and Schaffer is one of the most popular operators. Offspring genes are independently and uniformly sampled within an interval between a gene pair of parents. The parameter corresponds to the extension of the sampling interval, which plays a key role in maintaining the diversity of offspring. Eshelman et al. proposed Blend-- crossover (BLX--) [18] which involves two extension parameters. Deb and Agrawal introduced simulated binary crossover (SBX) [19] which simulates the single-point crossover in binary-coded GA for continuous search space. The interval used in SBX is determined by a polynomial probability distribution depending on the distribution index . indirectly adjusts the tendency of offspring generation. The above crossover operators have a common feature that the offspring genes are extracted according to a certain probability distribution from the predefined interval on the parent genes. This feature enables better results than using crossover operators for binary coding in the continuous search space. On the other hand, some crossover operators set more than two individuals as parents, which aim to generate offspring with well-preserved population statistics. In the case of unimodal normal distribution crossover (UNDX) [20], the generation of offspring follows a unimodal normal distribution defined on the line connecting two of the three parents. For simplex crossover (SPX) [21], individuals are taken as parents in the -dimensional search space. SPX uniformly generates offspring within -dimensional simplex constructed by parent individuals and expanded by a parameter .

Search history has also been exploited in some research, but to the best of our knowledge, none of them is for the purpose of improving the crossover model. Since online real systems often provide uncertain evaluation values which lead to unreliable convergence of GA, Sano and Kita proposed memory-based fitness estimation GA (MFEGA) [22]. MFEGA estimates the fitness from neighboring individuals stored in the search history. Leveraging search history allows estimation without requiring additional evaluation. Amor and Rettinger proposed GA using self-organizing maps (GASOM) [23]. SOM (self-organizing maps) can provide a visualized search history, which makes the regions explored intuitive for users. Moreover, individual novelty is introduced by the activation frequency in the search history table and utilized by the reseeding operator to preserve the exploration power. Yuen and Chow presented the continuous nonrevisiting GA (cNrGA) [24]. A binary partitioning tree called a density tree stores all evaluated individuals and divides the search space into nonoverlapped partitions by means of distributions. These subregions are used to check whether a new individual needs to be evaluated or not.

3. Overview

Principles of designing good crossover operators for RCGA are discussed in [25]. Two among them are especially important: (1) the crossover operator should preserve the statistics of the population; (2) the crossover operator should generate offspring with as much diversity as possible under the constraint of (1). By following these suggestions, the key idea of SHX is to cluster the search history and select population members from excessively generated candidate solutions by preserving the statistics represented by the clusters. Figure 1 illustrates the overview of our SHX. The proposed method is performed under the framework of RCGA which mainly involves survivor selection and crossover. Mutation is optional, but we exclude it to clearly investigate the effectiveness of SHX in this work.

The proposed method is described in Algorithm 1. Population is denoted by which comprises individuals, and the population at the -th generation is denoted as . Similarly, parents for SHX, excessively generated candidate solutions during SHX, offspring after SHX, and survivors for the next generation are represented by , , , and , respectively. The size of each set is denoted using with a subscript of the set name (e.g., the size of parents is denoted by ). In addition to , our method manages an archive which preserves survivors throughout the generation alternation. and are initialized by randomly placing individuals in the search space. The archive update process is conducted after the survivor selection. Survivor individuals of the current generation are aggregated into both and of the next generation. SHX can be further divided into parent selection, offspring generation, and offspring selection. Different from conventional RCGA, individuals generated from are regarded as offspring candidates . The main purpose of SHX is to narrow down to individuals denoted by according to the statistics provided by . is calculated from the clustering result of the archive and immediately impacts the offspring selection.

Input: population , population size ,
Archive , archive size ,
Score ,
Generation ID
Output: estimated global optimum
//initialization
(1) individuals in are randomly initialized
//fitness evaluation
(2)eval
//archive update
(3) archiveUpdate ;
(4)While termination criterion is not satisfied do
(5)
//SHX
(6) parentSelection , ;
(7)
(8) offspringGeneration ;
(9) offspringSelection , ;
//fitness evaluation
(10)eval
//survivor selection
(11) survivorSelection ;
(12)
//archive update
(13) archiveUpdate ;
(14)end
(15)Return .

SHX can adopt any existing crossover operators (e.g., BLX- [17] and SPX [21]) for the offspringGeneration function (Algorithm 1, line 8) to generate from . For the parentSelection function (Algorithm 1, line 6) and the survivorSelection function (Algorithm 1, line 11), the just generation gap (JGG) [26, 27] is employed in this work. That is, the parentSelection function randomly extracts individuals from as , and the survivorSelection function selects top- individuals in as according to the fitness value. To show the performance increase brought by SHX, we choose the widely applied BLX-, SPX, and UNDX for the offspring generation and compare the results in Section 6. We explain archiveUpdate (Algorithm 1, lines 3 and 13) and offspringSelection (Algorithm 1, line 9) in detail in Section 4 and Section 5, respectively.

4. Survivor Archive

Since the genetic operations are run alternately and iteratively, collecting and analyzing the history data may be beneficial for boosting performance. Given that SHX is to maintain the historical statistics while producing offspring for the next generation, the archive is designed to store over few past generations and extracts statistics . The calculation of is based on the k-means, which is an off-the-shelf nonsupervised clustering method. The pseudocode of k-means is shown in Algorithm 2. In particular, k-means is employed to cluster the individuals in based on their position in the search space, and is a normalized frequency histogram to show the proportion of each cluster size to . A higher score indicates that the corresponding cluster is more likely to be a promising search region. The statistics can then be maintained by probabilistically assigning newly generated candidates to each cluster according to .

  Input: number of clusters ,
  Data points
  Output: cluster centroids
(1) cluster centroids are randomly initialized
(2)While termination criterion is not satisfied do
(3)For do
(4)assign the nearest cluster centroid ID to
(5)end
(6)For do
(7)update by calculating the mean of data points in the -th cluster
(8)end
(9)end
(10)Return

To keep the computational cost brought by k-means within an acceptable and constant range, the archive size is fixed to . That is, a part of individuals in must be replaced with new survivors during the archive update to incorporate new information. Two types of update methods are considered in this work: (1) randomly selecting individuals in and replacing them with (denoted by ); (2) replacing a part of with in the order in which the individuals of arrived (denoted by ). The performance comparison between these two approaches is discussed in Section 6.

The update of and calculation of are executed in the function archiveUpdate (Algorithm 1, lines 3 and 13) which is summarized in Algorithm 3. At the replacement step (Algorithm 3, line 4), individuals are discarded from based on or approaches, and new are stored to . Initialization is executed when equals 0. The k-meansFit function (Algorithm 3, line 7) updates the centroids of the clusters according to the updated and assigns updated cluster labels to each individual in . After that, the normalized frequency histogram for each cluster is calculated by the hist function (Algorithm 3, line 9) for further usage in offspring selection (Algorithm 4). Note that the initial centroids of the clusters in the current generation are inherited from the previous generation, as most individuals in are the same as .

Function: archive Update .
Input: archive .
 Survivors .
 Size of the archive .
Output: updated archive ,
 Score .
(1) If then.
 //initialization.
(2) individuals in are randomly initialized.
(3) Else.
//archive update.
(4) randomly or sequentially (first in first out) select individuals.
 from to form .
(5).
(6)end.
//score update.
(7)k-meansFit .
(8); Calculating frequency histogram.
(9)Return .
Function: offspringSelection .
Input: candidates ,
 Score
Output: offspring
//labeling based on clustering results estimated in Algorithm 3, line 7
(1)k-meansPredict ;
//roulette construction
(2)For do
(3)If then
(4)
(5)Else
(6)
(7)end
(8)end
//offspring selection
(9)Repeat
(10)select one cluster ID by the roulette selection based on
(11)randomly select one candidate ,
(12)
//exclude selected candidate from clusters
(13)
(14)If then.
(15):
(16)end.
(17) times are run.
(18)Return .

5. Search History-Driven Crossover (SHX)

SHX randomly selects parents by following the strategy of existing crossover operators (e.g., two parents in the case of BLX and parents in the case of SPX) and excessively generates candidate offspring for further offspring selection. because must ensure a sufficient number of individuals that can be assigned to each cluster in . Here, generating individuals excessively can also be considered as a mechanism of diversity preservation. It is worth pointing out that the offspring selection is a different procedure from the survivor selection. Offspring selection belongs to the crossover model and is conducted before fitness evaluation. Survivor selection is conducted after fitness evaluation. Offspring selection narrows down to based on roulette wheel selection [28]. Each proportion of the wheel relates to each possible selection (i.e., clusters), and is used to associate a probability of selection with each cluster in . This can also be viewed as a procedure that SHX preferentially selects individuals in more “promising” regions. This bias selection can encourage the evolution of the population and accelerate the whole convergence. Besides, the statistics of the population (e.g., cluster size) can be maintained between two consecutive generations because the new generation is sampled based on the statistics of the history. Also, the diversity of can be preserved because each newly generated individual from has a probability to be assigned to.

The algorithm of offspring selection is shown in Algorithm 4. Input is excessively generated by existing crossover operators (Algorithm 1, line 8). Each candidate is labeled by the k-means Predict function (Algorithm 4, line 1) based on the current clusters estimated from . Then, the roulette is constructed based on . The roulette selection is called times, yielding selected offspring. Each time of roulette selection produces a cluster ID, and one candidate in that belongs to the corresponding cluster is randomly selected and assigned to . To avoid duplicate selection, a selected candidate will be excluded from . If no more candidates correspond to a certain cluster (this is rarely the case by assuming ), the roulette is reconstructed by eliminating the proportion of the corresponding cluster. Finally, is passed to the survivor selection process which determines using JGG.

6. Experimental Results

The performance of SHX is investigated over 15 benchmark functions, with each function in two different dimension settings. We comprehensively compare the performance of RCGA with/without SHX, and SHX is run with different settings of archive update methods (/) and offspring generation methods (BLX [17]/SPX [21]/UNDX [20]).

6.1. Experimental Setup

Benchmark functions are a useful tool to verify the effectiveness of a method, and it is general to use several functions with different properties, such as in [29, 30]. We selected 15 benchmark functions with different characteristics from the literature [3133] for evaluation. Detailed information of each function is summarized in Table 1. Initialization of the population and the archive is conducted within the range provided by the 4th column in Table 1. It is worth mentioning that the searching space (i.e., range of parameters) during the generation alternation is not constrained. Each function is labeled according to different combination of characteristics (U + S, U + NS, M + S, and M + NS). By involving various characteristics of functions, we can analyze the proposed method more comprehensively and objectively. Furthermore, as all selected functions are adjustable in the setting of dimension, we adopt two different numbers of dimensions ( and ) to control the difficulty degree of the search problem.


IDNameDefinitionRangeLabel

Schwefel 2.21U, S
SphereU, S
Sum squaresU, S
RosenbrockU, NS
Schwefel 1.2U, NS
Schwefel 2.22U, NS
ZakharovU, NS
EasomM, S
RastriginM, S
Schwefel 2.26M, S
Weierstrass1M, S
Ackley 1M, NS
GriewankM, NS
SalomonM, NS
Xin-She Yang 2M, NS

The last column (Label) represents the characteristics that the functions hold: unimodal (U), multimodal (M), separable (S), and nonseparable (NS).

The setting of hyperparameters of the proposed method is listed in Table 2. The proposed method includes hyperparameters of not only RCGA (number of generations, , and ) but also SHX (, , and ). Basically, the search problem defined by each function becomes more hard as the number of dimensions increases, which requires a lot of evaluations. For adaptive adjustment, the number of generations, , and are set proportional to the number of dimensions. The constant values of each parameter are empirically determined because the purpose of the experiments is to validate the effectiveness of having SHX, rather than achieving the best solution for each function.


ParameterValue

Number of generations
Population size,
Number of offspring,
Number of candidates,
Archive size,
Number of clusters,

is the number of dimensions of test functions. All the parameters are fixed throughout the experiments.

All experiments are executed 100 times with different random seeds. In each experiment, the generation alternation completely executed the number of generation times defined in Table 2. For a fair comparison, iterations under the same random seed start using the same population. The runtime and fitness are recorded with Python implementation (without either parallelization or optimization) on a i7-7700 CPU at 3.60 GHz, 12.0 GB RAM desktop computer.

6.2. Comparison in the Final-Generation-Elite

The results of the absolute error between the optimal value and the final-generation-elite fitness with respect to all combinations of functions, dimension, and methods are displayed in Table 3. Table 3 shows the minimum, maximum, median, mean, standard deviation (SD), and value of the Mann–Whitney U test by each combination. The Mann–Whitney U test evaluates the significance of SHX results against results without SHX under the significance level . Before showing the superiority by involving SHX, we first exclude a few results that all the methods are trapped by local optima or cannot reach the global optima. (1) Easom Function. This function has several local minima. It is unimodal, and the global minimum only has a small area corresponding to the search space, which can be hardly arrived at. (2) Schwefel 2.26. Since the setup of this experiment does not restrict the range of parameters during search, an extremely small fitness value (even smaller than the global optimum) can be achieved with this function, which is not suitable for comparisons.


MinimumMaximum
Mean
MinimumMaximumMinimumMaximum
MedianMedianMeanMedianMean
SDSD valueSD value

BLXSH-BLX_randomSH-BLX_sequential
53.37E + 001.58E + 012.07E + 001.51E + 012.45E + 001.36E + 01
8.82E + 008.92E + 007.27E + 007.26E + 007.13E + 007.35E + 00
2.87E + 002.40E + 002.76E − 052.29E + 006.54E − 05
101.03E + 012.58E + 018.60E + 002.28E + 019.20E + 002.68E + 01
1.80E + 011.84E + 011.72E + 011.68E + 011.66E + 011.69E + 01
3.20E + 002.90E + 001.11E − 033.36E + 008.49E − 04
51.07E − 013.40E + 001.19E − 021.69E + 007.60E − 022.14E + 00
7.75E − 019.28E − 015.67E − 015.57E − 014.47E − 015.63E − 01
6.39E − 013.78E − 013.07E − 064.11E − 011.67E − 06
101.02E + 001.59E + 019.51E − 019.47E + 001.29E + 008.85E + 00
5.02E + 005.33E + 004.14E + 004.24E + 003.79E + 004.09E + 00
2.18E + 001.68E + 008.78E − 051.53E + 003.00E − 06
51.61E − 019.02E + 001.51E − 019.28E + 001.58E − 015.74E + 00
2.08E + 002.62E + 001.24E + 001.60E + 001.35E + 001.69E + 00
1.66E + 001.33E + 006.50E − 081.26E + 001.17E − 06
104.57E + 005.66E + 012.41E + 004.86E + 014.04E + 004.56E + 01
2.47E + 012.60E + 011.84E + 012.11E + 012.02E + 012.06E + 01
1.02E + 019.57E + 001.33E − 048.79E + 001.90E − 04
56.96E + 011.94E + 041.21E + 028.83E + 037.78E + 018.08E + 03
1.94E + 032.80E + 038.98E + 021.74E + 038.68E + 021.40E + 03
2.80E + 032.00E + 037.07E − 051.45E + 039.29E − 07
106.89E + 033.50E + 054.44E + 032.12E + 054.89E + 031.56E + 05
6.11E + 047.19E + 044.69E + 045.35E + 043.38E + 044.21E + 04
5.19E + 043.91E + 041.73E − 033.01E + 042.16E − 07
52.45E + 015.58E + 021.81E + 016.65E + 027.61E + 004.54E + 02
1.78E + 021.93E + 021.30E + 021.63E + 021.29E + 021.45E + 02
1.15E + 021.18E + 029.22E − 038.49E + 011.15E − 03
104.00E + 022.80E + 033.94E + 022.29E + 032.98E + 022.32E + 03
1.40E + 031.37E + 031.12E + 031.18E + 031.16E + 031.22E + 03
4.28E + 023.99E + 026.21E − 044.39E + 026.82E − 03
54.00E + 002.12E + 022.06E + 008.97E + 012.70E + 008.71E + 01
2.52E + 013.36E + 011.49E + 012.05E + 011.56E + 011.87E + 01
3.25E + 011.62E + 014.48E − 061.32E + 012.58E − 07
104.03E + 011.17E + 061.63E + 011.57E + 053.60E + 018.26E + 05
3.85E + 033.75E + 049.76E + 029.57E + 031.09E + 031.31E + 04
1.33E + 052.77E + 045.04E − 058.24E + 043.11E − 04
52.84E − 011.22E + 014.40E − 011.09E + 013.23E − 011.02E + 01
3.33E + 003.87E + 002.48E + 002.93E + 002.16E + 002.76E + 00
2.52E + 001.97E + 003.21E − 031.99E + 005.01E − 04
107.23E + 004.17E + 017.08E + 004.04E + 015.92E + 004.00E + 01
2.39E + 012.32E + 011.98E + 012.04E + 011.95E + 011.94E + 01
7.22E + 006.47E + 001.49E − 036.75E + 007.14E − 05
5
108.70E − 011.00E + 004.90E − 011.00E + 009.16E − 011.00E + 00
1.00E + 009.95E − 011.00E + 009.86E − 011.00E + 009.93E − 01
1.87E − 026.23E − 023.65E − 061.41E − 027.74E − 07
54.50E + 002.22E + 013.68E + 002.01E + 014.74E + 002.13E + 01
1.28E + 011.34E + 011.15E + 011.15E + 011.13E + 011.14E + 01
3.52E + 003.40E + 002.35E − 043.46E + 005.41E − 05
102.44E + 015.91E + 012.61E + 015.87E + 012.61E + 015.60E + 01
4.44E + 014.40E + 014.31E + 014.22E + 014.22E + 014.24E + 01
6.68E + 006.01E + 001.93E − 027.00E + 006.44E − 02
56.18E + 019.74E + 02
5.93E + 025.75E + 02
1.78E + 022.82E − 39
105.69E + 022.15E + 035.93E + 022.23E + 037.63E + 022.24E + 03
1.77E + 031.71E + 031.78E + 031.76E + 031.85E + 031.79E + 03
2.62E + 022.45E + 028.94E − 012.35E + 029.91E − 01
54.10E + 014.31E + 014.06E + 014.22E + 014.08E + 014.25E + 01
4.18E + 014.18E + 014.15E + 014.15E + 014.14E + 014.14E + 01
4.15E − 013.47E − 019.67E − 083.11E − 012.23E − 10
101.83E + 021.87E + 021.83E + 021.86E + 021.83E + 021.86E + 02
1.85E + 021.85E + 021.85E + 021.85E + 021.85E + 021.85E + 02
6.42E − 016.27E − 012.70E − 066.55E − 012.38E − 08
53.70E + 009.50E + 002.80E + 008.91E + 002.88E + 009.80E + 00
6.69E + 006.82E + 005.68E + 005.66E + 005.84E + 005.87E + 00
1.46E + 001.30E + 001.19E − 071.23E + 002.67E − 06
105.99E + 001.30E + 016.06E + 001.13E + 015.01E + 001.24E + 01
1.00E + 019.84E + 008.95E + 009.13E + 009.08E + 009.12E + 00
1.23E + 001.12E + 004.53E − 061.39E + 007.36E − 05
52.97E − 019.57E − 011.51E − 018.28E − 011.60E − 017.97E − 01
5.41E − 015.59E − 015.08E − 014.92E − 014.77E − 014.85E − 01
1.27E − 011.44E − 011.16E − 031.36E − 019.77E − 05
108.82E − 011.28E + 007.98E − 011.24E + 006.04E − 011.23E + 00
1.11E + 001.11E + 001.08E + 001.08E + 001.07E + 001.06E + 00
7.04E − 027.46E − 021.14E − 039.55E − 021.91E − 05
53.09E − 012.52E + 004.48E − 012.03E + 003.56E − 012.63E + 00
1.34E + 001.37E + 001.20E + 001.18E + 001.20E + 001.19E + 00
3.93E − 013.52E − 013.88E − 043.59E − 012.14E − 04
101.73E + 004.20E + 001.80E + 004.22E + 001.58E + 003.68E + 00
2.96E + 002.97E + 002.72E + 002.71E + 002.71E + 002.66E + 00
4.74E − 014.77E − 016.94E − 054.66E − 015.25E − 06
55.04E − 021.55E − 014.53E − 021.21E − 014.44E − 021.08E − 01
7.84E − 028.18E − 027.22E − 027.21E − 026.74E − 026.88E − 02
1.99E − 021.59E − 022.67E − 041.44E − 025.03E − 07
101.86E − 031.27E − 021.17E − 031.07E − 021.26E − 031.12E − 02
5.75E − 035.66E − 035.28E − 035.43E − 034.93E − 035.12E − 03
2.04E − 032.03E − 032.58E − 011.95E − 032.84E − 02

SPXSH-SPX_randomSH-SPX_sequential
52.73E − 012.23E + 001.71E − 011.48E + 001.83E − 012.99E + 00
9.30E − 019.76E − 016.46E − 017.10E − 015.77E − 016.90E − 01
4.08E − 012.74E − 013.92E − 074.21E − 015.93E − 09
102.11E − 018.83E − 011.76E − 019.73E − 011.64E − 018.63E − 01
4.73E − 015.03E − 013.55E − 013.78E − 012.81E − 013.07E − 01
1.29E − 011.26E − 014.24E − 131.22E − 013.26E − 21
53.00E − 036.10E − 021.09E − 036.74E − 023.67E − 043.57E − 02
1.54E − 021.97E − 027.82E − 031.11E − 024.51E − 037.39E − 03
1.33E − 029.97E − 032.63E − 096.87E − 035.70E − 17
109.45E − 041.11E − 025.33E − 047.35E − 031.68E − 045.32E − 03
3.70E − 034.22E − 031.64E − 031.89E − 031.08E − 031.30E − 03
2.06E − 031.07E − 031.29E − 208.56E − 042.16E − 27
54.02E − 034.15E − 012.48E − 033.84E − 012.41E − 034.92E − 01
4.10E − 025.50E − 022.21E − 023.68E − 021.86E − 024.22E − 02
5.67E − 025.17E − 026.57E − 067.39E − 027.37E − 07
105.20E − 038.20E − 022.42E − 031.88E − 011.66E − 031.36E − 01
2.07E − 022.31E − 021.31E − 021.93E − 026.25E − 031.38E − 02
1.23E − 022.19E − 021.04E − 052.30E − 024.27E − 17
55.72E + 001.10E + 032.49E + 006.26E + 022.50E + 002.62E + 03
2.12E + 015.79E + 011.80E + 016.39E + 011.27E + 017.01E + 01
1.29E + 021.18E + 021.74E − 012.77E + 021.89E − 05
101.04E + 018.49E + 018.69E + 002.67E + 027.58E + 003.46E + 02
1.64E + 011.88E + 011.19E + 012.13E + 011.06E + 011.95E + 01
9.64E + 003.56E + 016.19E − 113.77E + 013.70E − 14
53.63E − 013.58E + 011.59E − 011.27E + 027.85E − 023.06E + 01
2.12E + 003.44E + 001.92E + 005.01E + 001.58E + 004.20E + 00
4.57E + 001.41E + 019.05E − 026.01E + 002.58E − 02
102.80E − 016.40E + 007.15E − 026.69E + 004.67E − 021.13E + 01
9.01E − 011.20E + 005.28E − 018.65E − 015.03E − 011.19E + 00
1.04E + 009.14E − 016.06E − 071.79E + 009.06E − 06
51.46E + 002.49E + 015.51E − 011.84E + 011.93E − 011.90E + 01
5.66E + 006.28E + 003.13E + 003.72E + 002.71E + 004.45E + 00
3.41E + 002.47E + 001.05E − 154.19E + 002.78E − 10
102.16E + 001.11E + 011.42E + 003.24E + 019.44E − 018.43E + 00
4.13E + 004.37E + 002.65E + 003.26E + 002.03E + 002.42E + 00
1.35E + 003.25E + 001.25E − 171.32E + 001.58E − 21
55.87E − 031.12E + 004.99E − 031.34E + 003.09E − 033.30E + 00
8.86E − 021.65E − 015.66E − 021.38E − 018.61E − 022.95E − 01
1.96E − 012.15E − 013.69E − 035.54E − 011.90E − 01
101.36E − 021.19E + 004.93E − 031.22E + 004.72E − 031.86E + 00
6.32E − 021.24E − 012.42E − 027.25E − 023.80E − 021.35E − 01
1.66E − 011.55E − 017.46E − 092.49E − 012.72E − 03
5
103.58E − 031.92E − 011.20E − 039.98E − 019.28E − 049.98E − 01
1.73E − 022.54E − 026.69E − 035.27E − 028.29E − 036.27E − 02
2.82E − 021.88E − 011.08E − 101.96E − 019.22E − 05
52.42E + 002.32E + 019.43E − 012.03E + 016.68E − 012.03E + 01
1.12E + 011.18E + 011.02E + 011.02E + 018.62E + 008.86E + 00
3.95E + 003.89E + 004.58E − 033.45E + 008.84E − 09
101.44E + 014.54E + 013.53E + 003.65E + 018.75E − 012.79E + 01
3.47E + 013.37E + 011.52E + 011.71E + 018.14E + 009.53E + 00
6.66E + 007.99E + 006.39E − 275.97E + 001.87E − 33
5
109.16E + 002.86E + 03
1.88E + 031.83E + 03
4.94E + 022.82E − 39
54.04E + 014.13E + 014.03E + 014.15E + 014.02E + 014.11E + 01
4.07E + 014.07E + 014.06E + 014.06E + 014.05E + 014.05E + 01
1.90E − 011.60E − 014.35E − 101.65E − 018.70E − 15
101.81E + 021.81E + 021.80E + 021.81E + 021.80E + 021.81E + 02
1.81E + 021.81E + 021.81E + 021.81E + 021.81E + 021.81E + 02
1.29E − 011.08E − 014.36E − 181.28E − 011.14E − 25
54.05E − 013.28E + 002.13E − 013.14E + 001.97E − 012.65E + 00
1.87E + 001.84E + 001.45E + 001.49E + 001.12E + 001.23E + 00
5.88E − 016.29E − 014.46E − 055.57E − 011.65E − 11
102.67E − 011.39E + 001.57E − 012.07E + 008.07E − 025.88E − 01
5.83E − 016.16E − 013.50E − 013.96E − 012.36E − 012.45E − 01
2.25E − 012.25E − 012.84E − 169.53E − 024.47E − 30
51.04E − 016.17E − 014.94E − 025.92E − 019.47E − 025.98E − 01
3.49E − 013.52E − 013.12E − 013.20E − 013.31E − 013.27E − 01
1.11E − 011.12E − 012.86E − 021.09E − 017.84E − 02
106.96E − 024.84E − 011.43E − 023.90E − 011.22E − 022.88E − 01
1.93E − 012.08E − 017.37E − 029.07E − 025.03E − 026.06E − 02
1.00E − 016.20E − 026.30E − 204.44E − 026.33E − 28
52.00E − 017.73E − 011.21E − 017.01E − 012.00E − 017.10E − 01
4.13E − 014.24E − 014.00E − 013.98E − 014.00E − 013.86E − 01
1.39E − 011.30E − 011.16E − 011.22E − 013.43E − 02
102.00E − 017.03E − 012.00E − 015.14E − 012.00E − 016.16E − 01
3.65E − 013.66E − 013.10E − 013.25E − 013.08E − 013.35E − 01
8.08E − 026.94E − 022.20E − 049.77E − 021.57E − 03
54.37E − 021.27E − 014.24E − 021.32E − 014.28E − 028.59E − 02
7.25E − 027.30E − 025.54E − 025.82E − 025.12E − 025.45E − 02
1.63E − 021.36E − 022.32E − 129.05E − 035.28E − 18
101.08E − 039.64E − 036.29E − 045.55E − 036.01E − 043.29E − 03
4.23E − 034.38E − 031.22E − 031.52E − 038.99E − 041.01E − 03
1.89E − 038.58E − 042.60E − 274.11E − 042.82E − 33

UNDXSH-UNDX_randomSH-UNDX_sequential
51.20E + 006.82E + 008.26E − 015.16E + 007.35E − 015.54E + 00
2.77E + 002.83E + 002.18E + 002.30E + 001.96E + 002.13E + 00
1.09E + 008.41E − 011.60E − 048.74E − 013.29E − 07
102.04E + 006.43E + 001.36E + 005.53E + 001.71E + 005.33E + 00
4.25E + 004.25E + 003.65E + 003.66E + 003.54E + 003.50E + 00
1.01E + 007.54E − 019.57E − 067.34E − 012.52E − 08
51.30E − 026.07E − 011.24E − 024.28E − 014.48E − 036.26E − 01
1.30E − 011.60E − 016.85E − 028.27E − 027.20E − 028.83E − 02
1.16E − 016.00E − 027.89E − 098.50E − 021.94E − 08
109.42E − 021.26E + 007.07E − 026.85E − 014.32E − 026.42E − 01
4.10E − 014.74E − 012.98E − 013.03E − 012.49E − 012.68E − 01
2.39E − 011.43E − 013.09E − 081.22E − 012.28E − 12
54.93E − 021.77E + 001.51E − 021.81E + 002.90E − 028.85E − 01
3.45E − 014.71E − 012.17E − 012.96E − 012.03E − 012.71E − 01
3.48E − 012.77E − 012.17E − 062.04E − 016.85E − 07
103.27E − 015.08E + 002.47E − 016.39E + 004.47E − 014.88E + 00
2.30E + 002.30E + 001.47E + 001.72E + 001.49E + 001.69E + 00
9.70E − 019.66E − 016.77E − 078.96E − 017.28E − 07
54.89E + 001.54E + 036.89E + 001.21E + 036.88E + 001.17E + 03
1.51E + 022.41E + 028.81E + 011.93E + 021.00E + 021.60E + 02
2.70E + 022.38E + 022.12E − 021.80E + 028.40E − 03
101.04E + 023.64E + 038.64E + 011.54E + 039.04E + 013.07E + 03
6.38E + 027.67E + 024.29E + 024.78E + 023.65E + 024.71E + 02
5.61E + 022.72E + 025.55E − 064.01E + 026.08E − 08
56.70E − 018.64E + 012.11E + 002.42E + 022.50E + 006.09E + 01
2.07E + 012.40E + 011.58E + 012.33E + 011.49E + 011.80E + 01
1.64E + 012.85E + 014.19E − 021.30E + 013.00E − 03
102.25E + 012.33E + 022.88E + 012.05E + 021.91E + 012.01E + 02
1.02E + 021.04E + 027.97E + 018.55E + 018.47E + 018.99E + 01
4.18E + 013.30E + 012.52E − 043.96E + 014.88E − 03
52.16E + 004.98E + 013.22E + 004.47E + 011.94E + 005.04E + 01
1.44E + 011.62E + 019.27E + 001.16E + 011.22E + 011.28E + 01
9.25E + 006.70E + 003.06E − 057.34E + 002.07E − 03
101.54E + 013.08E + 031.46E + 015.00E + 021.39E + 011.19E + 03
5.27E + 011.50E + 023.49E + 015.30E + 013.08E + 015.44E + 01
3.87E + 027.00E + 011.12E − 051.22E + 024.08E − 09
51.00E − 015.44E + 002.71E − 023.31E + 002.87E − 025.63E + 00
8.04E − 011.16E + 005.35E − 018.39E − 015.39E − 019.49E − 01
1.02E + 007.72E − 013.33E − 039.58E − 011.35E − 02
101.84E + 001.60E + 012.16E + 001.40E + 018.81E − 011.48E + 01
6.18E + 006.60E + 005.08E + 005.83E + 005.21E + 005.65E + 00
2.97E + 002.90E + 001.62E − 023.05E + 005.28E − 03
5
108.00E − 011.00E + 007.95E − 011.00E + 004.05E − 011.00E + 00
1.00E + 009.92E − 019.96E − 019.78E − 019.99E − 019.72E − 01
2.60E − 024.13E − 026.41E − 088.29E − 021.42E − 04
53.55E + 001.95E + 013.51E + 001.96E + 011.81E + 001.89E + 01
1.06E + 011.10E + 011.10E + 011.09E + 011.00E + 011.01E + 01
3.58E + 003.30E + 004.28E − 013.50E + 004.51E − 02
101.74E + 014.84E + 011.95E + 014.38E + 011.70E + 014.69E + 01
3.64E + 013.64E + 013.45E + 013.40E + 013.45E + 013.43E + 01
6.45E + 005.52E + 001.98E − 036.07E + 007.45E − 03
56.48E + 011.16E + 03
7.66E + 027.24E + 02
2.55E + 022.82E − 39
105.25E + 022.48E + 03
1.70E + 031.60E + 03
4.68E + 022.82E − 39
54.06E + 014.22E + 014.06E + 014.19E + 014.05E + 014.19E + 01
4.12E + 014.13E + 014.11E + 014.11E + 014.11E + 014.11E + 01
3.29E − 012.95E − 012.24E − 042.75E − 014.55E − 04
101.82E + 021.84E + 021.82E + 021.84E + 021.82E + 021.84E + 02
1.83E + 021.83E + 021.83E + 021.83E + 021.83E + 021.83E + 02
4.72E − 014.03E − 015.09E − 074.09E − 012.62E − 08
52.04E + 005.83E + 001.69E + 004.95E + 001.27E + 005.92E + 00
3.89E + 003.81E + 003.23E + 003.24E + 003.30E + 003.26E + 00
7.79E − 016.47E − 011.07E − 078.19E − 011.42E − 06
103.04E + 006.14E + 002.47E + 005.22E + 001.85E + 005.06E + 00
4.32E + 004.38E + 003.87E + 003.85E + 003.81E + 003.81E + 00
6.26E − 015.59E − 012.45E − 085.87E − 014.85E − 09
51.54E − 016.60E − 011.07E − 015.25E − 011.55E − 016.38E − 01
3.74E − 013.81E − 013.73E − 013.64E − 013.39E − 013.55E − 01
1.06E − 019.63E − 022.13E − 011.01E − 013.81E − 02
104.11E − 019.76E − 014.01E − 019.11E − 014.77E − 018.96E − 01
8.37E − 018.09E − 017.95E − 017.73E − 017.59E − 017.47E − 01
1.06E − 019.53E − 029.46E − 041.06E − 012.73E − 06
52.01E − 011.13E + 002.15E − 019.63E − 011.46E − 011.03E + 00
7.05E − 017.00E − 015.80E − 015.79E − 015.41E − 015.68E − 01
1.94E − 011.72E − 016.01E − 061.83E − 011.40E − 06
106.09E − 011.69E + 005.27E − 011.40E + 005.00E − 011.31E + 00
1.00E + 001.04E + 009.04E − 019.09E − 018.95E − 018.80E − 01
2.16E − 011.72E − 011.79E − 051.58E − 019.29E − 08
54.42E − 021.08E − 014.62E − 021.09E − 014.67E − 021.13E − 01
7.04E − 027.25E − 026.97E − 027.06E − 026.76E − 026.93E − 02
1.45E − 021.34E − 021.99E − 011.39E − 026.66E − 02
101.44E − 037.63E − 031.89E − 038.04E − 031.35E − 038.53E − 03
4.31E − 034.33E − 034.02E − 034.24E − 034.12E − 034.23E − 03
1.30E − 031.31E − 032.75E − 011.38E − 032.68E − 01

The best results in each row are emphasized in bold. The emphasized values in bold indicate that the Mann–Whitney U test with the significance level shows significance against the result without SHX. “—” represents invalid solutions (trapped by local optimum or out of parameter range

From Table 3, we can observe the clear improvement of performance brought by SHX. The results of the value show that the methods with SHX have recognized the significance at least in 23 settings among all 30 settings. In the other five results (minimum, maximum, median, mean, and SD), the methods without SHX cannot achieve outperformed results for most settings. For instance, focusing on the minimum results, the methods without SHX outperform the methods with SHX only 5, 0, and 4 times by BLX, SPX, and UNDX, respectively. On the other hand, SHX with sequential archive update achieves the best performance. SH-BLX_sequential, SH-SPX_sequential, and SH-UNDX_sequential show the significance in 27, 26, and 27 settings, respectively. In addition, they achieve the best results in most settings with respect to the maximum, median, and mean results. One possible reason for outperforming in most cases is that removes the oldest individual which arrived first, and therefore SHX can select offspring according to the up-to-date search history to reflect the trend of evolution more sensitively. In contrast, uniformly removes individuals in the archive, which may impede the discovery of new solutions since old individuals may be retained for more generations in the archive.

6.2.1. Analysis on BLX vs. SH-BLX

It has been already known that the standard BLX [17] faces difficulties especially when the target function is nonseparable [34] due to the parameter-wise sampling. By observing the results of to and to from Table 3, we can find that involving SHX significantly improves the performance, which indicates that SHX can help BLX to greatly mitigate this drawback. It is easy to understand because offspring selection with clusters embeds distance measure which builds the relationship among parameters.

6.2.2. Analysis on SPX vs. SH-SPX

SPX [21] is a better alternative of BLX, and we can observe from Table 3 that SPX noticeably outperforms BLX. From Table 3, it is also very clear that SHX further boosts the performance of SPX to a large extent. In particular, the results of minimum and median are improved by involving SHX for all settings. As pointed out in [21], SPX has the ability to maintain the mean and covariance of the parent individuals, which is consistent with the design guideline of good crossover operators mentioned in Section 3. Since SHX manages an archive that stores search history over few generations, it can preserve some useful statistics (e.g., centroids of clusters) much longer. That is why SHX is able to enhance SPX.

6.2.3. Analysis on UNDX vs. SH-UNDX

Similar to BLX and SPX, Table 3 shows that the results involving SHX are improved in most settings. UNDX is also designed to generate offspring inheriting the distribution of the parent individuals [35]. Therefore, statistics of the search history provided by SHX are useful for UNDX to enhance search ability.

6.3. Comparison in Convergence Curve

With the aid of search history, SHX not only achieves better results but also improves the convergence speed. In this section, we compare the generation alternation for over all the test functions in the case of . Evaluation values of elite individuals from the 1st generation to the 100th generation are plotted in Figure 2. The mean value of 100 trials is represented by the line, and the range between the minimum and the maximum is represented by the shaded area. Smaller area means more stable search. It should be noted that as the ranges of parameters are not constrained during the search procedure, methods can achieve infinitely small values of fitness, and a lower value does not mean a better result in the case of , as explained in Section 6.2.

For BLX, SPX, and UNDX, exploiting SHX shows faster convergence speed comparing against them without SHX in most cases. The superiority becomes more obvious when the problem setting is more difficult (e.g., multimodal functions vs. unimodal functions ).

6.4. Comparison in Processing Time

In this section, we show the runtime overhead of the processing brought by SHX. Figure 3 shows the comparisons in processing time of an optimization task ( and a single fitness evaluation takes 0.01 second) for BLX and SPX. The parameter setting follows Table 2, and all the results are averaged over 10 trials. It took 93.9 seconds and 94.1 seconds for BLX and SPX to complete the entire process, respectively. SH-BLX_random took additional 1.7 seconds to BLX. SH-BLX_sequential took 1.6 seconds more than BLX. Similarly, the additional runtime for SH-SPX_random and SH-SPX_sequential to SPX were 3.9 seconds and 3.9 seconds, respectively. These numbers demonstrate the additional runtime only occupies a small part of the total processing time. These additional computational costs mainly occur in the clustering with archive data and the label assignment with candidate offspring. The cost can be further reduced by fusing efficient distance measure or parallel computing. For a fixed size of an archive, the runtime grows linearly with the increase in the number of generations. Considering the complexity of the fitness function and the budget, SHX is a practical alternative to other crossover models.

7. Conclusions

In this paper, we have proposed a novel crossover model (SHX) which is simple yet effective and efficient. It can be easily integrated with any existing crossover operators. The key idea is to exploit search history over generations to gain useful information for generating offspring. Experimental results demonstrate that our SHX can significantly boost the performance of existing crossovers, in terms of the final solution and the convergence speed. Also, according to experiments, the induced extra runtime is negligible compared to the total processing time.

SHX still has a few limitations. (1) Additional hyperparameters need to be determined. (2) The induced additional runtime may be unable to sufficiently support applications which require high processing speed. As the future work, we would like to address the above limitations. For instance, hyperparameters can be adaptively set by considering specific contexts, and parallelization can be introduced to speed up SHX.

Data Availability

The test data used to support the findings of this study are included within the article.

Disclosure

A preliminary version of this work appears in GECCO2020 and has also been mentioned in the manuscript which can be viewed at the following link: https://arxiv.org/abs/2003.13508.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

This work was partly supported by JSPS KAKENHI, Grant number (JP18K17823).

References

  1. A. H. Wright, “Genetic algorithms for real parameter optimization,” in Foundations of Genetic Algorithms, vol. 1, pp. 205–218, Elsevier, Amsterdam, Netherlands, 1991. View at: Google Scholar
  2. F. Herrera, M. Lozano, and J. L. Verdegay, “Tackling real-coded genetic algorithms: operators and tools for behavioural analysis,” Artificial Intelligence Review, vol. 12, no. 4, pp. 265–319, 1998. View at: Publisher Site | Google Scholar
  3. D. Whitley, “An overview of evolutionary algorithms: practical issues and common pitfalls,” Information and Software Technology, vol. 43, no. 14, pp. 817–831, 2001. View at: Publisher Site | Google Scholar
  4. D. Sholomon, O. David, and N. S. Netanyahu, “A genetic algorithm-based solver for very large jigsaw puzzles,” in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1767–1774, Portland, OR, USA, June 2013. View at: Publisher Site | Google Scholar
  5. L. Xie and A. Yuille, “Genetic CNN,” in Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), pp. 1379–1388, Venice, Italy, October 2017. View at: Publisher Site | Google Scholar
  6. E. Real, S. Moore, A. Selle et al., “Lare-scale evolution of image classifiers,” in Proceedings of the International Conference on Machine Learning (ICML), pp. 2902–2911, Sydney, Australia, August 2017. View at: Google Scholar
  7. K. Deep and M. Thakur, “A new crossover operator for real coded genetic algorithms,” Applied Mathematics and Computation, vol. 188, no. 1, pp. 895–911, 2007. View at: Publisher Site | Google Scholar
  8. C. García-Martínez, M. Lozano, F. Herrera, D. Molina, and A. M. Sánchez, “Global and local real-coded genetic algorithms based on parent-centric crossover operators,” European Journal of Operational Research, vol. 185, no. 3, pp. 1088–1113, 2008. View at: Publisher Site | Google Scholar
  9. P.-H. Tang and M.-H. Tseng, “Adaptive directed mutation for real-coded genetic algorithms,” Applied Soft Computing, vol. 13, no. 1, pp. 600–614, 2013. View at: Publisher Site | Google Scholar
  10. S. Picek, D. Jakobovic, and M. Golub, “On the recombination operator in the real-coded genetic algorithms,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC), pp. 3103–3110, IEEE, Cancun, Mexico, June 2013. View at: Publisher Site | Google Scholar
  11. Y.-C. Chuang, C.-T. Chen, and C. Hwang, “A simple and efficient real-coded genetic algorithm for constrained optimization,” Applied Soft Computing, vol. 38, pp. 87–105, 2016. View at: Publisher Site | Google Scholar
  12. F. Herrera, M. Lozano, and A. M. Sánchez, “A taxonomy for the crossover operator for real-coded genetic algorithms: an experimental study,” International Journal of Intelligent Systems, vol. 18, no. 3, pp. 309–338, 2003. View at: Publisher Site | Google Scholar
  13. M. Črepinšek, S.-H. Liu, and M. Mernik, “Exploration and exploitation in evolutionary algorithms: a survey,” ACM Computing Surveys, vol. 45, pp. 1–33, 2013. View at: Google Scholar
  14. H.-G. Beyer and K. Deb, “On self-adaptive features in real-parameter evolutionary algorithms,” IEEE Transactions on Evolutionary Computation, vol. 5, no. 3, pp. 250–270, 2001. View at: Publisher Site | Google Scholar
  15. S. Lloyd, “Least squares quantization in pcm,” IEEE Transactions on Information Theory, vol. 28, no. 2, pp. 129–137, 1982. View at: Publisher Site | Google Scholar
  16. T. Nakane, X. Lu, and C. Zhang, “SHX: search history driven crossover for real-coded genetic algorithm,” 2020, https://arxiv.org/abs/2003.13508. View at: Google Scholar
  17. L. J. Eshelman and J. D. Schaffer, “Real-coded genetic algorithms and interval-schemata,” in Foundations of Genetic Algorithms, vol. 2, pp. 187–202, Elsevier, Amsterdam, Netherlands, 1993. View at: Google Scholar
  18. L. J. Eshelman, K. E. Mathias, and J. D. Schaffer, “Crossover operator biases: exploiting the population distribution,” in Proceedings of the 7th International Conference on Genetic Algorithms (ICGA), East Lansing, MI, USA, July 1997. View at: Google Scholar
  19. K. Deb and R. B. Agrawal, “Simulated binary crossover for continuous search space,” Complex Systems, vol. 9, pp. 115–148, 1995. View at: Google Scholar
  20. I. Ono and S. Kobayashi, “A real-coded genetic algorithm for function optimization using unimodal normal distribution crossover,” in Proceedings of the Seventh International Conference on Genetic Algorithms (ICGA), pp. 246–253, East Lansing, MI, USA, 1997. View at: Google Scholar
  21. S. Tsutsui, M. Yamamura, and T. Higuchi, “Multi-parent recombination with simplex crossover in real coded genetic algorithms,” in Proceedings of the Genetic and Evolutionary Computation Conference (GECCO), pp. 657–664, Orlando, FL, USA, July 1999. View at: Google Scholar
  22. Y. Sano and H. Kita, “Optimization of noisy fitness functions by means of genetic algorithms using history of search,” in Proceedings of the International Conference on Parallel Problem Solving from Nature (PPSN), pp. 571–580, Springer, Paris, France, September 2000. View at: Google Scholar
  23. H. B. Amor and A. Rettinger, “Intelligent exploration for genetic algorithms: using self-organizing maps in evolutionary computation,” in Proceedings of the 2005 Conference on Genetic and Evolutionary Computation—GECCO’05, pp. 1531–1538, Washington, DC, USA, June 2005. View at: Google Scholar
  24. S. Y. Yuen and C. K. Chow, “Continuous non-revisiting genetic algorithm,” in Proceedings of the 2009 IEEE Congress on Evolutionary Computation, pp. 1896–1903, IEEE, Trondheim, Norway, May 2009. View at: Publisher Site | Google Scholar
  25. H. Kita, “A comparison study of self-adaptation in evolution strategies and real-coded genetic algorithms,” Evolutionary Computation, vol. 9, no. 2, pp. 223–241, 2001. View at: Publisher Site | Google Scholar
  26. Y. Akimoto, R. Hasada, J. Sakuma, I. Ono, and S. Kobayashi, “Generation alternation model for real-coded ga using multi-parent: proposal and evaluation of just generation gap (JGG),” in Proceedings of the 19th SICE Symposium on Decentralized Autonomous Systems, pp. 341–346, Tokyo, Japan, January 2007. View at: Google Scholar
  27. S. Kobayashi, “The frontiers of real-coded genetic algorithms,” Transactions of the Japanese Society for Artificial Intelligence, vol. 24, no. 1, pp. 147–162, 2009. View at: Publisher Site | Google Scholar
  28. D. E. Goldberg, “Genetic algorithms in search, optimization, and machine learning,” 1989. View at: Google Scholar
  29. J. Wang, M. Zhang, O. K. Ersoy, K. Sun, and Y. Bi, “An improved real-coded genetic algorithm using the heuristical normal distribution and direction-based crossover,” Computational Intelligence and Neuroscience, vol. 2019, Article ID 4243853, 17 pages, 2019. View at: Publisher Site | Google Scholar
  30. E.-u. Haq, I. Ahmad, A. Hussain, and I. M. Almanjahie, “A novel selection approach for genetic algorithms for global optimization of multimodal continuous functions,” Computational Intelligence and Neuroscience, vol. 2019, Article ID 8640218, 14 pages, 2019. View at: Publisher Site | Google Scholar
  31. M. Jamil and X. S. Yang, “A literature survey of benchmark functions for global optimisation problems,” International Journal of Mathematical Modelling and Numerical Optimisation, vol. 4, no. 2, pp. 150–194, 2013. View at: Publisher Site | Google Scholar
  32. I. Fister, S. Fong, and J. Brest, “A novel hybrid self-adaptive bat algorithm,” The Scientific World Journal, vol. 2014, Article ID 709738, 2 pages, 2014. View at: Publisher Site | Google Scholar
  33. M. N. Ab Wahab, S. Nefti-Meziani, and A. Atyabi, “A comprehensive review of swarm optimization algorithms,” PLoS One, vol. 10, no. 5, Article ID e0122827, 2015. View at: Publisher Site | Google Scholar
  34. I. Ono, H. Kita, and S. Kobayashi, “A real-coded genetic algorithm using the unimodal normal distribution crossover,” in Advances in Evolutionary Computing, pp. 213–237, Springer, Berlin, Germany, 2003. View at: Google Scholar
  35. H. Kita, I. Ono, and S. Kobayashi, “Theoretical analysis of the unimodal normal distribution crossover for real-coded genetic algorithms,” in Proceedings of the 1998 IEEE International Conference on Evolutionary Computation Proceedings, pp. 529–534, Anchorage, AK, USA, May 1998. View at: Publisher Site | Google Scholar

Copyright © 2020 Takumi Nakane et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder
Views51
Downloads22
Citations

Related articles

We are committed to sharing findings related to COVID-19 as quickly as possible. We will be providing unlimited waivers of publication charges for accepted research articles as well as case reports and case series related to COVID-19. Review articles are excluded from this waiver policy. Sign up here as a reviewer to help fast-track new submissions.