Complexity

Complexity / 2021 / Article
Special Issue

Intelligent Methods for Large Scale System Operation and Management

View this Special Issue

Research Article | Open Access

Volume 2021 |Article ID 6694695 | https://doi.org/10.1155/2021/6694695

Lisheng Wei, Ning Wang, Huacai Lu, "A Novel BBO Algorithm Based on Local Search and Nonuniform Variation for Iris Classification", Complexity, vol. 2021, Article ID 6694695, 17 pages, 2021. https://doi.org/10.1155/2021/6694695

A Novel BBO Algorithm Based on Local Search and Nonuniform Variation for Iris Classification

Academic Editor: Shi Cheng
Received06 Nov 2020
Revised08 Jan 2021
Accepted27 Mar 2021
Published15 Apr 2021

Abstract

In order to improve the iris classification rate, a novel biogeography-based optimization algorithm (NBBO) based on local search and nonuniform variation was proposed in this paper. Firstly, the linear migration model was replaced by a hyperbolic cotangent model which was closer to the natural law. And, the local search strategy was added to traditional BBO algorithm migration operation to enhance the global search ability of the algorithm. Then, the nonuniform variation was introduced to enhance the algorithm in the later iteration. The algorithm could achieve a stronger iris classifier by lifting weaker similarity classifiers during the training stage. On this base, the convergence condition of NBBO was proposed by using the Markov chain strategy. Finally, simulation results were given to demonstrate the effectiveness and efficiency of the proposed iris classification method.

1. Introduction

Iris classification is especially suitable for recognition with uniqueness, stability, inviolability, and reliability characteristics and has been one of the hottest biological characteristic recognition research spot recently [1]. It has been widely used technology in national defence, financial industry, and entrance guard system [2, 3]. Alfred Wallace and Charles Darwin proposed the theory of biogeography to study the distribution, migration, and extinction of habitats of biological species in the 19th century. Inspired by biogeography, Simon proposed the BBO algorithm in 2008 [4]. BBO is a new swarm intelligence optimization algorithm and solves the optimization problem by simulating the mathematical model of species migration in biogeography. In general, BBO can find a good initial candidate solution and generate an acceptable optimization solution, which has been widely used in image processing [5], scheduling optimization [6, 7], parameter estimation [8], power flow calculation [9], and load analysis [10]. However, BBO has some shortcomings, such as weak search ability and easy to fall into local optimum in the later period of operation. How to use BBO global search ability to improve the iris classification rate has important theoretical and practical value.

In recent years, many researchers have worked on the improvement of the BBO performance. For instance, Wang et al. [11] combined the chaotic mapping strategy with the BBO optimal migration model and proposed a biogeography optimization algorithm based on based on the adaptive population migration mechanism. Feng et al. [12] proposed an improved BBO with random ring topology and Powell’s method, in which the local loop topology was used instead of the global topology. The adaptive Powell method was modified to adapt the evolutionary process to improve the accuracy of the solution. Li et al. [13] designed a hybrid algorithm by combining the artificial bee colony algorithm (ABC) with BBO to obtain ABC’s exploring ability and BBO’s developing ability. Chen et al. [14] combined Cuckoo Search (CS) and bBBO using two search strategies: heterogeneous rhododendron search and biogeography-based discovery. Zhao et al. [15] proposed an optimization method based on two-stage differential biogeography to solve the problem of premature convergence and reduce rotation variance.

In order to improve the iris classification rate, the NBBO based on local search and nonuniform variation was proposed in this paper. Firstly, the hyperbolic cotangent migration model was used to replace the original linear migration model to improve the adaptability of the algorithm. Then, the local search strategy was added to the BBO migration operation in order to enhance the algorithm search ability and improve the convergence speed of the algorithm. The nonuniform variation was used in the operation to increase the search ability of the algorithm and prevent the local optimum, which could achieve a stronger iris classifier by lifting weaker similarity classifiers during the training stage.

The rest of the paper was organized as follows. Section 2 presented the improved biogeography-based optimization algorithm, including migration model, migration operations, and nonuniform variation operation. Section 3 presented NBBO test on different test functions and iris classification samples. Finally, a brief conclusion and future work are given in Section 4.

2. Methods

In nature, biological species live in different areas with obvious boundaries, which are called habitats. Habitat suitability index (HSI) is used to describe the survival degree of species suitable for habitat. HSI is affected by humidity, temperature, habitat area, and vegetation diversity. These factors are called suitability variables (SIV). Habitats with higher HSI can accommodate more biological species, whereas habitats with lower HSI can only accommodate fewer biological species. BBO is a population-based algorithm, which regards each solution as a habitat, fitness as HSI, and each component of solution as a SIV. The optimization problem can be solved by simulating the migration and mutation process in biogeography. The two main operations of BBO are migration operation and mutation operation [16, 17].

Like other intelligent algorithms, BBO is prone to oscillation in the iterative process, resulting in slow and premature convergence, poor local search ability, and inaccuracy. This section presents the NBBO algorithm in terms of migration model and operation. The flow chart of the improved algorithm is shown in Figure 1.

2.1. Migration Model

There are different migration models in biogeography. As shown in formula (1), the original BBO adopts the simplest linear migration model. But, the ecosystem is nonlinear in nature; small changes in a certain part of the system may have complex effects on the whole system. For the actual species, the migration process is more complicated than what is described by the linear model. Among the related improved algorithms, the cosine mobility model proposed by Ma performs best under most of the functions tested [18], and the mobility model close to the natural law is better than the simple linear model. In order to better adapt to the nonlinear migration problem, this paper proposes a hyperbolic cotangent nonlinear migration model to improve the performance of the basic BBO algorithm.where and are related to the number of species and is the maximum number of species. represents the immigration rate which refers to the probability of species moving into the habitat, represents the emigration rate which refers to the probability of species moving out of the habitat, represents the maximum immigration rate, and represents the maximum emigration rate, all of which are functions of the number of habitat populations . When the number of habitat species is 0, the immigration rate is the highest, that is, , the migration rate is the smallest, that is, . When the number of species reaches the maximum number of species that can be accommodated in the habitat, the immigration rate drops to 0, and the migration rate reaches the maximum, that is, . When the number of species is , the immigration rate and the emigration rate of the species are equal, that is, , and the dynamic equilibrium state is reached at this time.

In the hyperbolic tangential mobility model, the trend of mobility changing with species number is like that of cosine model, but the amplitude is more moderate than that of cosine model.

2.2. Local Search Migration Operations

In the original BBO, a generalized search was carried out based on the immigration rate and the emigration rate, a habitat with a higher emigration rate was randomly selected, and the characteristics of the corresponding position (SIV) were moved to the habitat with low HSI. However, in some habitats with high migration rates, the characteristics of the corresponding locations SIV are not necessarily the best for the migrated habitats. If migrated directly, it may result in a decrease in the suitability index of the move into habitat.

Therefore, in order to enhance the global search ability of the algorithm and improve the convergence speed of the algorithm, the selected migration characteristics can be modified by using equation (6), which can reduce the poor migration characteristics from the habitat with higher migration rate to the low migration rate. Similarly, if you move out of the same habitat with a high migration rate each time, as the number of evolutions increases, the population diversity decreases, and the convergence speed will slow down:where represents a space allocated to each individual in the population for storing algorithm parameters. represents the current individual. represents the new individual generated by the local search. and .

2.3. Nonuniform Variation Operation

In the early stage of iteration, the population position is usually far from the optimal solution and needs to be iterated many times to find a feasible search space when the search radius is too small. At the end of iteration, the population position is close to or near the optimal solution. A small range of searches is used as a fine-tuning of the solution vector. It is shown by equation (7), as the traditional mutation operator cannot search the optimal solution efficiently, the nonuniform mutation operator is introduced. The whole population is searched at a large range during the early stage of search. As the number of iterations increases, the search range is gradually narrowed, and the algorithm is effectively balanced to escape from local minimum.where denotes the probability for the number of species; denotes the species number; and denotes the maximum variation rate given; .

The specific operation of nonuniform variation is like uniform variation, but it focuses on searching for small areas near the original individual. Assuming that the variation of the component of the generation ( is the current number of iterations) of Habitat is performed, the variation component is as follows:where and are the upper and lower limits of component , respectively. and represent random numbers in the range of and meet the nonuniform distribution:

This feature allows the mutation operator to search the entire space evenly in the initial stage of the algorithm (when is very small), and the latter part of the algorithm focuses on several local range exact searches. can be defined as follows:where is a random number that fits evenly within the range, is the maximum number of iterations, and is the system parameter that determines the degree of dependency on the number of iterations, select here .

Then, the specific steps of NBBO algorithm are as follows: (Algorithm 1)

Begin
Algorithmic parameters setting:
Population size = 50,
Maximum number of iterations G = 500,
Maximum immigration rate ,
Maximum emigration rate ,
Mutation probability ,
Number of elites retained .
  / Initialization /
Random generation of a set of initial habitats constitutes the initial population.
Calculate the fitness value of each habitat in the population.
/ is the condition for the end of the iteration /
Sort the habitat in descending order according to the fitness value.
The immigration rate and emigration rate of each habitat were calculated according to formula (1).
   / End of initialization/
  / Migration /
 Select according to the immigration probability
/ is dimension /
  
    Select according to the migration probability
   Randomly select a characteristic variable from
   Replace a random characteristic variable in with
  else
   Perform a local search on the characteristic variable
    according to formula (6) can get .
   Replace random characteristic variable in with .
  end if
 end for
end for
  / End of migration /
  / Mutation /
 Calculate the probability of mutation according to and
 Selection of unmutated habitat based on
  
  Replace with a randomly generated by formula (8)
   
    Replace with a randomly generated by (9)
   end if
  end if
 end if
end for
/ End of mutation /
    Recalculate the habitat fitness values
end while
end
2.4. Convergence Analysis

By using Markov chain strategy, we can analyze the convergence of the proposed NBBO algorithms [19, 20]. Let be a population with improved algorithm algebra of . We divide into subsets based on the fitness of the population. is a subset of them; next, we use Markov chain to analyze the global convergence of the improved BBO algorithm.

The evolution process of the NBBO algorithm is mainly composed of selection, migration and mutation operations, which are independent of evolution algebra. Considering the number of individuals of BBO algorithm, the evolution process of the NBBO algorithm satisfies the finite homogeneous Markov chain. Population subset can be equivalent to a state on a finite homogeneous Markov chain, means the probability of being in state , and then and .

The transition probability of state is as follows:

Then, the state transition matrix of the Markov chain iswhere

It can be found by formulas (12) and (13) , . Since the sum of the probabilities of each row in the Markov transition matrix is 1, . It can be concluded that the state transition matrix of Markov chain satisfies the reducible random matrix. According to [21], the definition of a reductive random matrix is as follows:

Let matrix of order be a reducible random matrix which can be obtained by the same row transformation and column transformation, namely,where is the order primitive matrix and and are nonzero matrices.

Then, we have

The above matrix is a stable random matrix, and satisfies the following conditions:

Therefore, state transition matrix is a reductive random matrix, then

In formula (12) and , since the sum of the probabilities of each line in the Markov transition matrix is 1; there must be , then

Formula (19) shows that when the number of iterations , probability , That is, no matter how the initial state can finally converge to the global optimal solution with probability 1, it can be concluded thatwhere is the optimal fitness value of population , represents the global optimal fitness value, and indicates the probability that the optimal individual of the generation is the global optimal value. It is proved that the NBBO algorithm has global convergence.

3. Simulation Experiments

3.1. Test Function Simulation

In order to test the optimization performance of the proposed IMBBO algorithm, 13 standard test functions were selected for simulation experiments [22, 23]. Table 1 shows the test function expression, the range of values for each independent variable, and the theoretical optimal value. We select the optimal value, average value, and standard deviation of the test results as the evaluation criteria to test the algorithm performance.


FunctionExpressionRangesTheory optimal

(−1.28, 1.28)0
{−100, 100)0
(−4, 5)0
(−100, 100)−1
(10, 10)0
(−600, 600)0
(−32, 32)0
(−5, 5)3.075 E − 04

The experimental function of Table 1 is numerically tested using the proposed NBBO algorithm proposed in this paper, and the basic BBO algorithm, grey wolf optimization algorithm [24] (DE), elephant herding optimization [25] (EHO), salp swarm algorithm [26] (DEBBO), and particle swarm optimization biogeography optimization algorithm [27] and (BBOPSO) for comparison of results. The experimental parameter settings of the algorithm are shown in Table 2.


Setting objectParameterValue

Common partPopulation size
The maximum number of iterations
Elite reserves


BBO/BBOPSOHabitat modification probability
Maximum immigration
Maximum migration
Mutation probability



DEWeighting factor
Crossover probability

EHOThe scale factors
The number of clans
,
DEBBOWeighting factor
Crossover probability
DE mutation scheme


DE/rand/1/bin
The proposed NBBOHabitat modification probability
Maximum immigration
Maximum migration
Mutation probability
System parameters





To study the performance of the algorithm in different dimensions, the dimensions of test function and are , the dimension of test functions is , and the dimension of test functions is . In order to ensure that the comparison between each method is fair, we ensure that the experimental environment of each method is consistent during the experiment: operating system: window 10, memory: 8 GB, programming language: MATLAB 2016, and ensure that the setting of common parameters of the algorithm is consistent, and the parameters of different algorithms are set according to the parameters of specific references. As the algorithm is running randomly, the six algorithms independently run each test function 30 times to avoid unnecessary errors caused by the random operation of the algorithm.

Tables 3 and 4 show the recorded optimal function value, average value, and standard deviation obtained by the algorithm. In the iterative process of the algorithm, we record the best fitness of each evaluation, find the average optimal value of each evaluation and draw a curve, which reflects the convergence trend of the algorithm. The convergence curves of some test functions are shown in Figures 2 and 3.


FunctionDimensionEvaluating indicatorReference [4]BBOReference [24]DEReference [25]EHOReference [26]DEBBOReference [27]BBOPSOThe proposed method

D = 10Best
Ave|
Std
3.40 E − 03
1.37 E − 02
9.80 E − 03
1.70 E − 03
4.10 E − 03
1.70 E − 03
2.10 E − 03
6.60 E − 033.70 E − 03
1.81 E − 04
9.45 E − 04
6.91 E − 04
7.92 E − 05
1.20 E − 03
8.62 E − 04
5.78 E054.51 E043.25 E04
D = 30Best
Ave
Std
7.58 E − 02
1.47 E − 01
4.41 E − 02
3.50 E − 02
5.46 E − 02
1.03 E − 02
8.44 E − 02
1.33 E − 01
4.13 E − 02
1.05 E − 02
1.46 E − 02
4.20 E − 03
6.20 E − 03
2.23 E − 02
9.10 E − 03
1.10 E032.70 E032.00 E03
D = 50Best
Ave
Std
2.51 E − 01
5.67 E − 01
2.08 E − 01
1.21 E − 01
2.29 E − 01
5.16 E − 02
1.98 E − 01
5.37 E − 01
1.38 E − 01
3.47 E − 02
5.61 E − 02
1.56 E − 02
4.63 E − 02
7.82 E − 02
2.22 E − 02
3.00 E038.50 E034.20 E03

D = 10Best
Ave
Std
1.94 E + 00
8.38 E + 00
6.12 E + 00
1.18 E − 20
3.52 E − 20
2.77 E − 20
2.35 E − 04
3.74 E − 04
6.88 E − 05
6.72 E − 32
1.69 E − 29
2.06 E − 29
2.02 E − 28
1.37 E − 27
1.49 E − 27
3.66 E455.79 E439.78 E43
D = 30Best
Ave
Std
7.13 E + 01
1.43 E + 02
4.64 E + 01
1.20 E − 03
2.60 E − 03
2.20 E − 03
3.00 E − 03
3.60 E − 03
0.31 E − 04
1.40 E − 10
5.83 E − 10
4.90 E − 10
6.73 E − 07
1.24 E − 05
3.35 E − 05
2.09 E241.43 E223.91 E22
D = 50Best
Ave
Std
5.93 E + 02
7.27 E + 02
1.34 E + 02
1.58 E + 01
2.30 E + 01
4.63 E + 00
7.90 E − 03
9.00 E − 03
1.20 E − 03
2.38 E − 05
1.62 E − 04
1.54 E − 04
4.11 E − 02
1.64 E − 01
9.32 E − 02
4.66 E186.83 E178.48 E17

D = 10Best
Ave
Std
2.47 E − 02
3.01 E − 01
2.54 E − 01
2.04 E − 05
1.52 E − 04
1.04 E − 04
4.64 E − 06
7.51 E − 06
2.22 E − 06
3.89 E − 04
2.40 E − 03
1.80 E − 03
2.63 E − 04
1.45 E − 03
1.24 E − 03
8.88 E135.16 E077.21 E07
D = 30Best
Ave
Std
2.47 E + 00
5.77 E + 00
3.75 E + 00
6.19 E + 01
1.36 E + 01
4.70 E + 01
2.36 E − 04
4.10 E − 04
1.50 E − 04
1.21 E − 01
3.92 E − 01
1.61 E − 01
5.35 E − 02
2.90 E − 01
1.25 E − 01
1.13 E101.26 E062.10 E06
D = 50Best
Ave
Std
1.73 E + 01
5.16 E + 01
2.44 E + 01
1.04 E + 03
1.87 E + 03
4.83 E + 02
2.20 E − 03
3.30 E − 03
6.48 E − 04
2.02 E + 00
4.55 E + 00
2.61 E + 00
1.63 E + 00
2.55 E + 00
9.34 E − 01
4.07 E094.69 E066.33 E09

D = 2Best
Ave
Std
−9.13 E − 01
–5.00 E − 01
4.45 E − 01
−1.00 E + 00−1.00 E + 000.00 E + 00−9.95 E − 01
–9.60 E − 01
3.55EE − 02
−1.00 E+00−1.00 E+000.00 E+00−1.00 E+00−1.00 E+000.00 E+00−1.00 E+00−1.00 E+000.00 E+00


FunctionDimensionEvaluating indicatorReference [4] BBOReference [24] DEReference [25] EHOReference [26] DEBBOReference [27] BBOPSOThe proposed method

D = 10Best
Ave
Std
2.59 E − 02
5.18 E − 02
2.26 E − 02
1.41 E − 19
2.97 E − 16
5.21 E − 16
3.59 E − 06
4.66 E − 06
8.50 E − 07
4.91 E − 06
8.09 E − 05
5.62 E − 05
9.43 E − 09
1.58 E − 05
1.77 E − 05
1.26 E268.04 E251.73 E24
D = 30Best
Ave
Std
3.24 E − 01
7.83 E − 01
2.94 E − 01
7.41 E + 00
1.11 E + 01
1.97 E + 00
5.22 E − 06
7.12 E − 05
1.75 E − 05
4.50 E − 03
6.21 E − 02
5.85 E − 02
4.43 E − 03
9.06 E − 03
3.07 E − 03
9.64 E161.05 E133.65 E13
D = 50Best
Ave
Std
2.32 E + 00
3.26 E + 00
5.16 E − 01
3.12 E + 01
3.62 E + 01
2.45 E + 00
1.68 E − 04
2.29 E − 04
5.69 E − 05
3.33 E − 01
1.15 E + 00
5.32 E − 01
4.26 E − 02
1.58 E − 01
1.41 E − 01
3.55 E131.03 E102.92 E10

D = 10Best
Ave
Std
1.00 E + 00
1.07 E + 00
4.12 E − 02
1.00 E + 00
1.00 E + 00
0.00 E + 00
1.10 E + 00
1.14 E + 00
2.33 E − 02
0.00 E+00
2.48 E − 02
1.57 E − 02
0.00 E+002.95 E − 02
2.54 E − 02
0.00 E+000.00 E+000.00 E+00
D = 30Best
Ave
Std
1.88 E + 00
2.40 E + 00
3.36 E − 01
1.00 E + 00
1.00 E + 00
7.20 E − 04
2.90 E + 00
3.61 E + 00
5.98 E − 01
6.03 E − 10
1.50 E − 03
3.20 E − 03
3.66 E − 07
1.24 E − 02
2.40 E − 02
0.00 E+000.00 E+000.00 E+00
D = 50Best
Ave
Std
4.86 E + 00
8.00 E + 00
2.26 E + 00
1.12 E + 00
1.18 E + 00
3.33 E − 02
7.72 E + 00
9.82 E + 00
1.01 E + 00
4.22 E − 05
1.90 E035.00 E03
9.11 E − 02
1.78 E − 01
6.79 E − 02
02.22 E177.02 E17

D = 10Best
Ave
Std
8.70 E − 01
2.05 E + 00
6.17 E − 01
3.43 E − 11
8.75 E − 11
6.33 E − 11
7.20 E − 03
8.10 E − 03
6.43 E − 04
2.66 E152.66 E150.00 E+006.22 E − 15
8.85 E − 13
2.54 E − 12
2.66 E152.66 E150.00 E+00
D = 30Best
Ave
Std
3.58 E + 00
4.30 E + 00
4.09 E − 01
1.28 E − 02
1.63 E − 02
2.30 E − 03
1.44 E − 02
1.56 E − 02
7.98 E − 04
5.12 E − 06
6.50 E − 06
2.13 E − 06
1.08 E − 04
2.66 E − 04
1.25 E − 04
1.23 E131.23 E131.07 E12
D = 50Best
Ave
Std
5.43 E + 00
6.18 E + 00
5.51 E − 01
2.14 E + 00
2.57 E + 00
2.98 E − 01
1.71 E − 02
1.88 E − 02
9.30 E − 02
8.84 E − 04
1.70 E − 03
1.20 E − 03
2.74 E − 02
7.18 E − 02
6.75 E − 02
1.75 E109.85 E109.57 E10

D = 4Best
Ave
Std
6.13 E − 04
2.70 E − 03
4.30 E − 03
3.17 E − 04
6.10 E − 04
4.01 E − 04
3.24 E − 04
3.52 E − 04
2.27 E − 05
3.16 E − 04
9.99 E − 04
1.20 E − 03
4.91 E − 04
7.38 E − 04
3.14 E − 04
3.08 E043.23 E042.26 E05

In order to compare the optimization performance of each algorithm more easily and clearly, the algorithm with the best optimization performance of the optimal value, average value and standard deviation of each test function is boldly marked in Tables 3 and 4. From Tables 3 and 4 and the simulation diagram, we find that the convergence speed of the analytical algorithm can be qualitatively and quantitatively analyzed. For the simple multidimensional unimodal function in Figure 2, the NBBO algorithm has faster convergence speed and solution precision in the later iteration although its advantage is not obvious in the initial stage of iteration. For the more complex multidimensional multimodal function in Figure 3, the other five algorithms have the phenomenon of search stagnation or slow search, i.e., they fall into local optimum and are difficult to jump out. Then, by comparing the optimal value of each test function and the average value of the optimization results, we can see that IMBBO algorithm has achieved a smaller value. Although the optimization results of DE, DEBBO, and BBOPSO algorithms also reach the global optimal value in the optimization of some test functions with different dimensions, it can be seen from the optimization results of single peak function and multi peak function that NBBO algorithm has a great advantage in the optimization ability. Figures 2 and 3 show that the proposed NBBO algorithm can not only improve the convergence of the algorithm but also improve the evolution ability and the ability to jump out of the local optimal solution. It is superior to the other four comparison algorithms in the stability of the optimization result. In summary, when solving the optimization problem of single-peak or multi-peak functions, the proposed NBBO can quickly and effectively find the optimal solution under a short number of iterations, and its optimal solution has higher precision and stability.

Considering the fixed dimension test function has few dimensions to be adjusted, the accuracy and stability of each algorithm are relatively high. For the multidimensional test function, when its dimension D is increased from 10 to 30 or 50, the accuracy and stability of the algorithm have decreased. This is because the objective function is more complicated and the algorithm needs to make more adjustments. However, no matter how the dimension changes, the proposed NBBO algorithm still maintains a stable and high-precision optimization performance for high-dimensional complex functions. In reference [28], derrick proposed that for the evaluation of the performance of the improved evolutionary algorithm, only based on the average value and the standard deviation value to compare the advantages and disadvantages of the algorithm is not rigorous enough, so statistical test should be carried out. The statistical test shows that the improved algorithm has significant improvement advantages over other existing algorithms. Therefore, in order to verify the significant difference between the IMBBO algorithm and other algorithms, a nonparametric estimation method, Wilcoxon rank sum test, is used for statistical analysis. For each test function, the 30 times solution results of IMBBO algorithm are tested with those of BBO, DE, EHO, DEBBO, and BBOPSO algorithms respectively, and H0 is assumed: the solution results of the two algorithms are generally the same; H1: the solution results of the two algorithms are generally different. Set the threshold value of significance level of zero hypothesis test α = 0.05. When the test result , reject the zero hypothesis; the experimental results of the two algorithms have significant difference; when the test result , accept the zero hypothesis; the experimental results of the two algorithms have no significant difference. The specific inspection results are shown in Tables 5 and 6.


FunctionDimensionNBBO vs BBONBBO vs DENBBO vs EHONBBO vs DEBBONBBO vs BBOPSO
PWPWPWPWPW

D = 102.87e − 11+2.87e − 11+2.87e − 11+1.40e − 03+1.68e − 06+
D = 302.82e − 11+2.82e − 11+2.82e − 11+2.82e − 11+6.93e − 11+
D = 502.87e − 11+2.87e − 11+2.87e − 11+2.82e − 11+2.87e − 11+

D = 102.87e − 11+2.87e − 11+2.87e − 11+2.87e − 11+2.78e − 11+
D = 302.87e − 11+2.87e − 11+2.82e − 11+2.87e − 11+2.87e − 11+
D = 502.87e − 11+2.87e − 11+2.82e − 11+2.87e − 11+2.87e − 11+

D = 102.87e − 11+2.87e − 11+2.87e − 11+2.87e − 11+2.87e − 11+
D = 302.87e − 11+2.87e − 11+2.93e − 11+2.93e − 11+2.93e − 11+
D = 502.87e − 11+2.87e − 11+2.78e − 11+2.87e − 11+2.87e − 11+

D = 21.17e − 12+N/AN/A1.17e − 12+N/AN/AN/AN/A


FunctionDimensionNBBO Vs BBONBBO vs DENBBO vs EHONBBO vs DEBBONBBO vs BBOPSO
PWPWPWPWPW

D = 102.87e − 11+2.87e − 11+2.87e − 11+2.87e − 11+2.87e − 11+
D = 302.87e − 11+2.87e −  11+2.87e − 11+2.87e − 11+2.87e − 11+
D = 502.87e − 11+2.87e − 11+2.87e − 11+2.87e − 11+2.87e − 11+

D = 101.17e − 12+1.69e − 14+1.17e − 12+5.62e − 11+5.62e − 11+
D = 301.17e − 12+1.69e − 14+1.17e − 12+1.17e − 12+1.17e − 12+
D = 503.06e − 12+3.06e − 12+3.06e − 12+3.06e − 12+3.06e − 12+

D = 101.17e − 12+1.17e − 121.03e − 13+N/AN/A1.15e − 12+
D = 302.87e − 11+2.87e − 11+2.87e − 11+2.87e − 11+2.87e − 11+
D = 502.87e − 11+2.87e − 11+2.87e − 11+2.87e − 11+2.87e − 11+

D = 22.87e − 11+4.27e − 07+1.68e − 06+2.31e − 08+2.87e − 11+

Tables 5 and 6 show the test results of test function . Among them, for the probability p value less than the significance level threshold value of 0.05, bold display; “W” is the significance test judgment result, combined with Tables 3 and 4, use “+” to indicate that NBBO is more accurate and significant than other algorithms, use “−” to indicate that NBBO is less accurate and significant than other algorithms, and use “ = ” to indicate that NBBO algorithm. There is no significant difference in the accuracy of other algorithms; “N/A” means that significance judgment cannot be made. From the test results, most of the test values are less than 0.05 and “” is “+,” so zero hypothesis is rejected. Therefore, there are significant differences between the calculation results of IMBBO and BBO, DE, EHO, DEBBO, and BBOPSO, and NBBO is significantly better.

To sum up, the biogeography-based optimization algorithm based on local search and nonuniform variation has good results for most of the benchmark functions, which effectively solves the slow convergence speed of the original BBO algorithm and easy to fall into the local optimization problem in the later stage of the search and greatly improves the convergence speed and search accuracy of the BBO algorithm without adding too many programming steps.

3.2. Iris Classification Based on NBBO Multilayer Perceptron

In order to test the training performance of NBBO multilayer perceptron for iris classification, the mean square error (MSE) of the actual output value of multilayer perceptron and the expected output value is used as an indicator to measure the performance of multilayer perceptron [29]. MSE of all training samples is calculated as follows:where is the number of training samples, is the number of outputs, is the expected output of the input unit under training samples, and is the actual output of the input unit under training samples. The MSE value is smaller is equal to the performance of the multilayer perceptron is better, and vice versa. The HIS of the habitat is calculated by

Figure 4 shows the schematic diagram of a multilayer perceptron based on NBBO algorithm. NBBO algorithm receives the average MSE of all training samples and all expected output samples as the objective function for iterative training and adjusts the link weights and deviations through continuous iterative evolution, so as to provide the optimal connection weight and deviation for iris classification samples after iterative training.

The iris dataset has 150 training samples divided into 3 categories: Setosa, Versicolor, and Virginica, with a total of 4 basic features: sepal length, sepal width, petal length, and petal width. We used the structure of multilayer perceptron as 4-9-3 to classify the problem. Table 7 shows the experimental results of the data set algorithm. Figure 5 shows the average convergence trend and classification accuracy.


AlgorithmOptimal valueAverage valueStandard deviation

Reference [4]3.15 E − 025.34 E − 021.52 E − 02
Reference [24]6.12 E − 021.31 E − 013.48 E − 02
Reference [25]2.37 E − 013.92 E − 017.36 E − 02
Reference [26]2.73 E − 024.72 E − 021.29 E − 02
Reference [27]1.51 E − 012.56 E − 015.34 E − 02
The proposed method1.34 E − 021.93 E − 023.80 E − 03

Compared with other five iris classification algorithms in Table 7, we can see that the proposed NBBO has the better optimal value, average value, and standard deviation than the other five algorithms. The optimal and average values of the MSE calculated by NBBO are better than other algorithms, which indicates that the optimization performance of the NBBO is the best among the seven algorithms. At the same time, the minimum standard deviation calculated by NBBO also indicates that the NBBO algorithm inherits the strong robustness of the BBO algorithm and is more robust than other algorithms.

Figure 5(a) shows the classification convergence curves of the six algorithms for the iris classification problem, among which the convergence effect of the DE algorithm is the worst of the six algorithms, and the EHO and BBOPSO algorithms fall into the local optimum in the late iteration. BBO algorithm and DEBBO algorithm have similar convergence precision but the convergence speed is not ideal. The proposed NBBO algorithm is optimal in convergence precision and convergence speed. Figure 5(b) shows that DE, EHO, and BBOPSO show significant fluctuations in the classification accuracy of 10 independent operations. The NBBO algorithm has the best stability and average, and its classification accuracy is the highest among the three stable algorithms BBO, DEBBO, and NBBO.

4. Conclusions and Future Work

This paper proposed a novel biogeography-based optimization algorithm based on local migration strategy and nonuniform mutation for iris classification. Firstly, the hyperbolic cotangent transfer model was used to replace the original transfer model to enhance the adaptability of the algorithm. Secondly, the local transfer strategy was used to improve the global search ability of the algorithm. Finally, the nonuniform mutation operation was used to enhance the development ability of the algorithm in the later stage of iteration for the optimal solution in a small range, to improve the ability of the algorithm to exit the local optimum. The simulation results showed that the proposed NBBO has higher stability, better convergence accuracy, and faster convergence speed than other method for iris classification. How to use NBBO algorithm to solve other multiobjective optimization problem compare with multiobjective optimization method [3033] and realize better applications for engineering practice will be the focus of our next work.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

This work was supported by the Natural Science Research Programme of Colleges and Universities of Anhui Province under grant KJ2020ZD39, the Foundation for Talented Young People of Anhui Polytechnic University under grant 2016BJRC008, the Open Research Fund of Anhui Key Laboratory of Detection Technology and Energy Saving Devices, and Anhui Polytechnic University under grant DTESD2020A02.

References

  1. S. A. C. Schuchkers, N. A. Schmid, A. Abhyankar et al., “On techniques for angle compensation in nonideal iris recognition,” IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics), vol. 37, no. 5, pp. 1083–4419, 2017. View at: Publisher Site | Google Scholar
  2. S. Gao, X. D. Zhu, and Y. N. Liu, “A quality assessment method of iris image based on support vector machine,” Journal of Fiber Bioengineering & Informatics, vol. 8, no. 2, pp. 293–330, 2015. View at: Publisher Site | Google Scholar
  3. W. W. Boles and B. Boashash, “A human identification technique using images of the iris and wavelet transform,” Pattern Recognition, vol. 36, no. 8, pp. 1783–1797, 2003. View at: Google Scholar
  4. D. Simon, “Biogeography-based optimization,” IEEE Transactions on Evolutionary Computation, vol. 12, no. 6, pp. 702–713, 2008. View at: Publisher Site | Google Scholar
  5. X. Zhang, D. Wang, and H. Chen, “Improved biogeography-based optimization algorithm and its application to clustering optimization and medical image segmentation,” IEEE Access, vol. 7, pp. 28810–28825, 2019. View at: Publisher Site | Google Scholar
  6. F. Zhao, S. Qin, Y. Zhang, W. Ma, C. Zhang, and H. Song, “A hybrid biogeography-based optimization with variable neighborhood search mechanism for No-wait flow shop scheduling problem,” Expert Systems with Applications, vol. 126, no. 6, pp. 321–339, 2019. View at: Publisher Site | Google Scholar
  7. M. Kaveh and M. S. Mesgari, “Improved biogeography-based optimization using migration process adjustment: an approach for location-allocation of ambulances,” Computers & Industrial Engineering, vol. 135, pp. 800–813, 2019. View at: Publisher Site | Google Scholar
  8. B. X. Li and K. S. Low, “Low sampling rate online parameters monitoring of DC-DC converters for predictive-maintenance using biogeography-based optimization,” IEEE Transactions on Power Electronics, vol. 31, no. 4, pp. 2870–2879, 2016. View at: Publisher Site | Google Scholar
  9. Y. B. Bhushan, S. U. Bhimrao, and R. D. Koteswara, “An efficient transient stability‐constrained optimal power flow using biogeography‐based algorithm,” International Transactions on Electrical Energy Systems, vol. 28, no. 1, pp. 1–15, 2018. View at: Publisher Site | Google Scholar
  10. A. Kaveh, A. Dadras, and N. G. Malek, “Buckling load of laminated composite plates using three variants of the biogeography-based optimization algorithm,” Acta Mechanica, vol. 229, no. 4, pp. 1551–1566, 2018. View at: Publisher Site | Google Scholar
  11. J. S. Wang and J. D. Song, “Chaotic biogeography-based optimization algorithm,” IAENG International Journal of Computer Ence, vol. 22, no. 2, pp. 122–134, 2017. View at: Google Scholar
  12. Q. X. Feng, S. Y. Liu, J. K. Zhang et al., “Improved biogeography-based optimization with random ring topology and Powell’s method,” Applied Mathematical Modelling, vol. 41, no. 6, pp. 630–649, 2017. View at: Publisher Site | Google Scholar
  13. H. Li and Y. H. Liu, “Beam pattern synthesis based on improved biogeography-based optimization for reducing sidelobe level,” Computers and Electrical Engineering, vol. 60, no. 5, pp. 161–174, 2017. View at: Publisher Site | Google Scholar
  14. X. Chen and K. J. Yu, “Hybridizing cuckoo search algorithm with biogeography-based optimization for estimating photovoltaic model parameters,” Solar Energy, vol. 180, pp. 192–206, 2019. View at: Publisher Site | Google Scholar
  15. F. Zhao, S. Qin, Y. Zhang, W. Ma, C. Zhang, and H. Song, “A two-stage differential biogeography-based optimization algorithm and its performance analysis,” Expert Systems with Applications, vol. 115, pp. 329–345, Jan, 2019. View at: Publisher Site | Google Scholar
  16. W. A. Guo, M. Chen, L. Wang et al., “A survey of biogeography-based optimization,” Neural Computing & Applications, vol. 28, no. 8, pp. 1909–1926, 2017. View at: Google Scholar
  17. A. R. AI-Roomi and M. E. EI-Hawary, “Metropolis biogeography-based optimization,” Information Sciences, vol. 360, no. 5, pp. 73–95, 2016. View at: Publisher Site | Google Scholar
  18. H. P. Ma, “An analysis of the equilibrium of migration models for biogeography-based optimization,” Information Sciences, vol. 180, no. 18, pp. 3444–3464, 2010. View at: Publisher Site | Google Scholar
  19. S. A. Darani and O. Abdelkhalik, “Convergence analysis of hidden genes genetic algorithms in space trajectory optimization,” Journal of Aerospace Information Systems, vol. 15, no. 4, pp. 228–238, 2018. View at: Publisher Site | Google Scholar
  20. B. X. Yue, H. B. Liu, and A. Abraham, “Dynamic trajectory and convergence analysis of swarm algorithm,” Computing & Informatics, vol. 31, no. 2, pp. 371–392, 2012. View at: Google Scholar
  21. F. A. Attia, “Resolvent operators of Markov processes and their applications in the control of a finite dam,” Journal of Applied Probability, vol. 26, no. 2, pp. 314–324, 1989. View at: Publisher Site | Google Scholar
  22. D. Ebenezer, “Optimum wavelet-based homomorphic medical image fusion using hybrid genetic–grey wolf optimization algorithm,” IEEE Sensors Journal, vol. 18, no. 16, pp. 6804–6811, 2018. View at: Publisher Site | Google Scholar
  23. Q.-K. Pan, H.-Y. Sang, J.-H. Duan, and L. Gao, “An improved fruit fly optimization algorithm for continuous function optimization problems,” Knowledge-Based Systems, vol. 62, pp. 69–83, 2014. View at: Publisher Site | Google Scholar
  24. R. Storn and K. Price, ““Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997. View at: Publisher Site | Google Scholar
  25. G. G. Wang, L. Dos Santos Coelho, X. Z. Gao, and S. Deb, “A new metaheuristic optimisation algorithm motivated by elephant herding behaviour,” International Journal of Bio-Inspired Computation, vol. 8, no. 6, pp. 394–409, 2016. View at: Publisher Site | Google Scholar
  26. W. Y. Gong, Z. H. Cai, and C. X. Ling, “DE/BBO: A hybrid differential evolution with biogeography-based optimization for global numerical optimization,” Soft Computing, vol. 15, no. 4, pp. 645–665, 2010. View at: Publisher Site | Google Scholar
  27. D. Li, W. Guo, L. Wang et al., “Particle swarm optimization-based solution updating strategy for biogeography-based optimization,” in Proceedings of the 2016 IEEE Congress on Evolutionary Computation (CEC), IEEE, Vancouver, Canada, July 2016. View at: Publisher Site | Google Scholar
  28. D. Joaquín, G. Salvador, D. Molina et al., “A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms,” Swarm & Evolutionary Computation, vol. 1, no. 1, pp. 3–18, 2011. View at: Publisher Site | Google Scholar
  29. H. Mannila, “Data mining: machine learning, statistics, and databases,” in Proceedings of 8th International Conference on Scientific and Statistical Data Base Management, IEEE, Stockholm, Sweden, June 1999. View at: Publisher Site | Google Scholar
  30. R. Wang, R. C. Purshouse, and P. J. Fleming, “Preference-inspired coevolutionary algorithms for many-objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 17, no. 4, pp. 474–494, 2013. View at: Publisher Site | Google Scholar
  31. R. Wang, Z. Zhou, H. Ishibuchi, T. Liao, and T. Zhang, “Localized weighted sum method for many-objective optimization,” IEEE Transactions on Evolutionary Computation, vol. 22, no. 1, pp. 3–18, 2018. View at: Publisher Site | Google Scholar
  32. R. Wang, Q. Zhang, and T. Zhang, “Decomposition-based algorithms using pareto adaptive scalarizing methods,” IEEE Transactions on Evolutionary Computation, vol. 20, no. 6, pp. 821–837, 2016. View at: Publisher Site | Google Scholar
  33. K. Li, T. Zhang, and R. Wang, “Deep reinforcement learning for multiobjective optimization,” IEEE Transactions on Cybernetics, p. 1, 2020, In press. View at: Publisher Site | Google Scholar

Copyright © 2021 Lisheng Wei et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Related articles

No related content is available yet for this article.
 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder
Views312
Downloads585
Citations

Related articles

No related content is available yet for this article.

Article of the Year Award: Outstanding research contributions of 2021, as selected by our Chief Editors. Read the winning articles.