Computational Intelligence and Neuroscience

Computational Intelligence and Neuroscience / 2017 / Article

Research Article | Open Access

Volume 2017 |Article ID 3235720 | 15 pages | https://doi.org/10.1155/2017/3235720

Dynamic Inertia Weight Binary Bat Algorithm with Neighborhood Search

Academic Editor: Michael Schmuker
Received27 Feb 2017
Accepted20 Apr 2017
Published28 May 2017

Abstract

Binary bat algorithm (BBA) is a binary version of the bat algorithm (BA). It has been proven that BBA is competitive compared to other binary heuristic algorithms. Since the update processes of velocity in the algorithm are consistent with BA, in some cases, this algorithm also faces the premature convergence problem. This paper proposes an improved binary bat algorithm (IBBA) to solve this problem. To evaluate the performance of IBBA, standard benchmark functions and zero-one knapsack problems have been employed. The numeric results obtained by benchmark functions experiment prove that the proposed approach greatly outperforms the original BBA and binary particle swarm optimization (BPSO). Compared with several other heuristic algorithms on zero-one knapsack problems, it also verifies that the proposed algorithm is more able to avoid local minima.

1. Introduction

There are many optimization problems with binary search space. And many of them are high dimensional. Thus, it is infeasible to solve them with exhaustive method. So as to optimize these problems, such as unit commitment [1], feature selection [2, 3], task scheduling [4, 5], and 0-1 knapsack problem [6, 7], binary algorithms are proposed to generate binary solutions. For example, Mirjalili et al. [8] adapted the standard continuous BA algorithm to be applied to binary spaces and then BBA combined with -Nearest Neighbor (KNN, ) method was used to solve feature selection problem [2]. BBA can provide competitive performance but, in some cases, it may get stuck to local minima. To solve this issue, an improved binary bat algorithm, named IBBA, is proposed. IBBA will carry out a more diversified search process. The main contributions of the paper can be summarized as follows:(1)An improved high-performance binary bat algorithm is proposed for binary problems. Using the neighbor bat and dynamic inertia weight strategy, the proposed approach can be more able to avoid being trapped into local minima.(2)To evaluate its performance, the proposed IBBA and several other algorithms are implemented on benchmark functions and zero-one knapsack problems. The results obtained prove that IBBA outperforms the other algorithms.

The organization of the paper is as follows: a compact overview of BA and BBA is given in Section 2. A literature review on inertia weight strategies is also provided in this section. Section 3 presents the improved structure of IBBA. The experimental results of benchmark functions and zero-one knapsack problems are demonstrated in Sections 4 and 5, respectively. And the reason that the performance of IBBA is better than other algorithms is given in Section 6. Finally, conclusion is made in Section 7.

2. Background

This section provides a brief overview of BA and BBA. A literature review on inertia weight strategies is also presented.

2.1. The Bat Algorithm

Inspired by the echolocation behavior of bats, Yang proposed the bat algorithm [9]. When bats chase preys, they will decrease the loudness and increase the frequency of emitted ultrasonic sound. These characteristics of real bats have been used in developing the BA. These basic steps of BA have been mathematically described as follows [9].

In the BA, each bat has three vectors, including a frequency vector, a velocity vector, and a position vector that are updated at time step as (1), (2), and (3):where represents the best position obtained so far and represents the frequency of th bat which is updated as follows:where in the range of is a random vector drawn from a uniform distribution. From (1) and (3), it is obvious that different frequencies promote the exploration capability of bats to the optimal solution.

These equations, to a certain extent, can guarantee the exploitation capability of the BA. However, to perform the intensification better, a random walk operation has also been employed as follows:where means one solution selected randomly among the current best solutions, is a randomly selected number in the range of , and indicates the average loudness of all bats at this time step. The pseudocode of BA algorithm is demonstrated in Algorithm 1. Note that rand is a random number uniformly distributed in the range . To an extent, BA can be considered as a balanced combination of global and intensive local search. The pulse emission rate () and loudness () control the balancing between these two search techniques. As increases, artificial bats tend to perform a diversification rather than intensification. These two parameters mentioned above are updated as follows:where and are constants and has the same meaning of the cooling factor in SA [10]. To guarantee that the artificial bats are moving toward the optimal solutions, both loudness and emission rate are updated when the better solutions are found. For any ,For simplicity, can be used.

Initialize the bat population and ;
Define pulse frequency ;
Initialize pulse rate and the loudness ;
while    do
Generate new solutions by adjusting frequency, updating
velocities and positions [Eq. (1) to (3)];
if    then
Select a solution among the best solutions randomly;
Generate a local solution around the selected best solution;
end if
Generate a new solution by flying randomly;
if    then
Accept the new solutions;
Increase and reduce ;
end if
Rank the bats and find the current ;
end while
2.2. Binary Bat Algorithm

The binary bat algorithm (BBA) was proposed by Mirjalili et al. [8] to solve optimization problems with binary search space. The structure of BBA is almost the same as the original BA in which the velocity and frequency are defined in continuous space. BBA makes two changes to the original BA:(i)The vector of position is no longer a continuous-valued vector but a bit string.(ii)The random operation demonstrated by (4) is no longer suitable to binary search space. Instead, a simpler operation is adopted.

The position update equation for BBA changes towhereand and indicate the position and velocity of th artificial bat at iteration in th dimension and represents the complement of .

The operation demonstrated by (4) for BBA changes towhere still denotes a solution selected randomly from the current best solutions.

2.3. Inertia Weight Strategies

In some heuristic algorithms, especially PSO [11], inertia weight strategy plays an important role in the process of keeping balance between global search and local search process. The inertia weight strategy determines the contribution proportion of a particle’s old velocity to its new velocity at the current time step. Shi and Eberhart [12] proposed the concept of inertia weight by adopting constant inertia weight strategy and demonstrated that a large inertia weight enhances the exploration while a small inertia weight enhances the exploitation. Further, various dynamic inertia weight strategies have been proposed which can improve the capabilities of PSO and they can be categorized into three classes: constant and random inertia weight strategies, time-varying inertia weight strategies, and adaptive inertia weight strategies. A compact literature review of inertia weight strategies is presented in subsequent paragraphs.

Eberhart and Shi [13] presented a random inertia weight strategy which was demonstrated to be more suitable for dynamic problems. Khan et al. [14] proposed a modified PSO by introducing a mutation mechanism and using dynamic algorithm parameters. To increase the diversity of the particles, the inertia weight of each particle adopts a random updating formula.

Most of the PSO variants employed time-varying inertia weight strategies in which the value of the inertia weight is adjusted based on the iteration number. In [15], a linear decreasing variant of inertia weight was proposed and was illustrated to be effective in enhancing the fine tuning performance of the PSO. Inspired by the idea of decreasing the inertia weight over time step, a nonlinear decreasing inertia weight strategy was proposed [16]. Gao et al. [17] presented a novel PSO variant which combined chaos mutation operator with the logarithm decreasing inertia weight. Based on the idea of decreasing inertia weight, Chen et al. [18] proposed two natural exponent inertia weight strategies to solve the continuous optimization problems.

Adaptive inertia weight strategies is another research trend of inertia weight strategies which monitor the search situation and adjust the inertia weight value according to one or more feedback parameters. Nickabadi et al. [19] proposed a new adaptive inertia weight approach that employs the success rate of the swarm as the feedback parameter to ascertain the particle’s situation in the search space. Zhan et al. [20] presented an adaptive particle swarm optimization (APSO) which provides better search efficiency than classical PSO. Yang et al. [21] used speed factor and aggregation degree factor to adapt the value of inertia weight.

3. Improved Binary Bat Algorithm

BA is an algorithm combined with many merits of previous proposed heuristic algorithms, such as PSO [11] and SA [10]. Therefore, it stands for the reason that the update process of velocity and location in the BA has many similarities with PSO. When the velocity update equation (1) is analyzed, it is obvious that this equation consists of two parts. The first item () denotes the velocity of population and the second item () controls the velocity of the th position () with guidance of the global best solution (). The first and second items of the equation affect the algorithm so that it performs global and local search, respectively. It has been proven that the first item of (1) may reduce the convergence rate rapidly and the second item of (1) may result in premature convergence problem [22]. To solve this problem, some improved BA algorithms have been proposed recently [2225]. Due to the fact that the structure of BBA is effectively the same as the original BA, BBA is not good enough at exploration and exploitation, too.

EBA [22] illustrates that the algorithm could produce better solutions with the guidance of the neighbor bat (th solution). For this purpose, inspired by [12], the velocity update equation of original BBA is modified as follows:where denotes the inertia weight factor which balances local and global search intensity of the th solution by controlling the value of old velocity , represents one of the best solutions randomly selected from the population (), is self-adaptive learning factor of global best solution () ranging from 0 to 1, and therefore, , which is a learning factor of th solution, ranges from 1 to 0. Since the th solution information is used to guide the th solution, the algorithm can be more able to avoid local minima. As is increased, the effect of the global best solution () becomes higher than the th neighbor solution () and vice versa. The update equation for is shown as follows:where denotes initial impact factor of , represents the maximum number of iterations, iter indicates the current number of iterations, and indicates a nonlinear modulation index. As iter is increased, will increase from to 1 nonlinearly and will decrease from () to 0 correspondingly. With a small and a large , bats are allowed to fly around the search space, instead of flying toward the swarm best. On the other hand, a large and a small allow the bats to converge to the global optimum solution in the latter stages of the search process. Therefore, the proposed approach can effectively control the global search and enhance convergence to the global best solution during the latter part of the optimization. Thus, the solution can switch from exploration to exploitation. The states of and have been illustrated in Figure 1 when and are 0.6 and 2, respectively.

Inspired by [26], dynamic inertia weight strategy is used to control the magnitude of the velocity. This strategy is illustrated as follows:where indicates the total number of iterations, iter denotes the current number of iterations, maximal inertia values are represented by , and is a constant larger than 1.

It is crucial to balance the large-scale exploration search and exploitation search. Once the algorithm locates the approximate position of global optima, refined local exploitation search capabilities need to be enhanced to get global optimum. To dynamically control the transformation point of global search and local search, an adaptive strategy is designed. If the current global best solution is not improved after iterations, the algorithm switches to intensive exploitation search with the smaller or continues exploration search with the current . The strategy is defined as follows:where and represent the th and th taken values of , respectively, and indicates an interval of definite iterations.

According to the descriptions mentioned above, the benefit of the proposed improved binary bat algorithm (IBBA) is that it contributes to the dispersion of the solutions into binary search space. In addition, more accurate results can be obtained.

4. Benchmark Experiment

To verify the performance of proposed algorithm, thirteen benchmark functions [27] are employed. These test functions contain two different groups: unimodal and multimodal functions. Tables 1 and 2 demonstrate the benchmark functions used, respectively, where range means the search boundary of the function. The dimensions of all test functions are 5. The global minimum values of all benchmark functions used except are 0 while the global minimum value of is ().


FunctionRange



Function Range

   
   

For this simulation, 15 bits are used to represent each continuous variable in binary. Thus, the dimension of generating bit vector for each benchmark function is calculated as follows:where is the dimension of each artificial bat in IBBA and represents the dimension of a particular benchmark function. Therefore, the dimensions of the corresponding binary problems are ().

The simulation focused on comparing the performance of the IBBA, BBA [8], and BPSO [28]. Some basic parameters should be initialized before running these algorithms. The initial parameters for these algorithms are demonstrated in Table 3. Tables 4 and 5 present the experimental results. The obtained results of each algorithm are averaged over 30 independent runs, and best results are denoted in bold type. In addition, the mean (Mean), standard deviation value (SD), and medium value (Med) of the optimal solution in the last iteration are presented.


Alg. ParametersValues

IBBANumber of bats30
0
2
0.25
0.5
0.9
0.9
0.9
Modulation index, 2
50
50
0.6
Max iteration500
Stopping criterionMax iteration

BBANumber of particles30
0
2
0.25
0.5
0.9
0.9
Max iteration500
Stopping criterionMax iteration

BPSONumber of particles30
,  2, 2
Decreased linearly from 0.9 to 0.4
Max iterations500
Max velocity6
Stopping criterionMax iteration


MetricIBBABBABPSO

Mean
SD
Med

Mean
SD
Med

Mean
SD
Med

Mean
SD
Med

Mean
SD
Med

Mean
SD
Med

Mean
SD
Med


MetricIBBABBABPSO

Mean
SD
Med

Mean
SD
Med

Mean
SD
Med

Mean
SD