Research Article | Open Access

Volume 2018 |Article ID 3847951 | https://doi.org/10.1155/2018/3847951

Xiao-Xu Ma, Jie-Sheng Wang, "Optimized Parameter Settings of Binary Bat Algorithm for Solving Function Optimization Problems", Journal of Electrical and Computer Engineering, vol. 2018, Article ID 3847951, 9 pages, 2018. https://doi.org/10.1155/2018/3847951

Optimized Parameter Settings of Binary Bat Algorithm for Solving Function Optimization Problems

Academic Editor: Jit S. Mandeep
Accepted11 Apr 2018
Published14 May 2018

Abstract

The bat algorithm (BA) is a new bionic intelligent optimization algorithm to simulate the foraging behavior and the echolocation principle of the bats. The parameter initialization of the discussed binary bat algorithm (BBA) has important influence on the convergence speed, convergence precision, and good global searching ability of the BBA. The convergence speed and algorithm searching precision are determined by the pulse of loudness and pulse rate. The simulation experiments are carried out by using the six typical test functions to discuss this influence. The simulation results show that the convergence speed of the BBA is relatively sensitive to the setting of the algorithm parameters. The convergence precision reduces when increasing the rate of bat transmitted pulse alone and the convergence speed increases the launch loudness alone. The proper combination of BBA parameters (the rate of bat transmitted pulse and the launch loudness) can flexibly improve the algorithm’s convergence velocity and improve the accuracy of the searched solutions.

1. Introduction

Optimization is the selection of a best element from a set of some available alternatives with regard to some criterion. Optimization algorithm is a basic principle of nature, which shows many different advantages and disadvantages in computational efficiency and global search probability and has a vast variety of applications in research and industry [1]. The function optimization presents a formalized framework for modelling and solving some certain problems. Given an objective function, it takes a number of parameters as its inputs, whose goal is to find the combination of parameters and return the best value. This framework is abstract enough that a wide variety of different problems can be interpreted as function optimization problems [2].

However, the traditional function optimization algorithm is used to solve the typical problem with small dimension, often not applicable in practice. So people focus on the nature. Nature provides rich models to solve these problems (such as fireflies, bats, and ants). People discovered the swarm intelligence optimization algorithm by simulating natural biological systems. These models could stimulate computer scientists using household nontraditional tools to solve the application problems [3]. Now a lot of swarm intelligence optimization algorithms are proposed, such as particle swarm optimization (PSO) [4], ant colony algorithm (ACO) [5], bat algorithm (BA) [6], social learning optimization (SLO) algorithm [7], and chicken swarm optimization (CSO) algorithm [8]. They can be used in the dictionary learning remote sensing data, automotive safety integrity level positioning, economic dispatch, composition, and examples of the Cloud Service Composition of QOS awareness. Obviously, the study of swarm intelligence optimization has become an important research direction. The bat algorithm (BA) was proposed by professor Yang based on the swarm intelligence heuristic search algorithm in 2010 and is a kind of effective method to search the global optimal solution [9]. BA has attracted more and more attention because of its simple, less parameters, strong robustness, and the advantage of easy implementation. Firstly, BA was proposed to solve the problem with the continuous real search space. However, there are many optimization problems having the discrete binary search space. So the binary bat optimization algorithm (BBA) was put forward to solve this kind of problems [10]. At present, many scholars have carried out many researches and proved that this algorithm has a certain competitive performance compared with other algorithms. BA has been applied in the multiobjective function optimization with the artificial neural network model [11], economic operation [6], and economic load dispatch in wind power generation system [12]. BBA has been applied in the simulation test point selection [13], low speed rolling bearing fault [14], and the optimization of the echo state network [15].

The parameter initialization of the discussed binary bat algorithm (BBA) has important influence on the convergence speed, convergence precision, and good global searching ability of the BBA. The convergence speed and algorithm searching precision are mainly determined by the rate of bat transmitted pulse and the launch loudness. In this paper, the function optimization problem is solved based on the binary bat algorithm (BBA). Then the parameter performance comparison and analysis are carried out through the simulation experiments in order to verify the binary bat algorithm’s superiority. The paper is organized as follows. In Sections 2 and 3, the bat algorithm and binary bat algorithm are introduced. The simulation experiments and results analysis are introduced in detail in Section 4. Finally, the conclusion illustrates the last part.

2. Bat Algorithm

The bat algorithm is an intelligence optimization algorithm inspired by the echolocation behavior of bats [9]. Echolocation works as a type of sonar: bats, mainly microbats, emit a loud and short sound pulse. When they hit an object, after a fraction of time, the echo will return back to their ears. The bat receives and detects the location of the prey in this way. In addition, this amazing orientation mechanism makes bats able to distinguish the difference between an obstacle and a prey and allows them to hunt even in complete darkness. In order to simulate the foraging behavior of the bats, the biological mechanism of the bat algorithm to simulate the bats’ foraging behavior obeys the following idealized assumptions:

(1) All bats use echolocation to sense distance, and they also know the difference between food/prey and the background barriers in some magical way.

(2) A bat flies randomly with velocity at position with a fixed frequency , varying wavelength , and loudness to search for prey. They can automatically adjust the wavelength (or frequency) of their emitted pulses and adjust the rate of pulse emission depending on the proximity of their targets.

(3) Although the loudness can vary in many ways, Yang assumes that the loudness varies from a large positive to a minimum constant value .

On the basis of three idealized assumptions, the algorithm generates a set of solutions randomly and then searches the optimal solution by cycle and strengthens the local search in the process of searching. By producing the local solution near the optimal solution by random flight, BA finally finds the global optimal solution.

The foraging space of bats is the dimension. At time , the location and the flight velocity of the th bat are and , respectively, and is the current global optimal location. At time , the velocity and position of the th bat are updated by using the following equations: where and are the minimum and maximum frequency of the sound waves emitted by bats, respectively. is a random number obeying the uniform distribution in . When setting up the initial values, the frequency of launch sound waves of each bat obeys the uniform distribution in . First of all, according to (1), obtain their frequencies, and then, according to (2) and (3), update the velocities and positions.

For the local search, each bat carries out the random walk based on the optimal solution. The following equation is used to produce a new solution:where is a random number obeying the uniform distribution in , is a solution randomly selected from the current optimal solution, and shows the average loudness for all bats at the th iteration.

The update rules of the loudness of pulse emission of the bat and the velocity are described as follows: if a bat finds prey, it will reduce its impulse response and increase the velocity of its pulse emission. In BA, the loudness of pulse emission of the bat and velocity is adjusted by the following equations: where is the initial velocity and is the initial loudness, which are selected randomly. and are constant (; ).

3. Binary Bat Algorithm

A binary search space can be considered as a hypercube. The search agents (particles) of a binary optimization algorithm can only shift to the nearer and farther corners of this hypercube by flipping various numbers of binary bits. Hence, when designing the binary version of BA, some basic concepts of the velocity and position updating rules must be modified. In the continuous version of BA, the artificial bats can move around the search space by utilizing position and velocity vectors (or updated position vectors) within the continuous real domain. In discrete binary space, the position updating means switching between “0” and “1.” In order to achieve this change, this switching should be done based on the velocities of agents. In other words, a transfer function defines a transformation probability from 0 to 1 for a position vector element and vice versa. The transfer functions force the particles to move in a binary space. The transfer function of the discussed binary BA is defined in (7) and Figure 1 [10].where is the velocity of particle in dimension at iteration .

After calculating the probabilities using transfer functions, a new position updating equation is necessary to update particles’ position as follows:where and indicate the position and velocity of th particle at th iteration in th dimension.

This method has a drawback as the particles are forced to take values of 0 or 1. So the particles remain unchanged in their positions when their velocity values increase. However, according to the concepts mentioned above for designing a transfer function, a better way is to oblige the particles with high velocity to switch their positions. A v-shaped transfer function and position updating rule are proposed in order to do this as in (9) and Figure 2.where and indicate the position and velocity of th particle at th iteration in th dimension and is the complement of .

4. Simulation Experiments and Results Analysis

4.1. Test Functions

In the simulation experiments, six typical functions are adopted to verify the performance of BBA. The simulation environment adopts Windows 10 operating system, processor 2.40 GHz, and 3 G memory for simulation software. The testing functions are shown in Table 1, where f1 ~f3 are unimodal functions and f4 ~f6 are the multimodal functions.

 Function Expression Range Min. value 0 0 0 0 0
4.2. Simulation Experiments and Results Analysis
4.2.1. Change of a Single Variable

The initialization parameters of BBA are set as follows: the population size (noP) is 30, the number of iterations (Max_iteration) is 500, , and the algorithm dimension (noV) is 100. In order to reduce the influence of random disturbance, the independent operating for each test function is carried out 10 times. The optimal value and average values of BBA in different launch loudness are shown in Table 2. The simulation results of the six test functions are shown in Figure 3.

 Function Result Simulation results of BBA under different Minimum time (s) 0.25 0.4 0.6 0.9 Average 1.6 1.05 0.656 1.074 10.065 s Std. 3.6423 3.3974 3.0026 2.913 Optimum 2 2 2 0 Average 26.58 28.176 27.068 26.984 11.618 s Std. 6.4989 7.8753 6.0687 6.5658 Optimum 25 25 29 27 Average 0.42459 0.13147 0.55776 0.16727 11.112 s Std. 0.30212 0.30663 0.39227 0.33251 Optimum 0.39603 0.55776 0.20291 0.39603 Average −83.269 −83.287 −83.282 −82.781 12.209 s Std. 2.5872 2.7421 3.194 2.9798 Optimum −83.306 −82.464 −84.147 −83.306 Average 0.040308 0.0072643 0.012536 0.0078837 11.426 s Std. 0.040308 0.028619 0.037926 0.031519 Optimum 0.0061266 0.011729 0.0068217 0.0053477 Average 1.172 1.008 0.85 0.986 11.01 s Std. 3.4621 2.9782 3.608 2.7468 Optimum 1 3 1 1

It can be seen from the convergence curves and the numerical results of six functions after 500 iterations and 10 times running independently that the optimization ability increases gradually when changes from 0.25 to 0.9. Functions and get the optimal value when . Function , , and get optimal value when . The most obvious convergence curve is and the worst convergence curve function is .

Compared with other convergence curves, function is the most volatile. The function searching capability affected by is smaller and the searching running time is related to the function complexity. It can be seen from all convergence trends that the convergence rate did not increase or decrease regularly with the increase of . Meanwhile, it has a certain relationship with the solution space. It is different from the function optimization performance impacted by the maximum or minimum values of parameter . Hence, each function is corresponding to the optimal value of . When is 0.4, the optimization effect of function is the best. When is 0.6, the optimization effects of functions and are the best. When is 0.9, the optimization effects of functions , , and are the best.

4.2.2. Change of a Single Variable

In view of (the rate of bat transmitted pulse), the simulation experiments are carried out according to (6) and the simulation results are shown in Table 3 and Figure 4.

 Function Result The simulation results of BBA under different Minimum time (s) 0.2 0.5 2 10 0.9 10 18 24 30 9.844000 0.4 37 69 85 81 11.263000 0.6 1.9033 1.7515 2.0421 1.9391 11.372000 0.6 −77.415 −66.476 −62.269 −60.586 11.934000 0.9 0.1069 0.18355 0.34503 0.31911 11.482000 0.9 8 19 24 28 11.083000

When , the test functions , , , , and obtain the optimal solution. The function gets the optimal solution when . With the increase of , the optimal solution of functions , , and reduces gradually. The optimal solution of functions , , and also presents the tendency of decrease but is better than . The operation time of the functions is almost the same time. From the convergence curves, function optimization precision is worse and worse, and convergence speed is slower and slower reduced in gradient with the increase of . In a word, for different functions, the smaller , the better the performance of convergence and the optimization ability is stronger and stronger.

4.2.3. Change of Variable and

In view of (the loudness of bat transmitted pulse) and (the rate of bat transmitted pulse), the simulation experiments are carried out and the simulation results are shown in Table 4 and Figure 5.

 Function The simulation results of BBA under different and Optimum Average 0.9 0.1 0 0.79 0.9 0.2 8 9.272 0.6 0.1 1 0.58 0.6 0.2 8 7.474 0.4 0.1 1 1.292 0.4 0.2 7 7.07 0.25 0.1 0 1.152 0.25 0.2 9 9.376 0.6 0.1 0 0.772 0.6 0.5 22 16.872 0.4 0.1 1 1.158 0.4 0.5 20 17.022 0.6 0.1 7 7.15 0.6 0.2 5 6.568 0.4 0.1 1 0.75 0.4 0.2 9 9.03 0.9 0.1 0.0068217 0.010572 0.9 0.2 0.0751 0.072148 0.6 0.1 0.0069992 0.0075418 0.6 0.2 0.071033 0.060404 0.9 0.1 1 0.614 0.9 0.2 9 5.75 0.6 0.1 0 0.66 0.6 0.2 8 7.914

When is 0.9 and is 0.1, the optimization of function is best for the loudness of bat transmitted pulse and the rate of bat transmitted pulse . When is 0.9 and is 0.1, the optimization of function is best. When is 0.6 and is 0.1, the optimization of function is best. When is 0.4 and is 0.1, the optimization of function is best. When is 0.9 and is 0.1, the optimization of function is best. When is 0.6 and is 0.1, the optimization of function is best. It can be seen from the convergence curves that the convergence precision reduces when increasing alone and the convergence speed increases when reducing alone. Different functions have an impact on optimization effect of BBA. The optimal combination of algorithm parameters is independent for different function.

5. Conclusions

Based on the basic principle of binary bat algorithm (BBA), the optimization performance is verified by carrying out the simulation experiments on six test functions. has contact with the convergence precision. relates to the convergence rate but the different functions require different . Therefore, for different functions, the simulation experiments should be carried out in order to obtain the appropriate parameter setting. The simulation results show that the convergence speed of the algorithm is relatively sensitive to the setting of the algorithm parameters.

Conflicts of Interest

The authors declare that there are no conflicts of interest.

Authors’ Contributions

Xiao-Xu Ma participated in the data collection, analysis, algorithm simulation, draft writing, and critical revision of this paper. Jie-Sheng Wang participated in the concept, design and interpretation and commented on the manuscript.

Acknowledgments

This work was supported by the Basic Scientific Research Project of Institution of Higher Learning of Liaoning Province (Grant no. 2017FWDF10), the Program for Research Special Foundation of University of Science and Technology of Liaoning (Grant no. 2015TD04), and the Opening Project of National Financial Security and System Equipment Engineering Research Center (Grant no. USTLKFGJ201502).

References

1. M. M. Noack and S. W. Funke, “Hybrid genetic deflated Newton method for global optimisation,” Journal of Computational and Applied Mathematics, vol. 325, pp. 97–112, 2017. View at: Publisher Site | Google Scholar
2. N. Bell and B. J. Oommen, “A novel abstraction for swarm intelligence: particle field optimization,” Autonomous Agents and Multi-Agent Systems, vol. 31, no. 2, pp. 362–385, 2017. View at: Publisher Site | Google Scholar
3. X.-S. Yang, S. Deb, S. Fong, X. He, and Y.-X. Zhao, “From Swarm Intelligence to Metaheuristics: Nature-Inspired Optimization Algorithms,” The Computer Journal, vol. 49, no. 9, Article ID 7562319, pp. 52–59, 2016. View at: Publisher Site | Google Scholar
4. L. Wang, H. Geng, P. Liu et al., “Particle Swarm Optimization based dictionary learning for remote sensing big data,” Knowledge-Based Systems, vol. 79, pp. 43–50, 2015. View at: Publisher Site | Google Scholar
5. Y. Gheraibia, K. Djafri, and H. Krimou, “Ant colony algorithm for automotive safety integrity level allocation,” Applied Intelligence, pp. 1–15, 2017. View at: Publisher Site | Google Scholar
6. B. R. Adarsh, T. Raghunathan, T. Jayabarathi, and X.-S. Yang, “Economic dispatch using chaotic bat algorithm,” Energy, vol. 96, pp. 666–675, 2016. View at: Publisher Site | Google Scholar
7. Z.-Z. Liu, D.-H. Chu, C. Song, X. Xue, and B.-Y. Lu, “Social learning optimization (SLO) algorithm paradigm and its application in QoS-aware cloud service composition,” Information Sciences, vol. 326, pp. 315–333, 2016. View at: Publisher Site | Google Scholar
8. D. Wu, S. Xu, and F. Kong, “Convergence Analysis and Improvement of the Chicken Swarm Optimization Algorithm,” IEEE Access, vol. 4, pp. 9400–9412, 2016. View at: Publisher Site | Google Scholar
9. X. S. Yang, “A new metaheuristic bat-inspired algorithm,” in Nature Inspired Cooperative Strategies for Optimization (NICSO 2010), vol. 284, pp. 65–74, Springer, 2010. View at: Publisher Site | Google Scholar
10. S. Mirjalili, S. M. Mirjalili, and X.-S. Yang, “Binary bat algorithm,” Neural Computing and Applications, vol. 25, no. 3-4, pp. 663–681, 2013. View at: Publisher Site | Google Scholar
11. N. S. Jaddi, S. Abdullah, and A. . Hamdan, “Multi-population cooperative bat algorithm-based optimization of artificial neural network model,” Information Sciences, vol. 294, pp. 628–644, 2015. View at: Publisher Site | Google Scholar | MathSciNet
12. Y. Duan and M. Zhao, “Bat Algorithm on Economic Load Dispatch in Wind Power System,” Electrical Engineering, 2015. View at: Google Scholar
13. D. Zhao and Y. He, “Chaotic binary bat algorithm for analog test point selection,” Analog Integrated Circuits and Signal Processing, vol. 84, no. 2, article no. A006, pp. 201–214, 2015. View at: Publisher Site | Google Scholar
14. M. Kang, J. Kim, and J.-M. Kim, “Reliable fault diagnosis for incipient low-speed bearings using fault feature analysis based on a binary bat algorithm,” Information Sciences, vol. 294, pp. 423–438, 2015. View at: Publisher Site | Google Scholar | MathSciNet
15. H. Wang and X. Yan, “Optimizing the echo state network with a binary particle swarm optimization algorithm,” Knowledge-Based Systems, vol. 86, pp. 182–193, 2015. View at: Publisher Site | Google Scholar

Copyright © 2018 Xiao-Xu Ma and Jie-Sheng Wang. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.