International Journal of Mathematics and Mathematical Sciences

International Journal of Mathematics and Mathematical Sciences / 2020 / Article

Research Article | Open Access

Volume 2020 |Article ID 4795793 |

Huda I. Ahmed, Eman T. Hamed, Hamsa Th. Saeed Chilmeran, "A Modified Bat Algorithm with Conjugate Gradient Method for Global Optimization", International Journal of Mathematics and Mathematical Sciences, vol. 2020, Article ID 4795793, 14 pages, 2020.

A Modified Bat Algorithm with Conjugate Gradient Method for Global Optimization

Academic Editor: Birendra Nath Mandal
Received04 Mar 2020
Revised17 Apr 2020
Accepted18 May 2020
Published04 Jun 2020


Metaheuristic algorithms are used to solve many optimization problems. Firefly algorithm, particle swarm improvement, harmonic search, and bat algorithm are used as search algorithms to find the optimal solution to the problem field. In this paper, we have investigated and analyzed a new scaled conjugate gradient algorithm and its implementation, based on the exact Wolfe line search conditions and the restart Powell criterion. The new spectral conjugate gradient algorithm is a modification of the Birgin and Martínez method, a manner to overcome the lack of positive definiteness of the matrix defining the search direction. The preliminary computational results for a set of 30 unconstrained optimization test problems show that this new spectral conjugate gradient outperforms a standard conjugate gradient in this field and we have applied the newly proposed spectral conjugate gradient algorithm in bat algorithm to reach the lowest possible goal of bat algorithm. The newly proposed approach, namely, the directional bat algorithm (CG-BAT), has been then tested using several standard and nonstandard benchmarks from the CEC’2005 benchmark suite with five other algorithms and has been then tested using nonparametric statistical tests and the statistical test results show the superiority of the directional bat algorithm, and also we have adopted the performance profiles given by Dolan and More which show the superiority of the new algorithm (CG-BAT).

1. Introduction

In 2010, Yang proposed a new optimization algorithm, namely, bat algorithm (BA), based on swarm intelligence and the inspiration from observing the bats. Although the original BA presents superior results in the experiments than PSO, we notice that the performance and the accuracy of the original BA still have the capacity to present better. The algorithm exploits the so-called echolocation of bats.

Bats use sonar echoes to detect and avoid obstacles. It is generally known that sound pulses are transformed to frequency which reflects from obstacle. Bats can use time delay from emission to reflection and use it for navigation. They typically emit short loud, sound impulses. The pulse rate is usually defined as 10 to 20 times per second. After hitting and reflecting, the bats transform their own pulse into useful information to gauge how far away the prey is. The bats are using wavelengths that vary from the range 0.7 to 17 mm or inbound frequencies of 20–500 kHz. To implement the algorithm, the pulse frequency and rate have to be defined. The pulse rate can be simply determined in the range from 0 to 1, where 0 means that there is no emission and 1 means that bats are emitting maximum [1].

The bat-inspired algorithm is a recent swarm-based intelligent system which mimics the echolocation system of microbats. In the bat-inspired algorithm, the bats randomly fly around the best bat locations found during the search so as to improve their hunting of prey. In practice, one bat location from a set of best bat locations is selected. Thereafter, that best bat location is used by local search with a random walk strategy to inform other bats about the prey location. This selection mechanism can be improved using other natural selection mechanisms adopted from other advanced algorithms such as genetic algorithm. Therefore, six selection mechanisms are studied to choose the best bat location: global-best, tournament, proportional, linear rank, exponential rank, and random. Consequently, six versions of the bat-inspired algorithm are proposed and studied which are global-best bat-inspired algorithm (GBA), tournament bat-inspired algorithm (TBA), proportional bat-inspired algorithm (PBA), linear rank bat-inspired algorithm (LBA), exponential rank bat-inspired algorithm (EBA), and random bat-inspired algorithm (RBA). Using two sets of global optimization functions, the bat-inspired versions are evaluated and the sensitivity analyses of each version to its parameters studied [2]. A success of an algorithm always depends on well balanced of these components. The aim of this study is to improve the performance of the standard bat algorithm by increasing its exploration and exploitation abilities along the main line of the BA. In this paper, two improvement strategies are presented. The first improvement strategy is the development of a spectral conjugate gradient technique, which can be used to guide the research process, and the second improvement strategy is to improve the bat algorithm using the conjugate gradient method to arrive at the best solution for the current iteration, which can be used to enhance the ability to local search. The newly proposed optimally directional bat algorithm (CG-BAT) will be tested on several benchmark problems chosen from the well-known CEC’2005 benchmark set and compared with several other swarm and evolutionary algorithms. Therefore, this study is organized as follows. A new scalar in a spectral conjugate gradient is described in Section 2. A global convergence is described in Section 3, the standard bat algorithm is presented in Section 4. Then, the enhanced bat algorithm is presented in Section 5. Finally, the results of the numerical experiments are presented in Section 6, followed by the conclusions in Section 7. Too much exploration but too little exploitation may cause difficulties that algorithm converges towards optimal solutions. The conjugate gradient technique could be a helpful procedure to search out the minimum value of any nonlinear function to find optimal solutions:where is a real-valued function. The numerical formula is given bywhere is a step length to be computed by a line search procedure [3]. The search direction is outlined as follows:and is gradient and is a parameter of conjugacy condition. Some famed formulas of this parameter are outlined as follows:where , which are referred to as Hestenes–Stiefel (HS) [4], Polak–Ribière (PR) [5], Fletcher–Reeves (FR) [6], and Dai–Yuan (DY) [7] severally. Many authors have studied the convergence of the on top of formulas for years [811].

To prove the convergence analysis of the conjugate gradient technique, the following weak Wolfe conditions are used:

Used the strong Wolfe conditions consist of (5) and

The constants are within the period , and additional details are found in [3]. Well, the sufficient descent property is defined as follows:where denotes the Euclidean norm, provided that c is any positive constant [7].

2. A New Scalar in Spectral CG Method ()

Birgin and Martínez (SS) [12] instructed a spectral conjugate gradient technique outlined bywhere .

The parameter has the following form:where the spectral in [13] is determined by using the following equation:

In this section, we will derive a new spectral CG method as follows:

The matrix is asymmetrical and positive definite and the scalar is defined by Al-Bayati and Salah [14] as .

By equating (9) and (12), we get

Multiplying both sides of (13) by , we get

Since and , we get

Since ,

If we use exact line search, then the new scalar is equal to one.

The new direction is defined by the following equation:

Theorem 1. Let the line search in (2) satisfies the strong Wolfe condition, then the new search direction given by (17) is a sufficient descent direction.

Proof. Under some algebraic operations, the direction of (17) can be written as follows:Now, multiplying both sides of (18) by , then we getSince and , we get Since ,Let (where is the positive constant), then

3. Global Convergence

In this section, the subsequent assumption is usually used in proving the global conjugate gradient methods.

Assumption 1. (see [15]).(i)The level set is bounded, that is, there exists a constant z > 0, such as .(ii)In neighborhood N of S, the function f is continuously differentiable and its gradient is Lipschitz continuous, i.e., there exists a constant L > 0 such thatBelow the assumptions (i) and (ii) on f, we can deduce that there exists  > 0 such as the following equations:

Lemma 1. (see [16]). Assume that Assumption 1 holds and suppose that, for any conjugate gradient method, is a descent direction and the step size satisfies conditions (5) and (7) ifThen

Theorem 2. Suppose that Assumption 1 holds, and the direction defined by (17) is descent and is computed using (5) and (7), then

Proof. By using some algebraic operations of (17) and taking the absolute value, we getSince and , we getSince ,Since , we getthat is, , the proof is complete.

4. Standard Bat Algorithm

The bat algorithm proposed by Yang [17] is an intelligent optimization algorithm inspired by the echolocation behavior of bats. When flying and hunting, bats emit some short, ultrasonic pulses to the environment and list to their echoes. Studies show that the information from the echoes will enable bats to build a precise image of their surroundings and determine precisely the distance, shapes, and prey’s location. The capability of such echolocation of microbats is fascinating, as these bats can find their prey and discriminate against different types of insects even in complete darkness [17]. The earlier studies showed that BA can solve unconstrained optimization problems with much more efficiency and robustness compared to GA and PSO [18, 19].

The used idealized rules in bat algorithm are as follows:(a)All bats use echolocation to sense distance and the location of a bat xi is encoded as a solution to an optimization problem under consideration.(b)Bats fly randomly with velocity at position xi with a varying frequency (from a minimum fmin to a maximum frequency fmax) or a varying wavelength λ and loudness A to search for prey. They can automatically adjust the wavelengths (or frequencies) of their emitted pulses and the rate of pulse emission r depending on the proximity of the target.(c)Loudness varies from a large positive value A0 to a minimum constant value Amin [17].

For each bat (i), its position (xi) and velocity () in a d-dimensional search space should be defined. xi and should be subsequently updated during the iterations. The rules for updating the position and velocities of a virtual bat (i) are given as in [17]:where rand ∈ [0, 1] is a random vector drawn from a uniform distribution. Here, is the current global best location (solution) which is located after comparing all solutions among all the n bats. A new solution for each bat is generated locally using random walk given bywhere ε ∈ [−1, 1] is a random number, while is the average loudness of all the bats at this time step.

The loudness and the rate of pulses emission are updated as the iterations proceed. The loudness decreases and the pulse rate increases as the bat gets closer to its prey. The equation for updating the loudness and the pulse rate is given bywhere and  > 0 are constants. As t ⟶ ∞, we have  ⟶ 0 and .

The initial loudness A0 can typically be A0 ∈ [1, 2], while the initial emission rate r0 ∈ [0, 1].

The basic steps of the standard bat algorithm are summarized in the pseudocode as shown in Algorithm 1.

(1)define objective function
(2)initialize the bat population and for k = 1, ⋯, n
(3)Define pulse frequency at .
(4)Initialize pulse rates and the loudness .
(5)While (t ≤ tmax).
(6)Adjust frequency equation (33)
(7)Update velocities equation (34)
(8)Update locations/solutions equation (35)
(9)if (rand > )
(10)Select a solution among the best solutions
(11)Generate a local solution around the selected best solution equation (36)
(12)end if
(13)Generate a new solution by flying randomly
(14)if (rand <  & F() < F())
(15)Accept the new solutions
(16)Increase equation (37)
(17)Reduce equation (38)
(18)end if
(19)Rank the bats and find the current best
(20)end while
(21)Output results for post-processing

5. Enhanced Bat Algorithm

This paper attempts to improve the bat algorithm from a different perspective from the previous improvements by hybridizing the bat method using optimization methods, by using the optimal size of the cubic step and the optimal search direction for the synchronous gradient feature of the optimal search direction for echo detection. First, local movements can be improved by controlling the optimum step sizes, while the second bat movement should be directed by other bats and the best local moves toward optimal movement. More specifically, two different adjustments will be made to improve the efficiency of the bat algorithm.

5.1. The First Modification (Optimal Step Size)

The first modification concerns local search mechanisms: in standard bats, they are allowed to move from their current locations to new random locations using local random walk. In modified bats, they are allowed to switch from their current locations to new locations optimally using local optimal walking, as we adjust this step to the optimal size using one of the optimization methods called optimal step size () when the step length is calculated by performing a line search [1].

5.2. The Second Modification Using New Spectral Conjugate Gradient Method

A bat emits two pulses in two different directions, one to the direction of the bat with the best position (the best solution is steepest descent) and the other to the direction of the new conjugate gradient bat. From the echoes, the bat can know if the food exists around these two bats or not. The best position is determined by the objective fitness, while, around the optimally selected bat, it depends on its fitness value. If it has a better fitness value as the actual bat, then the food is considered to exist; otherwise, there is not a food source in the neighborhood. If the food is confirmed to exist around the two bats (Choice 1), the current bat moves to a direction at the surrounding neighborhood of the two bats where the food is supposed to be plenty. If not (Choice 2), it moves toward the best bat.

The mathematical formulas of the bats’ movements are thus given by

The directions of the movement generated by equation (17) are directed towards the bat with the best position. This mechanism allows the BA to exploit more around the best position; however, if the best bat is not near the global optimality, there is a risk that the solutions generated by such moves could be trapped in local optima. The new proposed movement in equation (17) has the ability to diversify the movement directions which can enhance the exploration capability, especially at the different repetition stages especially at the initial stages of iterations, and can thus avoid premature convergence. Furthermore, when it approaches the end of the iteration process, the bats tend to get around the best bats with stronger exploitability which in turn can reduce the distances between them and thus enhance the speed of convergence which gives stability to the algorithm. The new algorithm CG-BAT is illustrated by presenting the algorithm and flowchart as follows.

5.2.1. CG-BAT Algorithm

In this section, we develop the movement of bat algorithm to reach the goal by using the new direction which is defined in equation (17).(1)objective function (2)initialize the bat population and for k = 1, …, n(3)Define pulse frequency at .(4)Initialize pulse rates and the loudness .(5)While (t ≤ tmax).(6)Adjust frequency equation (1)(7)Update locations/solutions equation (2)(8)Update velocities equation (3)(9)if (10)Generate new search movement using equation (17)(11)end if.(12)Generate new solution by a flying optimally step length using equation (39).(13)if .(14)Accept the new solutions(15)Increase equation (37)(16)Reduce equation (38)(17)end if(18)Rank the bats and find the current best (19)end while(20)Output results for postprocessing

5.2.2. CG-BAT Flowchart

In this section, we describe the movement of CG-BAT algorithm to reach the goal by using the flowchart of new search direction which is defined in equation (17) (Figure 1).

6. Experimental Results and Comparisons

To prove the efficiency performance of all newly proposed algorithms, two comparison experiments have been conducted. The first is a comparison between the new spectral conjugate gradient and the standard algorithms in this field, and the second experiment is a comparison between the new bat algorithm (CG-BAT) and cuckoo search, firefly search, and practical swarm.

6.1. Experimental Results and Comparisons in New CG

In this section, we have reported some numerical experiments that are performed on a set of 30 unconstrained optimization test problems to analyze the efficiency of . Detail of these test problems, with their given initial points. The termination criterion used in our experiments is . In our comparisons below, we employ the following algorithms:(i)SS: scalar in spectral in Birgin and Martínez algorithm with the Wolfe line search(ii)HS: Hestenes–Stiefel algorithm with the Wolfe line search(iii)New: new algorithm using equation (17) with the Wolfe line search

Table 1 shows the numerical computations of these newly proposed CG algorithms against other well-known CG-algorithms to check their performance and we have used the following well-known measures or tools used normally for this type of comparison of CG algorithms:NOI = the total number of iterationsNOF = the total number of function evaluationTIME = the total CPU time required for the processor to execute the CG algorithm and reach the minimum value of the required function minimization

Prob.New algorithm


To evaluate the modified conjugate gradient technique, this technique is analyzed and tested in some numerical tests (see [20]) and to demonstrate the performance of those methods, we applied Dolan and Moré [21], a new tool to analyze the efficiency of algorithms.

They introduced the notion of a performance profile as means to evaluate and compare the performance of the set of solvers S on a test set P. Assuming that there exist ns solvers and np problems, for each problem p and solvers, they defined computing time (the number of function evaluations or others) required to solve problem p by solver s

Requiring a baseline for comparisons, they compared the performance on problem p by solver s with the best performance by any solver on this problem, based on the performance ratio:

Suppose that a parameter for all is chosen, and if and only if solver s does not solve problem p (Figure 2).

Figure 3 shows the Dolan–More performance profile for these methods, which are subject to the frequency of a suitable performance compared to the basic methods. Figure 4 shows us through the Dolan–More performance profile for these methods, which are measured by the CPU time, which makes us deduce from the three forms presented. The new method is very suitable for solving issues of many dimensions.

6.2. Experimental Results and Comparisons in CG-BAT

To validate the performance of the proposed optimally directional bat algorithm, we have carried out various numerical experiments that have been then tested using several standard and nonstandard benchmarks from the CEC’2005 benchmark suite, which can be summarized as two comparison experiments. The first one is a comparison between the new directional bat algorithm and the standard algorithms including the bat algorithm on the classical benchmark functions, cuckoo search, firefly search, and practical swarm and the second one is a comparison has been performed against some advanced optimization algorithms such as Dolan and Moré [19], a new tool to analyze the efficiency of algorithms.

6.2.1. Benchmarking and Parameter Settings

Thirty popular benchmark functions are shown in Tables 24. We have been used to verify the performance of the new bat algorithm (CG-BAT), compared with that of standard BA, FA, CS, and PSO. The description and the setting parameters of these algorithms are as follows:(1)CG-BAT: an extensive analysis was performed to carry out parameter settings of BA; for best practice, we recommend the following settings: r0 = 0.1, r = 0.7, A0 = 0.9, A = 0.6, fmin = 0 and fmax = 2, , and .(2)BA: the standard bat algorithm was implemented as it is described in [17] with r0 = 0.1, A0 = 0.9, α = γ = 0.9, fmin = 0, and fmax = 2.(3)FA: the firefly algorithm where A0 = 0.9 is the intensity at the source point described in [15].(4)CS: the cuckoo search via Lèvy flights described in [22] is considered with the probability of the discovery of alien egg spa = 0.25.(5)PSO: a classical particle swarm optimization [23, 24] model has been considered. The parameter settings are c1 = 1.5 and c2 = 1.2 and the inertia coefficient is a monotonically decreasing function from 0.9 to 0.4.


F1Best1.927E − 033.052E − 012.481E + 012.340E + 022.340E + 02
Median1.408E − 025.480E + 044.120E + 014.357E + 023.330E + 02
Worst2.233E + 006.569E + 048.028E + 016.119E + 024.140E + 02
Mean2.256E − 014.920E + 044.411E + 014.153E + 021.340E + 02
SD4.869E − 011.859E + 041.259E + 019.518E + 012.000E + 02

F2Best2.011E + 063.313E + 099.080E + 083.229E + 175.140E + 12
Median1.171E + 091.294E + 151.177E + 117.654E + 196.3157E + 12
Worst5.713E + 105.893E + 501.553E + 122.433E + 226.9119E + 12
Mean2.363E + 124.310E + 493.051E + 112.263E + 214.153E + 12
SD1.261E + 121.461E + 504.102E + 115.976E + 219.518E + 11

F3Best1.634E + 008.563E + 009.877E + 011.062E + 033.229E + 07
Median3.115E + 012.996E + 051.618E + 021.996E + 037.654E + 09
Worst1.256E + 024.370E + 053.850E + 023.409E + 039.433E + 12
Mean1.461E + 012.612E + 051.742E + 022.138E + 032.263E + 01
SD7.456E + 001.348E + 056.173E + 015.493E + 025.976E + 01

F4Best5.049E − 033.210E + 029.989E − 033.026E + 101.082E + 03
Median8.544E − 025.949E + 028.997E − 024.448E + 101.886E + 03
Worst5.630E − 016.848E + 022.136E + 006.797E + 003.809E + 03
Mean1.405E − 015.816E + 022.303E − 014.567E + 002.138E + 03
SD1.481E − 017.884E + 014.210E − 019.934E − 019.493E + 02

F5Best2.685E + 032.967E + 06−3.276E − 032.831E + 073.026E + 10
Median2.553E + 042.529E + 063.007E − 034.084E + 074.248E + 10
Worst3.907E + 043.495E + 062.215E − 048.620E + 076.497E + 10
Mean2.423E + 045.436E + 064.901E − 034.242E + 044.567E + 10
SD6.590E + 046.360E + 056.627E + 031.118E + 049.934E + 10

F6Best1.812E + 002.420E + 022.998E + 011.129E + 022.931E + 07
Median3.8097E + 003.074E + 021.575E + 021.378E + 024.984E + 07
Worst8.1481E + 003.670E + 022.047E + 021.644E + 028.920E + 07
Mean1.6193E + 003.086E + 021.551E + 021.366E + 024.242E + 04
SD1.0623E + 013.603E + 013.368E + 011.349E + 011.228E + 04

F7Best1.5168E + 003.024E − 011.053E + 002.414E + 001.339E + 02
Median4.2901E + 006.876E − 011.928E + 004.5175E + 001.378E + 02
Worst9.2997E + 001.135E − 023.388E + 008.1813E + 001.554E + 02
Mean4.2716E + 007.176E − 012.017E + 005.153E + 001.366E + 02
SD1.5826E + 002.927E − 015.223E − 011.865E + 001.349E + 01

F8Best0.214E + 002.096E + 012.302E + 008.391E + 002.414E + 00
Median1.681E + 002.196E + 013.191E + 001.400E + 012.988E + 01
Worst2.881E + 005.996E + 013.648E + 001.750E + 019.575E + 02
Mean5.899E + 006.996E + 013.191E + 001.209E + 012.047E + 02
SD1.730E + 007.062E − 042.904E − 011.753E + 001.451E + 02

F9Best2.895E + 037.685E + 034.745E + 034.522E − 033.368E + 01
Median4.492E + 038.365E + 035.370E + 035.045E − 031.753E + 02
Worst5.646E + 031.017E + 046.006E + 035.426E − 031.928E + 03
Mean4.357E + 038.940E + 035.407E + 035.056E − 033.398E + 00
SD6.414E + 021.242E + 033.363E + 021.747E − 022.667E + 00

F10Best2.911E − 033.336E + 014.637E + 026.691E + 025.223E + 01
Median1.038E − 022.473E + 026.892E + 029.105E + 022.311E + 02
Worst2.011E − 012.944E + 031.304E + 032.290E + 033.188E + 03
Mean2.645E − 024.916E + 027.193E + 021.073E + 033.648E + 00
SD2.926E − 026.275E + 022.121E + 023.967E + 023.191E + 00


F11Best7.536E + 012.799E + 001.414E + 021.337E + 022.904E − 05
Median1.561E + 022.103E + 011.879E + 022.190E + 029.745E − 03
Worst2.506E + 020.334E + 022.352E + 023.009E + 024.370E − 03
Mean1.515E + 023.629E + 011.937E + 022.214E + 026.006E − 03
SD4.105E + 011.971E + 012.433E + 014.094E + 015.407E − 03

F12Best9.448E − 031.323E + 002.650E + 011.059E + 023.343E + 02
Median1.528E − 022.187E + 016.164E + 012.200E + 024.437E + 02
Worst3.044E − 019.385E + 021.438E + 026.159E + 026.492E + 02
Mean1.911E − 011.181E + 026.790E + 012.611E + 021.304E + 03
SD4.917E − 012.293E + 022.559E + 011.384E + 027.193E + 02

F13Best2.094E − 009.637E + 011.259E + 011.673E + 022.121E + 02
Median1.470E − 038.037E + 021.123E + 021.444E + 011.484E + 02
Worst9.5017E − 023.847E + 021.011E + 021.333E + 021.889E + 02
Mean1.495E − 018.475E + 001.121E + 011.455E + 012.452E + 02
SD3.135E + 006.847E − 017.085E − 017.923E − 010.937E + 02

F14Best1.344E + 008.475E + 002.073E + 031.778E + 021.433E + 01
Median2.815E + 014.501E + 014.076E + 033.536E + 022.950E + 01
Worst1.918E + 027.523E + 029.651E + 037.081E + 026.164E + 01
Mean4.898E + 011.695E + 024.532E + 033.554E + 021.438E + 02
SD5.028E + 012.220E + 021.861E + 031.311E + 026.790E + 01

F15Best4.499E + 011.382E + 041.283E + 051.542E + 080.559E + 08
Median3.283E + 024.133E + 052.388E + 053.709E + 080.259E + 09
Worst2.518E + 031.247E + 073.528E + 056.179E + 081.123E + 10
Mean4.926E + 021.929E + 062.392E + 053.760E + 081.011E + 02
SD5.304E + 023.115E + 066.522E + 041.192E + 081.121E + 01

F16Best3.462E − 026.731E + 001.271E + 011.112E + 013.085E + 01
Median0.239E + 011.505E + 011.522E + 011.404E + 018.073E + 03
Worst2.046E + 012.832E + 021.912E + 021.834E + 029.076E + 03
Mean3.716E + 001.647E + 011.528E + 011.452E + 019.651E + 03
SD4.409E + 005.854E + 001.682E + 001.746E + 004.532E + 03

F17Best2.719E + 023.067E + 011.483E + 012.157E + 011.551E + 03
Median3.085E + 023.181E + 012.232E + 012.788E + 011.743E + 05
Worst3.320E + 023.270E + 012.805E + 012.988E + 012.388E + 06
Mean3.053E + 013.178E + 012.276E + 012.718E + 013.528E + 05
SD1.668E + 004.720E − 013.374E + 002.463E + 002.392E + 05

F18Best1.131E + 021.637E + 022.948E + 022.564E + 026.522E + 04
Median1.979E + 022.550E + 023.687E + 023.163E + 021.271E + 11
Worst2.686E + 024.179E + 024.154E + 023.596E + 021.522E + 11
Mean1.959E + 022.651E + 023.627E + 023.168E + 021.912E + 02
SD3.767E + 016.899E + 013.236E + 012.533E + 011.528E + 01
F19Best5.054E − 015.082E − 012.426E + 011.337E + 021.682E + 00
Median3.328E − 005.697E + 014.450E + 012.190E + 061.993E + 01
Worst3.357E − 007.542E + 018.646E + 013.377E + 032.762E + 01
Mean2.417E − 005.172E + 014.591E + 015.082E + 062.805E + 01
SD4.826E − 011.981E + 031.265E + 018.308E + 047.542E + 01

F20Best2.861E − 005.453E + 005.729E + 005.193E + 015.1842E + 01
Median4.319E − 005.964E + 006.177E + 001.241E + 027.981E + 03
Worst5.766E − 006.693E + 006.674E + 002.769E + 039.453E + 00
Mean1.262E − 006.019E + 006.187E + 003.347E + 005.554E + 00
SD7.905E − 013.268E − 012.475E − 013.761E + 006.693E + 00


F21Best1.118E + 035.517E + 005.919E + 035.919E − 036.019E + 00
Median2.554E + 036.560E + 029.621E + 039.621E − 033.268E − 01
Worst5.626E + 037.964E + 031.568E + 041.568E − 025.517E + 00
Mean2.852E + 031.678E + 039.618E + 019.618E − 036.560E + 02
SD1.105E + 032.032E + 032.226E + 072.226E − 037.964E + 03

F22Best1.009E + 011.488E + 022.573E + 052.573E + 031.678E + 03
Median1.585E + 011.245E + 037.580E + 067.580E + 032.032E + 03
Worst1.824E + 012.330E + 048.664E + 078.664E + 041.488E + 02
Mean1.046E + 011.049E + 023.533E + 043.533E + 041.245E + 03
SD3.837E + 054.971E + 047.697E + 041.697E + 052.330E + 04

F23Best5.828E + 038.280E + 014.124E − 044.124E + 041.049E + 02
Median2.383E + 045.373E + 035.220E − 045.220E + 044.971E + 04
Worst5.416E + 043.294E + 047.472E − 047.472E + 048.280E + 01
Mean2.562E + 048.130E + 035.336E − 045.336E + 045.373E + 03
SD7.676E + 038.472E + 038.132E − 038.132E + 033.294E + 04

F24Best0.041E − 031.080E − 014.375E + 014.375E + 018.130E + 03
Median1.258E − 021.507E + 018.306E + 018.306E + 018.472E + 03
Worst2.684E − 025.574 E + 011.201E + 021.201E + 021.080E − 01
Mean8.481E − 011.900E + 018.040E + 018.040E + 011.507E + 01
SD1.717E − 011.828E + 011.588E + 011.588E + 015.574E + 01

F25Best7.078E − 070.326E + 035.169E + 045.169E + 031.900E + 01
Median8.827E − 062.920E + 038.395E + 058.395E + 051.828E + 01
Worst2.223E − 057.001E + 051.329E + 061.329E + 060.326E + 03
Mean6.204E − 053.194E + 058.815E + 058.815E + 052.920E + 03
SD2.312E − 051.916E + 051.932E + 051.932E + 057.001E + 05

F26Best1.707E + 022.994E + 011.330E + 021.330E + 023.194E + 15
Median2.517E + 025.895E + 011.625E + 021.625E + 021.916E + 15
Worst3.456E + 029.913E + 011.845E + 021.845E + 022.994E + 61
Mean2.599E + 025.746E + 011.580E + 021.580E + 025.885E + 01
SD3.756E + 011.825E + 011.558E + 011.558E + 019.913E + 01

F27Best2.126E + 011.093E + 001.366E + 011.366E + 015.746E + 04
Median3.604E + 014.073E + 002.384E + 012.384E + 011.825E + 05
Worst8.057E + 019.562E + 003.540E + 013.540E + 011.993E + 06
Mean3.979E + 015.675E + 002.417E + 012.417E + 014.373E + 00
SD1.681E + 013.920E + 005.004E + 005.004E + 009.772E + 00

F28Best1.252E − 032.595E + 001.338E + 011.338E + 018.335E + 05
Median1.462E − 025.744E + 001.559E + 011.559E + 011.329E + 06
Worst1.737E − 011.145E + 011.640E + 011.640E + 018.995E + 09
Mean1.474E + 015.920E + 001.540E + 011.540E + 011.932E + 05
SD1.235E + 002.453E + 007.839E − 017.839E − 011.220E + 02

F29Best7.293E + 032.736E + 042.281E + 047.281E + 031.895E + 09
Median8.803E + 034.228E + 043.698E + 048.698E + 031.888E + 09
Worst9.480E + 035.993E + 044.624E + 049.624E + 032.580E + 09
Mean3.722E + 034.208E + 038.712E + 038.712E + 031.998E + 01
SD5.060E + 027.320E + 025.463E + 025.463E + 021.336E + 01

F30Best8.437E + 041.048E + 028.566E + 031.566E + 011.384E + 02
Median1.588E + 052.756E + 035.394E + 042.394E + 011.540E + 03
Worst2.346E + 054.793E + 042.811E + 053.811E + 012.417E + 01
Mean1.597E + 055.961E + 038.159E + 048.159E + 019.090E + 03
SD4.048E + 049.588E + 036.481E + 046.481E + 044.237E + 01

For a fair comparison, the common parameters are considered the same. The population size was set to N = 50, and the number of function evaluations is the same as 15000, without counting the initial evaluations, though all algorithms were initialized randomly in the similar manner. Therefore, we set tmax = 500 except for CS. Due to the fact that the CS algorithm uses a number of 2N function evaluations at each iteration, we adjust tmax for this case to 250. The dimensionality of all benchmark functions is D = 30.

6.2.2. The First Experiment

For meaningful statistical analysis, each algorithm was run 51 times using a different initial population at each turn. The global minimum obtained after each trial was recorded for further statistical analysis. Subsequently, the mean value of the global minimum, the standard deviation (SD), the best solution, the median, and the worst solution values have been computed and are presented in Tables 24. From the results presented in Tables 24, the new directional bat algorithm achieved better results for 20 functions (F1, F2, F3, F4, F8, F10, F12, F13, F14, F15, F16, F18, F19, F20, F22, F23, F24, F25, F28, and F29), while the BA obtained better results for 3 functions (F7, F26, and F27). The FA has better scores for 2 functions (F5 and F17). CS obtained best results for F9, F21, and F30 and PSO for F11. We can show that the following monotonically decreasing function is more suitable and gives stability to the algorithm.

6.2.3. The Second Experiment (Nonparametric Statistical Tests)

In this section, to evaluate CG-BAT performance, nonparametric statistical tests were carried out. We performed Friedman's test and pairwise comparisons. Table 5 shows the descriptive statistics for the five algorithms, which gives the number of values studied, the mean and the standard deviation, and the highest value of the values for each method and the lowest value for it. Note that CG-BAT has the least arithmetic mean of 72929.28687 and the standard deviation of 372879.0388, which is lower than the rest of the algorithms, with the lowest and greatest value for the method.

AlgorithmNMeanStd. deviationMinimumMaximum

CS291.11379E + 165.99796E + 160.00452000003.23000E + 17
PSO291.14457E + 122.55784E + 1222.600000009.43000E + 12

Table 6 presents the Friedman rank test. For this test, an algorithm is considered better if it has a low rank. From the results, CG-BAT has the lowest rank for the two tests, which means that it is the best performing algorithm from the comparison. In addition, the last two rows present the test statistic and value. The statistic is distributed according to the chi-square distribution with 4 degrees of freedom. The lower value of the different tests suggests the existence of significant differences among the considered algorithms at α = 0.01 level of significance.



To highlight the differences between CG-BAT with each of the other algorithms, Table 7 presents the pairwise comparison results using the Friedman test. The control method is CG-BAT. The analysis of the Friedman rank test shows significant differences between CG-BAT and four algorithms (BA, FA, CS, and PSO) according to the values of chi-square statistic which all are less than α = 0.05 level of significance, and since CG-BAT algorithm for all pairwise comparisons has a minimum rank (1.28 with BA, 1.24 with FA, 1.24 with CS, and 1.10 with PSO), the results reveal that CG-BAT algorithm is significantly superior to BA, FA, CS, and PSO algorithms.

ProcedureAlgorithmRankChi-square statistic value


6.2.4. The Third Experiment (Convergence Curve Analysis)

The convergence curve is an important indicator for the performance of the algorithm, through which we can see the convergence speed and the ability of the new algorithm optimum. In order to evaluate the modified CG-BAT, this technique is analyzed and tested in some numerical tests, and to illustrate the performance of these methods, we applied Dolan and Moré [21] to analyze the efficiency of the algorithm. Performance profiles based on mean performance, standard deviation (SD), and best solution are shown in Figures 57.

7. Conclusions

In this study, we have submitted new spectral CG methods. A crucial property of proposed CG methods is that it secures sufficient descent directions. Under mild conditions, we have demonstrated that the new algorithms are globally convergent for each uniformly convex and general functions using the strong Wolfe line search conditions. The preliminary numerical results show that new algorithms perform very well and also an improved version of the standard bat algorithm, called the new directional bat algorithm (CG-BAT), has been proposed and presented. Two modifications have been embedded to the BA to increase its exploitation and exploration capabilities and consequently have significantly enhanced the BA performance. Three sets of experiments have been carried out to prove the superiority of the proposed CG-BA). The performance is compared by using thirty test functions, under seven optimization algorithms (SS, HS, New, CG-BAT, BA, CS, FA, and PSO). The comparison results show that the enhanced algorithms (New and CG-BAT) are better than the original algorithms and have relatively stable performance in both the optimization ability and the convergence speed.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.


The research was supported by College of Computer Sciences and Mathematics, University of Mosul, Republic of Iraq, under Project no. 4795793.


  1. I. Fister Jr., D. Fister, and X.-S. Yang, “A hybrid bat algorithm,” Elektrote-Hniski Vestnik, vol. 80, no. 1‐2, pp. 1–7, 2013. View at: Google Scholar
  2. M. A. Al-Betar, M. A. Awadallah, H. Faris, X.-S. Yang, A. Tajudin Khader, and O. A. Alomari, “Bat-inspired algorithms with natural selection mechanisms for global optimization,” Neurocomputing, vol. 273, pp. 448–465, 2018. View at: Publisher Site | Google Scholar
  3. P. Wolfe, “Convergence conditions for ascent methods. II: some corrections,” SIAM Review, vol. 13, no. 2, pp. 185–188, 1971. View at: Publisher Site | Google Scholar
  4. M. R. Hestense and E. X. Stiefel, “Methods of conjugate gradients for solving linear system,” Journal of Research of the National Bureau of Standards, vol. 49, no. 6, pp. 409–436, 1952. View at: Publisher Site | Google Scholar
  5. E. Polak and G. Ribiere, “Note sur la convergence de méthodes de directions conjuguées,” ESAIM, Mathematical Modeling and Numerical Analysis, vol. 3, no. 16, pp. 35–43, 1969. View at: Publisher Site | Google Scholar
  6. R. Fletcher and C. M. Revees, “Function minimization by conjugate gradients,” The Computer Journal, vol. 7, no. 2, pp. 149–154, 1964. View at: Publisher Site | Google Scholar
  7. Y. H. Dai and Y. Yuan, “A nonlinear conjugate gradient method with a strong global convergence property,” SIAM Journal on Optimization, vol. 10, no. 1, pp. 177–182, 1999. View at: Publisher Site | Google Scholar
  8. L. Guanghui, H. Jiye, H. Jiye, and Y. Hongxia, “Global convergence of the Fletcher-Reeves algorithm with inexact linesearch,” Applied Mathematics—A Journal of Chinese Universities, vol. 10, no. 1, pp. 75–82, 1995. View at: Publisher Site | Google Scholar
  9. H. Liu, “A new conjugate gradient method for unconstrained optimization,” Far East Journal of Mathematical Sciences (FJMS), vol. 40, pp. 145–152, 2010. View at: Google Scholar
  10. S. Nocedal and S. J. Wright, Numerical Optimization, Springer, Berlin, Germany, 1999.
  11. L. Zhang, W. Zhou, and D. Li, “Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search,” Numerische Mathematik, vol. 104, no. 4, pp. 561–572, 2006. View at: Publisher Site | Google Scholar
  12. E. G. Birgin and J. M. Martínez, “A spectral conjugate gradient method for unconstrained optimization,” Applied Mathematics and Optimization, vol. 43, no. 2, pp. 117–128, 2001. View at: Publisher Site | Google Scholar
  13. J. Barzilai and J. M. Borwein, “Two-point step size gradient methods,” IMA Journal of Numerical Analysis, vol. 8, no. 1, pp. 141–148, 1988. View at: Publisher Site | Google Scholar
  14. A. Y. Al-Bayati and M. Salah, New Variable Metric Method for Unconstrained Non- Linear Optimization, Academic Press, London, UK, 1994.
  15. H. Yabe and M. Takano, “Global convergence properties of nonlinear conjugate gradient methods with modified secant condition,” Computational Optimization and Applications, vol. 28, no. 2, pp. 203–225, 2004. View at: Publisher Site | Google Scholar
  16. E. T. Hamed, H. I. Ahmed, and A. Y. Al-Bayati, “A new hybrid algorithm for convex nonlinear unconstrained optimization,” Journal of Applied Mathematics, vol. 2019, Article ID 8728196, 6 pages, 2019. View at: Publisher Site | Google Scholar
  17. X.-S. Yang, “A new met heuristic bat-inspired algorithm,” in Nature Inspired Cooperative Strategies for Optimization (NICSO 2010), J. González, D. Pelta, C. Cruz et al., Eds., vol. 284, pp. 65–74, Springer, Berlin-Heidelberg, Germany, 2010. View at: Google Scholar
  18. A. H. Gandomi, X.-S. Yang, A. H. Alavi, and S. Talatahari, “Bat algorithm for constrained optimization tasks,” Neural Computing and Applications, vol. 22, no. 6, pp. 1239–1255, 2013. View at: Publisher Site | Google Scholar
  19. X. S. Yang and A. Hossein Gandomi, “Bat algorithm: a novel approach for global engineering optimization,” Engineering Computations, vol. 29, no. 5, pp. 464–483, 2012. View at: Publisher Site | Google Scholar
  20. N. Andrei, “An unconstrained optimization test functions collection,” Advanced Modeling and Optimization, vol. 10, pp. 147–161, 2008. View at: Google Scholar
  21. E. D. Dolan and J. J. Moré, “Benchmarking optimization software with performance profiles,” Mathematical Programming, vol. 91, no. 2, pp. 201–213, 2002. View at: Publisher Site | Google Scholar
  22. X.-S. Yang and S. Deb, “Cuckoo search via Lévy flights,” in Proceedings of the 2009 World Congress on Nature & Biologically Inspired Computing (NaBIC), Salamanca, Spain, October 2009. View at: Publisher Site | Google Scholar
  23. R. C. Eberhart and J. Kennedy, “A new optimizer using particle swarm theory,” in Proceedings of the Proceedings of the Sixth International Symposium on Micro Machine and Human Science, Nagoya. Japan, October 1995. View at: Publisher Site | Google Scholar
  24. R. C. Eberhart and S. Yuhui, “Particle swarm optimization: developments, applications and resources,” in Proceedings of the 2001 Congress on Evolutionary Computation Seoul, Seoul, Korea, May 2001. View at: Publisher Site | Google Scholar

Copyright © 2020 Huda I. Ahmed et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder

Related articles

We are committed to sharing findings related to COVID-19 as quickly as possible. We will be providing unlimited waivers of publication charges for accepted research articles as well as case reports and case series related to COVID-19. Review articles are excluded from this waiver policy. Sign up here as a reviewer to help fast-track new submissions.