Table of Contents Author Guidelines Submit a Manuscript
Computational Intelligence and Neuroscience
Volume 2019, Article ID 8718571, 25 pages
Review Article

A Systematic and Meta-Analysis Survey of Whale Optimization Algorithm

1Technical College of Informatics, Sulaimani Polytechnic University, Sulaimani, KRG, Iraq
2Applied Computer, College of Medicals and Applied Sciences, Charmo University, Sulaimani, Chamchamal, KRG, Iraq
3Network Department, College of Computer Science and Information Technology, Kirkuk University, Kirkuk, KRG, Iraq
4Computer Science and Engineering, University of Kurdistan Hewler (UKH), Erbil, KRG, Iraq

Correspondence should be addressed to Hardi M. Mohammed; gro.ytisrevinuomrahc@demmahom.idrah

Received 23 January 2019; Revised 13 March 2019; Accepted 18 March 2019; Published 28 April 2019

Academic Editor: Maciej Lawrynczuk

Copyright © 2019 Hardi M. Mohammed et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


The whale optimization algorithm (WOA) is a nature-inspired metaheuristic optimization algorithm, which was proposed by Mirjalili and Lewis in 2016. This algorithm has shown its ability to solve many problems. Comprehensive surveys have been conducted about some other nature-inspired algorithms, such as ABC and PSO. Nonetheless, no survey search work has been conducted on WOA. Therefore, in this paper, a systematic and meta-analysis survey of WOA is conducted to help researchers to use it in different areas or hybridize it with other common algorithms. Thus, WOA is presented in depth in terms of algorithmic backgrounds, its characteristics, limitations, modifications, hybridizations, and applications. Next, WOA performances are presented to solve different problems. Then, the statistical results of WOA modifications and hybridizations are established and compared with the most common optimization algorithms and WOA. The survey’s results indicate that WOA performs better than other common algorithms in terms of convergence speed and balancing between exploration and exploitation. WOA modifications and hybridizations also perform well compared to WOA. In addition, our investigation paves a way to present a new technique by hybridizing both WOA and BAT algorithms. The BAT algorithm is used for the exploration phase, whereas the WOA algorithm is used for the exploitation phase. Finally, statistical results obtained from WOA-BAT are very competitive and better than WOA in 16 benchmarks functions. WOA-BAT also outperforms well in 13 functions from CEC2005 and 7 functions from CEC2019.

1. Introduction

Recently, optimization becomes one of the most interesting issues in different life aspects, such as engineering designs, browsing the Internet, and business management. Time reduction, high quality, and financial profit can be challenging for the most real-world applications. Therefore, most optimization methods try to find a perfect method in order to deal with limited resources problem within various restrictions. Many effective search algorithms, which are using mathematical formulae and computational simulations, have been implemented to solve optimization problems. Metaheuristic algorithms try to balance between randomization and local search. So, most of these algorithms are used for global optimization [1, 2].

Metaheuristic algorithms have two basic elements, which are exploitation and exploration; in exploration, different solutions are found to explore the search space to find the global optimal, but in exploitation, local search is used by exploiting information about the best solutions that have been recently found. This combination with choosing the best solutions will guarantee that solutions reach the optimality, also exploration bypasses the local optima problem through randomization and raises the diversity of the solutions [1, 3].

Swarm-based nature metaheuristic algorithms are used to solve optimization problems by imitating the biological behavior of certain animals. Mirjalili and Lewis proposed the whale optimization algorithm to simulate the hunting behavior of humpback whales, and this is done by two main attacking mechanisms; first by chasing the prey with random or the best search agent and second by simulating the bubble net hunting strategy. Humpback whales like to hunt a group of small fish close to the surface. So, they swim around the target inside and alongside a thin circle to make a winding-shaped way, creating distinct blebs along a circle or ‘9’ shaped ways altogether. Humpback whales have a very remarkable hunting method; this hunting behavior is called the bubble net feeding method. It has been observed that foraging is done by creating unique bubbles along a circle or ‘9’ shaped path as shown in Figure 1 [5].

Figure 1: Spiral shape bubble net [4].

The aim of this research consists of several aspects: first of all, highlighting all studies and researches conducted on WOA, where metaheuristic hybridization models have been used to combine WOA with other techniques to enhance the performance of the resulting algorithm. Second, this work has focused on all modification methods, which have been applied on WOA to improve its ability to search for the best solution. Third, we have collected most of the research works related to various applications applied on WOA. Finally, a new hybridizing of WOA and BAT algorithms is presented. The proposed algorithm is used to overcome the problems of staying in the local optimum and increase the speed of convergence to the best solution. Consequently, this research work in return will pave the way for researchers to make other modifications on the WAO algorithm to suit their different purposes.

The rest of the paper outline starts with describing WOA, its characteristics, and limitations followed by providing various WOA modifications and hybridizations, which have been applied to different problems. Next, various applications of WOA are presented. After that, results from different benchmark functions and experiments are analyzed and compared to WOA modifications and other metaheuristic optimization algorithms. Then, the BAT algorithm is presented, and the WOA-BAT is proposed. The results of the WOA-BAT are evaluated against the original WOA. WOA-BAT is happened to be very competitive and better than WOA in 16 out of 23 benchmark test functions, 13 out of 25 CEC2005 test functions, and 7 out of 10 CEC2019 test functions. Finally, the conclusion is presented with future works on WOA and WOA-BAT.

1.1. Whale Optimization Algorithm

This algorithm consists of two main phases; in the first phase, encircling prey and spiral updating position are implemented (exploitation phase). However, searching for a prey is done randomly in the second phase (exploration phase) [5]. The mathematical model of each phase is illustrated in the following subsections.

1.1.1. Bubble Net Attacking Method

Two approaches are designed in order to mathematically model the bubble net behavior of humpback whales that is called the exploitation phase. The two approaches are described as follows:

(1) Encircling Prey. After the humpback whales discover the position of the prey, they encircle around them. Firstly, the location of the optimal design in the search space is unidentified; thus, the WOA algorithm assumes that the present leading candidate solution is the target prey or near to the optimum. Then the other search agents will attempt to change their locations to the best search agents. This behavior is represented by the following equations:where indicates the whale’s earlier best location at iteration t. is the whale’s current position, is the distance vector between whale and prey, and | | denotes absolute value. The C and A are coefficient vectors calculated as follows:

To apply shrinking, the value of is reduced in Equation (3); thus, the oscillating range of is also reduced by . The value of could be in the interval (−a, a), where the value of a is decreased from 2 to 0 through iterations. By selecting random values for in (−1, 1), the new position of any search agent can be determined anywhere in the range between the original position of the agent and the position of the current best agent.

(2) Spiral Updating Position. After calculating the distance between the whale located at (X, Y) and prey located at (, ). At that point, a spiral equation is generated between the location of the whale and prey to imitate the helix-shaped movement of humpback whales as follows:where b is a constant value for identifying the shape of logarithmic spiral and k is a random number in the range [−1, 1]. This behavior is represented in WOA to change the position of whales during optimization. There is a 50% chance for selecting between the shrinking encircling mechanism and the spiral model, and their components are designed as follows:where p is a random number in (0, 1).

1.1.2. Search for Prey

In the search phase for the prey, which is called the exploration phase, a similar method depending on the variance of the vector can be used. The whales actually use random search to discover their prey depending on the position of each other. Therefore, to oblige the search agents to move far away from the local whale, WOA uses the vector with random values greater or less than 1. Throughout the exploration phase, the location of a search agent is reorganized according to randomly selected search agent rather than the best search agent (exploitation phase). This procedure aids the WOA algorithm to perform the global search and overcome the local optimal problem. The mathematical model is expressed as follows:where is the random position vector (a random whale) chosen from the current population.

1.2. Operation of Whale Optimization Algorithm

The WOA algorithm starts by assigning whales population with random solutions and assuming the best optimal value of the objective function is a minimum or maximum value (depending on the problem), then the objective function for each search agent is calculated. At each iteration, each search agent updates their location depending on either the best solution found so far when or on a randomly chosen search agent when . In order to achieve exploration and exploitation phases, respectively, the value of a parameter is decreased from 2 to 0. Also, WOA has the feature to select either a spiral or circular movement through the value of another parameter, which is (a random number in [0, 1]) with a probability of 50% to select one of these two mechanisms, so if its value is greater than 0.5, then the search agents change their positions using Equation (5), otherwise they use Equation (1). Finally, the WOA algorithm ends by implementing of the termination condition [5] (Algorithm 1).

Algorithm 1: The whale optimization algorithm pseudocode.
1.3. Whale Optimization Algorithm Pseudocode
1.4. Characteristics of WOA

The process of obtaining a suitable equivalence between exploitation and exploration in the improvement of any metaheuristic algorithm is a topmost challenge due to the arbitrary nature of the optimization algorithm. WOA has the highest significance compared to the different optimization approaches through the following:(1)Exploitation ability(2)Exploration ability(3)Ability to get rid of the local minima

The WOA has an important capability of exploration due to the position updating mechanism of whales by using Equation (7). Throughout the initial step of the algorithm, this equation forces the whales to move randomly around each other. In the next steps, Equation (8) makes the whales update their positions rapidly and move along a spiral-shaped route in the direction of the best path that has been found so far. Since these two stages are done independently and in half iteration each, the WOA avoids local optima and achieves convergence speed at the same time through the iterations. But most of the other optimization algorithms (like PSO and GSA) do not have operators to consecrate a particular iteration to the exploration or the exploitation because they use only one format to update the position of search agents, so the probability of falling into local optima is more likely increased [2].

Figure 2 shows the number of publication that has been published about WOA since 2016.

Figure 2: The number of publications on the whale optimization algorithm since 2016.
1.5. Limitation of WOA

Metaheuristic algorithms have both efficiency and limitation for convergence speed and obtaining optimal solution. Thus, the limitation of WOA should also be found out depending on [6]. Randomization has a crucial role in exploration and exploitation, so using the current randomization technique in WOA would increase computational time especially for the highly complex problem [4].

Besides, convergence and speed depend on one control parameter, which is a. This parameter has an excessive impact on the performance of WOA [7]. For that reason, WOA has poor convergence speed in both exploration and exploitation phases [8, 9]. Thus, a balancing formulation between exploration and exploitation requires proper enhancement [10].

In addition, WOA uses the encircling mechanism in the search space, and this mechanism has less capability to jump out from local optima. Accordingly, it results in poor performance [11]. It also has a drawback when improving the best solution after each iteration [12].

It is worth mentioning that WOA cannot work in the fields of classification and dimensionality reduction as it is not suitable for the binary space [13]. Likewise, the original WOA cannot deal with complex environmental constraint as for the vehicle fuel consumption problem [14]. It cannot solve single and multidimensional 0–1 knapsack problems with different scales as it requires additional functions [15].

2. WOA Modifications and Hybridizations

In this review, we focus on reporting the developments of WOA, which have been published recently; this section is separated into three parts:(a)Modifications of WOA: including AWOA, IWOA, chaotic WOA, ILWOA, and MWOA.(b)Hybridizations of WOA: with metaheuristic algorithms, such as SA, PSO, local search, EWGC, and BS-WOA.(c)Problems solved by WOA

2.1. Modifications of WOA

There are different types of WOA, which have been modified. The following subsections are the summary of the modifications of WOA.

2.1.1. AWOA and SAWOA

Randomizations have an essential influence in exploration and exploitation in optimization algorithms. Therefore, there are different techniques, which have been used in randomization, for example, Markov chain, Lévy flight, and normal distribution or Gaussian. Despite using these techniques, the adaptive technique has been used in WOA, which is called adaptive WOA (AWOA). This technique has also been used in the cuckoo search algorithm. This technique is crucial due to decreasing computational times for highly complicated problems [4, 16]. Having fewer parameters dependency is the best feature of this technique, which is useful, and does not need to initialize parameters and step sizes. Therefore, these parameters change regarding its fitness values during iterations. As a result, AWOA reached an optimum solution in less computational time and local optimum was avoided with the fast convergence [16, 17]. Trivedi et al. [4] proved that AWOA was better than WOA in terms of computational times and convergence speed.

Another AWOA was proposed in [18] for cluster head selection based on the Internet of things (IoT). IoT is another vital area, which can be researched on due to improvement in its performance [19, 20]. Regardless of using parameters, such as distance energy and delay of sensor nodes in a wireless sensor network, self-adaptiveness WOA (SAWOA) considers temperature and load parameters of IoT devices. Results proved that SAWOA performance was better than other algorithms like GSA, GA, ABC, PSO, and WOA [2123].

2.1.2. IWOA

Distance control parameter a has value, which affects the ability of exploration and exploitation. However, this parameter is started from 2 and then decreased to 0 during iterations [7]. This parameter resulted in fast convergence and obtained accurate results for most problems. Despite these effects, it is linear and cannot adapt to the search process of WOA, which is nonlinear and complex [24]. Therefore, in an improved WOA (IWOA), some strategies defined for distance control parameter in order to adapt the nonlinear search process to achieve better results. There are five kinds of IWOA regarding its distance control variable, and those are SinWOA, CosWOA, TanWOA, LogWOA, and SquareWOA.

Because of having a poor balance between exploration and exploitation, researchers in [8] proposed a novel constitutional appraising approach based on WOA. Thus, results from clinical data analysis showed that IWOA had better efficiency in terms of convergence performance compared to the original WOA.

IWOA proposed in [25] used a new control parameter, which was inertia weight. This parameter was used to adjust the impact on the current best solution. To evaluate the performance of IWOA, 31 benchmark functions were used to test it in [5]. Then, IWOA was compared with WOA, FOA, and ABC algorithms. The size of the population was 1000 and with 30 iterations. IWOA outperformed compared to ABC, FOA, and WOA in terms of mean and standard deviations.

According to [26], the mean values of ABC and FOA were greater than IWOA and WOA for functions f1, f2, f3, f7, f10, f11, f12, f13, f16, and f27. The mean value of IWOA was the least compared to FOA and ABC for functions f4, f8, f14, f15, f17, f18, f19, f20, f21, and f26. However, the mean value for functions f5 and f9 was equal for all algorithms. The mean value of FOA was more than other algorithms for function f6. IWOA and FOA had greater mean values for functions f23, f24, and f25 compared to other algorithms.

The ABC mean value was better than IWOA, WOA, and FOA for functions f4 and f4. Function f22 was unfit with all algorithms because the mean values were far away from the optimal value. The WOA mean value for f8 had the least value. Similarly, the ABC mean value for f8 was the least. In addition, IWOA convergence was faster and obtained a lower value compared to WOA, FOA, and ABC. It can be said that IWOA was better than ABC and FOA. IWOA also enriched the original of WOA.

2.1.3. Chaotic WOA

Metaheuristic algorithms have problems due to convergence speed and obtaining better performance. The theory of nonlinear chaos has widely been used in different applications [9]. Dynamical chaotic systems are able to control status and unsteady periodic motions. Chaos can be used in stochastic and deterministic algorithms. Due to developing the performance and convergence speed, chaos was used with WOA [27]. This theory has been used by variety of algorithms such as genetic algorithm [24], harmony search [28], PSO [29], ABC [30, 31], FA [32], KH [33], BOA [34], and GWO [35]. Different types of chaotic maps used with WOA in order to control the main parameter of WOA to provide stability between exploration and exploitation. Chaos means features of a complicated system, which have unpredictable behavior and map means relating parameters by using functions with chaos behavior. Ten unidimensional chaotic maps were used with WOA [36]. These ten maps were used to produce a chaotic set. The initial point was crucial because it had an impact on chaotic maps. As a result, 0.7 was chosen as an initial point, which ranges between 0 and 1 [37]. Therefore, 20 benchmark functions were tested by CWOA. As a result, chaotic maps enhanced the efficiency of WOA.

2.1.4. ILWOA

Cloud computing is a computing system, which provides services via the Internet to clients [26]. Cloud computing is divided into two parts, which are the front end and back end. The front end includes all the software, which clients need, and the back end is related to the server and data storage [38]. Effectiveness and intelligent usages of cloud datacentre resources are the strategies of the consolidation of the virtual machine (VM). The most significant problem due to VM consolidation is VM replacement. Researchers aim was to minimize the number of physical machines, which were running in cloud datacentre. Abdel-Basset et al. [39] proposed improved Lévy WOA (ILWOA) to solve the minimization problem regarding the available bandwidth. Cloudsim toolkit was used to test the ILWOA on 25 different datasets and then compared with WOA, first fit, best fit, particle swarm optimization, genetic algorithm, and intelligent tuned harmony search. As a result, ILWOA showed better performance compared to other algorithms.

2.1.5. MWOA

Because of developing technologies, protecting information is crucial in order to transmit it to the Internet. A modified version of WOA (MWOA), which was used for cryptanalysis of the Merkle–Hellman knapsack cryptosystem (MHKC), was developed in [40]. The continuous value was converted to discrete by a sigmoid function. Then, the evaluation function was dealt with an infeasible solution by adding a penalty function. Mutation operation was added to improve the solution. MHKC was the first public key cryptosystem (PKC) invented in 1987. Two keys were used by MHKC. The keys were private and public. Encrypting the plaintext was done by a public key, and decrypting was done by a private key [11]. MWOA was used to breakdown the MHKC by knowing ciphertext. Consequently, an attacker could reach the plaintext by using MWOA with the ciphertext [40].

2.1.6. Memetic WOA

WOA is very competing with other common metaheuristic algorithms. However, WOA performance is restricted because of having search dynamics. Thus, the encircling mechanism mostly focuses on the exploration in the search space. As a result, WOA has poor performance to jump out from local optima. To solve this problem, memetic WOA (MWOA) was proposed in [12] by using chaotic local search inside WOA in order to extend the exploration capacity. MWOA was used to create stability between exploration and exploitation phases in the search space. To achieve a balance, MWOA was tested on 48 benchmark functions; then, results showed that MWOA performed better compared to its competitors with regard to accuracy and convergence speed.

2.2. Hybridization of WOA

WOA was used with common metaheuristic algorithms to achieve better solutions and get rid of the weakness of WOA and other algorithms. The summaries of hybridizations of WOA are explained in the following subsections.

2.2.1. With SA

Majdi and Mirjalili [41] presented WOA with simulated annealing (SA). SA was embedded inside WOA to improve the best solution, which was found at the end of each iteration. WOA was able to search efficiently for finding the best solution. The blind operator was used by WOA in the exploitation phase, so this technique was replaced by using SA. Despite the effectiveness of WOA, SA was used to enhance the exploitation phase and overcome the stagnation in local optima.

2.2.2. With PSO

Trivedi et al. [42] used PSO and WOA to obtain a superior solution for global numerical functions. They used PSO for the exploitation phase, and WOA was used in exploration phase in an environment, which was not certain. WOA used the algorithmic spatial path to explore a possible solution in less computational time to avoid local optima [5]. The result showed the efficiency of PSO-WOA compared to PSO and WOA individually.

2.2.3. With Local Search

According to [40], the authors proposed WOA, a strategy that is called Local Search, in order to reduce permutation flow shop scheduling problem (FSSP). FSSP is an NP-hard problem, which is hard to find a result in polynomial time. Despite its essentiality, several algorithms have been developed to achieve two goals: reducing the time complexity and decreasing the duration of the best schedule. Other algorithms, which solved FSSP, had some drawbacks due to high computational cost and local optima [40]. Therefore, an algorithm was required for the largest rank value (LRV) to deal with the search space of the problem, which is discrete. As a result, a hybrid whale algorithm (HWA) was presented and able to achieve optimal solution quickly by using various techniques, for example, swap mutation, insert-reversed block operation, local search strategy, and integrated with a heuristic algorithm that is known as Nawaz–Enscore–Ham (NEH). Swap mutation operation was used to improve the diversity of the candidate schedule. Local optima were also avoided by using the insert-reversed block operation. As a result, HWA was combined with NEH to develop basic WOA performance. The proposed algorithm showed better results compared to the basic WOA [40].

2.2.4. With EWGC

Data are increasing nowadays; hence, controlling data becomes a difficult task. Therefore, data might be too complex. As a result, decision-making is affected by the way of organizing data. Thus, data clustering is essential to extract knowledge and make an efficient decision about knowledge. Exponential grey wolf optimization (EGWO) with whale optimization for data clustering (EWGC) was proposed to identify optimal centroid through the clustering process. WGC used the hybridization of WOA and WEGWO [43].

WGC used the WOA algorithm hunting mechanism to find centroid and position updates by using the EGWO algorithm in the exploration phase. Three datasets were used to test the proposed algorithm, and the results were compared with the particle swarm clustering (PSC), modified PS (mPS), grey wolf optimization (GWO), exponential GWO, kernel-based EGWO, and WOA. WGC showed better results compared to those algorithms.

2.2.5. With BS

Cloud computing has a major role in the digital era because it serves a large number of users at the same time. Besides of having many advantages, security of data, which are stored in the cloud platform, is a big challenge. Brainstorm WOA (BS-WOA) is a hybridized algorithm which is based on brainstorm optimization and WOA. Thus, BS-WOA was used to identify the secret key of the database because the privacy of users should be preserved. Therefore, BS-WOA generated a key for the data, which came from the data owner, in order to protect the data from being used by the third-party user. As a result, BS-WOA improved the privacy and utility of the data in the cloud while the secret key was identified during the optimization process [44].

3. Applications of WOA

WOA has been used in several areas in various academic and industrial fields so far and the most important application classes are shown in the following subsections.

3.1. Electrical and Electronics Engineering

In the last years, the distribution systems of electric power are requiring extensive voltage ratio to supply inductive loads, which cause more power losses in the distribution networks and weakness in the power factor. To control these problems, appropriate distribution of capacitors is provided and eliminating the network line losses could enhance the constancy and the accuracy of the system. In order to find the optimum sizing and status of the capacitors for standard radial distribution systems, WOA was proposed as a solution, and several aspects were taken into consideration, such as decreasing the cost of operating and minimizing the losses in the power with disparity limitation on the voltage range. The suggested algorithm was confirmed by applying it to standard radial systems: IEEE-34 and IEEE-85 bus radial distribution test systems. The obtained results were efficient compared with the existing algorithms [45].

The main function of the economic operation of power plants is scheduling the generating units to obtain minimum generation cost for the power utilities that means low-cost electricity. WOA is one of the most important new strategies to solve the economic dispatch problem. The execution of the utilized algorithm was verified using standard test system of IEEE 30-Bus; the obtained results from the proposed algorithm was compared with other metaheuristic approaches, such as PSO, ant colony optimization, and genetic algorithm and the comparison indicated that the obtained results were somewhat similar [46, 47].

3.2. Economic Scheduling

With a massive amount of real-world applications, the flow shop scheduling problem (FSSP) has increased intensely. FSSP is regarded as an NP-hard problem since finding a solution in polynomial time is a difficult issue. In order to decrease the makespan of the best schedule and reduce the required time, WOA was merged with the local search technique for handling the flow shop scheduling problem. Swap mutation operation was utilized to enhance the diversity of item schedules, and the local optima problem was overcome by using insert-reverse block operation. The hybrid whale algorithm (HWA) obtained competitive results compared with the previous algorithms [10].

3.3. Civil Engineering

The enhanced whale optimization algorithm (EWOA) was suggested to deal with sizing and optimization problems of truss and frame structures. EWOA was used to solve four structural optimization problems: two truss optimization problems (spatial 72-bar truss and spatial 582-bar tower) and two frame optimization problems (3-bay 15-story frame and 3-bay 24-story frame). The obtained numerical results showed that the suggested EWOA had better efficiency than the standard whale optimization algorithm [48].

3.4. Fuel and Energy

WOA is widely used in the fields of saving, processing, and improving energy and fuel sources, and the following are some of these applications:(1)The need for cleaning source of energy caused a rise in the using of solar energy; thus, researchers have given great importance to the design of photovoltaic cells. They faced two important problems; the first one was finding a beneficial model to describe the solar cells, and the second one was the lack of information about photovoltaic cells, which badly influences the efficiency of the photovoltaic modules (panels). The chaotic whale optimization algorithm (CWOA) for the parameter estimation of solar cells was made and used for calculating and automatically adapting the internal parameters of the optimization algorithm. The improved technique was able to optimize difficult and multimodal objective purpose. The experimental results of the proposed approach showed higher performance regarding accuracy and robustness [49].(2)More recently, researchers have been searching for alternative energy sources, such as solar, wind, and biomass because of the lack of conventional energy sources, such as petrol and coal, and these sources are among the main causes of environmental pollution. At different circumstance and under variance conditions, it is very important to exploit the maximum solar power from the photovoltaic panels; thus, a modified artificial killer whale optimization algorithm (MAKWO) to trace and find the highest power region of the photovoltaic module in the partially cloudy atmosphere was suggested. The obtained findings from MAKWO were compared with different metaheuristic algorithms (modified wolf pack algorithm (MAWP), artificial bee colony (ABC), and particle swarm optimization (PSO)) with a significant performance for the proposed algorithm (MAWP) over all the other algorithms [50].

3.5. Medical Engineering

Lately, analysis of medical images has become the focus of many researchers because they highly depend on these images for diagnosis and surgery. The liver is one of the organs most used in the computer-aided diagnosis system in order to detect the correct position of the organ inside the abdominal and also to avoid the intensity values overlapping with other organs. The whale optimization algorithm was proposed for liver segmentation in MIR images. To do the segmentation process, many clusters in the abdominal were determined. WOA had split the image into a number of clusters. After converting it to a binary image, it was multiplied by the previously clustered image with WOA in order to delete several parts of other organs; then, the required clusters were represented, which led to the liver area. A set of 70 images were tested using the suggested method illustrated and agreed by radiology specialists. Some measures like structural similarity index measure (SSIM), similarity index (SI), and other five measures were used to verify the correctness of the image. The final resolution of the processed image showed 96.75% accuracy using SSIM and 97.5 using SI% [51].

3.6. Problems Solved by WOA

WOA is a metaheuristic optimization algorithm that can be used to solve different problems, such as engineering problems, binary problems, multiobjective problems, and scheduling problems. Table 1 summarizes several problems, which have been solved by the WOA.

Table 1: Problems solved by WOA.

4. Benchmark Functions Experiment

According to [21], WOA was compared with different algorithms, such as GSA, PSO, FEP, and DE. These algorithms were tested on 29 benchmark functions, it can be said that the benchmark functions are separated into four types: unimodal, multimodal, fixed-dimension multimodal, and composite functions, as shown in Table 2. These benchmark functions are used as a validation procedure to test WOA, and then the results are compared with other common algorithms to ensure whether WOA is better or not. Each algorithm runs 30 times in order to obtain the optimum solution.

Table 2: Description of unimodal, multimodal, fixed-dimension multimodal, and composite benchmark functions used in this work.

The following subsections include comparison and discussion, solving classical engineering problem by WOA, comparison of WOA with IWOA, comparing WOA with other algorithms for feature selection, and finally, the evaluation performance of WOA is compared against AWOA and ILWOA.

4.1. Comparison and Discussion

WOA characteristics were assessed based on 29 benchmark functions. These benchmark functions are standard functions that are used as a validation procedure to assess WOA and its modifications. The following sections have Tables 35, which show the average and standard deviation. The following points explain the exploitation, exploration, escaping from local minima, and convergence behavior.

4.1.1. Capability Exploitation Assessment

F1 to F7 are unimodal functions that have only one local optimum. Therefore, by using them, we can evaluate the performance of exploitation of each algorithm. Table 3 shows that WOA is as good as other optimization algorithms for unimodal functions in exploration capability. Specifically, for F1 and F2, WOA is the most efficient optimizer and it has the second rank in almost all functions. Therefore, WOA is good at exploitation behavior [5].

Table 3: Result comparison among optimization algorithms [2].
4.1.2. Capability Exploration Assessment

Multimodal functions have various local optima, while unimodal has one local best. Therefore, the local optimal number increases when the number of design variables increases. As a result, these types of function are vital to evaluate the exploration capability over other optimal algorithms. Table 4 shows that WOA has a good capability for exploration. Because of the integration mechanism of exploration, WOA has the second rank compared with other optimization algorithms.

Table 4: Result comparison among optimization algorithms [5].
4.1.3. Escaping from Local Minima

Balancing between exploration and exploitation is the only way to avoid local optima because of challenging of mathematical computation of a composite function. Table 5 shows that the WOA algorithm ranked as the first optimizer in three tests and is as good as other optimization algorithms. It also demonstrates that WOA works well to make a balance between exploration and exploitation phases.

Table 5: Composite benchmark functions comparison result [5].
4.1.4. Convergence Behavior Analysis

When comparing different metaheuristic algorithms (WOA, PSO, and GSA) for some problems, it can be seen that the convergence rate of WOA is well competitive with other algorithms when it is tested on 29 benchmarks functions [5]. WOA has many main characteristics that make it faster than other algorithms. In the initial steps of iterations, the search agents try to relocate their positions randomly around each other through Equation (8), which gives WOA high exploration capability, while using Equation (7), the search agents reposition their locations in a spiral-shaped path toward the best solution found so far. Each phase is done in almost half of iterations and simultaneously; thus, the WOA has the highest local optima avoidance capability and fast convergence rate than other similar metaheuristic algorithms. However, PSO and GSA have a greater probability of falling into starvation in local optima simply because they do not have parameters to determine specific iterations to the exploration or exploitation phases. In other words, they use only one equation to update the search agents’ positions, and also WOA requires less iteration to obtain global optimum compare to other algorithms.

4.2. WOA for Classical Engineering Problem

Mirjalili and Lewis [5] used WOA to solve the following engineering problems, which are shown in Table 6.

Table 6: Different engineering problem comparison result.
4.3. WOA Feature Selection Experiment

16 datasets were chosen in this paper [28]. Training, validation, and testing were steps in which datasets were used. Each dataset randomly separated into three parts. The classification was done by the training part, and the validation part was used to assess the classification capability. Finally, the test part was required for evaluating the selected features. WOA, PSO, and GA were used on this test in order to achieve the comparison results.

The results were computed in the Matlab environment 20 times. Overall, WOA outperformed feature selections, which approved the ability of wrapper-based approach and premature convergence while searching for optimal feature subset in the search space. WOA was better than PSO and GA in terms of ability to search for optimal features. Occurring local optima that may happen because of premature convergence can be avoided by WOA. Moreover, the results proved that WOA can find an optimal solution, which had maximum classification accuracy. It was also capable to make stability between exploration and exploitation.

4.4. Performance Evaluation on Benchmark Functions between Several Variants of WOA

WOA has different types of modification. Therefore, it was compared with AWOA and ILWOA in the following subsections.

4.4.1. WOA and AWOA Comparison

AWOA was tested on different unconstraint benchmark functions. It can be said that AWOA had a better result compared to WOA. AWOA improved solution by fast convergence, randomness, and stochastic behavior. It was also used as a random search in workspace while no optimal solutions exist. Thus, AWOA was an effective technique to solve the problem within unknown search space [4].

4.4.2. WOA and ILWOA

Statistical results from [12] show the difference between ILWOA and WOA performance, Friedman test is used with the experimental result to analysis. Friedman test can be executed on more than two dependent samples because it is a non-parametric and rank-based version of one-way ANOVA with respected measures.

Figures 3 and 4 show datasets with three and five types of hosts with Friedman ranked mean for each algorithm. The result showed that ILWOA had the best performance in minimizing utilization of host machines [39]. Friedman test and datacentre utilization host were performed to analyze the obtaining result. It is clear that ILWOA had been tested on three and five bin datasets. As a result, the efficiency of ILWOA increased as the number of bin datasets increased.

Figure 3: Friedman test of datasets with 3 host types.
Figure 4: Friedman test of datasets with 5 host types.

5. Standard Bat Algorithm

The bat algorithm is a metaheuristic algorithm developed by XinShe Yang in 2010 [57]. It was based on the echolocation capabilities of microbats. Before illustrating the details of this algorithm, we summarize echolocation briefly.

5.1. Echolocation of Microbats

Bats are mammals with echolocation capabilities. They use echolocation sonar to detect prey or to avoid obstacles. These bats send a very loud sound pulse and receive the echo that rebounds back from the surrounding objects. In zones with identical atmospheric air pressure, these sound pluses emit at a constant velocity while they get changed if the atmospheric pressure is changed [57]. Bats can estimate the positions of any surrounding objects using the time delay of the returning pulse, and also they determine the shape and the direction of the objects using comparative amplitudes of the sound pulses collected at each ear. Finally, the data obtained so far are analyzed and interpreted in the brain to construct image about their surroundings [58].

5.2. Bat Algorithm

Using the concept of bat echolocation abilities, Yang developed various bat-inspired algorithms or bat algorithms [57]. He simulated this behavior to solve different optimization problems. Bats can determine the position of their preys, objects, or food exactly through very loud sound wave emission and receiving echo that comes back from these objects. Bats use the advantages of time delay concept to find their preys, whereas the time delay is calculated as space between bats’ ears and the echo wave variations. On a trip finding the prey, bats fly randomly in the search space with a speed and change their positions at a constant frequency , different wavelengths , and loudness . In the algorithm, to update the value of these parameters, Yang used the following three equations [57]:where is the position of the bats, is the velocity of bats, is the frequency of waves, and is a random vector in the range [0, 1] taken from regular distribution. However, refers to the global best solution found so far among all bats in the search space. Based on the domain size of the optimization problem, the upper and the lower limits of the frequency are determined. Usually, the upper boundary is assigned 100 and the lower boundary is assigned 0. Initially, each bat takes a random value of the frequency within the range . The velocity of the search agent is compatible with the frequency, and the position of the new solution is located depending on its new velocity. When a bat finds its prey or food, the rate of the loudness reduces while the ratio of pulse emission rises. A pseudocode listed by Yang [57] is shown in Algorithm 2.

Algorithm 2: BAT algorithm pseudocode.

6. Hybrid WOA-BAT Algorithm

WOA is an optimization algorithm, which showed high performance in solving many optimization problems. Despite all the results, WOA showed slow convergence speed due to finding the global optimum [53]. Therefore, the BAT algorithm is used to improve the exploration capability of WOA. In this approach, two basic techniques are used: (1) the BAT algorithm is partially embedded inside WOA search phase and (2) the condition technique is used after changing the position of each search agent; for example, if the new position is better than the old position, then the old position is replaced. As a result, the WOA-BAT algorithm can obtain better results in fewer iterations compared to WOA. The detail of this modification can be seen in the WOA-BAT algorithm pseudocode (Algorithm 3) and Figure 5.

Algorithm 3: WOA-BAT algorithm pseudocode.
Figure 5: WOA-BAT flowchart.

7. Implementation and Results

The proposed algorithm WOA-BAT is implemented and evaluated by using different benchmark functions. Three different benchmark functions are used to test the proposed algorithm; these are 23 mathematical optimization problems (Table 2 and [5]), CEC2005 (Table 7), and CEC2019 (Table 8). The code of WOA-BAT is available at the following link: In order to improve the WOA code which have been implemented by Mirjalili and Lewis, WOA and BAT algorithms are hybridized using the Matlab code. The following subsections include a description of benchmark functions, experimental setup, evaluation criteria, comparison of WOA-BAT with WOA, and comparison of WOA-BAT with common algorithms.

Table 7: Summary of the 25 CEC2005 test functions.
Table 8: Summary of the basic CEC2019 functions.
7.1. Benchmark Functions

First, the implementation of 23 mathematical benchmark functions is conducted [5]. The test functions can be classified into two groups: f1–f7 (unimodal benchmark functions) and f8–f23 (multimodal benchmark functions). Second, CEC2005 includes four types of benchmark functions; these are unimodal functions, multimodal functions, expanded multimodal functions, and hybrid composition functions [5961]. CEC2005 function details are in Table 7 [62]. Each part includes the numbers, respectively, 5, 7, 2, and 11. Third, 10 benchmark functions are used from CEC2019. All the benchmark functions in CEC2019 are multimodal functions and can be seen in Table 8 [63].

7.2. Experimental Setup

To obtain an accurate result, population size is randomly generated in order to make the best comparison with other common algorithms. The population size is 30, which were randomly generated, maximum iteration for the population size is 500, and the dimension is 30. The population size and iterations are executed 30 times then the average result is taken.

7.3. Evaluation Criteria

Three ways are used to evaluate algorithms to obtain better comparison, and the following points are the criteria of evaluation:(1)Obtaining average and standard deviation(2)Comparing WOA-BAT by building a box and whisker plot with WOA(3)Compare WOA-BAT with other metaheuristic algorithms

7.4. Comparison with Original WOA Algorithm
7.4.1. Evaluation of F1–F7 Exploitation

These are unimodal functions as they have a single optimum global value. By using these functions, we can easily investigate the exploitation capability of the developed algorithm. Therefore, Table 9 and Figure 6 show that WOA-BAT as a better optimizer for f3, f4, f5, f6, and f7, while WOA is better for f1 and f2. As a result, WOA-BAT has an effective ability in exploitation.

Table 9: Comparison result of WOA-BAT and WOA.
Figure 6: Comparison of average results of WOA-BAT and WOA.
7.4.2. Evaluation of F8–F23 Exploration

Functions f8–f23 are multimodal functions, which can have many local optima. These numbers of local optima are increased depending on the design variables. Therefore, these functions can be used to test the exploration capability of the WOA-BAT algorithm. Table 9 and Figure 6 illustrate that the averages of 10 benchmark functions (f8, f11, f12, f13, f14, f15, f17, f19, f20, f22, and f23) are very efficient in WOA-BAT, while WOA has the optimum value in 4 functions (f9, f10, f18, and f21).

25 benchmark functions of CEC2005 are used to test on WOA-BAT and WOA. Table 10 and Figure 7 show the comparison results of WOA and WOA-BAT in box and whisker plot. WOA-BAT outperforms well in 13 functions. Table 10 shows that WOA-BAT has better performance compared to WOA original in f1, f2, f3, f4, f6, f9, f10, f12, f13, f18, f19, f22, and f25. However, WOA outperforms in other functions while WOA-BAT and WOA have the same performance in f7 and f8, which can be seen in Figure 7. Overall, it can be said that the proposed algorithm improved the WOA original to obtain a better result in approximately 13 functions.

Table 10: Comparison result of WOA-BAT and WOA on CEC2005.
Figure 7: Comparison of average results of WOA-BAT and WOA CEC2005.

Like CEC2005, CEC2019 is used to test the WOA-BAT algorithm and WOA. Table 11 and Figure 8 show that WOA-BAT has lower average result compared to WOA in seven functions f1, f2, f3, f5, f7, f8, and f10. However, WOA-BAT is not very competitive with WOA in f4, f6, and f9. Overall, WOA-BAT could improve the WOA in 7 benchmark functions from CEC2019.

Table 11: Comparison results of WOA-BAT and WOA CEC2019.
Figure 8: Comparison average result of WOA-BAT and WOA CEC2019.
7.5. Comparison with Metaheuristic Algorithms

The results from different papers included and presented in this paper in order to compare WOA-BAT with other well-known evolutionary algorithms, for example, GA, DE, ABC, and BSO. The results of these algorithms are obtained from CEC2005, which includes 25 benchmark functions [5961]. The results for CEC2005 as shown in Table 12 indicate that WOA-BAT has the first rank because it outperforms well in 13 functions. The function, which WOA-BAT has a better result, are f3, f11, f12, f15, f16, f17, f18, f19, f20, f21, f22, f23, and f25. BSO and DE have the second and third ranks, respectively. WOA-BAT outperforms well in 13 functions, while BSO is well in 8 functions. Performance of DE is sufficient in 3 functions, which are f4, f5, and f6.

Table 12: Comparison of WOA-BAT with GA, DE, ABC, and BSO.

However, in terms of standard deviation, the ABC result is the best in 8 functions. GA has the worst results in all functions and does not perform well compared to other algorithms. DE is the second worse algorithm.

Table 13 is created in order to obtain the ranking result of the optimization algorithms from Table 12. As a result, Table 13 illustrates that WOA-BAT has the best ranking among the five optimization algorithms. Overall ranking WOA-BAT is 1.6. However, BSO has 2.6. Accordingly, the difference between WOA-BAT and BSO is 1, so it can be said that the difference is significant. WOA-BAT and BSO have approximately the same ranking result for f1–f12. However, there is a significant difference between f13-f14 and f15–f25.WOA-BAT is better than BSO in f15–f25 by 1.9. Overall, it is believed that WOA-BAT has better ranking compare to GA, DE, ABC, and BSO.

Table 13: Ranking of WOA-BAT optimization compared to GA, DE, ABC, and BSO.

8. Conclusion

In this study, WOA was explained in detail. WOA characteristics and its functionality were presented. In addition, the use of WOA was described in different areas, such as electrical and electronics engineering, automatic control system, civil engineering, fuel and energy, and medical engineering. Furthermore, researchers have modified and hybridized WOA in order to overcome optimization problems in the above areas.

WOA was tested on 23 benchmark functions in order to determine the capability of exploitation, exploration, escaping from local minima, and convergence behavior. WOA had better performance of exploitation when it was tested on unimodal functions. It was also performed well in exploration on multimodal functions. In addition, testing WOA on composite functions can be viewed as the best way to stabilize between exploration and exploitation. Therefore, it can be said that WOA would increase convergence speed during iterations, while the majority of optimization algorithms (like PSO and GSA) do not have operators to consecrate a particular iteration to the exploration or the exploitation because they use only one format to update the position of search agents; thus, the probability of falling into local optima is more likely increased.

It is safe to say that WOA achieves convergence speed and avoids local optima at the same time through iterations because of having two independent stages (exploration and exploitation). Both exploration and exploitation are done in each iteration.

It is obvious that WOA cannot solve every optimization problems. However, it is very competitive with other common optimization algorithms. Another limitation of WOA is that WOA has poor convergence speed while searching around the global optimum.

It is established that there are many types of WOA modifications and hybridizations. It is impossible to compare each new proposed WOA with all other types, and there are different benchmark functions, which can be used to test any new modifications. Therefore, it is believed that creating a platform for the researchers is essential in order to upload their program. After that, it will be easy to conduct and compare all the modifications and hybridizations and decide which one is the best.

WOA demonstrated high performance in solving many optimization problems. Regardless of all the results, WOA exhibited slow convergence speed due to finding the global optimum. As a result, the BAT algorithm is used to recover the exploration capability of WOA. Thus, the WOA-BAT algorithm presented to obtain better results in fewer iterations compared to WOA.

In this paper, WOA-BAT and WOA were tested on 25 functions from CEC2005. The results indicate that WOA-BAT performance is much better than WOA in 13 functions and have the same result in two functions. Also, WOA-BAT is tested on CEC2019 and compared with WOA. WOA-BAT has a lower average than WOA in 7 out of 10 functions.

WOA-BAT was evaluated against other competitive algorithms using CEC2005. The results showed that WOA-BAT has the first rank among GA, DE, ABC, and BSO.

There are several areas in WOA that can be further researched in the future. Therefore, the following areas might be interesting for researchers:(1)Hybridization of WOA with other population metaheuristic algorithm, such as the ant-lion algorithm(2)Investigation on the adaptive value, which is responsible for the exploration and exploitation ability of WOA-BAT(3)Solving real-world problems in health care filed by hybridizing WOA-BAT with another optimization algorithm would be interested(4)Hybridization of other optimization algorithms with WOA-BAT for cluster head selection for IoT(5)It is recommended to use WOA-BAT to train other advanced types of machine learning techniques such as Capsule Net, LSTM, and CNN(6)Applying WOA-BAT for constrained optimization problems(7)Applying WOA-BAT for discrete optimization problems(8)Solving different business applications by using the WOA-BAT algorithm(9)Using WOA-BAT for feature selection in data mining(10)Using WOA-BAT in the text mining field.

Conflicts of Interest

The authors declare that they have no conflict of interest.


The authors wish to express their deep thanks to the University of Kurdistan Hewler (UKH) for providing funds for conducting this research study.


  1. X.-S. Yang, Nature-Inspired Metaheuristic Algorithms, Luniver Press, Middlesex University, UK, 2010.
  2. V. Ho-Huu, T. Nguyen-Thoi, M. Nguyen-Thoi, and L. Le-Anh, “An improved constrained differential evolution using discrete variables (D-ICDE) for layout optimization of truss structures,” Expert Systems with Applications, vol. 42, no. 20, pp. 7057–7069, 2015. View at Publisher · View at Google Scholar · View at Scopus
  3. R. V. Rao, V. J. Savsani, and D. Vakharia, “Teaching–learning-based optimization: an optimization method for continuous non-linear large scale problems,” Information Sciences, vol. 183, no. 1, pp. 1–15, 2012. View at Publisher · View at Google Scholar · View at Scopus
  4. I. N. Trivedi, J. Pradeep, J. Narottam, K. Arvind, and L. Dilip, “Novel adaptive whale optimization algorithm for global optimization,” Indian Journal of Science and Technology, vol. 9, no. 38, 2016. View at Publisher · View at Google Scholar · View at Scopus
  5. S. Mirjalili and A. Lewis, “The whale optimization algorithm,” Advances in Engineering Software, vol. 95, pp. 51–67, 2016. View at Publisher · View at Google Scholar · View at Scopus
  6. P. Niu, S. Niu, and L. Chang, “The defect of the Grey Wolf optimization algorithm and its verification method,” Knowledge-Based Systems, vol. 171, pp. 37–43, 2019. View at Publisher · View at Google Scholar
  7. M. Zhong, “Long W Whale optimization algorithm with nonlinear control parameter,” MATEC Web of Conferences, vol. 139, Article ID 00157, 2017. View at Publisher · View at Google Scholar · View at Scopus
  8. R. K. Saidala and N. Devarakonda, “Improved whale optimization algorithm case study: clinical data of anaemic pregnant woman,” in Data Engineering and Intelligent Computing, Springer, Cham, Switzerland, 2018. View at Publisher · View at Google Scholar · View at Scopus
  9. L. M. Pecora and T. L. Carroll, “Synchronization of chaotic systems,” Chaos: An Interdisciplinary Journal of Nonlinear Science, vol. 25, no. 9, Article ID 097611, 2015. View at Publisher · View at Google Scholar · View at Scopus
  10. M. Abdel-Basset, G. Manogaran, D. El-Shahat, and S. Mirjalili, “A hybrid whale optimization algorithm based on local search strategy for the permutation flow shop scheduling problem,” Future Generation Computer Systems, vol. 85, pp. 129–145, 2018. View at Publisher · View at Google Scholar · View at Scopus
  11. S. Nagaraj, G. Raju, and V. Srinadth, “Data encryption and authetication using public key approach,” Procedia Computer Science, vol. 48, pp. 126–132, 2015. View at Publisher · View at Google Scholar · View at Scopus
  12. Z. Xu, Y. Yu, H. Yachi, J. Ji, Y. Todo, and S. Gao, “A novel memetic whale optimization algorithm for optimization,” in International Conference on Swarm Intelligence, Springer, Cham, Switzerland, 2018. View at Publisher · View at Google Scholar · View at Scopus
  13. A. Kumar, V. Bhalla, P. Kumar, T. Bhardwaj, and N. Jangir, “Whale optimization algorithm for constrained economic load dispatch problems—a cost optimization,” in Ambient Communications and Computer Systems, 2018. View at Publisher · View at Google Scholar · View at Scopus
  14. A. G. Hussien, A. E. Hassanien, E. H. Houssein, S. Bhattacharyya, and M. Amin, “S-shaped binary whale optimization algorithm for feature selection,” in Recent Trends in Signal and Image Processing, Springer, Cham, Switzerland, 2019. View at Publisher · View at Google Scholar · View at Scopus
  15. G. N. Reddy, “Kumar SP multi objective task scheduling algorithm for cloud computing using whale optimization technique,” in International Conference on Next Generation Computing Technologies, Springer, Cham, Switzerland, 2017. View at Publisher · View at Google Scholar · View at Scopus
  16. M. K. Naik and R. Panda, “A novel adaptive cuckoo search algorithm for intrinsic discriminant analysis based face recognition,” Applied Soft Computing, vol. 38, pp. 661–675, 2016. View at Publisher · View at Google Scholar · View at Scopus
  17. G.-G. Wang, A. H. Gandomi, X.-S. Yang, and A. H. Alavi, “A new hybrid method based on krill herd and cuckoo search for global optimisation tasks,” International Journal of Bio-Inspired Computation, vol. 8, no. 5, p. 286, 2016. View at Publisher · View at Google Scholar · View at Scopus
  18. M. P. K. Reddy and M. R. Babu, “Implementing self adaptiveness in whale optimization for cluster head section in Internet of Things,” Cluster Computing, vol. 20, pp. 1–12, 2018. View at Publisher · View at Google Scholar
  19. J. Duan, D. Gao, D. Yang, C. H. Foh, and H.-H. Chen, “An energy-aware trust derivation scheme with game theoretic approach in wireless sensor networks for IoT applications,” IEEE Internet of Things Journal, vol. 1, no. 1, pp. 58–69, 2014. View at Publisher · View at Google Scholar · View at Scopus
  20. S. Raza, P. Misra, Z. He, and T. Voigt, “Building the internet of things with bluetooth smart,” Ad Hoc Networks, vol. 57, pp. 19–31, 2017. View at Publisher · View at Google Scholar · View at Scopus
  21. I.-G. Lee and M. Kim, “Interference-awareself-optimizingWi-Fi for high efficiency internet of things in dense networks,” Computer Communications, vol. 89, pp. 60–74, 2016. View at Publisher · View at Google Scholar · View at Scopus
  22. C.-L. Hsu and J. C.-C. Lin, “An empirical examination of consumer adoption of Internet of Things services: network externalities and concern for information privacy perspectives,” Computers in Human Behavior, vol. 62, pp. 516–527, 2016. View at Publisher · View at Google Scholar · View at Scopus
  23. Z. Zheng, N. Saxena, K. Mishra, and A. K. Sangaiah, “Guided dynamic particle swarm optimization for optimizing digital image watermarking in industry applications,” Future Generation Computer Systems, vol. 88, pp. 92–106, 2018. View at Publisher · View at Google Scholar · View at Scopus
  24. Y. Li-Jiang and C. Tian-Lun, “Application of chaos in genetic algorithms,” Communications in Theoretical Physics, vol. 38, no. 2, p. 168, 2002. View at Publisher · View at Google Scholar
  25. D. Karaboga, B. Gorkemli, C. Ozturk, and N. Karaboga, “A comprehensive survey: artificial bee colony (ABC) algorithm and applications,” Artificial Intelligence Review, vol. 42, no. 1, pp. 21–57, 2014. View at Publisher · View at Google Scholar · View at Scopus
  26. P. Ong, “Adaptive cuckoo search algorithm for unconstrained optimization,” Scientific World Journal, vol. 2014, pp. 1–8, 2014. View at Publisher · View at Google Scholar · View at Scopus
  27. G. Kaur and S. Arora, “Chaotic whale optimization algorithm,” Journal of Computational Design and Engineering, vol. 5, no. 3, pp. 275–284, 2018. View at Publisher · View at Google Scholar · View at Scopus
  28. B. Alatas, “Chaotic harmony search algorithms,” Applied Mathematics and Computation, vol. 216, no. 9, pp. 2687–2699, 2010. View at Publisher · View at Google Scholar · View at Scopus
  29. B. Liu, L. Wang, Y.-H. Jin, F. Tang, and D.-X. Huang, “Improved particle swarm optimization combined with chaos,” Chaos, Solitons & Fractals, vol. 25, no. 5, pp. 1261–1271, 2005. View at Publisher · View at Google Scholar · View at Scopus
  30. B. Alatas, “Chaotic bee colony algorithms for global numerical optimization,” Expert Systems with Applications, vol. 37, no. 8, pp. 5682–5687, 2010. View at Publisher · View at Google Scholar · View at Scopus
  31. H.-S. Chiang, A. K. Sangaiah, M.-Y. Chen, and J.-Y. Liu, “A novel artificial bee colony optimization algorithm with SVM for bio-inspiredsoftware-defined networking,” International Journal of Parallel Programming, pp. 1–19, 2018. View at Publisher · View at Google Scholar · View at Scopus
  32. D. Yang, G. Li, and G. Cheng, “On the efficiency of chaos optimization algorithms for global optimization,” Chaos, Solitons & Fractals, vol. 34, no. 4, pp. 1366–1375, 2007. View at Publisher · View at Google Scholar · View at Scopus
  33. G.-G. Wang, L. Guo, A. H. Gandomi, G.-S. Hao, and H. Wang, “Chaotic krill herd algorithm,” Information Sciences, vol. 274, pp. 17–34, 2014. View at Publisher · View at Google Scholar · View at Scopus
  34. S. Arora and S. Singh, “An improved butterfly optimization algorithm with chaos,” Journal of Intelligent & Fuzzy Systems, vol. 32, no. 1, pp. 1079–1088, 2017. View at Publisher · View at Google Scholar · View at Scopus
  35. M. Kohli and S. Arora, “Chaotic grey wolf optimization algorithm for constrained optimization problems,” Journal of Computational Design and Engineering, vol. 5, no. 4, pp. 458–472, 2017. View at Publisher · View at Google Scholar · View at Scopus
  36. A. H. Gandomi and X.-S. Yang, “Chaotic bat algorithm,” Journal of Computational Science, vol. 5, no. 2, pp. 224–232, 2014. View at Publisher · View at Google Scholar · View at Scopus
  37. A. H. Gandomi, X.-S. Yang, S. Talatahari, and A. H. Alavi, “Firefly algorithm with chaos,” Communications in Nonlinear Science and Numerical Simulation, vol. 18, no. 1, pp. 89–98, 2013. View at Publisher · View at Google Scholar · View at Scopus
  38. D. Tancock, S. Pearson, and A. Charlesworth, “A privacy impact assessment tool for cloud computing,” in Privacy and security for Cloud computing, Springer, Cham, Switzerland, 2013. View at Publisher · View at Google Scholar
  39. M. Abdel-Basset, L. Abdle-Fatah, and A. K. Sangaiah, “An improved Lévy based whale optimization algorithm for bandwidth-efficient virtual machine placement in cloud computing environment,” Cluster Computing, vol. 21, pp. 1–16, 2018. View at Publisher · View at Google Scholar · View at Scopus
  40. M. Abdel-Basset, D. El-Shahat, I. El-henawy, A. K. Sangaiah, and S. H. Ahmed, “A novel whale optimization algorithm for cryptanalysis in merkle-hellman cryptosystem,” Mobile Networks and Applications, vol. 23, no. 4, pp. 723–733, 2018. View at Publisher · View at Google Scholar · View at Scopus
  41. M. M. Mafarja and S. Mirjalili, “Hybrid Whale Optimization Algorithm with simulated annealing for feature selection,” Neurocomputing, vol. 260, pp. 302–312, 2017. View at Publisher · View at Google Scholar · View at Scopus
  42. I. N. Trivedi, P. Jangir, A. Kumar, N. Jangir, and R. Totlani, “A novel hybrid PSO–WOA algorithm for global numerical functions optimization,” in Advances in Computer and Computational Sciences, Springer, Cham, Switzerland, 2018. View at Publisher · View at Google Scholar · View at Scopus
  43. A. N. Jadhav and N. Gomathi, “WGC: hybridization of exponential grey wolf optimizer with whale optimization for data clustering,” Alexandria Engineering Journal, vol. 57, no. 3, pp. 1569–1584, 2017. View at Publisher · View at Google Scholar · View at Scopus
  44. S. T. Revathi, N. Ramaraj, and S. Chithra, “Brain storm-based Whale Optimization Algorithm for privacy-protected data publishing in cloud computing,” Cluster Computing, vol. 21, pp. 1–10, 2018. View at Publisher · View at Google Scholar · View at Scopus
  45. D. Prakash and C. Lakshminarayana, “Optimal siting of capacitors in radial distribution network using Whale Optimization Algorithm,” Alexandria Engineering Journal, vol. 56, no. 4, pp. 499–509, 2017. View at Publisher · View at Google Scholar · View at Scopus
  46. H. J. Touma, “Study of the economic dispatch problem on IEEE 30-bus system using whale optimization algorithm,” International Journal of Engineering, Science and Technology, vol. 5, no. 1, p. 1, 2016. View at Google Scholar
  47. E. B. Tirkolaee, M. Alinaghian, A. A. R. Hosseinabadi, M. B. Sasi, and A. K. Sangaiah, “An improved ant colony optimization for the multi-trip Capacitated Arc Routing Problem,” Computers & Electrical Engineering, 2018, In press. View at Publisher · View at Google Scholar · View at Scopus
  48. A. Kaveh and M. I. Ghazaan, “Enhanced whale optimization algorithm for sizing optimization of skeletal structures,” Mechanics Based Design of Structures and Machines, vol. 45, no. 3, pp. 345–362, 2017. View at Publisher · View at Google Scholar · View at Scopus
  49. D. Oliva, M. A. El Aziz, and A. E. Hassanien, “Parameter estimation of photovoltaic cells using an improved chaotic whale optimization algorithm,” Applied Energy, vol. 200, pp. 141–154, 2017. View at Publisher · View at Google Scholar · View at Scopus
  50. S. Gupta and K. Saurabh, “Modified artificial killer whale optimization algorithm for maximum power point tracking under partial shading condition,” in Proceedings of the 2017 International Conference onRecent Trends in Electrical, Electronics and Computing Technologies (ICRTEECT), pp. 87–92, IEEE, Warangal, Telangana, India, 2017. View at Publisher · View at Google Scholar · View at Scopus
  51. A. Mostafa, A. E. Hassanien, M. Houseni, and H. Hefny, “Liver segmentation in MRI images based on whale optimization algorithm,” Multimedia Tools and Applications, vol. 76, no. 23, pp. 24931–24954, 2017. View at Publisher · View at Google Scholar
  52. M.-F. Horng, T.-K. Dao, and C.-S. Shieh, “A Multi-objective optimal vehicle fuel consumption based on whale optimization algorithm,” in Advances in Intelligent Information Hiding and Multimedia Signal Processing, Springer, Cham, Switzerland, 2017. View at Publisher · View at Google Scholar · View at Scopus
  53. I. R. Kumawat, S. J. Nanda, and R. K. Maddila, “Positioning LED panel for uniform illuminance in indoor VLC system using whale optimization,” in Optical and Wireless Technologies, Springer, Singapore, 2018. View at Publisher · View at Google Scholar · View at Scopus
  54. M. A. El Aziz, A. A. Ewees, A. E. Hassanien, M. Mudhsh, and S. Xiong, “Multi-objective whale optimization algorithm for multilevel thresholding segmentation,” in Advances in Soft Computing and Machine Learning in Image Processing, Springer, Cham, Switzerland, 2018. View at Publisher · View at Google Scholar · View at Scopus
  55. M. Abdel-Basset, D. El-Shahat, and A. K. Sangaiah, “A modified nature inspired meta-heuristic whale optimization algorithm for solving 0–1 knapsack problem,” International Journal of Machine Learning and Cybernetics, vol. 10, no. 3, pp. 1–20, 2017. View at Publisher · View at Google Scholar
  56. H. Zhang, H. Wang, N. Li, Y. Yu, Z. Su, and Y. Liu, “Time-optimal memetic whale optimization algorithm for hypersonic vehicle reentry trajectory optimization with no-fly zones,” Neural Computing and Applications, vol. 30, pp. 1–15, 2018. View at Publisher · View at Google Scholar · View at Scopus
  57. X.-S. Yang, “A new metaheuristic bat-inspired algorithm,” in Nature Inspired Cooperative Strategies for Optimization (NICSO 2010), pp. 65–74, Springer, Berlin, Heidelberg, Germany, 2010. View at Publisher · View at Google Scholar · View at Scopus
  58. I. Albayrak, N. Aşan, and T. Yorulmaz, “The natural history of the Egyptian fruit bat, Rousettus aegyptiacus, in Turkey (Mammalia: chiroptera),” Turkish Journal of Zoology, vol. 32, no. 1, pp. 11–18, 2008. View at Google Scholar
  59. J. Ji, S. Song, C. Tang, S. Gao, Z. Tang, and Y. Todo, “An artificial bee colony algorithm search guided by scale-free networks,” Information Sciences, vol. 473, pp. 142–165, 2019. View at Publisher · View at Google Scholar · View at Scopus
  60. Y. Yu, S. Gao, S. Cheng, Y. Wang, S. Song, and F. Yuan, “CBSO: a memetic brain storm optimization with chaotic local search,” Memetic Computing, vol. 10, no. 4, pp. 353–367, 2018. View at Publisher · View at Google Scholar · View at Scopus
  61. Y. Yu, S. Gao, Y. Wang, J. Cheng, and Y. Todo, “ASBSO: an improved brain storm optimization with flexible search length and memory-based selection,” IEEE Access, vol. 6, pp. 36977–36994, 2018. View at Publisher · View at Google Scholar · View at Scopus
  62. P. N. Suganthan, N. Hansen, J. J. Liang et al., “Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization,” 2005, KanGAL report. View at Google Scholar
  63. F. Bornemann, D. Laurie, S. Wagon, and J. Waldvogel, The SIAM 100-Digit Challenge: a Study in High-Accuracy Numerical Computing, vol. 86, 2004, SIAM, New Delhi, India. View at Publisher · View at Google Scholar