Table of Contents Author Guidelines Submit a Manuscript
The Scientific World Journal
Volume 2013 (2013), Article ID 370172, 11 pages
http://dx.doi.org/10.1155/2013/370172
Research Article

An Improved Marriage in Honey Bees Optimization Algorithm for Single Objective Unconstrained Optimization

1Department of Computer Programming, Karamanoglu Mehmetbey University, Karaman, Turkey
2Computer Engineering Department, Selcuk University, Konya, Turkey

Received 5 May 2013; Accepted 11 June 2013

Academic Editors: P. Agarwal, V. Bhatnagar, and Y. Zhang

Copyright © 2013 Yuksel Celik and Erkan Ulker. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Marriage in honey bees optimization (MBO) is a metaheuristic optimization algorithm developed by inspiration of the mating and fertilization process of honey bees and is a kind of swarm intelligence optimizations. In this study we propose improved marriage in honey bees optimization (IMBO) by adding Levy flight algorithm for queen mating flight and neighboring for worker drone improving. The IMBO algorithm’s performance and its success are tested on the well-known six unconstrained test functions and compared with other metaheuristic optimization algorithms.

1. Introduction

Optimization means to find the best among the possible designs of a system. In other words, for the purpose of minimizing or maximizing a real function, selecting real or integer values from an identified range, placing these into the function and systematically examining or solving the problem is referred to as optimization. For the solution of optimization problems, mathematical and heuristic optimization techniques are used. In problems with wide and large solution space, heuristic algorithms heuristically produce the closest results to the solution, without scanning the whole solution space and within very short durations. Metaheuristic algorithms are quite effective in solving global optimization problems [1]. The main metaheuristic algorithms are genetic algorithm (GA) [2], simulated annealing (SA) [3], particle swarm optimization (PSO) [4], ant colony optimization (ACO) [5], differential evolution (DE) [6], marriage in honey bees optimization (MBO) [7, 8], artificial bee colony algorithm (ABC) [9] and evolutionary algorithms (EAs) [9, 10].

Performance of algorithms carrying out nature-inspired or evolutionary calculations can be monitored with their application on the test functions of such algorithms. Karaboga and Basturk implemented the artificial bee colony (ABC) algorithm, which they proposed from the inspiration of the food searching activities of honey bees, on unconstrained test functions [11]. Digalakis and Margaritis developed two algorithms titled as the generational replacement model (GRM) and the steady state replacement model by making modifications on the genetic algorithm and monitored their performances on unconstrained test functions [12]. By combining the GA and SA algorithms, Hassan et al. proposed the geno-simulated annealing (GSA) algorithm and implemented it on the most commonly used unconstrained test functions [13]. In order to obtain a better performance in the multidimensional search space, Chatterjee et al. suggested the nonlinear variation of the known PSO, the non-PSO algorithm, and measured its performance on several unconstrained test functions [14]. By integrating the opposition-based learning (OBL) approach for population initialization and generation jumping in the DE algorithm, Rahnamayan et al. proposed the opposition-based DE (ODE) algorithm and compared the results they obtained from implementing the algorithm on the known unconstrained test functions with DE [15]. It is difficult to exhibit a good performance on all test functions. Rather than expecting the developed algorithm to provide accurate results on all types of problems, it is more reasonable to determine the types of problems where the algorithm functions well and decide on the algorithm to be used on a specific problem.

Test functions determine whether the algorithm will be caught to the local minimum and whether it has a wide search function in the search space during the solution.

In 2001, Abbass [8] proposed the MBO algorithm, which is a swarm intelligence based and metaheuristic algorithm predicated on the marriage and fertilization of honey bees. Later on, Abbass and Teo used the annealing approach in the MBO algorithm for determining the gene pool of male bees [16]. Chang made modifications on MBO for solving combinatorial problems and implemented this to the solution. Again, for the solution of infinite horizon-discounted cost stochastic dynamic programming problems, he implemented MBO on the solution by adapting his algorithm he titled as “Honey Bees Policy Iteration” (HBPI) [17]. In 2007, Afshar et al. proposed MBO algorithm as honey bee mating optimization (HBMO) algorithm and implemented it on water resources management applications [18]. Marinakis et al. implemented HBMO algorithm by obtaining HBMOTSP in order to solve Euclidan travelling salesman problem (TSP) [19]. Chiu and Kuo [20] proposed a clustering method which integrates particle swarm optimization with honey bee mating optimization. Simulations for three benchmark test functions (MSE, intra-cluster distance, and intercluster distance) are performed.

In the original MBO algorithm, annealing algorithm is used during the queen bee’s mating flight, mating with drones, generation of new genotype, and adding these into the spermatheca. In the present study, we used Levy flight [1] instead of the annealing algorithm. Also, during the improvement of the genotype of worker bees, we applied single neighborhood and single inheritance from the queen. We tested the IMBO algorithm we developed on the most commonly known six unconstrained numeric test functions, and we compared the results with the PSO and DE [21] algorithms from the literature.

This paper is organized as follows: in Section 2, the MBO algorithm and unconstrained test problems are described in detail. Section 3 presents the proposed unconstrained test problems solution procedure using IMBO. Section 4 compares the empirical studies and unconstrained test results of IMBO, MBO, and other optimization algorithms. Section 5 is the conclusion of the paper.

2. Material and Method

2.1. The Marriage in Honey Bee Optimization (MBO) Algorithm
2.1.1. Honey Bee Colony

Bees take the first place among the insects that can be characterized as swarm and that possess swarm intelligence. A typical bee colony is composed of 3 types of bees. These are the queen, drone (male bee), and workers (female worker). The queen’s life is a couple of years old, and she is the mother of the colony. She is the only bee capable of laying eggs.

Drones are produced from unfertilized eggs and are the fathers of the colony. Their numbers are around a couple of hundreds. Worker bees are produced from fertilized eggs, and all procedures such as feeding the colony and the queen, maintaining broods, building combs, and searching and collecting food are made by these bees. Their numbers are around 10–60 thousand [22].

Mating flight happens only once during the life of the queen bee. Mating starts with the dance of the queen. Drones follow and mate with the queen during the flight. Mating of a drone with the queen depends of the queen’s speed and their fitness. Sperms of the drone are stored in the spermatheca of the queen. The gene pool of future generations is created here. The queen lays approximately two thousand fertilized eggs a day (two hundred thousand a year). After her spermatheca is discharged, she lays unfertilized eggs [23].

2.1.2. Honey Bee Optimization Algorithm

Mating flight can be explained as the queen’s acceptance of some of the drones she meets in a solution space, mating and the improvement of the broods generated from these. The queen has a certain amount of energy at the start of the flight and turns back to the nest when her energy falls to minimum or when her spermatheca is full. After going back to the nest, broods are generated and these are improved by the worker bees crossover and mutation.

Mating of the drone with the queen bee takes place according to the probability of the following annealing function [8]: where is the probability of the drone to be added to the spermatheca of the queen (probability of the drone and queen to mate) and is the absolute difference between ’s fitness and ’s fitness. and are the speed of the queen at time. This part is as the annealing function. In cases where at first the queen’s speed is high or the fitness of the drone is as good as the queen’s fitness, mating probability is high. Formulations of the time-dependent speed and energy of the queen in each pass within the search space are as follows: Here, is the factor of and is the amount of energy reduction in each pass. On the basis of (1) and (2), the original MBO algorithm was proposed by Abbas [8] as shown in Algorithm 1.

alg1
Algorithm 1: Original MBO algorithm [8].

2.2. Unconstrained Numerical Benchmark Functions

Performance of evolutionary calculating algorithms can be monitored by implementing the algorithm on test functions. A well-defined problem set is useful for measuring the performances of optimization algorithms. By their structures, test functions are divided into two groups as constrained and unconstrained test functions. Unconstrained test functions can be classified as unimodal and multimodal. While unimodal functions have a single optimum within the search space, multimodal functions have more than one optimum. If the function is predicated on a continuous mathematical objective function within the defined search space, then it is a continuous benchmark function. However, if the bit strings are not defined and continuous, then the function is described as a discreet benchmark function [24]. Alcayde et al. [25] approach a novel extension of the well-known Pareto archived evolution strategy (PAES) which combines simulated annealing and tabu search. They applied this several mathematical problems show that this hybridization allows an improvement in the quality of the nondominated solutions in comparison with PAES Some of the most commonly known test functions are as follows. We have solved well-known six unconstrained single objective numeric benchmark function. The details of the benchmark functions are given in Table 1.

tab1
Table 1: Unconstrained test functions.

3. IMBO for Unconstrained Test Functions

In the original MBO mating possibility of the queen bee in the mating flight is calculated through the annealing function. In the proposed study Improved MBO (IMBO) algorithm was obtained by improving the MBO algorithm through the replacement of the annealing algorithm with the Levy flight algorithm in order to enable the queen to make a better search in the search space. Flight behaviors of many animals and insects exhibited the typical characteristics of Levy flight [26]. In addition, there are many studies to which Levy flight was successfully adapted. Pavlyukevich solved a problem of nonconvex stochastic optimization with the help of simulated annealing of Levy flights of a variable stability index [27]. In biological phenomena, Viswanathan et al. used Levy flight in the search of biologic organisms for target organisms [28]. Reynolds conducted a study by integrating Levy flight algorithm with the honey bees’ strategies of searching food [29]. Tran et al. proposed Levy flight optimization (LFO) for global optimization problems, implemented it on the test functions, and compared the results they obtained with simulated annealing (SA) [30]. By adapting Levy flight algorithm instead of the gaussian random walk in the group search optimizer (GSO) algorithm developed for Artificial neural network (ANN), Shan applied the algorithm on a set of 5 optimization benchmark functions [31].

In general terms, Levy flight is a random walk. The steps in this random walk are obtained from Levy distribution [1]. Levy flight is implemented in 2 steps. While the first is a random selection of direction, the second is the selection of a step suitable for Levy distribution. While direction has to be selected from a homogenous distribution region, step selection is a harder process. Although there are several methods for step selection, the most effective and simplistic one is the Mantegna algorithm.

Mantegna algorithm is calculated as shown in the following equation: Here, the is obtained by taking the magnitude of the genotype as basis.

on the other hand is calculated as shown in the following equation; While in this equation is , is the is the Gamma function, and calculated as follows:

In consequence, the direction of the next step is determined with the and parameters, and step length is found by placing and into their place in the Mantegna algorithm (3). Based on , new genotype is generated as much as random genotype size, and the generated genotype is added to the previous step. Consider

Creation of the new genotype of this step is completed by subjecting the new solution set obtained, that is, the genotype to the maximum and minimum controls defined for the test problem and adjusting deviations if any. Accordingly, through these implemented equations, the queen bee moves from the previous position to the next position, or towards the direction obtained from Levy distribution and by the step length obtained from Mantegna algorithm as follows:

In the crossover operator, the genotype of the queen bee and all genotypes in the current population are crossed over. Crossover was carried out by randomly calculating the number of elements subjected to crossover within Hamming distance on the genotypes to be crossed over.

In the improvement of the genotype (broods) by the worker bees single neighborhood and single inheritance from the queen was used. Consider where is a random (0, 1) value, is the current brood genotype, is the queen genotype, is a random value number of genotype. In this way, and it was observed that the developed IMBO algorithm exhibits better performance than the other metaheuristic optimization algorithms.

The MBO algorithm we modified is shown in Algorithm 2.

alg2
Algorithm 2: Proposed IMBO algorithm.

4. Experimental Results

In this study, we used Sphere, Rosenbrock, Rastrigin, Griewank, Schwefel, and Ackley unconstrained test problems; a program in the MatLab 2009 programming language was developed for the tests of MBO and IMBO. Genotype sizes of 10, 50, 100, 300, 500, 800, and 1000 were taken for each test. Population size (Spermatheca Size) was accepted as . At the end of each generation, mean values and standard deviations were calculated for test functions. Each genotype was developed for 10,000 generations. Each improvement was run for 30 times. Global minimum variance graphs of each test function for IMBO are presented in Figure 1.

fig1
Figure 1: Mean global minimum convergence graphs of benchmark functions in 1000 generations for genotype sizes of 10, 50, 100, 300, 500, 800 and 1000.

Examining the six test functions presented in Figure 1 shows that, at first, global optimum values were far from the solution value in direct proportion to genotype size. Accordingly, for large genotypes, or in other words in cases where the number of entry parameters is high, convergence of the proposed algorithm to the optimum solution takes a longer time. The test results for MBO and IMBO are given in Tables 2 and 3.

tab2
Table 2: Test results of the MBO algorithm for the genotype sizes of 10, 50, 100, 300, 500, 800, and 1000; number of runs = 30; SD: standard deviation, AV: global minimum average; generation = 10000.
tab3
Table 3: Test results of the IMBO algorithm for the genotype sizes of 10, 50, 100, 300, 500, 800, and 1000; number of runs = 30; SD: standard deviation, AV: global minimum average; generation = 10000.

When Tables 2 and 3 were examined, it is seen that genotype size increases in all functions of MBO IMBO algorithms and the global minimum values get away from the optimal minimum values. In Table 2, the MBO algorithm Sphere function, 10, 50, 100 the size of genotype reached an optimum value while the other genotype sizes converge to the optimal solution. It was observed that Rosenbrock function minimum is reached optimal sizes in all genotypes. It was observed that rastrigin function 10, 50, 100, Griewank function 50, 100 genotype sizes in the optimal solution was reached. It was observed that except Schwefel function, other function and genotype sizes, the optimal solution was reached close to values.

When Table 3 was examined, it was seen that, while the size of genotype increased, IMBO algorithm Sphere, Rastrigin, Griewank, Schwefel, and Ackley function, were getting away from the optimal minimum. It was seen that Sphere function, 10, 50, 100 the size of genotype reached an optimum value while the other genotype sizes converges to the optimal solution. It was observed that, except for the size of 10 genotypes Rosenbrock function, but all other genotypes sizes optimal minimum was reached. Rastrigin function 10, 50, Griewank function 50, 100 to the optimal solution was observed to have reached the optimal solution. The other functions and sizes of genotype were observed to have reached the values close to the optimal solution.

Comparative results of the best and mean solutions of the MBO and IMBO algorithms are presented in Table 4.

tab4
Table 4: Success rates of MBO when compared with those of the IMBO algorithm. + indicates that the algorithm is better while − indicates that it is worse than the other. If both algorithms show similar performance, they are both +.

According to Table 4, it is seen that, when compared with the MBO algorithm and IMBO algorithm according to genotype, IMBO exhibited better performance than MBO in all genotypes sizes. When it was thought that total better or equal cases were represented with “+” mark, MBO algorithm, a total of 19 “+” available, and IMBO algorithm, a total of 36 pieces of the “+” were available. Accordingly, IMBO’s MBO algorithm demonstrates a better performance.

CPU time results of the all genotype sizes for MBO are given in Table 5 and for IMBO are given in Table 6.

tab5
Table 5: CPU time results of the MBO algorithm for the genotype sizes of 10, 50, 100, 300, 500, 800, and 1000; number of runs = 1; iteration = 10000.
tab6
Table 6: CPU time results of the IMBO algorithm for the genotype sizes of 10, 50, 100, 300, 500, 800, and 1000; number of runs = 1; iteration = 10000.

In Tables 5 and 6, it was seen that, when CPU time values were analyzed, depending upon the size in the same proportion as genotype problem, solving time took a long time. Again in these tables, when solution CPU time of MBO and IMBO algorithm was analyzed, IMBO algorithm solves problems with less CPU time than MBO algorithm.

For 10, 50, 100, and 1000 problem sizes of unconstrained numeric six benchmark functions, comparisons were made between test results of IMBO algorithm and the algorithms in literature, including DE, PSO, ABC [32], bee swarm optimization, (BSO) [33], bee and foraging algorithm (BFA) [34], teaching-learning-based optimization (TLBO) [35], bumble bees mating optimization (BBMO) [36] and honey bees mating optimization algorithm (HBMO) [36]. Table 7 presents the comparison between experimental test results obtained for 10-sized genotype (problem) on unconstrained test functions of IMBO algorithm and the results for the same problem size in literature including PSO, DE, ABC, BFA and BSO optimization algorithms; while the comparison of success of each algorithm and IMBO algorithm is given in Table 8. Table 9 shows the comparison between experimental test results obtained for 50-sized genotype (problem) on unconstrained test functions of IMBO algorithm and the results for the same problem size in literature including TLBO, HBMO and BBMO optimization algorithms, while the comparison of success of each algorithm and IMBO algorithm is given in Table 10. Table 11 shows the comparison between experimental test results for 100-sized genotype (problem) on unconstrained test functions of IMBO algorithm and the results for the same problem size in literature including PSO, DE, and ABC optimization algorithms, while the comparison of success of each algorithm and IMBO algorithm is given in Table 12. Table 13 shows the comparison between experimental test results obtained for 1000-sized genotype (problem) on unconstrained test functions of IMBO algorithm and the results for the same problem size in the literature including PSO, DE, and ABC optimization algorithms, while the comparison of success of each algorithm and IMBO algorithm is given in Table 14.

tab7
Table 7: The mean solutions obtained by the PSO, DE, ABC, BFA, BSO, and IMBO algorithms for 6 test functions over 30 independent runs and total success numbers of algorithms. Genotype size: 10; (—): not available value, SD: standard deviation, AV: global minimum average.
tab8
Table 8: Comparative results of IMBO with PSO, DE, ABC, BFA, and BSO algorithms over 30 independent runs for genotype size 50. + indicates that the algorithm is better while − indicates that it is worse than the other, (—): not available value. If both algorithms show similar performance, they are both +.
tab9
Table 9: The mean solutions obtained by the TLBO, HBMO, BBMO, and IMBO algorithms for 6 test functions over 30 independent runs and total success numbers of algorithms. Genotype size: 50; (—): not available value, SD: standard deviation, AV: global minimum average.
tab10
Table 10: Comparative results of IMBO with TLBO, HBMO, and BBMO algorithms over 30 independent runs for genotype size 50. + indicates that the algorithm is better while − indicates that it is worse than the other, (—): not available value. If both algorithms show similar performance, they are both +.
tab11
Table 11: The mean solutions obtained by the DE, PSO, ABC, and IMBO algorithms for 6 test functions over 30 independent runs and total success numbers of algorithms. Genotype size: 100; (—): not available value, SD: standard deviation, AV: global minimum average.
tab12
Table 12: Comparative results of IMBO with DE, PSO, and ABC algorithms over 30 independent runs for genotype size 100. + indicates that the algorithm is better while − indicates that it is worse than the other. If both algorithms show similar performance, they are both +.
tab13
Table 13: The mean solutions obtained by the DE, PSO, ABC, and IMBO algorithms for 6 test functions over 30 independent runs and total success numbers of algorithms. Genotype size: 1000; (—): not available value, SD: standard deviation, AV: global minimum average.
tab14
Table 14: Comparative results of IMBO with DE, PSO, and ABC algorithms over 30 independent runs for genotype size 1000. + indicates that the algorithm is better while − indicates that it is worse than the other. If both algorithms show similar performance, they are both +.

Tables 7, 11, and 13 demonstrate that, as the problem size increases in ABC, DE, and PSO, the solution becomes more distant and difficult to reach. However, the results obtained with IMBO showed that, despite the increasing problem size, optimum value could be obtained or converged very closely. There are big differences among the results obtained for 10, 100, and 1000 genotype sizes in DE and PSO; however, this difference is smaller in IMBO algorithm, which indicates that IMBO performs better even in large problem sizes. In Tables 7 and 8, it is seen that IMBO performs equally to DE and ABC and better than PSO, BFA and BSO. In Tables 9 and 10 showing the comparison of IMBO with LBO, HBMO, and BBMO for genotype (problem) size 50, it is seen that IMBO performs better than all the other algorithms. In Tables 11 and 12 showing the comparison of IMBO with DE, PSO and ABC algorithms on problem size 100, it is seen that IMBO performs equally to ABC and better than DE and PSE. In Tables 13 and 14 showing the comparison of IMBO with DE, PSO and ABC on problem size 1000, IMBO is seen to perform better than all the other algorithms.

5. Conclusion

In the proposed study, we developed a new IMBO by replacing annealing algorithm in the queen bee’s mating flight with the Levy flight algorithm and using single inheritance and single neighborhood in the genotype improvement stage. We tested the MBO algorithm we improved on the most commonly known six unconstrained numeric benchmark functions. We compared the results obtained with the results of other metaheuristic optimization algorithms in the literature for the same test functions.

In order to observe the improvement of IMBO, the experimental test results of MBO and IMBO were compared for 10, 50, 100, 300, 500, 800, and 1000 problem sizes. Consequently, IMBO algorithm was concluded to perform better than MBO algorithm. Furthermore, according to CPU time of problem solving process, IMBO algorithm works in shorter CPU times. The test results obtained with IMBO were compared with the results of DE, ABC, PSO, BSO, BFA; TLBO, BBMO and HBMO in the literature.

Accordingly, IMBO is observed to perform equally to or a little better than other algorithms in comparison with small genotype size, while IMBO performs much better than other algorithms with large genotype size. A total of 14 comparisons were made between experimental results of IMBO and other optimization algorithms in literature, and it showed better performances in 11 comparisons, and equal performances in 3 comparisons.

In future studies, different improvements can be made on the MBO algorithm and tests can be made on different test functions. Also, comparisons can be made with other metaheuristic optimization algorithms not used in the present study.

References

  1. X.-S. Yang, “Levy flight,” in Nature-Inspired Metaheuristic Algorithms, pp. 14–17, Luniver Press, 2nd edition, 2010. View at Google Scholar
  2. D. Bunnag and M. Sun, “Genetic algorithm for constrained global optimization in continuous variables,” Applied Mathematics and Computation, vol. 171, no. 1, pp. 604–636, 2005. View at Publisher · View at Google Scholar · View at Scopus
  3. H. E. Romeijn and R. L. Smith, “Simulated annealing for constrained global optimization,” Journal of Global Optimization, vol. 5, no. 2, pp. 101–126, 1994. View at Publisher · View at Google Scholar · View at Scopus
  4. J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of the IEEE International Conference on Neural Networks, pp. 1942–1948, December 1995. View at Scopus
  5. J. Holland, Adaptation in Natural and Artificial Systems, University of Michigan Press, 1975.
  6. R. Storn and K. Price, “Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997. View at Google Scholar · View at Scopus
  7. O. B. Haddad, A. Afshar, and M. A. Mariño, “Honey-bees mating optimization (HBMO) algorithm: a new heuristic approach for water resources optimization,” Water Resources Management, vol. 20, no. 5, pp. 661–680, 2006. View at Publisher · View at Google Scholar · View at Scopus
  8. H. A. Abbass, “MBO: Marriage in honey bees optimization a haplometrosis polygynous swarming approach,” in Proceedings of the Congress on Evolutionary Computation (CEC '01), pp. 207–214, May 2001. View at Scopus
  9. T. Thomas, Evolutionary Algorithms in Theory and Practice: Evolution Strategies, Evolutionary Programming, Genetic Algorithms, Oxford University Press, New York, NY, USA, 1996.
  10. C. Coello Coello and G. B. Lamont, Evolutionary Algorithms for Solving Multi-Objective Problems, Genetic Algorithms and Evolutionary Computation, Kluwer Academic Publishers, Boston, Mass, USA, 2nd edition, 2007.
  11. D. Karaboga and B. Basturk, “On the performance of artificial bee colony (ABC) algorithm,” Applied Soft Computing Journal, vol. 8, no. 1, pp. 687–697, 2008. View at Publisher · View at Google Scholar · View at Scopus
  12. J. G. Digalakis and K. G. Margaritis, “On benchmarking functions for genetic algorithms,” International Journal of Computer Mathematics, vol. 77, no. 4, pp. 481–506, 2001. View at Google Scholar · View at Scopus
  13. M. M. Hassan, F. Karray, M. S. Kamel, and A. Ahmadi, “An integral approach for Geno-Simulated Annealing,” in Proceedings of the 10th International Conference on Hybrid Intelligent Systems (HIS '10), pp. 165–170, August 2010. View at Publisher · View at Google Scholar · View at Scopus
  14. A. Chatterjee and P. Siarry, “Nonlinear inertia weight variation for dynamic adaptation in particle swarm optimization,” Computers and Operations Research, vol. 33, no. 3, pp. 859–871, 2006. View at Publisher · View at Google Scholar · View at Scopus
  15. R. S. Rahnamayan, H. R. Tizhoosh, and M. M. A. Salama, “Opposition-based differential evolution,” IEEE Transactions on Evolutionary Computation, vol. 12, no. 1, pp. 64–79, 2008. View at Publisher · View at Google Scholar · View at Scopus
  16. H. A. Abbass and J. Teo, “A true annealing approach to the marriage in honey-bees optimization algorithm,” International Journal of Computational Intelligence and Aplications, vol. 3, pp. 199–208, 2003. View at Google Scholar
  17. H. S. Chang, “Converging marriage in honey-bees optimization and application to stochastic dynamic programming,” Journal of Global Optimization, vol. 35, no. 3, pp. 423–441, 2006. View at Publisher · View at Google Scholar · View at Scopus
  18. A. Afshar, O. Bozorg Haddad, M. A. Mariño, and B. J. Adams, “Honey-bee mating optimization (HBMO) algorithm for optimal reservoir operation,” Journal of the Franklin Institute, vol. 344, no. 5, pp. 452–462, 2007. View at Publisher · View at Google Scholar · View at Scopus
  19. Y. Marinakis, M. Marinaki, and G. Dounias, “Honey bees mating optimization algorithm for the Euclidean traveling salesman problem,” Information Sciences, vol. 181, no. 20, pp. 4684–4698, 2011. View at Publisher · View at Google Scholar · View at Scopus
  20. C.-Y. Chiu and T. Kuo, “Applying honey-bee mating optimization and particle swarm optimization for clustering problems,” Journal of the Chinese Institute of Industrial Engineers, vol. 26, no. 5, pp. 426–431, 2009. View at Publisher · View at Google Scholar · View at Scopus
  21. D. Karaboga and B. Basturk, “A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm,” Journal of Global Optimization, vol. 39, no. 3, pp. 459–471, 2007. View at Publisher · View at Google Scholar · View at Scopus
  22. R. F. A. Moritzl and C. Brandesl, “Behavior genetics of honeybees (Apis mellifera L.),” in Neurobiology and Behavior of Honeybees, pp. 21–35, Springer, Berlin, Germany, 1987. View at Google Scholar
  23. R. F. A. Moritz and E. E. Southwick, Bees as Super-Organisms, Springer, Berlin, Germany, 1992.
  24. B. Bilgin, E. Özcan, and E. E. Korkmaz, “An experimental study on hyper-heuristics and exam scheduling,” in Practice and Theory of Automated Timetabling VI, vol. 3867 of Lecture Notes in Computer Science, pp. 394–412, Springer, 2007. View at Google Scholar
  25. A. Alcayde, R. Baños, C. Gil, F. G. Montoya, J. Moreno-Garcia, and J. Gómez, “Annealing-tabu PAES: a multi-objective hybrid meta-heuristic,” Optimization, vol. 60, no. 12, pp. 1473–1491, 2011. View at Publisher · View at Google Scholar · View at Scopus
  26. X.-S. Yang and S. Deb, “Multiobjective cuckoo search for design optimization,” Computers and Operations Research, vol. 40, no. 6, pp. 1616–1624, 2013. View at Publisher · View at Google Scholar · View at Scopus
  27. I. Pavlyukevich, “Lévy flights, non-local search and simulated annealing,” Journal of Computational Physics, vol. 226, no. 2, pp. 1830–1844, 2007. View at Publisher · View at Google Scholar · View at Scopus
  28. G. M. Viswanathan, F. Bartumeus, and S. V. Buldyrev, “Levy Flight random searches in biological phenomena,” Physica A, vol. 314, pp. 208–213, 2002. View at Google Scholar
  29. A. M. Reynolds, “Cooperative random Lévy flight searches and the flight patterns of honeybees,” Physics Letters A, vol. 354, no. 5-6, pp. 384–388, 2006. View at Publisher · View at Google Scholar · View at Scopus
  30. T. T. Tran, T. T. Nguyen, and H. L. Nguyen, “Global optimization using levy flight,” in Proceedings of the 3rd National Symposium on Research, Development and Application of Information and Communication Technology (ICT.rda '06), Hanoi, Vietnam, September 2004.
  31. S. He, “Training artificial neural networks using lévy group search optimizer,” Journal of Multiple-Valued Logic and Soft Computing, vol. 16, no. 6, pp. 527–545, 2010. View at Google Scholar · View at Scopus
  32. B. Akay, Nimerik Optimizasyon Problemlerinde Yapay Arı Kolonisi (Artifical Bee Colony, ABC) Algoritmasının Performans Analizi, Kayseri Üniversitesi, Fen Bilimleri Enstitüsü, Kayseri, Turkey, 2009.
  33. R. Akbari, A. Mohammadi, and K. Ziarati, “A novel bee swarm optimization algorithm for numerical function optimization,” Communications in Nonlinear Science and Numerical Simulation, vol. 15, no. 10, pp. 3142–3155, 2010. View at Publisher · View at Google Scholar · View at Scopus
  34. K. Sundareswaran and V. T. Sreedevi, “Development of novel optimization procedure based on honey bee foraging behavior,” in Proceedings of the IEEE International Conference on Systems, Man and Cybernetics (SMC '08), pp. 1220–1225, October 2008. View at Publisher · View at Google Scholar · View at Scopus
  35. R. Venkata Rao and V. Patel, “An improved teaching-learning-based optimization algorithm for solving unconstrained optimization problems,” Scientia Iranica. In press.
  36. Y. Marinakis, M. Marinaki, and N. Matsatsinis, “A bumble bees mating optimization algorithm for global unconstrained optimization problems,” Studies in Computational Intelligence, vol. 284, pp. 305–318, 2010. View at Publisher · View at Google Scholar · View at Scopus