Table of Contents Author Guidelines Submit a Manuscript
Mathematical Problems in Engineering
Volume 2016 (2016), Article ID 1423930, 22 pages
http://dx.doi.org/10.1155/2016/1423930
Research Article

Lévy-Flight Moth-Flame Algorithm for Function Optimization and Engineering Design Problems

1College of Information Science and Engineering, Guangxi University for Nationalities, Nanning 530006, China
2Key Laboratory of Guangxi High Schools Complex System and Computational Intelligence, Nanning 530006, China

Received 18 April 2016; Accepted 12 July 2016

Academic Editor: Jose J. Muñoz

Copyright © 2016 Zhiming Li et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The moth-flame optimization (MFO) algorithm is a novel nature-inspired heuristic paradigm. The main inspiration of this algorithm is the navigation method of moths in nature called transverse orientation. Moths fly in night by maintaining a fixed angle with respect to the moon, a very effective mechanism for travelling in a straight line for long distances. However, these fancy insects are trapped in a spiral path around artificial lights. Aiming at the phenomenon that MFO algorithm has slow convergence and low precision, an improved version of MFO algorithm based on Lévy-flight strategy, which is named as LMFO, is proposed. Lévy-flight can increase the diversity of the population against premature convergence and make the algorithm jump out of local optimum more effectively. This approach is helpful to obtain a better trade-off between exploration and exploitation ability of MFO, thus, which can make LMFO faster and more robust than MFO. And a comparison with ABC, BA, GGSA, DA, PSOGSA, and MFO on 19 unconstrained benchmark functions and 2 constrained engineering design problems is tested. These results demonstrate the superior performance of LMFO.

1. Introduction

Optimization is a process of finding the best possible solution(s) for a given problem. In real world, many problems can be viewed as optimization problems. Since the complexity of problems increases, the need for new optimization techniques becomes more evident than before. Over the past several decades, some kinds of methods have been proposed to solve optimization problems and have made great progress. For example, mathematical optimization techniques used to be the only tool for optimizing problems before the proposal of heuristic optimization techniques. However, these methods need to know the property of optimization problem, such as continuity or differentiability. In recent years, metaheuristic optimization algorithms have become more and more popular in optimization techniques. Some popular algorithms in this field are Genetic Algorithms (GA) [1, 2], Particle Swarm Optimization (PSO) [3], Ant Colony Optimization (ACO) [4], Evolutionary Strategy (ES) [5], Differential Evolution (DE) [6], and Evolutionary Programming (EP) [7]. The application of these algorithms can be found in different branches of science and industry as well. Despite the merits of these optimizers, there is a fundamental question here whether there is any optimizer for solving all optimization problems. According to the No-Free-Lunch (NFL) theorem [8] for optimization, researchers are allowed to develop new algorithms solving optimization problems more effectively. Some of the latest algorithms are Artificial Bee Colony (ABC) algorithm [9], Bat Algorithm (BA) [10], Cuckoo Search (CS) algorithm [11], Cuckoo Optimization Algorithm (COA) [12], Gravitational Search Algorithm (GSA) [13], Charged System Search (CSS) [14], Firefly Algorithm (FA) [15], and Ray Optimization (RO) [16], and Dragonfly Algorithm (DA) [17].

Moth-flame optimization (MFO) [18] algorithm is a new metaheuristic optimization method through imitating the navigation method of moths in nature called transverse orientation. In this algorithm, moths and flames are both solutions. The inventor of this algorithm, Seyedali Mirjalili, proved that this algorithm is able to show very competitive results compared with other state-of-the-art metaheuristic optimization algorithms. However, the MFO algorithm has been in research stage so far, and convergence speed and calculation accuracy of this algorithm can be further advanced. To improve the performance of MFO, a Lévy-flight moth-flame optimization (LMFO) algorithm is proposed.

We know that Lévy-flight [11, 19] has a strong ability of strengthening global search and overcoming the problem of being trapped in local minima. In order to make use of the good performance of Lévy-flight, we propose a Lévy-flight moth-flame optimization. MFO and Lévy-flight have complementary advantages, so the proposed algorithm can lead to a faster and more robust method. The proposed algorithm is verified on nineteen benchmark functions and two engineering problems.

The rest of the paper is organized as follows: Section 2 presents a brief introduction to MFO and Lévy-flight. An improved version of MFO algorithm, LMFO, is proposed in Section 3. The experimental results of test functions and engineering design problem are showed in Sections 4 and 5, respectively. Results and discussion are provided in Section 6. Finally, Section 7 concludes the work.

2. Related Works

In this section, a background about the moth-flame optimization algorithm and Lévy-flight will be provided briefly.

2.1. MFO Algorithm

Moth-flame optimization [18] algorithm is a new metaheuristic optimization method, which is proposed by Seyedali Mirjalili and based on the simulation of the behavior of moths for their special navigation methods in night. They utilize a mechanism called transverse orientation for navigation. In this method, a moth flies by maintaining a fixed angle with respect to the moon, which is a very effective mechanism for travelling long distance in a straight path because the moon is far away from the moth. This mechanism guarantees that moths fly along straight line in night. However, we usually observe that moths fly spirally around the lights. In fact, moths are tricked by artificial lights and show such behaviors. Since such light is extremely close to the moon, hence, maintaining a similar angle to the light source causes a spiral fly path of moths.

In the MFO algorithm, the set of moths is represented in a matrix . For all the moths, there is an array for storing the corresponding fitness values. The second key components in the algorithm are flames. A matrix similar to the moth matrix is considered. For the flames, it is also assumed that there is an array for storing the corresponding fitness values.

The MFO algorithm is a three-tuple that approximates the global optimal of the optimization problems and defined as follows:

is a function that creates a random population of moths and corresponding fitness values. The methodical model of this function is as follows:

The function, which is the main function, moves the moths around the search space. This function received the matrix of and returns its updated one eventually:

The function returns true if the termination criterion is satisfied and false if the termination criterion is not satisfied:

With , , and , the general framework of the MFO algorithm is defined as follows:;while is equal to false;end

After the initialization, the function is iteratively run until the function returns true. For the sake of simulating the behavior of moths mathematically, the position of each moth is updated with respect to a flame using the following equation: where indicate the th moth, indicates the th flame, and is the spiral function.

Any types of spiral can be utilized here subject to the following conditions:(1)Spiral’s initial point should start from the moth.(2)Spiral’s final point should be the position of the flame.(3)Fluctuation of the range of spiral should not exceed the search space.

Considering these points, a logarithmic spiral is defined for the MFO algorithm as follows:where indicates the distance of the th moth for the th flame, is a constant for defining the shape of the logarithmic spiral, and is a random number in .

is calculated as follows:where indicate the th moth, indicates the th flame, and indicates the distance of the th moth for the th flame.

Equation (6) describes the spiral flying path of moths. From this equation, the next position of a moth is defined with respect to a flame. The parameter in the spiral equation defines how much the next position of the moth should be close to the flame ( is the closest position to the flame, while shows the farthest).

A question that may rise here is that the position updating in (6) only requires the moths to move towards a flame, yet it causes the MFO algorithm to be trapped in local optima quickly. In order to prevent this, each moth is obliged to update its position using only one of the flames in (6). Another concern here is that the position updating of moths with respect to different locations in the search space may degrade the exploitation of the best promising solutions. To resolve this concern, an adaptive mechanism provided the number of flames. The following formula is utilized in this regard:where is the current number of iteration, is the maximum number of flames, and indicates the maximum number of iterations.

The gradual decrement in number of flames balances exploration and exploitation of the search space. After all, the general steps of the function can be described in Algorithm 1.

Algorithm 1: MFO algorithm.

As described in Algorithm 1, the function is executed until the function returns true. After termination of the function, the best moth is returned as the best obtained approximation of the optimum.

Note that the Quicksort method is utilized in MFO and the sort’s computational complexity is and in the best and worst case, respectively (where is the number of moths).

2.2. Lévy-Flight

Lévy-flight was originally introduced by the French mathematician in 1937 named Paul Lévy. Lévy-flight is a statistical description of motion that extends beyond the more traditional Brownian motion discovered over one hundred years earlier. A diverse range of both natural and artificial phenomena are now being described in terms of Lévy statistics [19].

Generally speaking, animals looking for food is random, from one place to another place. A large number of studies have shown that flight behavior of many animals and insects has demonstrated the typical characteristics of randomness. However, the choice of the direction relies only on a mathematical model [20], which is called Lévy-flight. For instance, many studies have shown that flight behavior of many animals and insects has revealed the typical characteristics of Lévy-flight [2124]. According to [24], we can know that fruit flies or Drosophila melanogaster explore their landscape utilizing a series of straight flight paths punctuated by a sudden turn, resulting in a Lévy-flight-style fitful scale-free pattern. Studies on human behavior such as the Ju/’hoansi hunter-gatherer foraging patterns [21] also show the typical feature of Lévy-flight. Pavlyukevich has used Lévy-flight in his research to present and theoretically justify a new stochastic algorithm for global optimization. Even the light can be related to Lévy-flight [20]. Subsequently, Lévy-flight have been applied to optimization and optimal search, and preliminary results show its promising capability [22, 25].

3. The Proposed LMFO Approach

In order to increase the diversity of population against premature convergence and accelerate the convergence speed, this paper proposes an improved Lévy-flight moth-flame optimization (LMFO) algorithm. Lévy-flight has the prominent properties to increase the diversity of population, sequentially, which can make the algorithm effectively jump out of the local optimum. In other words, this approach is beneficial to obtain a better trade-off between the exploration and exploitation ability of MFO. So, we let each moth perform once Lévy-flight using (9) after the position updating, which is formulated as follows [11, 26]:where is the th moth or solution vector at iteration , is a random parameter which conforms to a uniform distribution, is the dot product (entrywise multiplications), and rand is a random number in . It should be noted here that takes only three values 1, 0, and . And in (9) the combination of and Lévy-flight can make moth walk more random. That is to say, to get rid of local minima and improve global search capability are ensured via this combination in the basic MFO. Lévy-flight are a kind of random walk in which the steps are determined by the step lengths, and the jumps conform to a Lévy distribution as follows [11, 27]:

Formula (11) is calculated as Lévy random numbers:where and are both standard normal distributions, is a standard Gamma function, , and is defined as follows:

To sum up, global search ability of the proposed algorithm is strengthened using random walk with Lévy-flight to eliminate the weakness of MFO, its being trapped in local minimum is prevented, and it is observed to give more successful results particularly for unimodal and multimodal benchmark functions. Because of these features, the proposed algorithm has potential to provide superior performance compared to MFO. In following section, all kinds of benchmark functions are hired to verify the effectiveness of the proposed algorithm. The main steps of Lévy-flight moth-flame optimization can be simply presented in Algorithm 2.

Algorithm 2: LMFO algorithm.

4. Simulation Experiments

4.1. Simulation Platform

All the algorithms are tested in MATLAB R2012a (7.14) and numerical experiment is set up on Intel Core (TM) i5-4590 Processor, 3.30 GHz, 4 GB RAM, running on Windows 7.

4.2. Benchmark Functions

It is common in this field to benchmark the performance of algorithm on a set of mathematical functions with known global optimal. The same process is followed, in which nineteen standard benchmark functions are employed from the literature [27, 28] as test beds for comparison. Three groups of benchmark functions with different characteristics are selected to benchmark the performance of the LMFO algorithm from different perspectives. As shown in Tables 13, these benchmark functions are divided into three groups: unimodal functions, multimodal functions, and fixed-dimension multimodal functions. As their names imply, unimodal functions are suitable for benchmarking the exploitation and convergence of an algorithm since they have one global optimum and no local optima. In contrary, multimodal functions have more than one optimum, which makes them more challenging than unimodal functions. One of the optima is called global optimum, and the rest are called local optima. An algorithm should avoid all the local optima to approach and approximate the global optimum. Therefore, exploration and local optima avoidance of algorithms can be benchmarked by multimodal functions. The mathematical formulation of the employed benchmark functions is presented in Tables 1, 2, and 3, respectively. In these three tables, represent the boundary of the function’s search space, denotes the dimension of the function, and is the theoretical minimum of the function.

Table 1: Unimodal benchmark functions.
Table 2: Multimodal benchmark functions.
Table 3: Fixed-dimension multimodal benchmark functions.

Heuristic algorithms are stochastic optimization techniques, and therefore they have to be run more than 10 times for generating meaningful statistical results. The best obtained solution in the last iteration is calculated as the metrics of performance. The same method is selected to generate and report the results over 30 independent runs. However, average and standard deviation only compare the overall performance of algorithms.

To explore the performance of the proposed LMFO algorithm, some of the recent and well-known algorithms in the literature are chosen: ABC [9], BA [10], GGSA [29], DA [17], PSOGSA [30], and MFO [18]. Note that 30 number search agents and 1000 iterations are utilized for each of the algorithms. It should be noted that selection of the number of moths (or other candidate solutions in other algorithms) should be done experimentally.

In this paper, Best, Mean, Worst, and Std represent the optimal fitness value, mean fitness value, worst fitness value, and standard deviation, respectively. Experimental results are listed in Tables 4, 5, and 6. The best results are denoted in bold type.

Table 4: Results of unimodal benchmark functions.
Table 5: Results of multimodal benchmark functions.
Table 6: Results of fixed-dimension multimodal benchmark functions.

Due to the stochastic nature of the algorithms, statistical tests should be conducted to confirm the significance of the results [31]. The averages and standard deviation only compare the overall performance of the algorithms, while a statistical test considers each run’s results and proves that the results are statistically significant. In order to determine whether the results of LMFO differ from the best results of ABC, BA, GGSA, DA, PSOGSA, and MFO in a statistical method, a nonparametric test which is known as Wilcoxon’s rank-sum test [32, 33] is performed at 5% significance level. Tables 7, 8, and 9 report the values produced by Wilcoxon’s test for the pairwise comparison of the best value of six groups. Such groups are formed by ABC versus LMFO, BA versus LMFO, GGSA versus LMFO, DA versus MFO, PSOGSA versus LMFO, and MFO versus LMFO. In general, values < 0.05 can be considered as sufficient evidence against the null hypothesis. With the statistical test, we can make sure that the results are not generated by chance. The nonparametric Wilcoxon statistical test is conducted and the calculated values are reported as metrics of significance as well. Experimental results of values rank-sum test are listed in Tables 7, 8, and 9.

Table 7: Results of -values rank-sum test on unimodal benchmark functions.
Table 8: Results of -values rank-sum test on multimodal benchmark functions.
Table 9: Results of -values rank-sum test on fixed-dimension multimodal benchmark functions.
4.3. Unimodal Benchmark Functions

The unimodal benchmarks functions have only one global minimum and there are no local minima for them. Therefore, these kinds of functions are very suitable for benchmarking the convergence capability of algorithms. According to the results of Table 4, LMFO is able to provide very competitive results. As can be seen from this table, LMFO outperforms all other algorithms in ~. Therefore, the proposed algorithm has high performance to find the global minimum of unimodal benchmark functions. According to the values of ~ in Table 7, LMFO achieves significant improvement in all the unimodal benchmark functions compared to other algorithms. Hence, this proves that LMFO has better performance than other algorithms in forging for global optimum solution of unimodal benchmark functions.

Figures 18 illustrate the averaged convergence curves of all algorithms disposing unimodal benchmark functions over 30 independent runs. It can be noted here that all the convergence curves in the following subsections are also averaged curves. As may be seen from these curves, LMFO has the fastest convergence speed in all algorithms. From Table 4 and Figures 2027, the LMFO’s Std is much smaller than other algorithms. These show that LMFO has a strong sense of stability and robust comparing from other algorithms.

Figure 1: The convergence curves for .
Figure 2: The convergence curves for .
Figure 3: The convergence curves for .
Figure 4: The convergence curves for .
Figure 5: The convergence curves for .
Figure 6: The convergence curves for .
Figure 7: The convergence curves for .
Figure 8: The convergence curves for .
4.4. Multimodal Benchmark Functions

In contrast to the unimodal benchmark functions, multimodal benchmark functions have many local minima with the number increasing exponentially with dimension. This makes them suitable for benchmarking the exploration ability of an algorithm. So, the final results are more important because these benchmark functions can reflect the ability of the algorithm to escape from poor local optima and obtain the global optimum. The statistical results of the algorithms on multimodal benchmark functions are presented in Table 5. As the results of Best, Worst, Mean, and Std values show, LMFO is also able to provide very competitive results on the multimodal benchmark functions. These results show that the LMFO algorithm has merit in terms of exploration. According to the values of ~ reported in Table 8, LMFO achieves significant improvement on 200-D compared to other algorithms. When comparing LMFO and other algorithms, we can conclude that LMFO is significantly performing better with six groups of comparison algorithms. The values of ~ reported in Table 8 are less than 0.05, which is strong evidence against null hypothesis. Hence, this evidence demonstrates that the results of LMFO are statistically significant not occurring by coincidence.

Seen from Table 5 and Figures 915, the convergence rate of LMFO on the multimodal benchmark functions in majority cases is better than other algorithms. On the basis of Table 5, and Figures 915, we can draw a conclusion that the LMFO is able to avoid local minima in multimodal benchmark functions with a good convergence speed. From Table 5 and Figures 2837, the LMFO’s Std is much smaller than other algorithms. These show that LMFO has a strong sense of stability and robust comparing from other algorithms.

Figure 9: The convergence curves for .
Figure 10: The convergence curves for .
Figure 11: The convergence curves for .
Figure 12: The convergence curves for .
Figure 13: The convergence curves for .
Figure 14: The convergence curves for .
Figure 15: The convergence curves for .
4.5. Fixed-Dimension Multimodal Benchmark Functions

For fixed-dimension multimodal benchmark functions with only a few local minima, the dimensions of the multimodal benchmark functions are also small. Under such circumstances, it is difficult to judge the performance of individual algorithm. The major difference compared with multimodal functions is that fixed-dimension multimodal functions appear to be simpler because of their low dimensions and a smaller number of local minima. In this experiment, the results of Best, Worst, Mean, and Std values of fixed-dimension multimodal benchmark functions are summarized in Table 6. For all fixed-dimension multimodal functions, LMFO can give the best solution in terms of Best. As Table 6 shows, the LMFO algorithm provides the best results on two of fixed-dimension multimodal benchmark functions. The results are followed by the MFO, PSOGSA, and ABC algorithms. In addition, the values of Wilcoxon’s rank-sum in Table 9 show that the result of LMFO in is not significantly better than DA and GGSA algorithms (5% significance level), but it is significantly different compared with ABC, MFO, PSOGSA, and BA. In the remaining functions (, , and ), however, the results of LMFO are significantly better than other algorithms. So, it can be concluded that the results of LMFO in these benchmark functions are better than ABC, BA, GGSA, DA, PSOGSA, and MFO.

In addition, the convergence rate of LMFO on the fixed-dimension benchmark functions with 2-dim can be shown in Figures 1619. As can be seen from these figures, it can be claimed that LMFO has the faster convergence rate on functions and . From Figures 3538, we can find that all of the algorithms have a strong sense of stability except BA on fixed-dimension functions.

Figure 16: The convergence curves for .
Figure 17: The convergence curves for .
Figure 18: The convergence curves for .
Figure 19: The convergence curves for .
Figure 20: Standard deviation for .
Figure 21: Standard deviation for .
Figure 22: Standard deviation for .
Figure 23: Standard deviation for .
Figure 24: Standard deviation for .
Figure 25: Standard deviation for .
Figure 26: Standard deviation for .
Figure 27: Standard deviation for .
Figure 28: Standard deviation for .
Figure 29: Standard deviation for .
Figure 30: Standard deviation for .
Figure 31: Standard deviation for .
Figure 32: Standard deviation for .
Figure 33: Standard deviation for .
Figure 34: Standard deviation for .
Figure 35: Standard deviation for .
Figure 36: Standard deviation for .
Figure 37: Standard deviation for .
Figure 38: Standard deviation for .

Overall, the results from Tables 46, Tables 79, Figures 119, and Figures 2038 show that the proposed method is effective in not only optimizing unimodal and multimodal functions but also optimizing fixed-dimension multimodal functions.

Since constraints are one of the major challenges in solving real problems and the main objective of designing the LMFO algorithm is to solve real problems, two constrained real engineering problems are employed in the next section to further investigate the performance of the MFO algorithm and provide a comprehensive study.

5. LMFO for Engineering Optimization Problems

In this section, a set of two engineering problems (welded beam design and speed reducer design) is solved so as to further testify the performance of the proposed algorithm. There are some inequality constraints in real problems, so the LMFO algorithm should be capable of dealing with them during optimization. Several methods have been applied to handle constraints in the literature: penalty function, special operators, repaired algorithms, separation of objectives and constraints, and hybrid methods [34]. In this paper, penalty method is employed to handle the constraints of welded beam and speed reducer.

5.1. Welded Beam Design

The objective is to evaluate the optimal fabrication cost of a welded beam as shown in Figure 39 [35]. The constraints of the beam are shear stress (), bending stress in the beam (), buckling load on the bar (), end deflection of the beam (), and side constraints.

Figure 39: Structure of welded beam design.

This problem has four variables that are thickness of weld (), length of attached part of bar (), the height of the bar (), and thickness of the bar (), respectively. This problem is formulated as follows:

Mirjalili tried to solve this problem using MFO [18] and GGSA [29, 36]. Coello Coello [37] and Deb [38, 39] employed GA, whereas Lee and Geem [40] used HS to solve this problem. Richardson’s random method, simplex method, Davidon-Fletcher-Powell, and Griffith and Stewart’s successive linear approximation are the mathematical approaches that have been adopted by Ragsdell and Philips [41] for this problem. The comparison results of the welded beam design problem are shown in Table 10.

Table 10: Comparison results of the welded beam design problem.

The results of Table 10 show that the LMFO algorithm is able to find the best optimal design compared to other algorithms. The results of LMFO are closely followed by the MFO and GGSA algorithms.

5.2. Speed Reducer Design

The objective function of this problem is to minimize the total weight of the speed reducer as illustrated in Figure 40 [42]. The variables ~ denote the face width (), module of teeth (), number of teeth in the pinion (), length of the first shaft between bearings (), length of the second shaft between bearings (), the diameter of first (), and second shafts (), respectively. The mathematical formulation of this problem can be summarized as follows:

Figure 40: Structure of speed reducer design.

This problem has also been popular among researchers and optimized in many studies. The heuristic algorithms that have been employed to optimize this problem are Akhtar et al. [43], Mezura-Montes et al. [44], CS [11, 45], HCPS [46], SCA [47], () ES [5, 48], and ABC [9, 49]. The results of this problem are provided in Table 11. According to this table, the LMFO and HCPS algorithms can find a design with the minimum weight for this problem.

Table 11: Comparison results of the speed reducer design problem.

6. Results and Discussion

In this paper, an improved version of MFO algorithm based on Lévy-flight strategy, which is named as LMFO, is proposed. In order to benchmark the performance of LMFO, nineteen unconstrained benchmark functions and two constrained engineering design problems were conducted.

According to the values of Best, Worst, Mean, and Std and values in Section 4, the LMFO algorithm significantly outperforms others in terms of numerical optimization. There are several reasons of why LMFO algorithm did perform well on most of the test cases. First, Lévy-flight strategy: Lévy-flight can increase the diversity of the population and make the algorithm jump out of local optimum more effectively. This approach is helpful to make LMFO faster and more robust than MFO. Second, update mechanism of moths: in this mechanism, moths are required to update their positions with respect to the best recent feasible flames. Therefore, this approach promotes exploration of promising feasible regions and is the main reason of the superiority of the LMFO algorithm. Third, Quicksort method is utilized in LMFO algorithm. These are the reasons why LMFO performs better than other algorithms at the end of the results section. Another finding in the results is the poor performance of ABC, BA, and DA. These three algorithms belong to the class of swarm-based algorithms. In contrary to evolutionary algorithms, there is no mechanism for significant abrupt movements in the search space and this is likely to be the reason for the poor performance of ABC, BA, and DA.

As we can see in Section 4, the LMFO has been demonstrated to perform better than or highly competitive with the other algorithms. The advantages of LMFO involve performing simply and have few parameters to regulate. The work here proves the LMFO to be robust, powerful, and effective over all types of benchmark functions. Benchmark evaluating is a good way for testing the performance of the metaheuristic algorithms, but it also has some limitations. For example, different tuning parameter values in the optimization methods might lead to significant differences in their performance. Also, benchmark test may arrive at fully different conclusions if the termination criterion changes. If we change the population size or the number of iterations, we might draw a different conclusion.

In Section 5, the results show that MFO outperforms other algorithms in the majority of real case studies. Since the search space of these problems is unknown, these results are strong evidences for the applicability of LMFO in solving real problems. Due to the constrained nature of the case studies, in addition, it can be stated that the LMFO algorithm is able to optimize search spaces with infeasible regions as well. This is due to the update mechanism of moths, in which they are required to update their positions with respect to the best recent feasible flames. Therefore, this approach is the main reason of the superiority of the LMFO algorithm.

In our study, nineteen benchmark functions have been applied to evaluate the performance of LMFO. We also test our proposed method on the real-world engineering problems. Moreover, we will compare LMFO with other optimization algorithms.

7. Conclusion and Future Works

Due to the limited performance of MFO, Lévy-flight strategy has been introduced into the standard MFO to develop a novel Lévy-flight moth-flame optimization algorithm for optimization problems. As shown in Section 4, LMFO is very efficient with an almost exponential convergence rate and the results were compared to a wide range of algorithms for verification. This observation is based on the comparison of LMFO with other algorithms to solve optimization problems. The proposed algorithm proved its superior performance on nineteen benchmark functions in terms of enhanced convergence speed and modified avoidance of local minima. This paper also identified and discussed the reasons for poor performances of other algorithms. It was observed that the swarm-based algorithms suffer from low exploration, whereas the LMFO does not.

Furthermore, this paper also considers solving two classical engineering problems by using the LMFO algorithm. The high level of exploration and exploitation of this algorithm were the motivations for this study. The comparative results in Section 5 show that the LMFO algorithm has high performance on challenging constrained problems with unknown spaces. In this work, the LMFO makes an attempt at taking merits of the MFO and Lévy-flight in order to avoid local optimal. With both techniques combined, LMFO can balance exploration and exploitation and effectively solve complex problems and real-world engineering problems.

For future works, two research directions can be recommended. Firstly, we are going to apply the LMFO to solve more real-world engineering problems. Secondly, it is recommended to develop binary and multiobjective versions of the MFO algorithm.

Competing Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This work is supported by National Science Foundation of China under Grants no. 61463007 and 6153008.

References

  1. J. H. Holland, “Genetic algorithms,” Scientific American, vol. 267, no. 1, pp. 66–72, 1992. View at Publisher · View at Google Scholar · View at Scopus
  2. J. H. Holland and J. S. Reitman, “Cognitive systems based on adaptive algorithms,” ACM SIGART Bulletin, no. 63, p. 49, 1977. View at Publisher · View at Google Scholar
  3. R. Eberhart and J. Kennedy, “New optimizer using particle swarm theory,” in Proceedings of the 6th International Symposium on Micro Machine and Human Science, pp. 39–43, October 1995. View at Scopus
  4. A. Colorni, M. Dorigo, and V. Maniezzo, “Distributed optimization by ant colonies,” in Proceedings of the European Conference on Artificial Life, pp. 134–142, Paris, France, December 1991.
  5. I. Rechenberg, Evolution Strategy: Nature's Way of Optimization. Optimization: Methods and Applications, Possibilities and Limitations, Springer, Berlin, Germany, 1989.
  6. R. Storn and K. Price, “Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, 1997. View at Publisher · View at Google Scholar · View at MathSciNet
  7. X. Yao, Y. Liu, and G. Lin, “Evolutionary programming made faster,” IEEE Transactions on Evolutionary Computation, vol. 3, no. 2, pp. 82–102, 1999. View at Publisher · View at Google Scholar · View at Scopus
  8. D. H. Wolpert and W. G. Macready, “No free lunch theorems for optimization,” IEEE Transactions on Evolutionary Computation, vol. 1, no. 1, pp. 67–82, 1997. View at Publisher · View at Google Scholar · View at Scopus
  9. D. Karaboga and B. Basturk, “A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm,” Journal of Global Optimization, vol. 39, no. 3, pp. 459–471, 2007. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  10. X.-S. Yang, “A new metaheuristic bat-inspired Algorithm,” Studies in Computational Intelligence, vol. 284, pp. 65–74, 2010. View at Publisher · View at Google Scholar · View at Scopus
  11. X. S. Yang and S. Deb, “Cuckoo search via Lévy flights,” in Proceedings of the IEEE World Congress on Nature & Biologically Inspired Computing (NaBIC '09), pp. 210–214, Coimbatore, India, December 2009. View at Publisher · View at Google Scholar
  12. R. Rajabioun, “Cuckoo optimization algorithm,” Applied Soft Computing, vol. 11, no. 8, pp. 5508–5518, 2011. View at Publisher · View at Google Scholar · View at Scopus
  13. E. Rashedi, H. Nezamabadi-Pour, and S. Saryazdi, “GSA: a gravitational search algorithm,” Information Sciences, vol. 179, no. 13, pp. 2232–2248, 2009. View at Publisher · View at Google Scholar · View at Scopus
  14. A. Kaveh and S. Talatahari, “A novel heuristic optimization method: charged system search,” Acta Mechanica, vol. 213, no. 3-4, pp. 267–289, 2010. View at Publisher · View at Google Scholar · View at Scopus
  15. X. S. Yang, “Firefly algorithm,” in Engineering Optimization, pp. 221–230, John Wiley & Sons, 2010. View at Google Scholar
  16. A. Kaveh and M. Khayatazad, “A new meta-heuristic method: ray optimization,” Computers and Structures, vol. 112-113, no. 4, pp. 283–294, 2012. View at Publisher · View at Google Scholar · View at Scopus
  17. S. Mirjalili, “Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems,” Neural Computing and Applications, vol. 27, no. 4, pp. 1053–1073, 2016. View at Publisher · View at Google Scholar · View at Scopus
  18. S. Mirjalili, “Moth-flame optimization algorithm: a novel nature-inspired heuristic paradigm,” Knowledge-Based Systems, vol. 89, pp. 228–249, 2015. View at Publisher · View at Google Scholar · View at Scopus
  19. A. F. Kamaruzaman, A. M. Zain, S. M. Yusuf, and A. Udin, “Lévy flight algorithm for optimization problems—a literature review,” Applied Mechanics and Materials, vol. 421, pp. 496–501, 2013. View at Publisher · View at Google Scholar · View at Scopus
  20. P. Barthelemy, J. Bertolotti, and D. S. Wiersma, “A Lévy flight for light,” Nature, vol. 453, no. 7194, pp. 495–498, 2008. View at Publisher · View at Google Scholar · View at Scopus
  21. C. T. Brown, L. S. Liebovitch, and R. Glendon, “Lévy flights in dobe Ju/'hoansi foraging patterns,” Human Ecology, vol. 35, no. 1, pp. 129–138, 2007. View at Publisher · View at Google Scholar · View at Scopus
  22. I. Pavlyukevich, “Lévy flights, non-local search and simulated annealing,” Journal of Computational Physics, vol. 226, no. 2, pp. 1830–1844, 2007. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  23. I. Pavlyukevich, “Cooling down Lévy flights,” Journal of Physics A. Mathematical and Theoretical, vol. 40, no. 41, pp. 12299–12313, 2007. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  24. A. M. Reynolds and M. A. Frye, “Free-flight odor tracking in Drosophila is consistent with an optimal intermittent scale-free search,” PLoS ONE, vol. 2, no. 4, article e354, 2007. View at Publisher · View at Google Scholar · View at Scopus
  25. M. F. Shlesinger, “Mathematical physics: search research,” Nature, vol. 443, no. 7109, pp. 281–282, 2006. View at Publisher · View at Google Scholar · View at Scopus
  26. J. Xie, Y. Zhou, and H. Chen, “A novel bat algorithm based on differential operator and lévy flights trajectory,” Computational Intelligence and Neuroscience, vol. 2013, Article ID 453812, 13 pages, 2013. View at Publisher · View at Google Scholar · View at Scopus
  27. X. S. Yang, “Appendix a: test problems in optimization,” Engineering Optimization, pp. 261–266, 2010. View at Publisher · View at Google Scholar
  28. K. Tang, X. Li, P. N. Suganthan, Z. Yang, and T. Weise, Benchmark Functions for the cec'2008 Special Session and Competition on Large Scale Global Optimization, Nature Inspired Computation and Applications Laboratory, 2009.
  29. M. B. Dowlatshahi and H. Nezamabadi-Pour, “GGSA: a grouping gravitational search algorithm for data clustering,” Engineering Applications of Artificial Intelligence, vol. 36, pp. 114–121, 2014. View at Publisher · View at Google Scholar · View at Scopus
  30. S. Mirjalili and S. Z. M. Hashim, “A new hybrid PSOGSA algorithm for function optimization,” in Proceedings of the International Conference on Computer and Information Application (ICCIA '10), pp. 374–377, IEEE, Tianjin, China, November 2010. View at Publisher · View at Google Scholar · View at Scopus
  31. J. Derrac, S. García, D. Molina, and F. Herrera, “A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms,” Swarm and Evolutionary Computation, vol. 1, no. 1, pp. 3–18, 2011. View at Publisher · View at Google Scholar · View at Scopus
  32. J. Gibbons and S. Chakraborti, Nonparametric Statistical Inference, Springer, Berlin, Germany, 2011.
  33. D. A. Wolfe and M. Hollander, Nonparametric Statistical Methods, John Wiley & Sons, New York, NY, USA, 2013.
  34. C. A. C. Coello, “Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: a survey of the state of the art,” Computer Methods in Applied Mechanics and Engineering, vol. 191, no. 11-12, pp. 1245–1287, 2002. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  35. C. A. C. Coello, “Use of a self-adaptive penalty approach for engineering optimization problems,” Computers in Industry, vol. 41, no. 2, pp. 113–127, 2000. View at Publisher · View at Google Scholar · View at Scopus
  36. S. Mirjalili and A. Lewis, “Adaptive gbest-guided gravitational search algorithm,” Neural Computing and Applications, vol. 25, no. 7-8, pp. 1569–1584, 2014. View at Publisher · View at Google Scholar · View at Scopus
  37. C. A. Coello Coello, “Constraint-handling using an evolutionary multiobjective optimization technique,” Civil Engineering and Environmental Systems, vol. 17, no. 4, pp. 319–346, 2000. View at Publisher · View at Google Scholar · View at Scopus
  38. K. Deb, “Optimal design of a welded beam via genetic algorithms,” AIAA Journal, vol. 29, no. 11, pp. 2013–2015, 1991. View at Publisher · View at Google Scholar
  39. K. Deb, “An efficient constraint handling method for genetic algorithms,” Computer Methods in Applied Mechanics and Engineering, vol. 186, no. 2–4, pp. 311–338, 2000. View at Publisher · View at Google Scholar · View at Scopus
  40. K. S. Lee and Z. W. Geem, “A new meta-heuristic algorithm for continuous engineering optimization: harmony search theory and practice,” Computer Methods in Applied Mechanics and Engineering, vol. 194, no. 36-38, pp. 3902–3933, 2005. View at Publisher · View at Google Scholar · View at Scopus
  41. K. M. Ragsdell and D. T. Phillips, “Optimal design of a class of welded structures using geometric programming,” Journal of Engineering for Industry, vol. 98, no. 3, pp. 1021–1025, 1976. View at Publisher · View at Google Scholar · View at Scopus
  42. A. H. Gandomi and X. S. Yang, Benchmark Problems in Structural Optimization. Computational Optimization, Methods and Algorithms, Springer, Berlin, Germany, 2011.
  43. S. Akhtar, K. Tai, and T. Ray, “A socio-behavioral simulation model for engineering design optimization,” Engineering Optimization, vol. 34, no. 4, pp. 341–354, 2002. View at Publisher · View at Google Scholar · View at Scopus
  44. E. Mezura-Montes, C. A. C. Coello, and R. Landa-Becerra, “Engineering optimization using simple evolutionary algorithm,” in Proceedings of the 15th IEEE International Conference on Tools with Artificial Intelligence, pp. 149–156, IEEE Computer Society, November 2003. View at Publisher · View at Google Scholar
  45. A. H. Gandomi, X.-S. Yang, and A. H. Alavi, “Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems,” Engineering with Computers, vol. 29, no. 1, pp. 17–35, 2013. View at Publisher · View at Google Scholar · View at Scopus
  46. W. Long, W.-Z. Zhang, Y.-F. Huang, and Y.-X. Chen, “A hybrid cuckoo search algorithm with feasibility-based rule for constrained structural optimization,” Journal of Central South University, vol. 21, no. 8, pp. 3197–3204, 2014. View at Publisher · View at Google Scholar · View at Scopus
  47. T. Ray and K. M. Liew, “Society and civilization: an optimization algorithm based on the simulation of social behavior,” IEEE Transactions on Evolutionary Computation, vol. 7, no. 4, pp. 386–396, 2003. View at Publisher · View at Google Scholar · View at Scopus
  48. E. Mezura-Montes and C. A. C. Coello, “Useful infeasible solutions in engineering optimization with evolutionary algorithms,” in Proceedings of the 4th Mexican international conference on Advances in Artificial Intelligence (MICAI '05), vol. 3789, pp. 652–662, Springer, Monterrey, Mexico, 2005. View at Publisher · View at Google Scholar
  49. B. Akay and D. Karaboga, “Artificial bee colony algorithm for large-scale problems and engineering design optimization,” Journal of Intelligent Manufacturing, vol. 23, no. 4, pp. 1001–1014, 2012. View at Publisher · View at Google Scholar · View at Scopus