Table of Contents Author Guidelines Submit a Manuscript
International Journal of Antennas and Propagation
Volume 2015 (2015), Article ID 124675, 11 pages
http://dx.doi.org/10.1155/2015/124675
Research Article

Modified Fruit Fly Optimization Algorithm for Analysis of Large Antenna Array

1Faculty of Engineering, King Mongkut’s Institute of Technology Ladkrabang, 1 Chalongkrung Road, Ladkrabang, Bangkok 10520, Thailand
2Faculty of Engineering, Rajamangala University of Technology Lanna, 128 Huay Kaew Road, Muang, Chiangmai 50300, Thailand

Received 2 March 2015; Revised 27 May 2015; Accepted 14 June 2015

Academic Editor: Jun Hu

Copyright © 2015 Nattaset Mhudtongon et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

This research paper deals with the optimization of a large antenna array for maximum directivity using a modified fruit fly optimization algorithm (MFOA) with random search of two groups of swarm and adaptive fruit fly swarm population size. The MFOA is utilized to determine three nonlinear mathematical test functions, analysis of the optimal number of elements and optimal element spacing of the large antenna array, and analysis of nonuniform amplitude of antenna array. The numerical results demonstrate that the MFOA is effective in solving all test function and electromagnetic problems. The advantages of the proposed algorithm are ease of implementation, large search range, less processing time, and reduced memory requirement.

1. Introduction

The nature of an electromagnetic (EM) problem which contains a myriad of local and global optimal solutions contributes to its complexity and difficulty to locate the best solution to the problem. An EM problem refers to a large radiation problem with a vast search space. Prior research on large EM problems focused mostly on the problems of antenna array radiation [1, 2] and of radio wave scattering of ships, aircraft, and rough surfaces. To obtain an optimal global solution to an EM problem requires high performance optimization algorithms, particularly for the EM problems associated with the antenna array radiation and the wave scattering. Currently, the application of optimization algorithms has been extended beyond engineering to other fields, for example, sciences and finance.

Optimization algorithms can be classified into two types: local optimization and global optimization. This paper has focused on the global optimization since this optimization type is often applied to solving EM problems. Examples of global optimization algorithms applied to EM problems are the genetic algorithm (GA), evolution strategy (ES), particle swarm optimization (PSO), ant colony optimization (ACO), and simulated annealing (SA). In [3], GA was adopted to optimize the radiation patterns of linear and planar antenna arrays and it was found that GA was suitable for the EM problems of both types of antenna arrays. Although GA returned an optimal solution in a very short time period, according to [4], it is difficult to identify the optimal GA initial parameters. To address this issue, [5] proposed an evolutionary program (EP) and evolutionary strategies (ES) in which the optimal parameters of the algorithms were self-adaptive. The hybrid EP and ES were proved to be effective in solving EM problems, but the convergence to the optimum solution required a great amount of time.

In [6, 7], a PSO algorithm was proposed to minimize the side lobe level (SLL) of a phased array antenna. The algorithm was employed to solve a number of EM problems related to radio communications. In [8], a multiobjective particle swarm optimization (MOPSO) algorithm was used in a symmetric phased linear array to optimize energy consumption. It is found that the MOPSO algorithm is effective in energy saving and minimizing the SLL in the symmetric phased linear array. In [9], a thinned linear array and another planar antenna array both with minimum SLL were designed using ACO. To successfully minimize SLL, [10] applied SA to an array antenna.

The aforementioned optimization algorithms have been proved to be optimally applicable to their respective designs of antenna array; however, these algorithms are so advanced and complicated that it requires great amount of time and efforts for a beginner to comprehend. A new optimization algorithm, which is called the fruit fly optimization algorithm (FOA), has thus been recently proposed by Pan [11]. FOA is a stochastic searching algorithm based on the principle of natural selection. The algorithm has its use in numerous applications, for example, financial distress detection, web-auction logistics service, neural network, PID controller parameters tuning, key control characteristics optimization, and swarms of miniautonomous surface vehicles [1218]. However, FOA has never been applied to EM problems because it has no enough search space and is often meeting a local optimum solution for EM problems.

This research has proposed the modified fruit fly optimization algorithm (MFOA) to analyze the radiation pattern of the large antenna array. The MFOA is improved upon by incorporating random search of two groups of swarm and self-adaptive population size feature into the conventional FOA. The modified algorithm is found to be effective in solving three nonlinear test functions and EM problems for the large antenna array. In addition, the advantages of the MFOA are ease of implementation, large search range, less processing time, and reduced memory requirement.

The organization of the rest of this research is as follows: Section 2 discusses the modified fruit fly optimization algorithm with adaptive population size. Section 3 deals with the geometry of the large antenna array and its problem formulation. Section 4 presents the numerical results. The conclusions are provided in Section 5.

2. Modified Fruit Fly Optimization Algorithm

The conventional FOA is a technique to search for a global optimization. The algorithm is modeled after the food-seeking behavior of fruit flies [11]. Figure 1 illustrates a food-seeking iterative process of a fruit fly swarm [11].

Figure 1: A food-seeking iterative process of a fruit fly swarm.

The FOA can be efficiently used with several problems [1118], but it has never been used with EM problems. Naturally, EM problems have large search space, complexity, and discontinuous behaviour, which may be affected to the optimization algorithms to meet the local optimum solution. From the FOA procedure, it likes the food-seeking behaviour of fruit flies. When some swarms meet the optimum solution, another swarm will follow that solution. It can make the algorithm converge to the wrong solution or local optimum. Therefore, the FOA is modified to have more search space and global search for complexity and discontinuous function such as EM problems which is referred to as “modified FOA or MFOA.” The conventional FOA procedure is shown below, and the flow diagram is shown in Figure 2(a):(1)Initiate the number of iterations ().(2)Initiate the population size ().(3)Randomly generate the initial location of the fruit fly parameters: and .(4)Generate the direction and distance for all population () of the first iterationfor to ,Estimate the distance and calculate the smell concentration (),Calculate and find the best smell.Set the best smell concentration “bestSmell.”End.(5)Search the procedure from the second to the last iteration.Randomize the assigned direction and distance for all population size,for to ,Estimate and .Calculate and find the bestSmell.If the new bestSmell is better than the old one, the bestSmell will be updated.Set the and into the bestSmell.End.(6)Iterate for number (5) until the optimum solution is met by considering a bestSmell or the last number of iterations is reached.

Figure 2: Flow diagrams of the (a) conventional FOA and (b) MFOA with adaptive population size.

For the MFOA, it separated the population size into two groups. The first group is assigned to find a new search space with the wide area, and the second group is assigned to find nearby optimum space. This procedure can achieve a wider search space. Moreover, [19] found the limitation of FOA in some applications because it cannot estimate the negative value of searching parameters, in which they put the random value of sign on point at the smell function (). It is also employed in this paper. Although MFOA can search in the large space, it is dispersed too much and does not converge to the optimum solution. To avoid this problem, the best parameter is collected for use in the next iteration. The MFOA procedure is shown below and the flow diagram is shown in Figure 2(b):(1)Initiate the number of iterations ().(2)Initiate the population size ().(3)Randomly generate initial location of fruit fly parameters: and .(4)Generate direction and distance for all population () for the first iterationfor to .For group 1,For group 2,Estimate the distance and calculate the smell concentration ():Calculate and find the best smell.Set the best smell concentration “bestSmell.”Collect the best and .End.(5)Search procedure from the second to the last iteration.Determine the optimal random population size.Randomly assign direction and distance for all population sizefor to .For group 1,For group 2,Use the best direction and distance and from the previous iteration.Estimate and .Calculate and find the bestSmell.If the new bestSmell is better than the old one, the bestSmell will be updated.Set the and into the bestSmell.Collect the best and .End.(6)Iterate for number (5) until the optimum solution is met by considering a bestSmell or the last number of iterations is reached.

To demonstrate the efficiency of MFOA, it is employed with three basic functions and compared with the conventional FOA and GA. They are employed to find the maximum value of three nonlinear mathematical functions, where two variables ( and ) are adjusted by optimization algorithms. Therefore, smell function (for FOA and MFOA) and fitness function (for GA) are the value of considered functions. This section considers three nonlinear mathematical functions [20] with two variables as below.(1)Four-peak function is (2)Parabolic function is(3)Goldstein-Price function is

From the numerical results, FOA, MFOA, and GA can find the optimum parameter of three mathematical functions as shown in Table 1, except that FOA cannot find the optimum parameter of Goldstein-Price function. Moreover, the convergence rates of FOA, MFOA, and GA are shown in Figure 3. In Figure 3(a), GA and MFOA are rapidly converged, but FOA is converged in the fourth iteration. Next, FOA, MFOA, and GA can find the optimum solution of parabolic function within the first iteration as shown in Figure 3(b). However, Goldstein-Price function is solved. It is found that MFOA and GA can find the optimum parameter, but FOA cannot achieve it, as shown in Figure 3(c). It is obvious that the MFOA is more efficient than the FOA. In addition, the MFOA has slower convergence rate than the GA. Moreover, the MFOA is not complexity algorithm and it is easy to implement, while the GA is difficult to specify the optimum GA parameters such as crossover and mutation probability for appropriate problems.

Table 1: Compared results of the FOA, MFOA, and GA with nonlinear mathematical test functions.
Figure 3: Compared convergence rate of the FOA, MFOA, and GA with nonlinear mathematical test functions.

To monitor the behavior of FOA and MFOA, the parameter distribution due to the optimization algorithm is considered as shown in Figures 4, 5, 6, and 7, respectively. It is obvious that the MFOA has more spread than the FOA, where the FOA has faster convergence for four-peak function as shown in Figures 4 and 5. For Goldstein-Price function, the FOA is not converged, while the MFOA is converged with larger distribution. It is found that the MFOA has the larger search space than the FOA and can be converged to the optimum parameter as well. Next, the adaptive population is presented in the MFOA to save the time consumption. It is still converged with a large search space.

Figure 4: Parameter distribution of the MFOA for four-peak function.
Figure 5: Parameter distribution of the FOA for four-peak function.
Figure 6: Parameter distribution of the MFOA for Goldstein-Price function.
Figure 7: Parameter distribution of the FOA for Goldstein-Price function.

The parameter distribution of the MFOA for four-peak function is shown in Figure 4. In this figure, it is obviously demonstrated that the distributions of the MFOA method for four-peak function between both and parameters are similar. The random distribution caused by the first population size group is shown in the initial phase to determine the solution in the entire search space. Then, the uniform distribution which is caused by the second population size group appears to converge to the optimum solution. From the parameter distribution above, it can conclude that the solution produced from the distribution procedure of the first population size group will affect the behavior of the distribution procedure of the second population size group to converge to the optimum solution.

The MFOA method can be proposed in the other aspect. The population size has been divided into 2 individual groups. Each group will increase randomizing procedure into system to optimize the best solution. The first population size group runs randomizing procedure upon the large search space. Therefore, the solution produced from system is the region near an optimum point. Then, the solution from the first procedure is used as an initial point for the second population size group to determine the solution in the local search space.

The parameter distribution of the FOA for four-peak function is illustrated in Figure 5. From this figure, both and parameters’ distributions of four-peak function which is analyzed by the FOA method show that the distributions of population size tend to converge to the optimum solution when the number of iterations reaches 10, comparing to the MFOA method whose convergence to the optimum solution appears when the number of iterations is 4. From this point, the MFOA method suggests high efficiency in terms of time consuming and processing of memory space requirement.

The distributions of and parameters considered from the MFOA method for Goldstein-Price function are plotted as shown in Figures 6(a) and 6(b), respectively. The above parameters present the similar distribution of the MFOA method between Goldstein-Price function and four-peak function. The pattern of the distribution spreads over the large search space and later on yields specific outcome to the fitness function in order to obtain the optimum solution from the complex function. The distribution has been activated by randomizing procedure of two groups of population size to help increasing population diversity.

From Figures 7(a) and 7(b), the distributions analyzed from the FOA method for Goldstein-Price function shown in and parameters describe distribution which creates nonconvergence to the optimum solution due to the lack of diversity which is the key of the function to determine the optimum solution.

3. Geometry and Problem Formulation

The performance of a single element antenna is limited because of its broad radiation pattern and low directivity. Nevertheless, several applications, for example, radar and sonar communications, require an antenna with narrow radiation pattern and high directivity. To overcome the limitations of single element antennas, in other words, to obtain a narrow beam and high gain, more single elements must be added to the antenna design to produce an array antenna.

A linear array antenna is an antenna in which individual elements of the array are arranged in a straight line and spaced equally apart. Let us assume a linear array of isotropic elements. The elements are aligned along the -axis and are equidistant. The geometry of the -element array is shown in Figure 8 and an array factor () can be written as [21]

Figure 8: The geometry of -element array with isotropic sources along the -axis.

If we further assume that the excitation of amplitude () and phase () is 1 and 0, respectively, for all elements, the array factor () can be expressed aswhere is the number of elements; is the amplitude current excitation coefficient; is the phase current excitation coefficient; is the spacing between elements; is the wave number; is the angle between the field direction and the -axis.

In this research paper, the focus is on the application of the MFOA with adaptive population size to optimize the number of elements and element spacing of the broadside linear array antenna (i.e., and ) to achieve maximum directivity. Thus, a fitness function (or smell function) can be expressed as

4. Numerical Results

The numerical results of the conventional FOA and MFOA with adaptive population size are determined for comparison purpose and for verification of the accuracy and efficiency of the MFOA. The large antenna array under consideration is uniform linear array with a broadside radiation pattern, , and . The MFOA is employed to optimize the large antenna array to obtain the optimal number of elements and element spacing that yield the maximum directivity.

In this research, the population size of the conventional FOA is varied from 10, 20, 30, and 40 to 50, while that of the MFOA with adaptive population size is varied in the ranges of 5–10, 5–20, 5–30, 5–40, and 5–50. The numerical results of the conventional FOA and MFOA show the identical optimum global solution for the large broadside array antenna. In Figure 9, the computational time increases with increase in population size in both FOAs. Nevertheless, the MFOA with adaptive population size requires significantly less computational time than the conventional FOA.

Figure 9: The computational time versus the population size for the conventional FOA and MFOA with adaptive population size.

Figures 10(a)10(e) illustrate the population size relative to the number of iterations of the conventional FOA and the MFOA with adaptive population size with a maximum iteration of 50. The population size of the MFOA is self-adaptive in a range of 5–10 for the population size of 10; 5–20 for 20; 5–30 for 30; 5–40 for 40; and 5–50 for 50. The self-adaption of the MFOA helps avoid premature convergence due to diverse population sizes and reduces the processing time in comparison with the conventional FOA. Figure 11 illustrates the convergence rates of the GA, MFOA, and conventional FOA for uniform linear array with a broadside radiation pattern. The same figure shows that the MFOA with adaptive population size can perform well in searching for the best global solution, that is, the maximum directivity, while FOA met the local solution. In addition, the MFOA and GA converge to the same solution with stable convergence rate. Although the solution is identical, the proposed FOA takes up less memory and time with increase in population size in comparison with the conventional FOA.

Figure 10: The population size relative to iteration of the conventional FOA and MFOA for a maximum iteration of 50: (a) population sizes are 10 and 5–10 for the conventional and modified FOA, (b) 20 and 5–20, (c) 30 and 5–30, (d) 40 and 5–40, and (e) 50 and 5–50.
Figure 11: The convergence rates of the GA, MFOA with adaptive population size, and conventional FOA for uniform linear array with a broadside radiation pattern.

Figure 12 shows the optimum radiation pattern of the large broadside antenna array, in which the optimal number of elements is 81 and the optimal distance between elements is . Under the aforesaid optimal condition, the directivity is 21 dBi with low side lobe level.

Figure 12: The optimal radiation pattern of the large broadside antenna array from the MFOA with adaptive population size.

To get insight into the design procedure, a demonstration of 9 elements with nonuniform amplitude of broadside linear array is considered by the proposed optimization compared with the GA and conventional FOA. A fitness function (or smell function) is defined as a maximum directivity. The spacing between elements is equal to . For numerical results, the GA and MFOA found that the optimum amplitudes with 9 elements are equal to 0.39, 0.85, 0.79, 0.89, 1.00, 0.89, 0.79, 0.85, and 0.39, respectively. The obtained maximum directivity is 9.23 dBi. Figure 13 depicts the convergence rates of the GA, MFOA, and conventional FOA. The conventional FOA converges to other fitness values, but the GA and MFOA converge to the best directivity within 4 and 5 iterations, respectively. The optimal radiation pattern of the nonuniform amplitude of broadside linear array with 9 elements is shown in Figure 14.

Figure 13: The convergence rates of the GA, MFOA with adaptive population size, and conventional FOA for nonuniform amplitude of broadside linear array with 9 elements.
Figure 14: The optimal radiation pattern of nonuniform amplitude of broadside linear array with 9 elements from the MFOA with adaptive population size.

5. Conclusion

The modified fruit fly optimization algorithm (MFOA) with adaptive population size can be effectively applied to optimize the electrically large antenna array. The aim of the application of the modified algorithm to the large antenna array is to determine the optimal number of elements and element spacing that yield the maximum directivity at . The numerical results show that the MFOA and GA converge to the same solution with stable convergence rate but FOA is not converging. For the uniform spacing and nonuniform amplitude of linear array with 9 elements, the MFOA is employed to determine the amplitude of all elements. It is obvious that the optimum amplitude can be found at 9 elements within 5 iterations. The MFOA has slow convergence rate compared with the GA. The advantages of the proposed algorithm are easy implementation, stable convergence rate, large search range, less processing time, and reduced memory requirement, while the GA is difficult to set up the initial optimization parameters.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

The part of the research undertaken by Nattaset Mhudtongon was funded by the Royal Golden Jubilee Ph.D. Program under Grant no. PHD/0331/2551 and that under the care of Supakit Kawdungta was sponsored by the Thailand Research Fund (TRF) and Rajamangala University of Technology Lanna under Grant no. TRG5780080. The authors would like to extend deep gratitude to the aforementioned organizations for the financial support, without which this research would not have materialized.

References

  1. S. Kawdungta, C. Phongcharoenpanich, and D. Torrungrueng, “A novel analysis of planar dipole antenna arrays in free space with the multiple-sweep method of moments,” Electromagnetics, vol. 31, no. 4, pp. 258–272, 2011. View at Publisher · View at Google Scholar · View at Scopus
  2. S. Kawdungta, C. Phongcharoenpanich, and D. Torrungrueng, “An analysis of electrically large planar dipole antenna arrays with an efficient hybrid MSMM/CG method,” Journal of Electromagnetic Waves and Applications, vol. 25, no. 2, pp. 189–202, 2011. View at Publisher · View at Google Scholar · View at Scopus
  3. F. J. Ares-Pena, J. A. Rodriguez-Gonzalez, E. Villanueva-Lopez, and S. R. Rengarajan, “Genetic algorithms in the design and optimization of antenna array patterns,” IEEE Transactions on Antennas and Propagation, vol. 47, no. 3, pp. 506–510, 1999. View at Publisher · View at Google Scholar · View at Scopus
  4. J. J. Grefenstette, “Optimization of control parameters for genetic algorithms,” IEEE Transactions on Systems, Man and Cybernetics, vol. 16, no. 1, pp. 122–128, 1986. View at Publisher · View at Google Scholar · View at Scopus
  5. A. Hoorfar, “Evolutionary programming in electromagnetic optimization: a review,” IEEE Transactions on Antennas and Propagation, vol. 55, no. 3, pp. 523–537, 2007. View at Publisher · View at Google Scholar · View at Scopus
  6. W. T. Li, X. W. Shi, and Y. Q. Hei, “An improved particle swarm optimization algorithm for pattern synthesis of phased arrays,” Progress in Electromagnetics Research, vol. 82, pp. 319–332, 2008. View at Publisher · View at Google Scholar · View at Scopus
  7. D. I. Abu-Al-Nadi, T. H. Ismail, H. Al-Tous, and M. J. Mismar, “Design of linear phased array for interference suppression using array polynomial method and particle swarm optimization,” Wireless Personal Communications, vol. 63, no. 2, pp. 501–513, 2012. View at Publisher · View at Google Scholar · View at Scopus
  8. K. A. Papadopoulos, C. A. Papagianni, P. K. Gkonis, I. S. Venieris, and D. I. Kaklamani, “Particle swarm optimization of antenna arrays with efficiency constraints,” Progress in Electromagnetics Research M, vol. 17, pp. 237–251, 2011. View at Publisher · View at Google Scholar · View at Scopus
  9. Ó. Quevedo-Teruel and E. Rajo-Iglesias, “Ant colony optimization in thinned array synthesis with minimum side lobe level,” IEEE Antennas and Wireless Propagation Letters, vol. 5, no. 1, pp. 349–352, 2006. View at Publisher · View at Google Scholar · View at Scopus
  10. V. Murino, A. Trucco, and C. S. Regazzoni, “Synthesis of unequally spaced arrays by simulated annealing,” IEEE Transactions on Signal Processing, vol. 44, no. 1, pp. 119–123, 1996. View at Publisher · View at Google Scholar · View at Scopus
  11. W.-T. Pan, “A new fruit fly optimization algorithm: taking the financial distress model as an example,” Knowledge-Based Systems, vol. 26, pp. 69–74, 2012. View at Publisher · View at Google Scholar · View at Scopus
  12. H. Dai, G. Zhao, J. Lu, and S. Dai, “Comment and improvement on ‘a new fruit fly optimization algorithm: taking the financial distress model as an example’,” Knowledge-Based Systems, vol. 59, pp. 159–160, 2014. View at Publisher · View at Google Scholar · View at Scopus
  13. S.-M. Lin, “Analysis of service satisfaction in web auction logistics service using a combination of fruit fly optimization algorithm and general regression neural network,” Neural Computing and Applications, vol. 22, no. 3-4, pp. 783–791, 2013. View at Publisher · View at Google Scholar · View at Scopus
  14. H.-Z. Li, S. Guo, C.-J. Li, and J.-Q. Sun, “A hybrid annual power load forecasting model based on generalized regression neural network with fruit fly optimization algorithm,” Knowledge-Based Systems, vol. 37, pp. 378–387, 2013. View at Publisher · View at Google Scholar · View at Scopus
  15. W. Sheng and Y. Bao, “Fruit fly optimization algorithm based fractional order fuzzy-PID controller for electronic throttle,” Nonlinear Dynamics, vol. 73, no. 1-2, pp. 611–619, 2013. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  16. Y. F. Xing, “Design and optimization of key control characteristics based on improved fruit fly optimization algorithm,” Kybernetes, vol. 42, no. 3, pp. 466–481, 2013. View at Publisher · View at Google Scholar · View at Scopus
  17. Z. Z. Abidin, M. R. Arshad, and U. K. Ngah, “A simulation based fly optimization algorithm for swarms of mini autonomous surface vehicles application,” Indian Journal of Marine Sciences, vol. 40, no. 2, pp. 250–266, 2011. View at Google Scholar · View at Scopus
  18. L. Wang, X.-L. Zheng, and S.-Y. Wang, “A novel binary fruit fly optimization algorithm for solving the multidimensional knapsack problem,” Knowledge-Based Systems, vol. 48, pp. 17–23, 2013. View at Publisher · View at Google Scholar · View at Scopus
  19. F. Xu and Y. Tao, “The improvement of fruit fly optimization algorithm,” in Proceedings of the 2nd International Conference on Computer and Information Application (ICCIA '12), pp. 1516–1520, Taiyuan, China, 2012.
  20. S. K. Pal, C. S. Rai, and A. P. Singh, “Comparative study of firefly algorithm and particle swarm optimization for noisy non-linear optimization problems,” International Journal of Intelligent Systems and Applications, vol. 4, no. 10, pp. 50–57, 2012. View at Publisher · View at Google Scholar
  21. C. A. Balanis, Antenna Theory Analysis and Design, John Wiley and Son, New York, NY, USA, 2nd edition, 1997.