- About this Journal
- Abstracting and Indexing
- Aims and Scope
- Annual Issues
- Article Processing Charges
- Articles in Press
- Author Guidelines
- Bibliographic Information
- Citations to this Journal
- Contact Information
- Editorial Board
- Editorial Workflow
- Free eTOC Alerts
- Publication Ethics
- Reviewers Acknowledgment
- Submit a Manuscript
- Subscription Information
- Table of Contents
Journal of Applied Mathematics
Volume 2013 (2013), Article ID 139464, 20 pages
Cellular Harmony Search for Optimization Problems
1School of Computer Sciences, Universiti Sains Malaysia, 11800 Penang, Malaysia
2Department of Information Technology, Al-Huson University College, Al-Balqa Applied University, P.O. Box 50, Al-Huson, Irbid, Jordan
3Department of Computer Science, Jadara University, P.O. Box 733, Irbid, Jordan
Received 8 July 2013; Revised 15 August 2013; Accepted 16 August 2013
Academic Editor: Zong Woo Geem
Copyright © 2013 Mohammed Azmi Al-Betar et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Structured population in evolutionary algorithms (EAs) is an important research track where an individual only interacts with its neighboring individuals in the breeding step. The main rationale behind this is to provide a high level of diversity to overcome the genetic drift. Cellular automata concepts have been embedded to the process of EA in order to provide a decentralized method in order to preserve the population structure. Harmony search (HS) is a recent EA that considers the whole individuals in the breeding step. In this paper, the cellular automata concepts are embedded into the HS algorithm to come up with a new version called cellular harmony search (cHS). In cHS, the population is arranged as a two-dimensional toroidal grid, where each individual in the grid is a cell and only interacts with its neighbors. The memory consideration and population update are modified according to cellular EA theory. The experimental results using benchmark functions show that embedding the cellular automata concepts with HS processes directly affects the performance. Finally, a parameter sensitivity analysis of the cHS variation is analyzed and a comparative evaluation shows the success of cHS.
The optimization techniques have the utility of navigating the search space using effective operators driven by control parameters. The tricky point of the success of any optimization method is its ability to strike a suitable balance between exploration (diversification) and exploitation (intensification) of the problem search space . Exploration is the optimization method capability of navigating a promising region of the search space, if necessary, while exploitation refers to the capability of fine-tuning the already-navigated regions to converge into the local optima .
Harmony search (HS) algorithm is a recent evolutionary algorithm (EA) proposed by Geem et al.  to imitate the musical improvisation process. Due to its advantages over other optimization methods, it stipulates fewer mathematical requirements in the initial search . It has a novel stochastic derivative which reduces the number of iterations required to converge towards local minima , in addition to being simple, adaptable, general, and scalable . Therefore, HS algorithm has been intensively tailored for several optimization problems such as timetabling [6, 7], nurse restoring , space allocation , and many others [10–13]. However, due to the complex nature of some optimization problems and the avoidance of a premature convergence situation, HS theories are modified . Furthermore, the control parameter adaptation for HS is also studied [5, 15–17].
In a procedural context, HS, which is an iterative improvement algorithm, initiates with a population of random individuals stored in harmony memory (HM). At each iteration, a new individual is generated based on three operators: (i) memory consideration, which selects the variables of a new individual from whole HM individuals; (ii) pitch adjustment, which is responsible for local improvement, and (iii) random consideration, used to provide random elements for the new individual. The new individual is then evaluated and replaces the worst individual in HM, if it is better. This process is cycled until a desired condition is met (see Table 3).
As other EAs, HS algorithm interacts with whole individuals in the HM during each breeding step. The update process selects the worst individual from a single HM and replaces it with a new one, if better. The decentralized methods used in other structured EAs have shown that the performance of EA is improved. Examples include cellular genetic algorithm (cGA) , distributed EA (dEA) , cellular PSO , and others . The main idea of these structured methods is to partition the population into several sets with common features .
Cellular genetic algorithm (cGA), in particular, is a decentralized method where the population is represented as a toroidal grid of two-dimensions, as shown in Figure 3 [23, 24]. The individuals are located in this toroidal in a predefined topology and solely interact with their nearest neighbors in the breeding step . Note that all the neighborhoods have the same size and identical shape. This concept embedded in cGA provides useful advantages for the optimization domain  and parallel implementations [25, 26] because it assists in providing a high-level of diversity and yields a small diffusion of solutions through the search. The cGA provides a proper exploitation power inside each neighborhood of an individual by the operators of GA . Theoretically, it has been shown that nonrandom mating keeps genetic diversity at higher level, thus preventing the algorithms from converging prematurely to the local optima .
The main objective of this paper is to embed the cellular automata concepts in the HS algorithm optimization framework, where the HM is arranged as a two-dimensional toroidal grid. The population diffusion will be expectably preserved, maintaining a high-level of diversity during the search, thus avoiding genetic drift. The improvisation process of HS algorithm is adjusted to interact with the neighborhoods of specific individuals. The updating process of HM is done within the neighborhoods of that individual. The results show that the new decentralized version of HS (i.e., cHS) algorithm improves the performance of HS using standard benchmark functions.
The rest of the paper is organized as follows. The basics of HS algorithm are described in Section 2. The proposed cellular harmony search (cHS) is discussed in Section 3. Results of the experiments are presented in Section 4. Finally, the conclusion and promising future research directions are provided in Section 5.
2. The Harmony Search Algorithm
The harmony search (HS) algorithm is a recent evolutionary approach proposed by Geem et al. . It is initiated with a set of individuals stored in an augmented matrix called harmony memory (HM). It is a centralized algorithm where at each breeding step, it generates a new individual by interacting with the whole individuals in HM. HS follows three rules in the breeding step to generate a new individual: memory consideration, random consideration, and pitch adjustment. If the new individual is better than the worst individual in the whole HM, the replacement process is triggered. This process is repeated as many times as the HS is stagnated. The steps of HS algorithm are flowcharted in Figure 1, where each step is described below in more detail.
Step 1 (initialize parameters). The optimization problem is initially represented as , where is the objective function and is the set of decision variables. is the possible value range for each decision variable, where , and and are the lower and upper bounds for the decision variable , respectively, and is the number of decision variables. The parameters of the HS algorithm are also predefined in the following step.(a)The harmony memory consideration rate (HMCR), used in the breeding step to determine whether the value of a decision variable is to be selected from the individuals stored in the harmony memory (HM). (b)The harmony memory size (HMS) which determines the number of individuals in HM. (c)The pitch adjustment rate (PAR), which is used to decide the adjustments of some decision variables selected from memory. (d)The distance bandwidth (BW) which determines the distance of the adjustment that occurs to the individual in the pitch adjustment operator.(e)The number of improvisations (NI) which is similar to the number of generations.
Step 2 (initialize the harmony memory (HM)). The harmony memory (HM) is a matrix of size which includes sets of individuals determined by HMS (see (1)). In this step, these individuals are randomly generated as follows: , for all and for all , and generate a random number between 0 and 1. Consider
Step 3 (improvise a new individual). The HS algorithm generates a new individual, , using three operators: memory consideration, random consideration, and pitch adjustment.
Memory Consideration. In memory consideration, the value of the first decision variable in the new individual is randomly selected from the historical values, , stored in whole HM individuals. Values of the other decision variables, , are sequentially assigned in a similar way with a probability of HMCR, where . It is worth mentioning that this process interacts with the whole individuals in HM which might lead to a premature convergence situation due to the genetic drift.
Random Consideration. Decision variables that are not assigned with values according to memory consideration are randomly assigned according to their possible range by random consideration with a probability of (1-HMCR) as follows:
Pitch Adjustment. Each decision variable of a new individual, , that has been assigned a value by memory consideration is pitch adjusted with the probability of PAR, where as follows:
In pitch adjustment, if the decision for is Yes, the value of is modified to its neighboring value as follows: .
Step 4 (update HM). If the new individual, , has better objective function value than that of the worst individual stored in HM (i.e., in case HM is sorted), the worst individual will be replaced by the new one. Note that the worst individual is selected from the whole HM where the decentralized structure of the population is not observed.
3. Cellular Harmony Search (cHS) Algorithm
There has been much interest from researchers and scientists from different fields exploiting the cellular automata (CA) in physics, biology, social science, computer science, and so on. The initial concepts of CA were developed by Neumann  and have been an effective research tool to be incorporated with a wide variety of disciplines.
The concepts of cellular automata (CA) are normally concerned with individual perspective. The main idea from CA is to provide a population of a particular structure formulated as a toroidal grid. The cell in the toroidal grid refers to an individual who communicates with his closest neighboring individuals so that all the individuals have exactly the same number of neighbors. This leads us to a kind of locality known as isolation by distance. Normally, the Manhattan distance is used to measure the distance between any individual and his neighbors. Note that the neighboring individuals have identical shapes and the same size .
There exist two-different kinds of cellular models based on how the breeding cycle is performed to the individuals. To put it differently, if the cycle is performed to the whole individuals at the same time, the cellular model is said to be synchronous, where the individuals of the next generation are simultaneously build. On the other hand, if the individuals of the population are sequentially updated with a particular order policy, an asynchronous cellular model is stated. For more discussion about the theory of cellular automata, relevant papers can be seen in [22, 29].
Cellular harmony search (cHS) algorithm can be considered as a new decentralized variation of HS which hinges on the structured HM. The individuals in the HM are arranged in the form of two-dimensional toroidal grid. This is meant to keep a high level of diversity during the breeding step and thus increases the chance to converge into global minima. This can be achieved by avoiding the genetic drift and providing a more suitable population diffusion during the search.
Some steps of cHS algorithm to the original version of HS algorithm presented in Section 2 have been adjusted. The adjustments are flowcharted in Figure 2. The breeding step of cHS solely interacts with the neighboring individuals of a randomly selected individual using “cellular memory consideration” operators. The update of HM step is adjusted to replace the worst individual amongst the neighboring individuals with a new individual, but not amongst the whole HM. Figure 3 shows the population on cellular structure (toroidal grid population), which inspires the concept of small neighborhood regions in the cellular automata. Particularly, the overlap of the neighborhood provides implicit technique migration into population. Therefore, the best solutions are diffused smoothly in the whole population, where the diversity of the cellular harmony search is preserved throughout the search. The cellular HS model includes several components.(1)Cell: the selected random individual in the population (the number of individual is HMS). (2)Cell space: the set of the whole individuals in HM. (3)Neighborhood: the set of potential mates of any individual.(4)Neighborhood shapes: the way of selecting the neighborhoods of the cell as seen in Figure 5.(5)Discrete time limit: the number of generations in HS algorithm which is normally determined by NI parameter.
The detailed steps of cHS are discussed in the steps below.
Step 1 (initialize cHS parameters and optimization problem). It is clear that the successful search of any metaheuristic method is based on skillful parameter setting. The parameters have different effects on optimization solutions. The parameters of cHS are harmony memory size (HMS), harmony memory consideration rate (HMCR), pitch adjusting rate (PAR), number of improvisation (NI), and the size of neighborhood (NH) determined by cellular structure (see Figure 5), where if , this means the neighbors structure is length with 8 neighbors.
Step 2 (initialize the harmony memory (HM)). This step is the same as Step 2 in the original version of HS. Note that the individuals of the HM are arranged as a two-dimensional toroidal grid as shown in Figure 4.
Step 3 (initialization of neighborhood matrix (NM)). NM is a binary matrix of size (see (4)). The binary values are assigned to each element based on NH parameter. This matrix is used during the breeding step to determine the neighboring individuals of any randomly selected individual. The NM matrix is filled by binary value as in (5). Consider
The is a set of all neighboring individuals of the individual arranged in two-dimensional toroidal mesh. This set is determined based on a neighborhood shape as seen in Figure 5.
Figure 4 shows how the HM individuals are mapped to toroidal grid. Note that the element reflects the index of the individual 1 in HM, while the reflects the individual index HMS in HM. The HMS value is the square value of K (i.e., ).
To map the element in toroidal grid Y and the individual index in HM, the following will be used:
To map the index of the individual in HM to the element () in the matrix NM, the following will be used:
This mapping mechanism between the individual index of HM and the elements in Y is very useful to determine the neighboring of any individual in HM. As shown in Figure 5, there are several neighborhood shapes to determine the neighbors of any individual. For example, the L5 neighborhood shape takes the nearest neighbors of a given cell axial direction. Therefore, to determine the individual indexes in HM that belong to the neighbors of the individual index using L5, the and should be calculated using (7) and (8). And the set of neighbors of individual called , then (8) is used to map the elements in Y to the corresponding individual indexes in HM. The same step is used if different neighborhood shapes (i.e., L9, C9,…) are used to determine the neighboring individuals of .
As can be noted, the individuals in the set will be assigned by 1 in NM for the individual while others will be assigned by 0.
Step 4 (select random individual from HM). The selection of a random individual , , is done to determine the neighboring individuals taken from the matrix NM (i.e., ), where is the number of the neighboring individuals; that is, . The neighboring individuals interact together in order to generate the new individual in the next step.
Step 5 (generate a new individual). In this step, a new individual, , is generated based on the three operators: cellular memory consideration, pitch adjustment, and random consideration. The whole process is drawn in Figure 6.
The pitch adjustment and random consideration operators in cHS algorithm are the same as those in the original version of HS algorithm. However, the memory consideration is modified to be inline with the concepts of cellular automata as follows.
Cellular Memory Consideration. As in cellular GA, the cellular memory consideration solely interacts with the close individuals of taken from the set of . The other individuals are not used in the breeding step. The value of the first decision variable is randomly selected from the historical values, , stored individuals in the set . The other decision variables, , are sequentially assigned in a similar way with a probability of HMCR, where .
It is worth mentioning that cellular memory consideration is able to control the diffusion between the individuals in HM, and thus, it is able to preserve the cHS diversity as long as the search process is iterated. By this strategy, the population is structured and it is possible to improve the numerical behavior of the cHS algorithm.
Step 6 (update the harmony memory). This step is modified in cHS algorithm. The worst individual from the set of neighbors , that is, , is replaced by the new individual , if better. Note that the replacement process is done taking into account the neighboring individuals in the set only.
Step 7 (check stopping criterion). In this step, the cHS will stop if the maximum number of the iteration (i.e., NI) is reached; otherwise the algorithm repeats Steps 4 to 6. The pseudocode of cHS is demonstrated in Algorithm 1.
4. Experimental and Comparative Evaluation
In this section, cHS algorithm is evaluated using benchmark functions circulated in the literature used to evaluate different variations of HS algorithm. The comparative evaluation is demonstrated while the sensitivity analysis of the control parameters of the proposed method is carried out.
4.1. Comparative Evaluation
In this section, a set of test functions designed for the special session on real-parameter optimization organized in the 2005 IEEE Congress on Evolutionary Computation (CEC 2005) , is used. The CEC 2005 comprises 25 test functions including 5 unimodals and 20 multimodals functions, as shown in Table 1. Note that a full discussion about these functions can be taken from Sunganthan et al. . The CEC 2005 provides a suitable number of comparative methods in which the proposed method can be evaluated, as abbreviated in Table 2.
The experiments done followed the conditions of CEC 2005 , where the 25 repeated runs have been performed for each test function. The 25 runs have been summarized in terms of average of the error () of the best individual (i.e., ). Note that is a given optimal solution while the is the average best solution obtained in 25 runs. The dimension and the cHS is iterated 100,000 evaluations of the fitness function.
Notably, most of the winner comparative methods are hybrid versions of a particular EA, where their results are very efficient to the tested functions. It is appeared that the AE of HS and cHS are very close or sometimes better than those achieved by the comparative methods. In particular, the HS algorithm is able to achieve very powerful results for most test functions and excels some of the best results reported by the comparative methods. For example, HS has achieved the smallest for , and .
It is noted that the obtained by cHS is not the best in most cases. This is because the main idea of proposing the cHS is to ensure a high-level of diversity, and thus, the cHS required a higher number of evaluations to hit better results. However, the results obtained by cHS are very close to the winner results.
4.2. Sensitivity Analysis of cHS Parameters
All the experiments are run using a computer with 2.66 Intel Core 2 Quad with 4 GB of RAM. The operating system used is Microsoft windows Vista Enterprise Service Pack 1. The source code is implemented using MATLAB Version 184.108.40.2064(R2008a).
The common parameters among all algorithms used in the experiments are set based on empirical guidelines . For the sake of studying the effect of different parameter settings, in general, the parameters setting used for evaluating the cHS method are as follows: , , , , , and . The neighborhood shape used is C9, presented in Figure 5.
All functions are implemented in 30 dimensions (30D). For the scalability study in Section 4.2.6, the functions are implemented in 10 dimensions (10D), 50 dimensions (50D), and 100 dimensions (100D). The stopping criterion used is the maximum number of improvisation (NI).
All the results in this section are presented in Tables 4 to 8, demonstrating the mean and standard deviation for thirty independent runs. The best solution has been highlighted in bold font. A comparative analysis between cHS algorithm and the original variant of HS for the common parameters is also conducted.
4.2.1. Benchmark Functions
The global minimization benchmark functions are used to study the sensitivity analysis of the parameters of the proposed method (cHS) against the original version of HS algorithm. Five functions are defined by Whitley et al.  and the other five were described by Yao et al. . These functions provide a balance between unimodal and multimodal functions. These functions are commonly used to evaluate the state-of-the-art variations of harmony search algorithms [5, 44, 45].
Most of the benchmark functions have standard solution space range of the objective function. Otherwise, unsymmetrical initialization ranges are used for these functions whose global optima are at the center of the solution space. These benchmark functions are as follows.
Sphere function, defined as where global optimum and (search range ) (Figure 7(a)).
Schwefel’s problem 2.22, defined as where global optimum and (search range ) (Figure 7(b)).
Step function, defined as where global optimum and (search range ) (Figure 7(c)).
Rosenbrock function, defined as where global optimum and (search range ) (Figure 7(d)).
Rotated hyper-ellipsoid function, defined as where global optimum and (search range ) (Figure 7(e)).
Generalized Schwefel’s problem 2.26, defined as where global optimum and (search range ) (Figure 7(f)).
Rastrigin function, defined as where global optimum and (search range ) (Figure 7(g)).
Ackley’s function, defined as where global optimum and (search range ) (Figure 7(h)).
Griewank function, defined as where global optimum and (search range ) (Figure 7(i)).
Six-Hump Camel-Back function, defined as where global optimum , and (search range ) (Figure 7(j)).
4.2.2. The Effect of HMCR
Table 4 summarizes the effect of HMCR on the performance of cHS and the original version of HS using four values of HMCR (i.e., 0.7, 0.9, 0.94, and 0.98). The best solution obtained at each corresponding value is highlighted in bold font. Figure 8 is the box plot visualizing the effect of various values of HMCR on the behavior of cHS algorithm using the ten global minimization functions.
The results show that increasing the HMCR value improves the performance of the cHS for all functions, except six-hump where the opposite function is true. Where a small value is used, HMCR increases the diversity and hence prevents the cHS from convergence (i.e., it results in random search). Thus, it is generally better to use a large value for the HMCR (i.e., ≤0.98).
The high value of HMCR means high probability of using the harmony memory that leads to less exploration of search space. Using a probability of HMCR close to 1 (high value) might lead the algorithm to fall into the local minima. Using less probability of HMCR allows more randomly generated solutions. Therefore, the diversity increases in a way that prevents the convergence. The results reveal that cHS and HS have identical sensitivity to the different values of HMCR for all functions, and in 0.98 probabilities to use HMCR they get the best results for the majority of optimization functions. Furthermore, the results produced by cHS are better than those produced by HS in almost all tested functions.
4.2.3. The Effect of HMS
Table 5 summarizes the effect of HMS on the performance of cHS and basic HS using four values of HMS (i.e., 16, 25, 36, and 100). The best solution obtained at each corresponding value is highlighted in bold font. Figure 9 is the box plot visualizing the effect of various values of HMS on the behavior of cHS algorithm using the ten global minimization functions.
It is revealed that both cHS and the basic HS are not sensitive to the HMS. For (), cHS obtained the best results when the value of HMS is large (see Figure 9). Apparently, with different values of HMS, cHS performance is better for most of the functions because the overlap of the neighborhoods provides an implicit mechanism of migration to the cHS. Since the best solutions spread smoothly through the whole population, the cHS diversity in the structured population is preserved longer than in the classical version of HS algorithm.
HM is analogous to the short-term memory of a musician that is known to be small. A plausible interpretation may rely on the high number of similar harmonies within the HM when the HMS is large that leads to shortages of diversity and, hence, lead to falling into local minima. Therefore, cHS is likely to be capable of maintaining the diversity than HS with the cellular structure.
4.2.4. The Effect of PAR
Table 6 summarizes the effect of PAR on the performance of cHS and basic HS using four values of PAR (0.1, 0.3, 0.7, and 0.9). The best solution at each corresponding value is highlighted in bold font. Figure 10 is the box plot visualizing the effect of various values of PAR on the behavior of cHS algorithm using the ten global minimization functions.
It seems that using a relatively small value of PAR (i.e., ≤0.5) improves the performance of the cHA and HS. Most results at large value of PAR can increase the convergence speed of HS algorithm, while a small value of PAR increases diversity in HM. On the other hand, the small value of PAR allows more exploration of the search space, and the large value of PAR leads to a lower rate of exploration, where the diversity is reduced and then the algorithm might be trapped into the local optima. It is observed that cHS is able to get the best results than the original version of HS algorithm for some benchmark functions (i.e., , , , ), but not so for the others.
4.2.5. The Effect Number of Neighbors NH
Table 7 summarizes the effect number of neighbors (NH) according to cellular structure on the performance of cHS. Table 7 also exhibits the effect of NH for ( to ) using four values of NH (), where the value of . The best solution at each corresponding value is highlighted in bold font. The best results of comparing cHS with original version HS are highlighted in bold. Figure 11 is the box plot visualizing the effect of various NH values on the behavior of cHS algorithm using the ten global minimization functions.
The results show that cHS obtained the best result when the number of neighbors is large, except () which obtained the best result when the number of neighbors is small. Generally, this leads to the conclusion that cHS often shows better performance at larger values of NH between 8 and 12. Figure 11 shows the box plots of the recorded results which reveal the distribution of the 30 tested runs against the various value of NH.
4.2.6. Scalability Study of
In this section, the results produced by cHS and HS, when the dimension of the function is set to , , , and as shown in Table 8, are recorded.
In general, decreasing the dimensionality leads to better results in cHS and HS. This comes in line with the previous theory. However, it is observed from results (Table 8) that the proposed cHS algorithm outperforms the HS when dimensionality is large for the majority of the benchmark optimization functions. This shows that increasing the dimensionality of the problem needs a better algorithm like cHS.
5. Conclusion and Future Work
In this paper, a new version of HS algorithm called cellular harmony search (cHS) algorithm is proposed. cHS is an HS algorithm embedded with cellular automata concepts. The main idea of proposing cHS algorithm is to provide a structured population that preserves a high level of diversity during the search. In cHS, the HM individuals are arranged as a two-dimensional toroidal grid, where each individual is generated and only interacts with its neighboring individuals. The operators of the original version of HS algorithm are adjusted to observe the cellular GA theory, where the concepts of cell and cell search space are employed.
Using ten global optimization functions circulated in the literature, the cHS is evaluated. The results support the theory of cellular automata, where in almost all cases the cHS outperforms HS algorithm. The sensitivity analysis of cHS parameters has suggested that the cHS is sensitive to the values of HMCR, PAR, , and NH. The comparative evaluation is also conducted with two versions of HS algorithms proposed in [44, 45], where comparable results were obtained.
This is an initial investigation of using cellular automata concepts in the HS algorithm optimization framework. The future, indeed, is pregnant with several research directions, such as(i)analyzing the selection pressure and time complexity concepts of cHS algorithm,(ii)studying the effect of the neighborhood shapes on the performance of cHS,(iii)studying a new migration strategy to empower the interaction between the individuals and their neighbors.
The first author is grateful to be awarded Postdoctoral Research Fellowship from the School of Computer Sciences (USM). This paper is supported by Grant no. 203/PTS6728001 (Grant Type: LRGS).
- C. Blum and A. Roli, “Metaheuristics in combinatorial optimization: overview and conceptual comparison,” ACM Computing Surveys, vol. 35, no. 3, pp. 268–308, 2003.
- Z. W. Geem, J. H. Kim, and G. V. Loganathan, “A new heuristic optimization algorithm: harmony search,” Simulation, vol. 76, no. 2, pp. 60–68, 2001.
- M. Mahdavi, M. Fesanghary, and E. Damangir, “An improved harmony search algorithm for solving optimization problems,” Applied Mathematics and Computation, vol. 188, no. 2, pp. 1567–1579, 2007.
- Z. W. Geem, “Novel derivative of harmony search algorithm for discrete design variables,” Applied Mathematics and Computation, vol. 199, no. 1, pp. 223–230, 2008.
- M. A. Al-Betar, I. A. Doush, A. T. Khader, and M. A. Awadallah, “Novel selection schemes for harmony search,” Applied Mathematics and Computation, vol. 218, no. 10, pp. 6095–6117, 2012.
- M. A. Al-Betar, A. T. Khader, and M. Zaman, “University course timetabling using a hybrid harmony search metaheuristic algorithm,” IEEE Transactions on Systems, Man and Cybernetics C, vol. 42, pp. 664–681, 2012.
- M. A. Al-Betar and A. T. Khader, “A harmony search algorithm for university course timetabling,” Annals of Operations Research, vol. 194, no. 1, pp. 3–31, 2012.
- M. Awadallah, A. Khader, M. Al-Betar, and A. Bolaji, “Nurse rostering using modied harmony search algorithm,” in Swarm, Evolutionary, and Memetic Computing, B. Panigrahi, P. Suganthan, S. Das, and S. Satapathy, Eds., vol. 7077 of Lecture Notes in Computer Science, pp. 27–37, Springer, Berlin, Germany, 2011.
- M. Awadallah, A. Khader, M. Al-Betar, and P. Woon, “Office-space-allocation problem using harmony search algorithm,” in Neural Information Processing, T. Huang, Z. Zeng, C. Li, and C. Leung, Eds., vol. 7664 of Lecture Notes in Computer Science, pp. 365–374, Springer, Berlin, Germany, 2012.
- L. Guo, “A novel hybrid bat algorithm with harmony search for global numerical optimization,” Journal of Applied Mathematics, vol. 2013, Article ID 696491, 21 pages, 2013.
- L. Zhang, Y. Xu, and Y. Liu, “An elite decision making harmony search algorithm for optimization problem,” Journal of Applied Mathematics, vol. 2012, Article ID 860681, 15 pages, 2012.
- Z. W. Geem, “Economic dispatch using parameter-setting-free harmony search,” Journal of Applied Mathematics, vol. 2013, Article ID 427936, 5 pages, 2013.
- J. Fourie, R. Green, and Z. W. Geem, “Generalised adaptive harmony search: a comparative analysis of modern harmony search,” Journal of Applied Mathematics, vol. 2013, Article ID 380985, 13 pages, 2013.
- M. G. H. Omran and M. Mahdavi, “Global-best harmony search,” Applied Mathematics and Computation, vol. 198, no. 2, pp. 643–656, 2008.
- A. Mukhopadhyay, A. Roy, S. Das, S. Das, and A. Abraham, “Population-variance and explorative power of harmony search: an analysis,” in Proceedings of the 3rd International Conference on Digital Information Management (ICDIM '08), pp. 775–781, November 2008.
- X.-S. Yang, “Harmony search as a metaheuristic algorithm,” in Music-Inspired Harmony Search Algorithm, Z. Geem, Ed., vol. 191 of Studies in Computational Intelligence, pp. 1–14, Springer, Berlin, Germany, 2009.
- M. A. Al-Betar, A. T. Khader, Z. W. Geem, I. A. Doush, and M. A. Awadallah, “An analysis of selection methods in memory consideration for harmony search,” Applied Mathematics and Computation, vol. 219, no. 22, pp. 10753–10767, 2013.
- E. Alba and B. Dorronsoro, “The exploration/exploitation tradeoff in dynamic cellular genetic algorithms,” IEEE Transactions on Evolutionary Computation, vol. 9, no. 2, pp. 126–142, 2005.
- D. Whitley and T. Starkweather, “Genitor ii: a distributed genetic algorithm,” Journal of Experimental & Theoretical Articial Intelligence, vol. 2, pp. 189–214, 1990.
- Y. Shi, H. Liu, L. Gao, and G. Zhang, “Cellular particle swarm optimization,” Information Sciences, vol. 181, no. 20, pp. 4460–4493, 2011.
- T. Y. Lim, “Structured population genetic algorithms: a literature survey,” Artificial Intelligence Review, 2012.
- E. Alba and B. Dorronsoro, Cellular Genetic Algorithms, vol. 42, Springer, Berlin, Germany, 2008.
- S. Janson, E. Alba, B. Dorronsoro, and M. Middendorf, “Hierarchical cellular genetic algorithm,” in Evolutionary Computation in Combinatorial Optimization, J. Gottlieb and G. Raidl, Eds., vol. 3906 of Lecture Notes in Computer Science, pp. 111–122, Springer, Berlin, Germany, 2006.
- J. Błazewicz, M. Drozdowski, F. Guinand, and D. Trystram, “Scheduling a divisible task in a two-dimensional toroidal mesh,” Discrete Applied Mathematics, vol. 94, no. 1–3, pp. 35–50, 1999.
- E. Cantu-Paz, Efficient and Accurate Parallel Genetic Algorithms, vol. 1, Springer, London, UK, 2000.
- H. Mühlenbein, M. Schomisch, and J. Born, “The parallel genetic algorithm as function optimizer,” Parallel Computing, vol. 17, no. 6-7, pp. 619–632, 1991.
- C. Fernandes and A. Rosa, “A study on non-random mating and varying population size in genetic algorithms using a royal road function,” in Proceedings of the Congress on Evolutionary Computation, pp. 60–66, May 2001.
- J. V. Neumann, Theory of Self-Reproducing Automata, University of Illinois Press, Urbana, Ill, USA, 1966.
- B. Schönfisch and A. de Roos, “Synchronous and asynchronous updating in cellular automata,” BioSystems, vol. 51, no. 3, pp. 123–143, 1999.
- P. Suganthan, N. Hansen, J. Liang et al., “Problem denitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization,” Tech. Rep., Nanyang Technological University, 2005.
- C. García-Martínez and M. Lozano, “Hybrid real-coded genetic algorithms with female and male differentiation,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '05), pp. 896–903, September 2005.
- D. Molina, F. Herrera, and M. Lozano, “Adaptive local search parameters for real-coded memetic algorithms,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '05), pp. 888–895, September 2005.
- P. Posik, “Real-parameter optimization using the mutation step coevolution,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '05), pp. 872–879, 2005.
- J. Rönkkönen, S. Kukkonen, and K. V. Price, “Real-parameter optimization with differential evolution,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '05), pp. 506–513, September 2005.
- J. J. Liang and P. N. Suganthan, “Dynamic multi-swarm particle swarm optimizer with local search,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '05), pp. 522–528, September 2005.
- B. Yuan and M. Gallagher, “Experimental results for the special session on real-parameter optimization at CEC 2005: a simple, continuous EDA,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '05), pp. 1792–1799, September 2005.
- A. Sinha, S. Tiwari, and K. Deb, “A population-based, steady-state procedure for real-parameter optimization,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '05), pp. 514–521, September 2005.
- A. Auger and N. Hansen, “Performance evaluation of an advanced local search evolutionary algorithm,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '05), pp. 1777–1784, September 2005.
- A. Auger and N. Hansen, “A restart CMA evolution strategy with increasing population size,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '05), pp. 1769–1776, September 2005.
- A. K. Qin and P. N. Suganthan, “Self-adaptive differential evolution algorithm for numerical optimization,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '05), pp. 1785–1791, September 2005.
- P. J. Ballester, J. Stephenson, J. N. Carter, and K. Gallagher, “Real-parameter optimization performance study on the CEC-2005 benchmark with SPC-PNX,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '05), pp. 498–505, September 2005.
- D. Whitley, S. Rana, J. Dzubera, and K. E. Mathias, “Evaluating evolutionary algorithms,” Artificial Intelligence, vol. 85, no. 1-2, pp. 245–276, 1996.
- X. Yao, Y. Liu, and G. Lin, “Evolutionary programming made faster,” IEEE Transactions on Evolutionary Computation, vol. 3, no. 2, pp. 82–102, 1999.
- Y. M. Cheng, L. Li, T. Lansivaara, S. C. Chi, and Y. J. Sun, “An improved harmony search minimization algorithm using different slip surface generation methods for slope stability analysis,” Engineering Optimization, vol. 40, no. 2, pp. 95–115, 2008.
- Q.-K. Pan, P. N. Suganthan, J. J. Liang, and M. F. Tasgetiren, “A local-best harmony search algorithm with dynamic subpopulations,” Engineering Optimization, vol. 42, no. 2, pp. 101–117, 2010.
- M. A. Al-Betar, A. T. Khader, and F. Nadi, “Selection mechanisms in memory consideration for examination timetabling with harmony search,” in Proceedings of the 12th Annual Genetic and Evolutionary Computation Conference (GECCO '10), pp. 1203–1210, ACM, Portland, Ore, USA, July 2010.