Nature-Inspired Optimization Algorithms for Neuro-Fuzzy Models in Real World Control and Robotics ApplicationsView this Special Issue
Solving the Manufacturing Cell Design Problem through Binary Cat Swarm Optimization with Dynamic Mixture Ratios
In this research, we present a Binary Cat Swarm Optimization for solving the Manufacturing Cell Design Problem (MCDP). This problem divides an industrial production plant into a certain number of cells. Each cell contains machines with similar types of processes or part families. The goal is to identify a cell organization in such a way that the transportation of the different parts between cells is minimized. The organization of these cells is performed through Cat Swarm Optimization, which is a recent swarm metaheuristic technique based on the behavior of cats. In that technique, cats have two modes of behavior: seeking mode and tracing mode, selected from a mixture ratio. For experimental purposes, a version of the Autonomous Search algorithm was developed with dynamic mixture ratios. The experimental results for both normal Binary Cat Swarm Optimization (BCSO) and Autonomous Search BCSO reach all global optimums, both for a set of 90 instances with known optima, and for a set of 35 new instances with 13 known optima.
Group technology is a manufacturing philosophy in which similar parts are identified and grouped together to take advantage of their similarities in design and production  by organizing similar parts into part families, where each part of the family has similar design and manufacturing characteristics. The basic concept of group technology has been practiced for many years around the world, as part of good engineering and scientific management practices [2, 3], which states that similar things should be manufactured in a similar way .
The Manufacturing Cell Design Problem (MCDP) is an application of group technology to organize cells containing a set of machines to process a family of parts . In this context, MCDP involves the creation of an optimal design of production plants, in which the main objective is to minimize the movement and exchange of material between these cells, thus generating greater productivity and reducing production costs.
The Manufacturing Cell Design Problem belongs to the complex NP-hard class of problems, and then exploring good search algorithms is always a challenging task from the optimization and now also from the artificial intelligence world . In particular, in this paper, an efficient metaheuristic implementation is proposed to tackle this problem, demonstrating through several benchmark instances its performance (various global optima are reached), which is also valuable from an artificial intelligence and optimization standpoint. Additionally, this algorithm includes an Autonomous Search Component (dynamic mixture ratio), which is currently an important research trend in the optimization and metaheuristic sphere. Metaheuristics are intrinsically complex to be configured in order to reach good results, and Autonomous Search comes to facilitate this task by letting the metaheuristic itself to self-tune its internal configuration without the need of a user expert for reaching good results. To the best of our knowledge, the work done on Autonomous Search in metaheuristics is very recent, and no Autonomous Search work for cat swarm exists.
The research work that has been done to solve the problem of cell formation has followed two complementary lines, which can be organized into two groups: approximate methods and exact methods. Approximate methods are mostly focused on finding an optimal solution in a limited time; however, they do not guarantee a global optimum. Exact methods, on the contrary, aim to fully analyze the search space to ensure a global optimum ; however, these algorithms are quite time-consuming and can only solve cases of very limited size. For this reason, many research efforts have focused on the development of heuristics, which find near-optimal solutions within a reasonable period of time.
This research focuses on solving the MCDP through a recent metaheuristic in the vein of Swarm Intelligence (SI)  called Binary Cat Swarm Optimization (BCSO) . This algorithm was generated from observations of cat behavior in nature, in which cats either hunt or remain alert. BCSO is based on the CSO algorithm, recently proposed by Chu and Tsai . The difference is that in BCSO, the vector position consists of ones and zeros, instead of real numbers (CSO), and the proposed alternate version makes use of a dynamic mixture ratio.
As aforementioned, reaching good results for problems belonging from the NP class is always a challenging and appealing task from the optimization and artificial intelligence world. In this research, our goal was to provide an intelligent algorithm for solving this problem by additionally integrating self-tuning features, which is a very recent research trend in the optimization and metaheuristic sphere.
2. Theoretical Framework
The formation of manufacturing cells has been researched for many years. One of the first investigations focused on resolving this set of problems was Burbidge’s work in 1963 , which proposed the use of an incidence matrix reorganized into a Block Diagonal Form (BDF) . In recent years, many exact and heuristic algorithms have been proposed in the literature to solve MCDP. Such metaheuristic techniques include genetic Algorithm (GA) , inspired by biological evolution and its genetic-molecular basis; the Neural Network (NN)  that takes the behavior of neurons and the connections of the human brain; and Constraint Programming (CP)  where the relationships between the variables are expressed as constraints. For extensive reviews of previous research and other methods of cell formation, see Selim et al. .
Among the metaheuristics used for cell formation, there is also the branch of Swarm Intelligence, which was initially introduced by Beni and Wang in 1989 . Inspired by nature, Swarm Intelligence systems are typically formed by a population of simple agents who interact locally with each other and with their environment and who are able to optimize an overall objective through the search for collaboration in a space . Within this branch, the main techniques are Particle Swarm Optimization (PSO) designed and presented by Eberhart et al. [7, 9] in 1995; Ant Colony Optimization (ACO), which is a family of algorithms derived from Dorigo’s 1991 work based on the social behavior of ants [15, 16]; Migrating Birds Optimization (MBO)  algorithm based on the alignment of migratory birds during flight; Artificial Fish Swarm Algorithm (AFSA) , based on the behavior of fish to find food by themselves or by following other fish; and the discrete Cat Swarm optimization (CSO) Technique presented in 2007 by Chu and Tsai , which is based on the behavior of cats. Interestingly, the CSO cat corresponds to a particle in PSO, with a small difference in its algorithms [19, 20]. CSO and PSO were originally developed for continuous value spaces, but there are a number of optimization problems where the values are discrete .
3. The Manufacturing Cell Design Problem
The Manufacturing Cell Design Problem (MCDP) divides an industrial production plant into a number of cells. Each cell contains machines with similar process types or part families, determined according to the similarity between parts . A manufacturing cell can be defined as an independent group of functionally different machines, located together, dedicated to the manufacture of a family of similar parts. In addition, a family of parts can be defined as a collection of parts that are similar, either because of their geometric shape and size or because similar processing steps are required to manufacture them .
The goal of MCDP is to identify a cell organization in a way that minimizes the transport of different parts between cells, in order to reduce production costs and increase productivity. The idea is to represent the processing requirements of machine parts through an incidence matrix called machine part. This reorganization involves the formulation of two new matrices called machine-cell and part-cell.
A detailed mathematical definition of the formulation of the machine-part clustering problem is defined by the optimization model explained below :(i): number of machines(ii): number of parts(iii): number of cells(iv): machine index ()(v): part index ()(vi): cell index ()(vii): maximum number of machines per cell(viii): machine-to-part binary incidence matrix, where(ix): machine-to-part binary incidence matrix, where(x): machine-to-part binary incidence matrix, where
4. Binary Cat Swarm Optimization
There are about thirty different species of known felines, e.g., lions, tigers, leopards, common housecat, etc. . Although they have different living environments, cats share similar behavioral patterns . For wild cats, the ability to hunt ensures food supply and survival of the species . To hunt their food, wild cats form groups ranging from 2–15 individuals . Domestic cats also show the same ability to hunt and are curious about moving objects [26–28]. Although cats might seem to be resting most of the time, even when awake [29, 30], they are actually in a constant state of alert; without moving, they may be listening or have their eyes open to look around . BCSO  was formulated on the basis of all these behaviors and is an optimization algorithm that mimics the natural behavior of cats [9, 32, 33]. The authors identified two main modes of behavior for simulating cats [3, 34–39]:(i)Seeking mode: exploration-oriented mode, where cats are attracted by moving objects and have a high hunting capacity. Cats may seem to spend most of their time resting, but in fact, they are constantly alert when moving slowly.(ii)Tracing mode: exploitation-oriented mode, where cats detect a prey and run after it, spending a lot of energy due to its rapid movements. In this way, the cats follow the best in their group.
In BCSO, these two behaviors are mathematically modeled to solve complex optimization problems. The first decision is to define the number of cats needed for each iteration. Each cat, represented by catk, where , has its own position consisting of M dimensions composed of ones and zeros (1 and 0). In addition, they have speed for each dimension d, a flag to indicate whether the cat is in the seeking or tracing mode, and finally a fitness value that is calculated based on the MCDP. The BCSO keeps looking for the best solution until iterations are finalized. In BCSO, each catx represents a MCDP solution through a machine-cell matrix, where x identifies the cat and d are the position bits of the cat. In addition, the constraint matrix ensures that each row i is covered by at least one column.
Algorithm 1 describes the general BCSO pseudocode where the mixture ratio (MR) is a percentage that determines the number of cats in the seeking mode.
4.1. Seeking Mode
This submodels the state of the cat, which is resting, looking around, and seeking the next position to move towards. The seeking mode has the following essential factors:(i)PMO: probability of mutation operation, a percentage that defines the mutation probability for the selected dimension.(ii)CDC: counts of dimensions to change, a percentage that indicates how many dimensions are candidates to change.(iii)SMP: seeking memory pool, a positive integer used to define the memory size for each cat. SMP indicates the points to be scanned by the cat and can be different for different cats.
The following pseudocode describes the behavior of the cat in the seeking mode. Here, is the fitness of the ith cat, and finds the minimum solution and the maximum solution. To solve the MCDP, we use . Step 1: create SMP copies of current catx. Step 2: for each copy: for dimensions that are candidates for change (based on CDC percentage): get a random number (rand) between 0 and 1 if rand < PMO, then the position changes. Step 3: evaluate Fitness of all copies. Step 4: calculate the selection probability by applying a roulette wheel or, by default, choose the best copy according to Fitness. Step 5: evaluate if the chosen copy is a better solution than the currently selected cat, and replace accordingly.
Figure 1 shows the flow chart of the behavior of the cat in the seeking mode.
4.2. Tracing Mode
This submodel is used to model the state of the cat in hunting or tracing behavior, where the cats are moving towards the best solution obtained so far. Once a cat enters the tracing mode, it moves according to its own velocities for each dimension. Each cat has two velocity vectors, defined as and , where is the probability that the bits of the cat change to zero and is the probability they change to one. The velocity vector changes its meaning with the probability of mutation for each dimension d. The tracing mode action is described in the following pseudocode. Step 1: calculate and according to the following expression, where is the dimension d of the best cat, r1 has random values in the range of [0,1], and c1 is a user-defined constant. Step 2: update values for and according to the expression, where is the inertia weight and M is the number of columns. Step 3: calculate the velocity of catk, , according to Step 4: calculate the probability of mutation in each dimension, defined by parameter which takes a value in the interval of [0,1] Step 5: based on the value of , the new value of each dimension of the cat is updated as follows:
The maximum velocity vector of must be limited to value .
If the value of surpasses that of , must be selected for the corresponding velocity dimension.
The following is a flow chart for a cat in the tracing mode (Figure 2).
5. Solving the Manufacturing Cell Design Problem (MCDP)
To solve the MCDP, it is essential to use a repair method for solutions that were not feasible. Algorithm 2 describes the pseudocode used to solve the MCDP.
6. Repair Method
A solution may not satisfy the constraints, resulting in an unworkable solution. For this reason, the value that violates the constraint is repaired instead of the matrix being removed. In this section, a function is described to transform nonfeasible solutions into feasible solutions.
Thus, Algorithm 3 presents a repair method in which all rows not covered are identified and assigned accordingly. This will cover all restrictions.
7. Autonomous Search
Autonomous Search (AS) is a modern approach that allows the solver to automatically reconfigure its resolution parameters to provide better performance when bad results are detected .
In this context, performance is assessed through indicators that collect relevant information during the search. Search parameters are then updated advantageously according to the results obtained by the fitness evaluation.
This approach has been effectively applied to different optimization and satisfaction techniques, such as Constraint Programming , SAT , mixed integer programming [43, 44], and various other metaheuristic techniques [45–47].
In the present investigation, a version of the BCSO with Autonomous Search has been implemented, where the mixture ratio (MR) variable is used as an autonomous parameter; i.e., the MR value changes while the program is executed to give a more dynamic algorithm that directly influences the mode that the cat will take.
Algorithm 4 is the pseudocode describing the Autonomous Search BCSO.
The BCSO implementation process of MCDP has led to results that will be presented in the following section. The metaheuristic was programmed in the JAVA programming language. For the execution of the algorithm, the parameters considered were the following:(i)Iterations = 5000(ii)Number of cats = 30(iii)MR = 0.75 (75% seeking; 25% tracing)(iv)SMP = 15(v)CDC = 0.2(vi)PMO = 0.76(vii)(viii)(ix)
9. Boctor Instances
Tests with the implemented solution were carried out based on 90 instances of 16 × 30 matrices, obtained from 10 problems found in the paper of Boctor , hereafter called Boctor Instances. These problems included the use of 2 or 3 cells. In the case of 2 cells, the maximum number of machines (Mmax) in each took values between 8 and 12. In the case of 3, Mmax varied between 6 and 9 machines per cell. In both cases, the value of Mmax remained constant throughout the execution of the algorithm.
The values obtained by submitting each problem to the Classic BCSO and BCSO with Autonomous Search are summarized in Tables 1–9, where “O” denotes the global optimum given in ; “BCSO,” the best value obtained by the BCSO here proposed; “A,” the average number of optima obtained; “I,” the average number of iterations in which the optimum is reached; ”Ms,” the time (in milliseconds) used to reach the optimum; and “RPD,” the Relative Percent Difference, calculated as follows:where is the best known optimal value and Z is the best optimal value achieved by BCSO.
The above results were run 40 times for each of the 90 Boctor Instances. It is important to point out that 100% of these were optimized, proving that BCSO can work with any MCDP instance. The performance of the BCSO metaheuristic in its Autonomous Search version was slightly better, demonstrated by some of the optima averages reached in the experimental results.
10. Other Author Instances
To analyze the effectiveness of the implemented algorithm in a wider range of problems, new instances from different authors were investigated. Matrix sizes ranged from 5 to 40 machines and from 7 to 100 parts. Table 10 shows the instances used:
In order to improve the quality of the exhibited behavior by the autonomous version of the Binary Cat Swarm Optimization, we performed a detailed comparison by using these new instances, because they are hardest. This comparison includes two well-known metaheuristics: the first one is inspired by the behavior of the Egyptian vulture (EVOA) , and the second one mimics the flashing behavior of fireflies . Table 11 reports the result comparison between our proposal and the methods published in .
If it observes the showed results for instances CF01 to CF11, we can conclude that BCSO presents a similar performance to EVOA. In both cases, the optimal values are reached. Moreover, we note the worst and mean values are equal. This behavior can be attributed to the similarity of the operations between both algorithms. Now, if it evaluates MFAO with respect to BCSO, we again can report a similar conclusion. Nevertheless, in CF05 and CF07, BCSO achieves two optimal values that they are not reached with MFAO.
From CF12 onwards, BCSO begins to exhibit an outstanding performance. For instance, in CF12, BCSO is the only one that finds the best solution (optimum value) reaching RPD 0%. Its closer competitor (MBFA) obtains RPD 28.57%. However, the biggest significant difference can be seen from CF15. In this instance, BCSO exhibits higher efficiency than EVOA and it overcomes the reached value by MBFA. Now, if taken any instances between CF16 and CF 35 (more than 57% of instances), the good yield of the BCSO exceeds the two compared approaches term of the best-found values, average-found values, and worst-found values also. Therefore, we can state that BCSO is more than a competitive technique. It is a real alternative for solving the Manufacturing Cell Design Problem.
The above results were obtained after 40 executions for each of the 35 new instances. It should be noted that it was possible to reach optima in 100% of instances for both algorithms, proving that BCSO can work with almost any instance. The performance of the BSCO metaheuristic in its Autonomous Search version was slightly better, demonstrated in some of the optima achieved, improving by 3% with respect to the original.
11. Results for Boctor Instances Using BCSO and BCSO with Autonomous Search
Figure 3 shows the results of the experiments conducted for the Boctor Instances presented above. Thanks to the operation mode of the BCSO, a fast optimum convergence is obtained at C = 2; however, when C = 3, the BCSO does not converge as quickly that said, the optimum is reached in most cases before 100 executions, which demonstrates the effectiveness of the proposed approach.
Figure 4 shows the results of problem 3, C = 2 and Mmax = 8, over iterations. Both versions converge quickly: while the Autonomous Search BCSO reaches the optimum early (iteration 10), the normal BCSO is stuck at optimum of fitness 5 at iteration 4.
The following graph (Figure 3) shows the results of problem 7, with C = 3, Mmax = 8, reaching the overall optimum in both cases at similar iterations: normal BCSO, iteration 30; and Autonomous Search BCSO, iteration 40.
12. Results for New Instances Using BCSO and BCSO with Autonomous Search
Figure 5 shows the results of the experiments performed for new instances, in which it can be seen that the Autonomous Search algorithm helps the solution not to get trapped at some local optimum; however, not all results with Autonomous Search present an advantage over the original version.
Figure 5 represents the results of problem 26, with M = 24, , C = 12, and Mmax = 3, in which it can be seen that Autonomous Search BCSO does not have a great difference over the normal BCSO; however, Autonomous Search BCSO is able to explore new solutions, which makes it achieve better results.
The graph in Figure 6 represents the results of problem 30, with M = 30, , C = 14, and MMAX = 4, in which it can be seen that the Autonomous Search BCSO solutions continue to change without being trapped in a local optimum, whereas normal BCSO is trapped near iteration 4000.
The graph in Figure 7 represents the results of problem 35, with M = 40, , C = 10, and Mmax = 6, in which Autonomous Search BCSO solutions are changing, exploring new solutions, expanding their search space early on, before iteration 3000; normal BCSO is trapped in a local optimum near iteration 1000.
In the present investigation, a new algorithm inspired by cat behavior, called Cat Swarm Optimization, was presented in solving the Manufacturing Cell Design Problem, used for placement of machinery in a manufacturing plant.
The proposed BCSO was implemented and tested using 90 Boctor Instances plus 35 new instances, for a total of 125 instances: The BCSO managed to obtain 100% of known optima in the 90 Boctor Instances, achieving rapid convergence and reduced execution times. In the case of the 35 new instances, it was possible to obtain 100% of the 13 known optima. It should be noted that these results were obtained after a long testing process, where the different parameters of the algorithm were calibrated based on experimentation. For that reason, Autonomous Search was implemented as an optimization method to influence variables in real time, which resulted in dynamic MR that slightly improved results obtained: 3% compared to the original, with 100% of the known optima, both for the 90 Boctor Instances and the 35 new instances.
As can be seen from the results, this metaheuristic behaves well in all observed cases. This research demonstrates that BCSO is a valid alternative for solving the MCDP. The algorithm works well, regardless of the scale of the problem. However, solutions obtained could be improved by using different parameters for each set of instances.
The BCSO performance was significantly increased after selecting a good repair technique. However, relying on a repair method leads us not to recommend the use of this algorithm for other types of problems because it is far less efficient than other techniques for more complex problems.
For future research, a more extensible configuration could be developed to cover a wider set of problems. It would also be interesting to implement this technique in conjunction with other recent metaheuristics where limited work on Autonomous Search exists such as cuckoo search, firefly optimization, or bat algorithms . Finally, hybridization with learning techniques is another interesting research line to pursue, where feedback gathered for the self-tune phase could be processed with machine learning in order to better track the complete solving process.
The authors declare that the data used to support the findings of this study are available from the corresponding author.
Conflicts of Interest
The authors declare that they have no conflicts of interest.
Broderick Crawford was supported by the Grant CONICYT/FONDECYT/REGULAR/1171243, and Ricardo Soto was supported by the Grant CONICYT/FONDECYT/REGULAR/1160455.
I. Ham, K. Hitomi, and T. Yoshida, Group Technology: Applications to Production Management, Springer Science & Business Media, Berlin, Germany, 2012.
H. Zhenggang, Z. Guo, and J. Wang, “Integrated scheduling of production and distribution operations in a global MTO supply chain,” Enterprise Information Systems, vol. 2018, pp. 1–25, 2018.View at: Google Scholar
P. D. Medina, E. A. Cruz, and M. Pinzón, “Generación de celdas de manufactura usando el algoritmo de ordenamiento binario (aob),” Scientia Et Technica, vol. 1, no. 44, pp. 106–110, 2010.View at: Google Scholar
J. L. Burbidge, The Introduction of Group Technology, Halsted Press, New York, NY, USA, 1975.
A. P. Engelbrecht, Fundamentals of Computational Swarm Intelligence, John Wiley & Sons, Hoboken, NJ, USA, 2006.
Y. Shara, M. A. Khanesar, and M. Teshnehlab, “Discrete binary cat swarm optimization algorithm,” in Proceedings of Computer, Control & Communication (IC4), 2013 3rd International Conference, pp. 1–6, IEEE, Karachi, Pakistan, September 2013.View at: Google Scholar
S.-C. Chu and P.-W. Tsai, “Computational intelligence based on the behavior of cats,” International Journal of Innovative Computing, Information and Control, vol. 3, no. 1, pp. 163–173, 2007.View at: Google Scholar
G. Beni and J. Wang, “Swarm intelligence in cellular robotic systems,” in Robots and Biological Systems: Towards a New Bionics? pp. 703–712, Springer, Berlin, Germany, 1993.View at: Google Scholar
J. F. Kennedy, J. Kennedy, R. C. Eberhart, and Y. Shi, Swarm Intelligence, Morgan Kaufmann, Burlington, MA, USA, 2001.
R. Soto, B. Crawford, B. Almonacid, and F. Paredes, “A migrating birds optimization algorithm for machine-part cell formation problems,” in Proceedings of Mexican International Conference on Artificial Intelligence, pp. 270–281, Springer, Cuernavaca, MX, USA, October 2015.View at: Google Scholar
J. Kennedy and R. C. Eberhart, “A discrete binary version of the particle swarm algorithm,” in Proceedings of 1997 Computational Cybernetics and Simulation, 1997 IEEE International Conference, pp. 4104–4108, IEEE, Orlando, FL, USA, October 1997.View at: Google Scholar
J. So and W. Jenkins, “Comparison of cat swarm optimization with particle swarm optimization for IIR system identification,” in Proceedings of Signals, Systems and Computers, 2013 Asilomar Conference, pp. 903–910, IEEE, Pacific Grove, CA, USA, November 2013.View at: Google Scholar
M. A. Khanesar, M. Teshnehlab, and M. A. Shoorehdeli, “A novel binary particle swarm optimization,” in Proceedings of Control & Automation, 2007. MED’07. Mediterranean Conference, pp. 1–6, IEEE, Marrakech, Morocco, June 2007.View at: Google Scholar
S. A. Irani, Handbook of Cellular Manufacturing Systems, John Wiley & Sons, Hoboken, NJ, USA, 1999.
V. Aspinall, Complete Textbook of Veterinary Nursing, Butterworth Heinemann, Oxford, UK, 2006.
J. Dards, “Feral cat behaviour and ecology,” Bulletin of the Feline Advisory Bureau, vol. 15, 1976.View at: Google Scholar
S. Crowell-Davis, “Cat behaviour: social organization, communication and development,” in The Welfare Of Cats, I. Rochlitz, Ed., pp. 1–22, Springer, Berlin, Germany, 2005.View at: Google Scholar
W. Sung, “Effect of gender on initiation of proximity in free ranging domestic cats (Felis catus),” University of Georgia, Athens, GA, USA, 1998, M.Sc. thesis.View at: Google Scholar
B. Santosa and M. K. Ningrum, “Cat swarm optimization for clustering,” in Proceedings of 2009 International Conference of Soft Computing and Pattern Recognition, pp. 54–59, Malacca, Malaysia, December 2009.View at: Google Scholar
X. Zhang, K. Tang, S. Li, K. Xia, and D. Zhao, “Design of slow-wave structure based on multi-objective quantum particle swarm optimization algorithm with inertia weight,” Chinese Journal of Vacuum Science & Technology, vol. 30, no. 6, pp. 651–656, 2010.View at: Google Scholar
Y. Hamadi, E. Monfroy, and F. Saubion, “What is autonomous search?” Hybrid Optimization, Springer, Berlin, Germany, 2011.View at: Google Scholar
F. Hutter, Y. Hamadi, H. H. Hoos, and K. Leyton-Brown, “Performance prediction and automated tuning of randomized and parametric algorithms,” in Proceedings of International Conference on Principles and Practice of Constraint Programming, pp. 213–228, Springer, Nantes, France, September 2006.View at: Google Scholar
F. Hutter, H. H. Hoos, and K. Leyton-Brown, “Automated configuration of mixed integer programming solvers,” in Proceedings of International Conference on Integration of Artificial Intelligence (AI) and Operations Research (OR) Techniques in Constraint Programming, pp. 186–202, Springer, Bologna, Italy, June 2010.View at: Google Scholar
F. Hutter, H. H. Hoos, and K. Leyton-Brown, “Sequential model-based optimization for general algorithm configuration,” in Proceedings of International Conference on Learning and Intelligent Optimization, pp. 507–523, Springer, Rome, Italy, Januray 2011.View at: Google Scholar
J. Maturana and F. Saubion, “On the design of adaptive control strategies for evolutionary algorithms,” in Proceedings of International Conference on Artificial Evolution (Evolution Artificielle), pp. 303–315, Springer, Tours, France, October 2007.View at: Google Scholar
J. Maturana and F. Saubion, “A compass to guide genetic algorithms,” in Proceedings of International Conference on Parallel Problem Solving from Nature, pp. 256–265, Springer, Birmingham, UK, September 2008.View at: Google Scholar
K. R. Kumar and A. Vannelli, “Strategic subcontracting for efficient disaggregated manufacturing,” University of Illinois, Champaign, IL, USA, 1986, BEBR faculty working paper; no. 1252.View at: Google Scholar
C. Sur, S. Sharma, and A. Shukla, “Egyptian vulture optimization algorithm–A new nature inspired meta-heuristics for knapsack problem,” in Proceedings of 9th International Conference on Computing and Information Technology (IC2IT2013), pp. 227–237, Bangkok, Thailand, May 2013.View at: Google Scholar
B. Almonacid, F. Aspée, R. Soto, B. Crawford, and J. Lama, Solving Manufacturing Cell Design Problem Using Modified Binary Firefly Algorithm and Egyptian Vulture Optimization Algorithm, IET Software, Wales, UK, 2016.