Table of Contents Author Guidelines Submit a Manuscript
Computational Intelligence and Neuroscience
Volume 2019, Article ID 4787856, 16 pages
https://doi.org/10.1155/2019/4787856
Research Article

Solving the Manufacturing Cell Design Problem through Binary Cat Swarm Optimization with Dynamic Mixture Ratios

1Pontificia Universidad Católica de Valparaíso, Avenida Brasil 2241, Valparaíso 2362807, Chile
2Universidad Técnica Federico Santa María, Avenida España 1680, Valparaíso 2390123, Chile
3Universidad Diego Portales, Av. Ejército 441, Santiago 8370109, Chile
4Universidad de Valparaíso, General Cruz 222, Valparaíso 2603631, Chile

Correspondence should be addressed to Hanns de la Fuente-Mella; lc.vcup@etneufaled.snnah

Received 29 October 2018; Revised 11 January 2019; Accepted 14 January 2019; Published 14 February 2019

Academic Editor: Oscar Castillo

Copyright © 2019 Ricardo Soto et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

In this research, we present a Binary Cat Swarm Optimization for solving the Manufacturing Cell Design Problem (MCDP). This problem divides an industrial production plant into a certain number of cells. Each cell contains machines with similar types of processes or part families. The goal is to identify a cell organization in such a way that the transportation of the different parts between cells is minimized. The organization of these cells is performed through Cat Swarm Optimization, which is a recent swarm metaheuristic technique based on the behavior of cats. In that technique, cats have two modes of behavior: seeking mode and tracing mode, selected from a mixture ratio. For experimental purposes, a version of the Autonomous Search algorithm was developed with dynamic mixture ratios. The experimental results for both normal Binary Cat Swarm Optimization (BCSO) and Autonomous Search BCSO reach all global optimums, both for a set of 90 instances with known optima, and for a set of 35 new instances with 13 known optima.

1. Introduction

Group technology is a manufacturing philosophy in which similar parts are identified and grouped together to take advantage of their similarities in design and production [1] by organizing similar parts into part families, where each part of the family has similar design and manufacturing characteristics. The basic concept of group technology has been practiced for many years around the world, as part of good engineering and scientific management practices [2, 3], which states that similar things should be manufactured in a similar way [4].

The Manufacturing Cell Design Problem (MCDP) is an application of group technology to organize cells containing a set of machines to process a family of parts [5]. In this context, MCDP involves the creation of an optimal design of production plants, in which the main objective is to minimize the movement and exchange of material between these cells, thus generating greater productivity and reducing production costs.

The Manufacturing Cell Design Problem belongs to the complex NP-hard class of problems, and then exploring good search algorithms is always a challenging task from the optimization and now also from the artificial intelligence world [5]. In particular, in this paper, an efficient metaheuristic implementation is proposed to tackle this problem, demonstrating through several benchmark instances its performance (various global optima are reached), which is also valuable from an artificial intelligence and optimization standpoint. Additionally, this algorithm includes an Autonomous Search Component (dynamic mixture ratio), which is currently an important research trend in the optimization and metaheuristic sphere. Metaheuristics are intrinsically complex to be configured in order to reach good results, and Autonomous Search comes to facilitate this task by letting the metaheuristic itself to self-tune its internal configuration without the need of a user expert for reaching good results. To the best of our knowledge, the work done on Autonomous Search in metaheuristics is very recent, and no Autonomous Search work for cat swarm exists.

The research work that has been done to solve the problem of cell formation has followed two complementary lines, which can be organized into two groups: approximate methods and exact methods. Approximate methods are mostly focused on finding an optimal solution in a limited time; however, they do not guarantee a global optimum. Exact methods, on the contrary, aim to fully analyze the search space to ensure a global optimum [6]; however, these algorithms are quite time-consuming and can only solve cases of very limited size. For this reason, many research efforts have focused on the development of heuristics, which find near-optimal solutions within a reasonable period of time.

This research focuses on solving the MCDP through a recent metaheuristic in the vein of Swarm Intelligence (SI) [7] called Binary Cat Swarm Optimization (BCSO) [8]. This algorithm was generated from observations of cat behavior in nature, in which cats either hunt or remain alert. BCSO is based on the CSO algorithm, recently proposed by Chu and Tsai [9]. The difference is that in BCSO, the vector position consists of ones and zeros, instead of real numbers (CSO), and the proposed alternate version makes use of a dynamic mixture ratio.

As aforementioned, reaching good results for problems belonging from the NP class is always a challenging and appealing task from the optimization and artificial intelligence world. In this research, our goal was to provide an intelligent algorithm for solving this problem by additionally integrating self-tuning features, which is a very recent research trend in the optimization and metaheuristic sphere.

2. Theoretical Framework

The formation of manufacturing cells has been researched for many years. One of the first investigations focused on resolving this set of problems was Burbidge’s work in 1963 [4], which proposed the use of an incidence matrix reorganized into a Block Diagonal Form (BDF) [4]. In recent years, many exact and heuristic algorithms have been proposed in the literature to solve MCDP. Such metaheuristic techniques include genetic Algorithm (GA) [10], inspired by biological evolution and its genetic-molecular basis; the Neural Network (NN) [11] that takes the behavior of neurons and the connections of the human brain; and Constraint Programming (CP) [12] where the relationships between the variables are expressed as constraints. For extensive reviews of previous research and other methods of cell formation, see Selim et al. [1].

Among the metaheuristics used for cell formation, there is also the branch of Swarm Intelligence, which was initially introduced by Beni and Wang in 1989 [13]. Inspired by nature, Swarm Intelligence systems are typically formed by a population of simple agents who interact locally with each other and with their environment and who are able to optimize an overall objective through the search for collaboration in a space [14]. Within this branch, the main techniques are Particle Swarm Optimization (PSO) designed and presented by Eberhart et al. [7, 9] in 1995; Ant Colony Optimization (ACO), which is a family of algorithms derived from Dorigo’s 1991 work based on the social behavior of ants [15, 16]; Migrating Birds Optimization (MBO) [17] algorithm based on the alignment of migratory birds during flight; Artificial Fish Swarm Algorithm (AFSA) [18], based on the behavior of fish to find food by themselves or by following other fish; and the discrete Cat Swarm optimization (CSO) Technique presented in 2007 by Chu and Tsai [9], which is based on the behavior of cats. Interestingly, the CSO cat corresponds to a particle in PSO, with a small difference in its algorithms [19, 20]. CSO and PSO were originally developed for continuous value spaces, but there are a number of optimization problems where the values are discrete [21].

3. The Manufacturing Cell Design Problem

The Manufacturing Cell Design Problem (MCDP) divides an industrial production plant into a number of cells. Each cell contains machines with similar process types or part families, determined according to the similarity between parts [4]. A manufacturing cell can be defined as an independent group of functionally different machines, located together, dedicated to the manufacture of a family of similar parts. In addition, a family of parts can be defined as a collection of parts that are similar, either because of their geometric shape and size or because similar processing steps are required to manufacture them [22].

The goal of MCDP is to identify a cell organization in a way that minimizes the transport of different parts between cells, in order to reduce production costs and increase productivity. The idea is to represent the processing requirements of machine parts through an incidence matrix called machine part. This reorganization involves the formulation of two new matrices called machine-cell and part-cell.

A detailed mathematical definition of the formulation of the machine-part clustering problem is defined by the optimization model explained below [6]:(i): number of machines(ii): number of parts(iii): number of cells(iv): machine index ()(v): part index ()(vi): cell index ()(vii): maximum number of machines per cell(viii): machine-to-part binary incidence matrix, where(ix): machine-to-part binary incidence matrix, where(x): machine-to-part binary incidence matrix, where

4. Binary Cat Swarm Optimization

There are about thirty different species of known felines, e.g., lions, tigers, leopards, common housecat, etc. [23]. Although they have different living environments, cats share similar behavioral patterns [24]. For wild cats, the ability to hunt ensures food supply and survival of the species [25]. To hunt their food, wild cats form groups ranging from 2–15 individuals [26]. Domestic cats also show the same ability to hunt and are curious about moving objects [2628]. Although cats might seem to be resting most of the time, even when awake [29, 30], they are actually in a constant state of alert; without moving, they may be listening or have their eyes open to look around [31]. BCSO [8] was formulated on the basis of all these behaviors and is an optimization algorithm that mimics the natural behavior of cats [9, 32, 33]. The authors identified two main modes of behavior for simulating cats [3, 3439]:(i)Seeking mode: exploration-oriented mode, where cats are attracted by moving objects and have a high hunting capacity. Cats may seem to spend most of their time resting, but in fact, they are constantly alert when moving slowly.(ii)Tracing mode: exploitation-oriented mode, where cats detect a prey and run after it, spending a lot of energy due to its rapid movements. In this way, the cats follow the best in their group.

In BCSO, these two behaviors are mathematically modeled to solve complex optimization problems. The first decision is to define the number of cats needed for each iteration. Each cat, represented by catk, where , has its own position consisting of M dimensions composed of ones and zeros (1 and 0). In addition, they have speed for each dimension d, a flag to indicate whether the cat is in the seeking or tracing mode, and finally a fitness value that is calculated based on the MCDP. The BCSO keeps looking for the best solution until iterations are finalized. In BCSO, each catx represents a MCDP solution through a machine-cell matrix, where x identifies the cat and d are the position bits of the cat. In addition, the constraint matrix ensures that each row i is covered by at least one column.

Algorithm 1 describes the general BCSO pseudocode where the mixture ratio (MR) is a percentage that determines the number of cats in the seeking mode.

Algorithm 1: Binary Cat Swarm Algorithm.
4.1. Seeking Mode

This submodels the state of the cat, which is resting, looking around, and seeking the next position to move towards. The seeking mode has the following essential factors:(i)PMO: probability of mutation operation, a percentage that defines the mutation probability for the selected dimension.(ii)CDC: counts of dimensions to change, a percentage that indicates how many dimensions are candidates to change.(iii)SMP: seeking memory pool, a positive integer used to define the memory size for each cat. SMP indicates the points to be scanned by the cat and can be different for different cats.

The following pseudocode describes the behavior of the cat in the seeking mode. Here, is the fitness of the ith cat, and finds the minimum solution and the maximum solution. To solve the MCDP, we use .Step 1: create SMP copies of current catx.Step 2: for each copy:for dimensions that are candidates for change (based on CDC percentage):get a random number (rand) between 0 and 1if rand < PMO, then the position changes.Step 3: evaluate Fitness of all copies.Step 4: calculate the selection probability by applying a roulette wheel or, by default, choose the best copy according to Fitness.Step 5: evaluate if the chosen copy is a better solution than the currently selected cat, and replace accordingly.

Figure 1 shows the flow chart of the behavior of the cat in the seeking mode.

Figure 1: Seeking mode.
4.2. Tracing Mode

This submodel is used to model the state of the cat in hunting or tracing behavior, where the cats are moving towards the best solution obtained so far. Once a cat enters the tracing mode, it moves according to its own velocities for each dimension. Each cat has two velocity vectors, defined as and , where is the probability that the bits of the cat change to zero and is the probability they change to one. The velocity vector changes its meaning with the probability of mutation for each dimension d. The tracing mode action is described in the following pseudocode.Step 1: calculate and according to the following expression, where is the dimension d of the best cat, r1 has random values in the range of [0,1], and c1 is a user-defined constant.Step 2: update values for and according to the expression, where is the inertia weight and M is the number of columns.Step 3: calculate the velocity of catk, , according toStep 4: calculate the probability of mutation in each dimension, defined by parameter which takes a value in the interval of [0,1]Step 5: based on the value of , the new value of each dimension of the cat is updated as follows:

The maximum velocity vector of must be limited to value .

If the value of surpasses that of , must be selected for the corresponding velocity dimension.

The following is a flow chart for a cat in the tracing mode (Figure 2).

Figure 2: Tracing mode.

5. Solving the Manufacturing Cell Design Problem (MCDP)

To solve the MCDP, it is essential to use a repair method for solutions that were not feasible. Algorithm 2 describes the pseudocode used to solve the MCDP.

Algorithm 2: Solving MCDP.

6. Repair Method

A solution may not satisfy the constraints, resulting in an unworkable solution. For this reason, the value that violates the constraint is repaired instead of the matrix being removed. In this section, a function is described to transform nonfeasible solutions into feasible solutions.

Thus, Algorithm 3 presents a repair method in which all rows not covered are identified and assigned accordingly. This will cover all restrictions.

Algorithm 3: Repairing solutions.

7. Autonomous Search

Autonomous Search (AS) is a modern approach that allows the solver to automatically reconfigure its resolution parameters to provide better performance when bad results are detected [40].

In this context, performance is assessed through indicators that collect relevant information during the search. Search parameters are then updated advantageously according to the results obtained by the fitness evaluation.

This approach has been effectively applied to different optimization and satisfaction techniques, such as Constraint Programming [41], SAT [42], mixed integer programming [43, 44], and various other metaheuristic techniques [4547].

In the present investigation, a version of the BCSO with Autonomous Search has been implemented, where the mixture ratio (MR) variable is used as an autonomous parameter; i.e., the MR value changes while the program is executed to give a more dynamic algorithm that directly influences the mode that the cat will take.

Algorithm 4 is the pseudocode describing the Autonomous Search BCSO.

Algorithm 4: Autonomous search.

8. Results

The BCSO implementation process of MCDP has led to results that will be presented in the following section. The metaheuristic was programmed in the JAVA programming language. For the execution of the algorithm, the parameters considered were the following:(i)Iterations = 5000(ii)Number of cats = 30(iii)MR = 0.75 (75% seeking; 25% tracing)(iv)SMP = 15(v)CDC = 0.2(vi)PMO = 0.76(vii)(viii)(ix)

9. Boctor Instances

Tests with the implemented solution were carried out based on 90 instances of 16 × 30 matrices, obtained from 10 problems found in the paper of Boctor [48], hereafter called Boctor Instances. These problems included the use of 2 or 3 cells. In the case of 2 cells, the maximum number of machines (Mmax) in each took values between 8 and 12. In the case of 3, Mmax varied between 6 and 9 machines per cell. In both cases, the value of Mmax remained constant throughout the execution of the algorithm.

The values obtained by submitting each problem to the Classic BCSO and BCSO with Autonomous Search are summarized in Tables 19, where “O” denotes the global optimum given in [48]; “BCSO,” the best value obtained by the BCSO here proposed; “A,” the average number of optima obtained; “I,” the average number of iterations in which the optimum is reached; ”Ms,” the time (in milliseconds) used to reach the optimum; and “RPD,” the Relative Percent Difference, calculated as follows:where is the best known optimal value and Z is the best optimal value achieved by BCSO.

Table 1: Experimental results with cell = 2 and Mmax = 8.
Table 2: Experimental results with cell = 2 and Mmax = 9.
Table 3: Experimental results with cell = 2 and Mmax = 10.
Table 4: Experimental results with cell = 2 and Mmax = 11.
Table 5: Experimental results with cell = 2 and Mmax = 12.
Table 6: Experimental results with cell = 3 and Mmax = 6.
Table 7: Experimental results with cell = 3 and Mmax = 7.
Table 8: Experimental results with cell = 3 and Mmax = 8.
Table 9: Experimental results with cell = 3 and Mmax = 9.

The above results were run 40 times for each of the 90 Boctor Instances. It is important to point out that 100% of these were optimized, proving that BCSO can work with any MCDP instance. The performance of the BCSO metaheuristic in its Autonomous Search version was slightly better, demonstrated by some of the optima averages reached in the experimental results.

10. Other Author Instances

To analyze the effectiveness of the implemented algorithm in a wider range of problems, new instances from different authors were investigated. Matrix sizes ranged from 5 to 40 machines and from 7 to 100 parts. Table 10 shows the instances used:

Table 10: New instances from other authors.

In order to improve the quality of the exhibited behavior by the autonomous version of the Binary Cat Swarm Optimization, we performed a detailed comparison by using these new instances, because they are hardest. This comparison includes two well-known metaheuristics: the first one is inspired by the behavior of the Egyptian vulture (EVOA) [71], and the second one mimics the flashing behavior of fireflies [72]. Table 11 reports the result comparison between our proposal and the methods published in [73].

Table 11: Comparison between classic BCSO.

If it observes the showed results for instances CF01 to CF11, we can conclude that BCSO presents a similar performance to EVOA. In both cases, the optimal values are reached. Moreover, we note the worst and mean values are equal. This behavior can be attributed to the similarity of the operations between both algorithms. Now, if it evaluates MFAO with respect to BCSO, we again can report a similar conclusion. Nevertheless, in CF05 and CF07, BCSO achieves two optimal values that they are not reached with MFAO.

From CF12 onwards, BCSO begins to exhibit an outstanding performance. For instance, in CF12, BCSO is the only one that finds the best solution (optimum value) reaching RPD 0%. Its closer competitor (MBFA) obtains RPD 28.57%. However, the biggest significant difference can be seen from CF15. In this instance, BCSO exhibits higher efficiency than EVOA and it overcomes the reached value by MBFA. Now, if taken any instances between CF16 and CF 35 (more than 57% of instances), the good yield of the BCSO exceeds the two compared approaches term of the best-found values, average-found values, and worst-found values also. Therefore, we can state that BCSO is more than a competitive technique. It is a real alternative for solving the Manufacturing Cell Design Problem.

Now, the values obtained by submitting each problem to Classic BCSO and BCSO with Autonomous Search are summarized in Table 12, where the global optimum is given in [74].

Table 12: Experimental results for new instances.

The above results were obtained after 40 executions for each of the 35 new instances. It should be noted that it was possible to reach optima in 100% of instances for both algorithms, proving that BCSO can work with almost any instance. The performance of the BSCO metaheuristic in its Autonomous Search version was slightly better, demonstrated in some of the optima achieved, improving by 3% with respect to the original.

11. Results for Boctor Instances Using BCSO and BCSO with Autonomous Search

Figure 3 shows the results of the experiments conducted for the Boctor Instances presented above. Thanks to the operation mode of the BCSO, a fast optimum convergence is obtained at C = 2; however, when C = 3, the BCSO does not converge as quickly that said, the optimum is reached in most cases before 100 executions, which demonstrates the effectiveness of the proposed approach.

Figure 3: Graph showing the results of problem 7 for BCSO and BCSO AS with C = 3.

Figure 4 shows the results of problem 3, C = 2 and Mmax = 8, over iterations. Both versions converge quickly: while the Autonomous Search BCSO reaches the optimum early (iteration 10), the normal BCSO is stuck at optimum of fitness 5 at iteration 4.

Figure 4: Graph showing the results of problem 3 for BCSO and BCSO AS with C = 2.

The following graph (Figure 3) shows the results of problem 7, with C = 3, Mmax = 8, reaching the overall optimum in both cases at similar iterations: normal BCSO, iteration 30; and Autonomous Search BCSO, iteration 40.

12. Results for New Instances Using BCSO and BCSO with Autonomous Search

Figure 5 shows the results of the experiments performed for new instances, in which it can be seen that the Autonomous Search algorithm helps the solution not to get trapped at some local optimum; however, not all results with Autonomous Search present an advantage over the original version.

Figure 5: Graph showing the results of problem 26 for BCSO and BCSO AS.

Figure 5 represents the results of problem 26, with M = 24, , C = 12, and Mmax = 3, in which it can be seen that Autonomous Search BCSO does not have a great difference over the normal BCSO; however, Autonomous Search BCSO is able to explore new solutions, which makes it achieve better results.

The graph in Figure 6 represents the results of problem 30, with M = 30, , C = 14, and MMAX = 4, in which it can be seen that the Autonomous Search BCSO solutions continue to change without being trapped in a local optimum, whereas normal BCSO is trapped near iteration 4000.

Figure 6: Graph showing the results of problem 30 for BCSO and BCSO AS.

The graph in Figure 7 represents the results of problem 35, with M = 40, , C = 10, and Mmax = 6, in which Autonomous Search BCSO solutions are changing, exploring new solutions, expanding their search space early on, before iteration 3000; normal BCSO is trapped in a local optimum near iteration 1000.

Figure 7: Graph showing the results of problem 35 for BCSO and BCSO AS.

13. Conclusions

In the present investigation, a new algorithm inspired by cat behavior, called Cat Swarm Optimization, was presented in solving the Manufacturing Cell Design Problem, used for placement of machinery in a manufacturing plant.

The proposed BCSO was implemented and tested using 90 Boctor Instances plus 35 new instances, for a total of 125 instances: The BCSO managed to obtain 100% of known optima in the 90 Boctor Instances, achieving rapid convergence and reduced execution times. In the case of the 35 new instances, it was possible to obtain 100% of the 13 known optima. It should be noted that these results were obtained after a long testing process, where the different parameters of the algorithm were calibrated based on experimentation. For that reason, Autonomous Search was implemented as an optimization method to influence variables in real time, which resulted in dynamic MR that slightly improved results obtained: 3% compared to the original, with 100% of the known optima, both for the 90 Boctor Instances and the 35 new instances.

As can be seen from the results, this metaheuristic behaves well in all observed cases. This research demonstrates that BCSO is a valid alternative for solving the MCDP. The algorithm works well, regardless of the scale of the problem. However, solutions obtained could be improved by using different parameters for each set of instances.

The BCSO performance was significantly increased after selecting a good repair technique. However, relying on a repair method leads us not to recommend the use of this algorithm for other types of problems because it is far less efficient than other techniques for more complex problems.

For future research, a more extensible configuration could be developed to cover a wider set of problems. It would also be interesting to implement this technique in conjunction with other recent metaheuristics where limited work on Autonomous Search exists such as cuckoo search, firefly optimization, or bat algorithms [75]. Finally, hybridization with learning techniques is another interesting research line to pursue, where feedback gathered for the self-tune phase could be processed with machine learning in order to better track the complete solving process.

Data Availability

The authors declare that the data used to support the findings of this study are available from the corresponding author.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

Broderick Crawford was supported by the Grant CONICYT/FONDECYT/REGULAR/1171243, and Ricardo Soto was supported by the Grant CONICYT/FONDECYT/REGULAR/1160455.

References

  1. H. M. Selim, R. G. Askin, and A. J. Vakharia, “Cell formation in group technology: review, evaluation and directions for future research,” Computers & Industrial Engineering, vol. 34, no. 1, pp. 3–20, 1998. View at Publisher · View at Google Scholar
  2. I. Ham, K. Hitomi, and T. Yoshida, Group Technology: Applications to Production Management, Springer Science & Business Media, Berlin, Germany, 2012.
  3. H. Zhenggang, Z. Guo, and J. Wang, “Integrated scheduling of production and distribution operations in a global MTO supply chain,” Enterprise Information Systems, vol. 2018, pp. 1–25, 2018. View at Google Scholar
  4. P. D. Medina, E. A. Cruz, and M. Pinzón, “Generación de celdas de manufactura usando el algoritmo de ordenamiento binario (aob),” Scientia Et Technica, vol. 1, no. 44, pp. 106–110, 2010. View at Google Scholar
  5. J. L. Burbidge, The Introduction of Group Technology, Halsted Press, New York, NY, USA, 1975.
  6. R. Soto, H. Kjellerstrand, O. Durán, B. Crawford, E. Monfroy, and F. Paredes, “Cell formation in group technology using constraint programming and boolean satisfiability,” Expert Systems with Applications, vol. 39, no. 13, pp. 11423–11427, 2012. View at Publisher · View at Google Scholar · View at Scopus
  7. A. P. Engelbrecht, Fundamentals of Computational Swarm Intelligence, John Wiley & Sons, Hoboken, NJ, USA, 2006.
  8. Y. Shara, M. A. Khanesar, and M. Teshnehlab, “Discrete binary cat swarm optimization algorithm,” in Proceedings of Computer, Control & Communication (IC4), 2013 3rd International Conference, pp. 1–6, IEEE, Karachi, Pakistan, September 2013.
  9. S.-C. Chu and P.-W. Tsai, “Computational intelligence based on the behavior of cats,” International Journal of Innovative Computing, Information and Control, vol. 3, no. 1, pp. 163–173, 2007. View at Google Scholar
  10. A. Kusiak, “The part families problem in flexible manufacturing systems,” Annals of Operations Research, vol. 3, no. 6, pp. 277–300, 1985. View at Publisher · View at Google Scholar · View at Scopus
  11. M. Shargal, S. Shekhar, and S. A. Irani, “Evaluation of search algorithms and clustering efficiency measures for machine-part matrix clustering,” IIE transactions, vol. 27, no. 1, pp. 43–59, 1995. View at Publisher · View at Google Scholar · View at Scopus
  12. H. Seifoddini and C.-P. Hsu, “Comparative study of similarity coefficients and clustering algorithms in cellular manufacturing,” Journal of Manufacturing Systems, vol. 13, no. 2, pp. 119–127, 1994. View at Publisher · View at Google Scholar · View at Scopus
  13. G. Beni and J. Wang, “Swarm intelligence in cellular robotic systems,” in Robots and Biological Systems: Towards a New Bionics? pp. 703–712, Springer, Berlin, Germany, 1993. View at Google Scholar
  14. J. F. Kennedy, J. Kennedy, R. C. Eberhart, and Y. Shi, Swarm Intelligence, Morgan Kaufmann, Burlington, MA, USA, 2001.
  15. M. Dorigo, M. Birattari, and T. Stutzle, “Ant colony optimization,” IEEE computational intelligence magazine, vol. 1, no. 4, pp. 28–39, 2006. View at Publisher · View at Google Scholar
  16. F. Olivas, F. Valdez, O. Castillo, C. I. Gonzalez, G. Martinez, and P. Melin, “Ant colony optimization with dynamic parameter adaptation based on interval type-2 fuzzy logic systems,” Applied Soft Computing, vol. 53, pp. 74–87, 2017. View at Publisher · View at Google Scholar · View at Scopus
  17. R. Soto, B. Crawford, B. Almonacid, and F. Paredes, “A migrating birds optimization algorithm for machine-part cell formation problems,” in Proceedings of Mexican International Conference on Artificial Intelligence, pp. 270–281, Springer, Cuernavaca, MX, USA, October 2015.
  18. G. Srinivasan, “A clustering algorithm for machine cell formation in group technology using minimum spanning trees,” International Journal of Production Research, vol. 32, no. 9, pp. 2149–2158, 2007. View at Publisher · View at Google Scholar · View at Scopus
  19. J. Kennedy and R. C. Eberhart, “A discrete binary version of the particle swarm algorithm,” in Proceedings of 1997 Computational Cybernetics and Simulation, 1997 IEEE International Conference, pp. 4104–4108, IEEE, Orlando, FL, USA, October 1997.
  20. J. So and W. Jenkins, “Comparison of cat swarm optimization with particle swarm optimization for IIR system identification,” in Proceedings of Signals, Systems and Computers, 2013 Asilomar Conference, pp. 903–910, IEEE, Pacific Grove, CA, USA, November 2013.
  21. M. A. Khanesar, M. Teshnehlab, and M. A. Shoorehdeli, “A novel binary particle swarm optimization,” in Proceedings of Control & Automation, 2007. MED’07. Mediterranean Conference, pp. 1–6, IEEE, Marrakech, Morocco, June 2007.
  22. S. A. Irani, Handbook of Cellular Manufacturing Systems, John Wiley & Sons, Hoboken, NJ, USA, 1999.
  23. V. Aspinall, Complete Textbook of Veterinary Nursing, Butterworth Heinemann, Oxford, UK, 2006.
  24. B. Pallaud, “Hypotheses on mechanisms underlying observational learning in animals,” Behavioural Processes, vol. 9, no. 4, pp. 381–394, 1984. View at Publisher · View at Google Scholar · View at Scopus
  25. J. Dards, “Feral cat behaviour and ecology,” Bulletin of the Feline Advisory Bureau, vol. 15, 1976. View at Google Scholar
  26. A. Yamane, T. Doi, and Y. Ono, “Mating behaviors, courtship rank and mating success of male feral cat (felis catus),” Journal of Ethology, vol. 14, no. 1, pp. 35–44, 1996. View at Publisher · View at Google Scholar · View at Scopus
  27. S. Crowell-Davis, “Cat behaviour: social organization, communication and development,” in The Welfare Of Cats, I. Rochlitz, Ed., pp. 1–22, Springer, Berlin, Germany, 2005. View at Google Scholar
  28. W. Sung, “Effect of gender on initiation of proximity in free ranging domestic cats (Felis catus),” University of Georgia, Athens, GA, USA, 1998, M.Sc. thesis. View at Google Scholar
  29. R. E. Adamec, “The interaction of hunger and preying in the domestic cat (Felis catus): an adaptive hierarchy?” Behavioral Biology, vol. 18, no. 2, pp. 263–272, 1976. View at Publisher · View at Google Scholar · View at Scopus
  30. H. Adler, “Some factors of observation learning in cats,” Journal of Genetic Psychology, vol. 86, no. 1, pp. 159–177, 1995. View at Publisher · View at Google Scholar · View at Scopus
  31. B. Santosa and M. K. Ningrum, “Cat swarm optimization for clustering,” in Proceedings of 2009 International Conference of Soft Computing and Pattern Recognition, pp. 54–59, Malacca, Malaysia, December 2009.
  32. G. Panda, P. M. Pradhan, and B. Majhi, “IIR system identification using cat swarm optimization,” Expert Systems with Applications, vol. 38, no. 10, pp. 12671–12683, 2011. View at Publisher · View at Google Scholar · View at Scopus
  33. P.-W. Tsai, J.-S. Pan, S.-M. Chen, and B.-Y. Liao, “Enhanced parallel cat swarm optimization based on the taguchi method,” Expert Systems with Applications, vol. 39, no. 7, pp. 6309–6319, 2012. View at Publisher · View at Google Scholar · View at Scopus
  34. G.-G. Wang, A. H. Gandomi, X. Zhao, and H. C. E. Chu, “Hybridizing harmony search algorithm with cuckoo search for global numerical optimization,” Soft Computing, vol. 20, no. 1, pp. 273–285, 2014. View at Publisher · View at Google Scholar · View at Scopus
  35. M. Yazdani and F. Jolai, “Lion optimization algorithm (LOA): a nature-inspired metaheuristic algorithm,” Journal of Computational Design and Engineering, vol. 3, no. 1, pp. 24–36, 2016. View at Publisher · View at Google Scholar · View at Scopus
  36. X. Zhang, K. Tang, S. Li, K. Xia, and D. Zhao, “Design of slow-wave structure based on multi-objective quantum particle swarm optimization algorithm with inertia weight,” Chinese Journal of Vacuum Science & Technology, vol. 30, no. 6, pp. 651–656, 2010. View at Google Scholar
  37. D. Zhao, K. Xia, H. Liu, and X. Shi, “A pitch distribution in slow-wave structure of STWT using Cauchy mutated cat swarm optimization with gravitational search operator,” Journal of the Chinese Institute of Engineers, vol. 41, no. 4, pp. 297–307, 2018. View at Publisher · View at Google Scholar · View at Scopus
  38. M. Zhao, “A novel compact cat swarm optimization based on differential method,” Enterprise Information Systems, vol. 2018, pp. 1–25, 2018. View at Publisher · View at Google Scholar · View at Scopus
  39. C. Peraza, F. Valdez, and P. Melin, “Optimization of intelligent controllers using a type-1 and interval type-2 harmony search algorithm,” Algorithms, vol. 10, no. 3, p. 82, 2017. View at Publisher · View at Google Scholar · View at Scopus
  40. Y. Hamadi, E. Monfroy, and F. Saubion, “What is autonomous search?” Hybrid Optimization, Springer, Berlin, Germany, 2011. View at Google Scholar
  41. B. Crawford, R. Soto, E. Monfroy, W. Palma, C. Castro, and F. Paredes, “Parameter tuning of a choice-function based hyperheuristic using particle swarm optimization,” Expert Systems with Applications, vol. 40, no. 5, pp. 1690–1695, 2013. View at Publisher · View at Google Scholar · View at Scopus
  42. F. Hutter, Y. Hamadi, H. H. Hoos, and K. Leyton-Brown, “Performance prediction and automated tuning of randomized and parametric algorithms,” in Proceedings of International Conference on Principles and Practice of Constraint Programming, pp. 213–228, Springer, Nantes, France, September 2006.
  43. F. Hutter, H. H. Hoos, and K. Leyton-Brown, “Automated configuration of mixed integer programming solvers,” in Proceedings of International Conference on Integration of Artificial Intelligence (AI) and Operations Research (OR) Techniques in Constraint Programming, pp. 186–202, Springer, Bologna, Italy, June 2010.
  44. F. Hutter, H. H. Hoos, and K. Leyton-Brown, “Sequential model-based optimization for general algorithm configuration,” in Proceedings of International Conference on Learning and Intelligent Optimization, pp. 507–523, Springer, Rome, Italy, Januray 2011.
  45. J. Maturana, F. Lardeux, and F. Saubion, “Autonomous operator management for evolutionary algorithms,” Journal of Heuristics, vol. 16, no. 6, pp. 881–909, 2010. View at Publisher · View at Google Scholar · View at Scopus
  46. J. Maturana and F. Saubion, “On the design of adaptive control strategies for evolutionary algorithms,” in Proceedings of International Conference on Artificial Evolution (Evolution Artificielle), pp. 303–315, Springer, Tours, France, October 2007.
  47. J. Maturana and F. Saubion, “A compass to guide genetic algorithms,” in Proceedings of International Conference on Parallel Problem Solving from Nature, pp. 256–265, Springer, Birmingham, UK, September 2008.
  48. F. F. Boctor, “A Jinear formulation of the machine-part cell formation problem,” International Journal of Production Research, vol. 29, no. 2, pp. 343–356, 1991. View at Publisher · View at Google Scholar · View at Scopus
  49. J. R. King and V. Nakornchai, “Machine-component group formation in group technology: review and extension,” International Journal of Production Research, vol. 20, no. 2, pp. 117–133, 2007. View at Publisher · View at Google Scholar · View at Scopus
  50. P. H. Waghodekar and S. Sahu, “Machine-component cell formation in group technology: Mace,” International Journal of Production Research, vol. 22, no. 6, pp. 937–948, 2007. View at Publisher · View at Google Scholar · View at Scopus
  51. H. Seifoddini, “A note on the similarity coefficient method and the problem of improper machine assignment in group technology applications,” International Journal of Production Research, vol. 27, no. 7, pp. 1161–1165, 2007. View at Publisher · View at Google Scholar · View at Scopus
  52. A. Kusiak and M. Cho, “Similarity coefficient algorithms for solving the group technology problem,” International Journal of Production Research, vol. 30, no. 11, pp. 2633–2646, 2007. View at Publisher · View at Google Scholar · View at Scopus
  53. A. Kusiak and W. S. Chow, “Efficient solving of the group technology problem,” Journal of Manufacturing Systems, vol. 6, no. 2, pp. 117–124, 1987. View at Publisher · View at Google Scholar · View at Scopus
  54. H. Seifoddini and P. M. Wolfe, “Application of the similarity coefficient method in group technology,” IIE transactions, vol. 18, no. 3, pp. 271–277, 1986. View at Publisher · View at Google Scholar · View at Scopus
  55. M. P. Chandrasekharan and R. Rajagopalan, “MODROC: an extension of rank order clustering for group technology,” International Journal of Production Research, vol. 24, no. 5, pp. 1221–1233, 1986. View at Publisher · View at Google Scholar · View at Scopus
  56. M. P. Chandrasekharan and R. Rajagopalan, “An ideal seed non-hierarchical clustering algorithm for cellular manufacturing,” International Journal of Production Research, vol. 24, no. 2, pp. 451–463, 1986. View at Publisher · View at Google Scholar · View at Scopus
  57. C. Mosier and L. Taube, “The facets of group technology and their impacts on implementation-A state-of-the-art survey,” Omega, vol. 13, no. 5, pp. 381–391, 1985. View at Publisher · View at Google Scholar · View at Scopus
  58. H. M. Chan and D. A. Milner, “Direct clustering algorithm for group formation in cellular manufacture,” Journal of Manufacturing systems, vol. 1, no. 1, pp. 65–75, 1982. View at Publisher · View at Google Scholar · View at Scopus
  59. R. G. Asktn and S. P. Subramantan, “A cost-based heuristic for group technology configuration,” International Journal of Production Research, vol. 25, no. 1, pp. 101–113, 2007. View at Publisher · View at Google Scholar · View at Scopus
  60. L. E. Stanfel, “Machine clustering for economic production,” Engineering costs and production economics, vol. 9, no. 1–3, pp. 73–81, 1985. View at Publisher · View at Google Scholar · View at Scopus
  61. W. T. McCormick Jr., P. J. Schweitzer, and T. W. White, “Problem decomposition and data reorganization by a clustering technique,” Operations Research, vol. 20, no. 5, pp. 993–1009, 1972. View at Publisher · View at Google Scholar · View at Scopus
  62. G. Srinvasan, T. Narendran, and B. Mahadevan, “An assignment model for the part families problem in group technology,” International Journal of Production Research, vol. 28, no. 1, pp. 145–152, 1990. View at Publisher · View at Google Scholar · View at Scopus
  63. J. R. King, “Machine-component grouping in production flow analysis: an approach using a rank order clustering algorithm,” International Journal of Production Research, vol. 18, no. 2, pp. 213–232, 2007. View at Publisher · View at Google Scholar · View at Scopus
  64. A. S. Carrie, “Numerical taxonomy applied to group technology and plant layout,” International Journal of Production Research, vol. 11, no. 4, pp. 399–416, 1973. View at Publisher · View at Google Scholar · View at Scopus
  65. C. Mosier and L. Taube, “Weighted similarity measure heuristics for the group technology machine clustering problem,” Omega, vol. 13, no. 6, pp. 577–579, 1985. View at Publisher · View at Google Scholar · View at Scopus
  66. K. R. Kumar, A. Kusiak, and A. Vannelli, “Grouping of parts and components in flexible manufacturing systems,” European Journal of Operational Research, vol. 24, no. 3, pp. 387–397, 1986. View at Publisher · View at Google Scholar · View at Scopus
  67. W. J. Boe and C. H. Cheng, “A close neighbour algorithm for designing cellular manufacturing systems,” International Journal of Production Research, vol. 29, no. 10, pp. 2097–2116, 1991. View at Publisher · View at Google Scholar · View at Scopus
  68. M. P. Chandrasekharan and R. Rajagopalan, “GROUPABIL1TY: an analysis of the properties of binary data matrices for group technology,” International Journal of Production Research, vol. 27, no. 6, pp. 1035–1052, 2007. View at Publisher · View at Google Scholar · View at Scopus
  69. K. R. Kumar and A. Vannelli, “Strategic subcontracting for efficient disaggregated manufacturing,” University of Illinois, Champaign, IL, USA, 1986, BEBR faculty working paper; no. 1252. View at Google Scholar
  70. M. P. Chandrasekharan and R. Rajagopalan, “ZODIAC-an algorithm for concurrent formation of part-families and machine-cells,” International Journal of Production Research, vol. 25, no. 6, pp. 835–850, 2007. View at Publisher · View at Google Scholar · View at Scopus
  71. C. Sur, S. Sharma, and A. Shukla, “Egyptian vulture optimization algorithm–A new nature inspired meta-heuristics for knapsack problem,” in Proceedings of 9th International Conference on Computing and Information Technology (IC2IT2013), pp. 227–237, Bangkok, Thailand, May 2013.
  72. A. Ritthipakdee, A. Thammano, N. Premasathian, and D. Jitkongchuen, “Firefly mating algorithm for continuous optimization problems,” Computational Intelligence and Neuroscience, vol. 2017, Article ID 8034573, 10 pages, 2017. View at Publisher · View at Google Scholar · View at Scopus
  73. B. Almonacid, F. Aspée, R. Soto, B. Crawford, and J. Lama, “Solving the manufacturing cell design problem using the modified binary firefly algorithm and the egyptian vulture optimisation algorithm,” IET Software, vol. 11, no. 3, pp. 105–115, 2017. View at Publisher · View at Google Scholar · View at Scopus
  74. B. Almonacid, F. Aspée, R. Soto, B. Crawford, and J. Lama, Solving Manufacturing Cell Design Problem Using Modified Binary Firefly Algorithm and Egyptian Vulture Optimization Algorithm, IET Software, Wales, UK, 2016.
  75. R. Soto, B. Crawford, B. Almonacid, and F. Paredes, “Efficient parallel sorting for migrating birds optimization when solving machine-part cell formation problems,” Scientific Programming, vol. 2016, Article ID 9402503, 39 pages, 2016. View at Publisher · View at Google Scholar · View at Scopus