About this Journal Submit a Manuscript Table of Contents
Discrete Dynamics in Nature and Society
Volume 2012 (2012), Article ID 698057, 28 pages
Research Article

Bacterial Colony Optimization

1College of Management, Shenzhen University, Shenzhen 518060, China
2Hefei Institute of Intelligent Machines, Chinese Academy of Sciences, Hefei 230031, China

Received 27 May 2012; Accepted 24 August 2012

Academic Editor: Binggen Zhang

Copyright © 2012 Ben Niu and Hong Wang. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


This paper investigates the behaviors at different developmental stages in Escherichia coli (E. coli) lifecycle and developing a new biologically inspired optimization algorithm named bacterial colony optimization (BCO). BCO is based on a lifecycle model that simulates some typical behaviors of E. coli bacteria during their whole lifecycle, including chemotaxis, communication, elimination, reproduction, and migration. A newly created chemotaxis strategy combined with communication mechanism is developed to simplify the bacterial optimization, which is spread over the whole optimization process. However, the other behaviors such as elimination, reproduction, and migration are implemented only when the given conditions are satisfied. Two types of interactive communication schemas: individuals exchange schema and group exchange schema are designed to improve the optimization efficiency. In the simulation studies, a set of 12 benchmark functions belonging to three classes (unimodal, multimodal, and rotated problems) are performed, and the performances of the proposed algorithms are compared with five recent evolutionary algorithms to demonstrate the superiority of BCO.

1. Introduction

Swarm intelligence is the emergent collective intelligent behaviors from a large number of autonomous individuals. It provides an alternative way to design novel intelligent algorithms to solve complex real-world problems. Different from conventional computing paradigms [13], such algorithms have no constraints of central control, and the searching result of the group will not be affected by individual failures. What is more, swarm intelligent algorithms maintain a population of potential solutions to a problem instead of only one solution.

Nowadays, most of swarm intelligent optimization algorithms are inspired by the behavior of animals with higher complexity. Particle swarm optimization (PSO) [4, 5] was gleaned ideas from swarm behavior of bird flocking or fish schooling. Ant colony optimization (ACO) was motivated from the foraging behavior of ants [6, 7]. Artificial fish swarm algorithm (AFSA) was originated in the swarming behavior of fish [8], and artificial bee colony algorithm (ABCA) [9, 10] was stimulated by social specialization behavior of bees. However, the states of the abovementioned animals are more complex, and their behaviors are difficult to describe qualitatively.

As prokaryote, bacteria behave in a simple pattern which can be easily described. Inspired by the foraging behavior of Escherichia coli (E. coli) in human intestines, Passion proposed an optimization algorithm known as bacterial foraging optimization (BFO) recently [11]. In the same year, another well-known study based on bacterial behavior, bacteria chemotaxis (BC), was presented by Müller et al. [12]. Those two papers broadened the bacterial optimization area with a new horizon. Since then, a growing number of researchers who paid great attention to this field began to divert their concentrations on this new algorithm [1315] and extended the concept to algorithms such as fast bacterial swarming algorithm (FBSA) [16], bacterial-GA foraging algorithm [17], and adaptive bacterial foraging algorithm (ABFO) [18]. Except those new proposed algorithms, improvements of bacterial foraging optimization (BFO) were considered by investigators. Some of the key models involve the discussion of chemotaxis step length (BFO-LDC and BFO-NDC [19]) or adaptive chemotaxis step [20]. These researches on bacterial foraging optimization suggested that predicting and controlling the dynamical behavior of microbial colonies might have profound implications in reality.

However, traditional bacterial behavior-based algorithms (BFO or BC) only considered individual behaviors instead of social behaviors with swarm intelligence. Each individual in the colony independently searches for food by their own experience without any information exchange with others. What made the situation worse is the complicated characteristics of the original bacterial behavior-based algorithms. Taking BFO for example, long period of time has been spent on random chemotaxis. Additionally, the chemotaxis, reproduction, elimination and dispersal processes are inner iterations that lead to high computation complexity. The frequency of chemotaxis, reproduction, elimination, and dispersal is predefined without considering the environmental dynamics.

To deal with the aforementioned problems, we propose a new bacterial behavior based optimization algorithm—bacterial colony optimization (BCO)—to formulate bacteria's behavior with swarm intelligence. The main contributions of this paper are described as follows.(i)A new description of artificial bacteria lifecycle is formulated, which include five basic behaviors and their corresponding models.(ii)Newly created bacterial behavior model is proposed to simplify the bacteria optimization process.(iii)A novel chemotaxis and communication mechanism is used as well to update the bacterium positions.(iv)Two methods of communication: individual interaction and group exchange are introduced to improve the optimization efficiency.

The rest of the paper is organized as follows. Section 2 describes the basic behavior of artificial bacteria, and their corresponding models and the proposed algorithm are proposed in Section 3. Section 4 presents the results of the simulation studies, followed by the conclusion in Section 5.

2. Artificial Bacteria Behavior

Bacteria swim by rotating whip-like flagella driven by a reversible motor embedded in the cell wall (Figure 1(a)). As environment changed, the fittest individuals will survive while the poorer ones will be eliminate. After a while, the existing bacteria generate new offsprings (Figure 1(b)). With the depletion of nutrition and increasing of population, the resources can no longer hold the bacteria population. Some individuals may choose to migrate to a new area with better nutrition supply (Figure 1(c)). During the searching for a new habitat, information sharing and self-experience are both essential (Figure 1(c)). This process brings together the macro- and the microscale that allows bacteria to make chemotaxis movements as well as interacting with other bacterium. This can be realized by incorporating the mechanism of chemotaxis and communication into the whole optimization process. Apart from the mechanism of chemotaxis and communication, three additional mechanisms, reproduction, elimination, and migration are also viewed as evolutionary operators to cover the entire optimization search process (Figure 1(d)).

Figure 1: Individual and social behavior of artificial bacteria.

The basic behavior of bacteria in the lifecycle can be simply divided into five parts: chemotaxis, elimination, reproduction, migration, and communication. The detail descriptions of those processes are given as follows.

2.1. Chemotaxis

A fascinating property of E. coli is their chemotactic behavior. Flagellated bacteria E. coli produce motion by the movement of their flagellum. The whole process of chemotaxis can be depicted into two operations: running and tumbling. In the process of running, all flagella rotate counterclockwise to form a compact propelling the cell along a helical trajectory. In this way, bacteria can swim straightly in one direction. In the case of tumbling, the flagella rotate clockwise, which pull the bacterium in different directions [11].

Bacteria need to migrate up to the concentration gradient of nutrients. Hence, the alteration between two operations in chemotaxis must be well organized. A basic strategy used by microbes is to move in one direction for several steps. If the new environment cannot satisfy the bacteria, then they would tumble to pull themselves into a new direction and start a second run.

2.2. Elimination, Reproduction and Migration

Based on the theory of nature selection, bacteria with poorer searching ability would have higher chance of being eliminated. In contrast, those who perform well in the chemotaxis process would obtain more energy for survival and thus have a high probability of reproduction. In our proposed model—bacteria colony optimization (BCO), artificial bacteria with high quality in searching for nutrition have the opportunity to be endowed a relevant level of energy grade. Whether a bacterium has the chance to reproduce or not would base on the level of its energy grade.

After a long time of chemotaxis, elimination, and reproduction in the same area or surroundings, the nutrition must be used up or cannot satisfy all the bacteria. At this time, some bacteria have to migrate into a new nutritious place, and this process is called “Migration.”

2.3. Communication

Communication is an essential behavior that exists in whole processes of bacterial life. Three basic communication mechanisms are employed in the bacteria colony optimization (see Figure 2), including dynamic neighbor oriented (Figure 2(a)), random oriented (Figure 2(b)) and group oriented (Figure 2(c)), which represent three types of topologic structures. The bacteria share information between individuals in the first two exchange mechanisms, and the third mechanism (group oriented) means that bacteria communicate searching information in unit if groups.

Figure 2: Communication mechanism.

3. Bacterial Colony Optimization Principle

3.1. Lifecycle Model

The behavior of artificial bacteria in this paper includes five parts, but those behaviors are continuous, mingle, and amalgamate. Chemotaxis behavior is always accompanied by communication along the whole lifecycle. Therefore, chemotaxis and communication are treated as one model in Bacterial Colony Optimization (BCO). Bacteria have two chooses after long times of chemotaxis and communication. They may die for the lack of food, or they may reproduce if they are capable of searching for food. Within the complicated environment, some individuals may run into dangerous place (go out of boundary). Specific situations like this also worth special treatment in lifecycle model (LCM). Migration conducts as an independent model, which involves energy depletion, group diversity, and chemotaxis efficiency. The overall model of bacterial lifecycle is shown in Figure 3.

Figure 3: Lifecycle model.

The framework of this model is based on an agent-environment-rule (AER) schema; that is, there are three fundamental elements: agent, environment, and rule. The detailed description is listed below:(i)A: artificial bacteria,(ii)E: artificial environment,(iii)R: the environment/organism interaction mechanisms.

LCM model is different from the original population-based model in which all the individuals share the same state properties. LCM is a philosophy that embraces the uniqueness of the individuals in a system with multiple individuals that have its own set of state variables and parameters. Looking in state space, the population is akin to clouds of individuals with similar behaviors, and other clouds, amounting to separate individuals. Fundamentally, this allows for individuals to exist and speciation to occur and permits extinction. In general, the lifecycle model of artificial bacteria in BCO can be divided into four submodels: chemotaxis and communication model, reproduction model, elimination model, and migration model.

The detailed explanations of each submodel are formulated in the following.

3.2. Chemotaxis and Communication Model

Chemotaxis is accompanied with communication in the whole optimization process. Bacteria run and tumble in the competitive environment. However, they also have to offer their information to the colony in exchange of overall information which would guide them in direction and ways of movement. As is shown in Figure 4, bacterial chemotaxis is directed by three elements: group information, personal previous information, and a random direction. All three factors conduct bacteria running and tumbling toward optimum.

Figure 4: Chemotaxis.

Bacterium runs or tumbles with communication process can be formulated as:

Actually, the above position updating equations only consider the relationship between the individuals and the group. The bacteria share information between individuals also merged into the communication model. Pseudocode for E. coli chemotaxis and communication is listed in Pseudocode 1.

Pseudocode 1: Pseudocode for individual exchange.

3.3. Elimination and Reproduction Model

Each bacterium is marked with an energy degree based on its search capability. The higher level of energy indicates a better performance of bacterium. The level of energy decides the probability of elimination and reproduction. The distribution of bacterial energy degree was sorted and analyzed and then used as a criterion to judge the qualification of the bacteria. The details are summarized as

All behaviors of bacteria were restricted within a restrained area. As a general principle, individuals are not allowed to go out of the region, so boundary control is especially important. If bacteria move away from the feasible domain, at least two strategies will be performed based on experiences. One is to generate new individuals to replace the outer ones, and the other is to let the outer ones stay at boundary but change the forward direction to keep them effectiveness. In this paper, those outer individuals named “unhealthy” are put into a set which hold the candidate bacteria for elimination.

3.4. Migration Model

Naturally, bacteria could pursue more nutrition by migration. In optimization aspect, migration can avoid local optimum within some distance. Especially, the migration of artificial bacteria in BCO is not based on a certain given probability. It depends on a given condition. When condition is fulfilled, bacterium would migrate to a new random place, as described by where, and and are the upper and lower boundary, respectively. Bacteria will search for nutrition continuously as long as they need not to migrate. Migration in BCO algorithm is influenced by average energy degree, individual similarity, and chemotaxis efficiency. These factors altogether make up the migration condition. Pseudocode 2 shows the entire migration procedure.

Pseudocode 2: Pseudocode for migration.

3.5. Bacterial Colony Optimization
3.5.1. Implementation of Chemotaxis and Communication

As illustrated above, bacteria chemotaxis all life time can be divided into two models: tumbling and swimming. In the process of tumbling, a stochastic direction participates into actually swimming process. Therefore, turbulent director and optimal searching director altogether influence the search orientation in tumbling, update the positions of each bacterium as (3.4), whereas no turbulent director acceding in swimming process to affect the bacteria swimming toward optimal as (3.5) formulated: where and is the turbulent direction variance of the th bacterium. and are the globe best or personal best position of the th bacterium. is the chemotaxis step size. The BCO uses the adaptive chemotaxis step referred to Niu et al. [19, 20]: where is the maximal number of iterations, is the current number of iterations, and is the chemotaxis step of the th bacterium. With , the system becomes a special case of fixed chemotaxis step length, as the original proposed BFO algorithm. If , it is the linearly decreasing strategy of chemotaxis step. And otherwise, chemotaxis size is changing with a nonlinear decreasing strategy. As it is proved by Niu et al. in their chemotaxis step discussed papers [19, 20], simple low-dimension problems prefer to linear decreasing strategy of chemotaxis step, but, in high dimension multimodal complex problems, nonlinear decreasing strategies of chemotaxis step are more popular.

3.5.2. Implementation of Interactive Exchange

Interactive exchange in BCO can be divided into individual exchange and group exchange as described above. Individual exchange also can specify dynamic neighbor oriented (Figure 2(a)) and random oriented (Figure 2(b)). Group exchange means group oriented (Figure 2(c)). But in each generation, every bacterium has only one type of the exchange model to choose. As interactive exchange may affect the diversity of bacterial group, each bacterium has only one chance to exchange in each generation. Pseudocode 3 shows the process of interactive exchange.

Pseudocode 3: Pseudocode of interactive exchange.

3.5.3. Framework of Bacterial Colony Optimization

In BCO algorithm, artificial bacterial behaviors are executed based on given conditions responding to the dynamic environment. The procedure of chemotaxis, communication, reproduction, elimination, and migration is not premeditated, but determined only when certain given conditions are reached.

As shown in Figure 5, the process of reproduction, elimination, and migration is all independent without specific orders, where iteration times of BCO algorithm are the same as the frequency of chemotaxis and communication. It is consistent with the theoretical understanding of biological systems. The environment-based operation rules control the basic behavior of bacteria. Taking migration for example, Figure 5 also demonstrates that influence factors such as position, direction, and energy level can impact the migration process. So the rules of migration can be set by consulting the influence factors. As one of migration factors, position is related to group diversity, individual fitness, and so forth.

Figure 5: The flowchart of BCO.

The overall procedure of bacterial colony optimization (BCO) is presented in Pseudocode 4.

Pseudocode 4: Pseudocode of bacterial colony optimization.

4. Experiments and Results

4.1. Test Functions

To test the effectiveness of the new proposed BCO algorithm, twelve well-known test functions with 15 dimensions and 40 dimensions are adopted. Test problems include two unimodal functions [21], six multimodal functions [21], and four rotated multimodal functions [22]. All the test functions used have a minimum function value of zero. The detailed description of the benchmark functions is listed in Appendix A.

4.2. Parameters Settings

To evaluate the performance of the proposed BCO, five other algorithms were used for comparisons: particle swarm optimization (PSO), genetic algorithm (GA), bacterial foraging optimization (BFO), bacterial foraging optimization with linear decreasing cemotaxis step (BFO-LDC), and bacterial foraging optimization with nonlinear decreasing chemotaxis step (BFO-NDC). The parameters used for these five algorithms were recommended from [4, 11, 19, 20, 23] or hand selected. The parameter setting of the previous algorithms on benchmark functions is summarized as Table 1. The population size of all algorithms used in our experiments was set at 100. The maximum number of iterations 2000 was applied to all the algorithms. For BCO, linearly decreasing chemotaxis step length is adopted. The lower step length is set at 0.01, and the upper step length is used. The test functions used in this paper are to find the minimum 0. All experiments were repeated for 20 runs.

Table 1: Globe optimum, search ranges, and initialization ranges of test functions.
4.3. Experimental Results and Discussions
4.3.1. Results for the 15-D Problems

Tables 2, 3, and 4 show the means and variances of the 20 runs of five algorithms on twelve test functions with 15 dimensions. And Figure 6 presents the convergence characteristics in terms of the best fitness values of median run of each algorithm for twelve unconstrained functions. In comparison with five introduced algorithms, the results in Tables 2~4 show that the proposed BCO performs significantly better than four other algorithms (GA, BFO, BFO-LDC, and BFO-NDC) in all test functions, and BCO generates better results than PSO in most of functions. Three exceptions in twelve are sphere, sum of power, and rotated Griewank function, and PSO can get the smaller values after iterations finished. From Table 2, PSO can get better performance in most of functions in this group. PSO is easy to implement in easy problems, BCO has to spend more time communicating and migrating, which result that the BCO could not converge as fast as PSO.

Table 2: Experimental results on benchmark functions ~ (15-D).
Table 3: Experimental results on benchmark functions ~ (15-D).
Table 4: Experimental results on benchmark functions ~ (15-D).
Figure 6: The median convergence characteristics of 15-D test functions.

The means and variances of the median run of each algorithm on ~ multimodal functions with 15-D are presented in Pseudocode 3. From Pseudocode 3, it is observed that, for those four multimodal test problems, BCO achieves the best results compared with the other algorithms and it converges quickly (i.e., as shown in Figures 6(e)~6(h)). In this group, BCO obviously performs better than PSO. That is because PSO has the high chance of getting trapped in a local optimum in multimodal functions. In contrast, with the ability of migration, BCO has the best ability to surpass all approaches in keeping the tradeoff between the local exploitation and the global exploration. BCO could show better search ability in multimodal functions.

Pseudocode 4 illustrates the comparisons of the other four algorithms on the rotated benchmark functions (~). The proposed algorithm BCO is superior to any other algorithms (PSO, GA, BFO, BFO-LDC, and BFO-NDC) on the optimization of the optimization problems except the Rotated-Griewank function. From the values in Pseudocodes 3 and 4, BCO can converge to the best fitness value with the most times in the 15-D, which proves that BCO owns higher optimization capability for the multimodal complex problems.

Figure 6 shows the comparisons on the twelve unimodal, multimodal, and rotated functions in 15-D. This figure illustrates that the BCO could converge to the global optimum keeping a good diversity and high speed when it conducts the optimization of 15-D functions. From Figures 6(a)~6(c), Figures 6(e)~6(g) and others, BCO converges quicker than any other algorithms, which owes to the communication mechanism between bacterial individuals and bacterial groups. The fourth function Sin is a trigonometric function drawn from Figure 6(d), PSO gets the quickest convergence rate, but BCO finds the smallest optimum value after the maximum generations finished.

4.3.2. Results for the 40-D Problems

The experiments conducted on 15-D problems are repeated on the 40-D problems. Similarly to the case in 15-D, Tables 5~7 list the experimental results (i.e., the mean and standard deviations of the function values found in 20 runs) for each algorithm on twelve test functions. The average convergence results of benchmark functions on 40-D obtained in 20 runs are presented in Figure 16. From the results in Table 5, it is observed that the ranking of algorithms achieved is similar to the ranking in the 15-D problems. The new proposed approach BCO can find relatively optimum within 2000 generations and obtains the best results in the functions , and . Table 6 and Figures 7(e)~7(h) prove that the proposed algorithm can converge much faster to the best results than all other algorithms to the functions from to . As it concerns to the difficulty in multimodal functions ( to ), the proposed approach consistently finds the best minimum on the functions , , and in Table 7. Figures 7(i), 7(k), and 7(l) once again verify the fastest convergence rate of the proposed algorithm on those functions. Although it becomes more difficult in 40-D problems, BCO also achieves the best results compared with the other algorithms in most cases.

Table 5: Experimental results on benchmark functions ~ (40-D).
Table 6: Experimental results on multimodal benchmark functions ~ (40-D).
Table 7: Experimental results on rotated benchmark functions ~ (40-D).
Figure 7: The median convergence characteristics of 40-D test functions.
4.4. Bacterial Behavior in Bacterial Colony Optimization

According to the comparative experiments, the proposed BCO algorithm shows the superior searching abilities in most cases. In this section, simulation studies will conduct in a vary environment with nutrient-noxious distribution. The nutrient distribution of environment at is set by the function as Appendix B, which is also illustrated in Figure 8.

Figure 8: Nutrient-noxious environment.

From Figure 9, we are able to conclude that the new proposed optimization algorithm can find optimum at a high speed. When a relative optimum was found, the strategy was changed so that more time would be spent on local searching. After a long time of chemotaxis and communication, reproduction, and elimination process, the final fitness value of each bacteria has been showed in Figure 10. From Figure 10, we know that all of the bacteria have found the optimum at the end of 2000 runs.

Figure 9: The average fitness values obtained with iterations.
Figure 10: The final optimal function values of each individual.

Figures 11 and 12 suggest that the position of the bacterial group changes with the chemotaxis process from two dimensions. In BCO, bacterial chemotaxis time equals to running frequency. Chemotaxis goes along with entire optimization process. For example, if bacterial chemotaxis , then the whole run frequency . What the figures inform us is that bacteria can search for optimum quickly with the help of communication. Most individuals can even find the optimum in the first 500 runs.

Figure 11: 2-D Position with the iteration process.
Figure 12: The process of finding the optimum.

The above figures only point out the group search ability without answering how microcommunities can adapt their behavior to nutrients. Figures 13 and 14 issue the four bacteria optimization process. Those four bacteria are initialized from different positions, but after 100 runs, all of the four have found their optimum.

Figure 13: Four bacteria find the optimum when chemotaxis ranges between 1~100.
Figure 14: Optimal process of four bacteria when chemotaxis ranges between 1~100.

Figure 15 pictures the single bacterium chemotaxis process when chemotaxis ranges between 1 and 100. Optimum can be quickly achieved with our proposed method. In the first 25 steps, the bacterium had already entered into optimum region. Thereafter, it changes the strategy to local search. Figure 16 presents the whole optimum procedure, which reveals the fact that the bacterium has found the optimum after 100 runs.

Figure 15: Single bacterium finds the optimum when chemotaxis ranges between 1~100.
Figure 16: Optimal process of one bacterium when chemotaxis ranges between 1~100.

5. Conclusions and Future Work

In this paper, a lifecycle model concerned with modeling ecological and evolutionary processes of E. coli bacteria is proposed in this paper. The details of some typical evolutionary behaviors such as chemotaxis, reproduction, extinction, and migration have been described, and the detailed algorithm used to model those behaviors is given. Based on the LCM model a new optimization algorithm—bacterial colony optimization—is proposed. In BCO, bacterial behaviors (chemotaxis, reproduction, extinction, and migration) during their whole lifecycle are viewed as evolutionary operators used to find the best solution for a given optimization problem. Additionally, to improve the search ability of BCO, three types of communication model are designed for each individual interacted locally with one another.

Based on the results of the six algorithms on the twelve chosen test problems belonging to three classes, we can conclude that BCO gives the best performance on almost all the benchmarks problems irrespective of if they are unrotated or rotated when compared with five other algorithms.

However, BCO is still in its infant stage. Further work may focus on (i) incorporating a dynamic population size strategy to BCO, (ii) hybridizing BCO with other swarm intelligent algorithms, (iii) applying BCO to multiobjective problems.


A. Benchmark Functions

A.1. Group A: Unimodal Functions

(1)Sphere function (2)Rosenbrock function

A.2. Group B: Multimodal Functions

(3)Sum of different powers function (4)Sin function (5)Rastrigin function (6)Griewank function (7)Ackley function (8)Weierstrass function

A.3. Group C: Rotated Multimodal Functions

(9)Rotated Rastrigin function (10)Rotated Griewank function (11)Rotated Ackley function (12)Rotated Weierstrass function

B. The Nutrient Distribution Function


The first author would like to thank Dr Yujuan Chai for modifying the manuscript and giving many valuable comments. This work is supported by National Natural Science Foundation of China (Grant nos. 71001072, 71271140, and 71210107016), China Postdoctoral Science Foundation (Grant no. 20100480705), Science and Technology Project of Shenzhen (Grant no. JC201005280492A), and the Natural Science Foundation of Guangdong Province (Grant no. 9451806001002294 and S2012010008668).


  1. Q. Tan, Q. He, W. Zhao, Z. Shi, and E. S. Lee, “An improved FCMBP fuzzy clustering method based on evolutionary programming,” Computers & Mathematics with Applications, vol. 61, no. 4, pp. 1129–1144, 2011. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  2. J. A. Vasconcelos, J. A. Ramírez, R. H. C. Takahashi, and R. R. Saldanha, “Improvements in genetic algorithms,” IEEE Transactions on Magnetics, vol. 37, no. 5, pp. 3414–3417, 2001. View at Publisher · View at Google Scholar · View at Scopus
  3. R. Akbari and K. Ziarati, “A multilevel evolutionary algorithm for optimizing numerical functions,” International Journal of Industrial Engineering Computations, vol. 2, no. 2, pp. 419–430, 2011.
  4. J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of the IEEE International Conference on Neural Networks, pp. 1942–1948, December 1995. View at Scopus
  5. J. Kennedy and R. C. Eberhart, Swarm Intelligence, Morgan Kaufmann, San Francisco, Calif, USA, 2001.
  6. M. Dorigo, M. Birattari, and T. Stützle, “Ant colony optimization artificial ants as a computational intelligence technique,” IEEE Computational Intelligence Magazine, vol. 1, no. 4, pp. 28–39, 2006. View at Publisher · View at Google Scholar · View at Scopus
  7. M. Dorigo and C. Blum, “Ant colony optimization theory: a survey,” Theoretical Computer Science, vol. 344, no. 2-3, pp. 243–278, 2005. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  8. X. L. Li, Z. J. Shao, and J. X. Qian, “Optimizing method based on autonomous animats: fish-swarm Algorithm,” System Engineering Theory and Practice, vol. 22, no. 11, pp. 32–38, 2002. View at Scopus
  9. D. Karaboga and B. Akay, “A comparative study of artificial Bee colony algorithm,” Applied Mathematics and Computation, vol. 214, no. 1, pp. 108–132, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH
  10. D. Karaboga and B. Akay, “A survey: algorithms simulating bee swarm intelligence,” Artificial Intelligence Review, vol. 31, no. 1–4, pp. 61–85, 2009. View at Publisher · View at Google Scholar · View at Scopus
  11. K. M. Passino, “Biomimicry of bacterial foraging for distributed optimization and control,” IEEE Control Systems Magazine, vol. 22, no. 3, pp. 52–67, 2002. View at Publisher · View at Google Scholar · View at Scopus
  12. S. D. Müller, J. Marchetto, S. Airaghi, and P. Koumoutsakos, “Optimization based on bacterial chemotaxis,” IEEE Transactions on Evolutionary Computation, vol. 6, no. 1, pp. 16–29, 2002. View at Publisher · View at Google Scholar · View at Scopus
  13. S. Das, S. Dasgupta, A. Biswas, A. Abraham, and A. Konar, “On stability of the chemotactic dynamics in bacterial-foraging optimization algorithm,” IEEE Transactions on Systems, Man, and Cybernetics Part A, vol. 39, no. 3, pp. 670–679, 2009. View at Publisher · View at Google Scholar · View at Scopus
  14. M. S. Li, T. Y. Ji, W. J. Tang, Q. H. Wu, and J. R. Saunders, “Bacterial foraging algorithm with varying population,” BioSystems, vol. 100, no. 3, pp. 185–197, 2010. View at Publisher · View at Google Scholar · View at Scopus
  15. S. Dasgupta, A. Biswas, A. Abraham, and S. Das, “Adaptive computational chemotaxis in bacterial foraging algorithm,” in Proceedings of the 2nd International Conference on Complex, Intelligent and Software Intensive Systems (CISIS '08), pp. 64–71, March 2008. View at Publisher · View at Google Scholar · View at Scopus
  16. Y. Chu, H. Mi, H. Liao, Z. Ji, and Q. H. Wu, “A Fast Bacterial Swarming Algorithm for high-dimensional function optimization,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '08), pp. 3135–3140, June 2008. View at Publisher · View at Google Scholar · View at Scopus
  17. D. H. Kim, “Hybrid GA-BF based intelligent PID controller tuning for AVR system,” Applied Soft Computing Journal, vol. 11, no. 1, pp. 11–22, 2011. View at Publisher · View at Google Scholar · View at Scopus
  18. H. N. Chen, Y. L. Zhu, and K. Y. Hu, “Adaptive bacterial foraging algorithm,” Abstract and Applied Analysis, vol. 2011, Article ID 108269, 27 pages, 2011. View at Publisher · View at Google Scholar
  19. B. Niu, Y. Fan, H. Wang, L. Li, and X. Wang, “Novel bacterial foraging optimization with time-varying chemotaxis step,” International Journal of Artificial Intelligence, vol. 7, no. 11, pp. 257–273, 2011.
  20. B. Niu, H. Wang, L. J. Tan, and L. Li, “Improved BFO with adaptive chemotaxis step for global optimization,” in Proceedings of International Conference on Computational Intelligence and Security (CIS '11), pp. 76–80, 2011.
  21. X. Yao, Y. Liu, and G. Lin, “Evolutionary programming made faster,” IEEE Transactions on Evolutionary Computation, vol. 3, no. 2, pp. 82–102, 1999. View at Publisher · View at Google Scholar · View at Scopus
  22. R. Salomon, “Re-evaluating genetic algorithm performance under coordinate rotation of benchmark functions. A survey of some theoretical and practical aspects of genetic algorithms,” BioSystems, vol. 39, no. 3, pp. 263–278, 1996. View at Publisher · View at Google Scholar · View at Scopus
  23. D. E. Goldberg, Genetic Algorithms in Search, Optimization, and Machine Learning, Addison-Wesley Professional, Boston, Mass, USA, 1989.