The Scientific World Journal

The Scientific World Journal / 2014 / Article
Special Issue

Swarm Intelligence and Its Applications 2014

View this Special Issue

Research Article | Open Access

Volume 2014 |Article ID 375358 | https://doi.org/10.1155/2014/375358

I. C. Obagbuwa, A. O. Adewumi, "An Improved Cockroach Swarm Optimization", The Scientific World Journal, vol. 2014, Article ID 375358, 13 pages, 2014. https://doi.org/10.1155/2014/375358

An Improved Cockroach Swarm Optimization

Academic Editor: Y. Zhang
Received06 Dec 2013
Accepted18 Mar 2014
Published14 May 2014

Abstract

Hunger component is introduced to the existing cockroach swarm optimization (CSO) algorithm to improve its searching ability and population diversity. The original CSO was modelled with three components: chase-swarming, dispersion, and ruthless; additional hunger component which is modelled using partial differential equation (PDE) method is included in this paper. An improved cockroach swarm optimization (ICSO) is proposed in this paper. The performance of the proposed algorithm is tested on well known benchmarks and compared with the existing CSO, modified cockroach swarm optimization (MCSO), roach infestation optimization RIO, and hungry roach infestation optimization (HRIO). The comparison results show clearly that the proposed algorithm outperforms the existing algorithms.

1. Introduction

Swarm intelligence (SI) is a method of computing whereby simple decentralized agents get information by interacting locally with one another and their environment [1]. The local information received is not controlled centrally; local interaction of agents results in amazing and emergent global patterns which can be adopted for solving problems [1].

SI algorithms draw inspiration from insects and animals social behaviour and have been proven in literature to be efficient in solving global optimization problems. Examples of existing SI algorithms include particle swarm optimization (PSO), ant colony optimization (ACO), and bee colony optimization (BCO). PSO based on bird social behaviour, introduced by Kennedy and Eberhart [2], has been applied to several problems, including power and management processes [3, 4] and combinatorial optimization problem in [5]. ACO based on ant social behaviour, introduced by Dorigo [6], has been applied to problems such as vehicle routing problem [7] and network routing problem [8]. BCO based on bees social behaviour, introduced by Pham et al. [9], has been applied to real world problems by Karaboga and his research group [1012].

One of the recent developments in SI is cockroach optimization [1316]. Cockroach belongs to Insecta Blattodea, abodes in warm, dark, and moist shelters, and exhibits habits which include chasing, swarming, dispersing, being ruthless and omnivorous, and food searching. Cockroaches interact with peers and respond to their immediate environment and make decisions based on their interaction such as selecting shelter, searching for food sources and friends, dispersing when danger is noticed, and eating one another when food is scarce.

The original cockroach swarm optimization (CSO) algorithm, introduced by Zhaohui and Haiyan [14], was modified by ZhaoHui with the introduction of inertial weight [15]. CSO algorithms [14, 15] mimic chase swarming, dispersion, and ruthless social behaviour of cockroaches.

Global optimization problems are considered as very hard problems, ever increasing in complexity. It became necessary to design better optimization algorithms; this necessitated the design of a better cockroach algorithm. This paper extends MCSO with the introduction of another social behaviour called hunger behaviour. Hunger behaviour prevents local optimum and enhances diversity of population. An improved cockroach swarm optimization (ICSO) is presented in this paper.

The organization of this paper is as follows: Section 2 presents CSO, MCSO, and ICSO models with algorithmic steps; Section 3 shows the experiments carried out and results obtained; the paper is summarised in Section 4.

2. Cockroach Swarm Optimization

CSO algorithm is a population based global optimization algorithm which has been applied to problems in literature including [1719]. CSO [14] models are given as follows.

(1) Chase-Swarming Behaviour. where is the cockroach position, step is a fixed value, rand is a random number within , is the personal best position, and is the global best position. Consider where perception distance visual is a constant, , . Consider

(2) Dispersion Behaviour. where rand is a -dimensional random vector that can be set within a certain range.

(3) Ruthless Behaviour. where is a random integer within and is the global best position.

2.1. Modified Cockroach Swarm Optimization

ZhaoHui presented a modified cockroach swarm optimization (MCSO) [15] with the introduction of inertial weight to chase swarming component of original CSO as shown below. Other models remain as in original CSO.

Chase-swarming behaviour is as follows: where is an inertial weight which is a constant.

2.2. Improved Cockroach Swarm Optimization

In this paper, MCSO is extended with additional component called hunger behaviour.

2.2.1. Hunger Behaviour

At interval of time, when cockroach is hungry, it migrates from its comfortable shelter and friends company to look for food [13, 20]. Hunger behaviour is modelled using partial differential equation (PDE) migration techniques [21]. Cockroach migrates from its shelter to any available food source within the search space. A threshold hunger is defined, when cockroach is hungry and threshold hunger is reached; it migrates to food source. Hunger behaviour prevents local optimum and enhances diversity of population.

PDE migration equation is described by Kerckhove [21]: with .

Parameter is the controlling speed of the migration. is the population size, is time, and is location or position. is the population size at time in location with being the initial population distribution. Consider

The characteristic equations are

By integration, we have

Consider displacement = speed × time.

In , displaces .

satisfies migration equation at any initial population distribution [21].

Hunger behaviour is modelled as follows:

If where denotes cockroach position, denotes cockroach migration from its present position, is a constant which controls migration speed at time , denotes food location, denotes hunger threshold, and hunger is a random number .

2.2.2. Improved Cockroach Swarm Optimization Models

(1) Chase-Swarming Behaviour. where is an inertial weight which is a constant, step is a fixed value, rand is a random number within , is the personal best position, and is the global best position. Consider where perception distance visual is a constant, , . Consider

(2) Hunger Behaviour. If , where denotes cockroach position, denotes cockroach migration from its present position, is a constant which controls migration speed at time , denotes food location, denotes hunger threshold, and hunger is a random number within .

(3) Dispersion Behaviour. where rand is a -dimensional random vector that can be set within a certain range.

(4) Ruthless Behaviour. where is a random integer within and is the global best position.

The algorithm for ICSO is illustrated in Algorithm 1 and its computational steps given as follows.(1)Initialise cockroach swarm with uniform distributed random numbers and set all parameters with values.(2)Find and using (12) and (13).(3)Perform chase-swarming using (11).(4)Perform hunger behaviour using  (14)(5)Perform dispersion behaviour using  (15).(6)Perform ruthless behaviour using  (16).(7)Repeat the loop until stopping criterion is reached.Series of experiments are conducted in Section 3 using established global optimization problems to test ICSO performance. The performance of ICSO is compared with that of existing algorithms RIO, HRIO, CSO, and MCSO.

INPUT: Fitness function:
set parameters and generate an initial population of cockroach
set  
for     do
      if     then
            
      end if
end for
for     do
      for     do
            for     do
                  if     then
                        
                  end if
            end for
             if     then
                   
             else
                   
             end if
              if     then
                  
            end if
       end for
       if     then
             
             
             Increment counters
       end if
      for     do
            
            if     then
                  
            end if
       end for
       
        ;
end for
Check termination condition

3. Simulation Studies

The speed, accuracy, robustness, stability, and searching capabilities of ICSO are evaluated in this section with 23 benchmark test functions. The test functions were adopted from [2224]; any further information about the test functions can be found in these references. The test functions are of different characteristics such as unimodal , multimodal , separable , and nonseparable . Table 1 of this paper shows the test functions used, whose problem ranges from 2 to 30 in dimension as in [2224].


NumberRange CFunctionsDescription

1 −100, 100 30USStep
2 −100, 100 30USSphere
3 −10, 10 30USSumsquares
4 −100, 100 2MSBohachevsky1
5 −100, 100 2MNBohachevsky2
6 −100, 100 2MNBohachevsky3
7[0, 180]20UNSinusoidal20
8 −100, 100 30UNQuadric
9 −100, 100 2UNEasom
10 −10, 10 2UNMatyas
11 −5, 10 10UNZakharov
12 −10, 10 24UNPowell
13 −10, 10 30UNSchwefel2.22
14 −30, 30 30UNRosenbrock
15 −5.12, 5.12 30MSRastrigin
16 −100, 100 2MNSchaffer1
17 −100, 100 30MNSchaffer2
18 −600, 600 30MNGriewangk
19 −32, 32 30MNAckley
20 −5, 5 2MNThree hump camel back
21 −5, 5 2MNSix hump camel back
22 , 9UNStorn's Tchebychev ,
23 , 17Storn's Tchebychev
for : , and
for : , and .

dimension; C: characteristic; U: unimodal; S: seperable; N: non-separable.

All algorithms were implemented in MATLAB 7.14 (R2012a) and run on a computer with 2.30 GHz processor with 4.00 GB of RAM. Experimental setting of [1315] is used for the experiments of this paper; experiment runs 20 times with maximum iteration 1000, perception distance , the largest step was , and inertia weight was ; we defined hunger threshold and hunger as a randomly generated number in each iteration for ICSO. Cockroach parameters [13] are used for RIO and HRIO; and , hunger threshold , and hunger as randomly generated number . Cockroach population size is used in this paper for all the algorithms. Further details about RIO, HRIO, CSO, and MSCO can be found in [1315].

ICSO along with similar algorithms, that is, CSO, MSCO, RIO, and HRIO, was implemented with several simulation experiments conducted and reported. Success rate, average and best fitness, standard deviation (STD), and execution time in seconds are used as performance measure for comparative purpose (see Tables 2, 3, and 4 of this paper).


SNFn.Dim.Opt.RIOHRIOCSOMCSOICSO

1Boha120Ave. 0.0000
STD 0.0000
Best 0.00000.0000
Success20/2020/205/2020/2020/20
Time1.1375250.88635623.9132370.0752120.097187

2Boha220Ave. 0.0000
STD 0.0000
Best 0.00000.0000
Success20/2020/204/2020/2020/20
Time0.9981780.94688726.4920950.0720210.074106

3Boha320Ave. 0.0000
STD 0.0000
Best 0.00000.0000
Success20/2020/203/2020/2020/20
Time1.0899200.88525225.0280540.0809080.068189

43camel20Ave.
STD
Best
Success19/2020/2012/2020/2020/20
Time4.2315330.79498318.2816830.1041320.078845

56camel2 Ave.
STD
Best
Success
Time0.4063550.3301985.7230390.09458560.086637

6Easom2 Ave.
STD
Best
Success20/2020/2020/2020/2020/20
Time0.1240220.1073030.1067380.0771790.092393

7Matyax20Ave. 7.5712
STD
Best
Success20/2020/2011/2020/2020/20
Time0.9733220.71173413.5595760.885360.076693

8Schaffer12 Ave.
STD
Best
Success20/2020/2020/2020/2020/20
Time0.1090480.0864330.1190760.0724000.081599

9Schaffer220Ave.
STD
Best
Success
Time62.56765431.41583629.1942830.0841270.082320

Dim. denotes dimension. Opt. denotes optimum value. Boha1 denotes Bohachevsky1. Boha2 denotes Bohachevsky2. Boha3 denotes Bohachevsky3. 3camel denotes three hump camel back. 6camel denotes six hump camel back.

SNFn.Dim.Opt.RIOHRIOCSOMCSOICSO

10Sphere300Ave.
STD
Best
Success20/2020/2019/2020/2020/20
Time0.6175440.55787125.3781610.825120.199373

11Rastrigin300Ave. 0.0000
STD 0.0000
Best 0.00000.0000
Success20/2020/205/2020/2020/20
Time0.9563290.82677071.8111700.1755630.369987

12Rosenbrock300Ave.
STD 0.00000.0000
Best
Success0/200/200/200/200/20
Time126.618734127.46963881.36166376.08492978.572185

13Ackley300Ave.
STD
Best
Success0/200/200/2020/2020/20
Time122.216187117.63585482.2272100.2350120.192339

14Quadric300Ave.
STD
Best
Success20/2020/2020/2020/2020/20
Time0.7187850.51224231.0758090.2474560.227244

15Schwefel2.22300Ave.
STD
Best
Success0/200/200/2020/2020/20
Time128.445013127.08438779.9245160.2171040.219296

16Griewangk300Ave. 0.0000
STD 0.0000
Best 0.00000.0000
Success0/200/205/2020/2020/20
Time126.872461126.21015370.8523760.2113510.210934

17Sumsquare300Ave.
STD
Best
Success0/200/200/2020/2020/20
Time122.748646125.15434978.8092700.2737800.236129

18Sinusoidal30 Ave.
STD 1.0203
Best
Success20/2020/2020/2020/2020/20
Time0.2045590.2402000.2342050.2003610.217635

Dim. denotes dimension. Opt. denotes optimum value.

SNFunctionDim.Opt.RIOHRIOCSOMCSOICSO

19Zakharov300Ave.
STD
Best
Success0/200/200/2020/2020/20
Time115.192226114.69182779.9262320.2052800.259202

20Step300Ave.0.00000.0000 0.00000.0000
STD0.00000.0000 0.00000.0000
Best0.00000.00000.00000.00000.0000
Success20/2020/2016/2020/2020/20
Time0.6864030.63326439.1366960.2395250.225102

21Powell240Ave.
STD
Best
Success2/2012/200/2020/2020/20
Time122.79699192.87608674.7947301.5271700.853751

22ST90Ave.0.00000.00000.00000.00000.0000
STD0.00000.00000.00000.00000.0000
Best0.00000.00000.00000.00000.0000
Success20/2020/2020/2020/2020/20
Time0.4359110.4263200.4379440.4311220.436741

23ST170Ave.0.00000.00000.00000.00000.0000
STD0.00000.00000.00000.00000.0000
Best0.00000.00000.00000.00000.0000
Success20/2020/2020/2020/2020/20
Time1.0661611.0521691.1598301.0896571.147114

Dim. denotes dimension. Opt. denotes optimum value.

ICSO locates minimum values for the tested benchmark problems such as Bohachevsky, Rastrigin, Easom, Schaffer, Step, and Storn’s Tchebychev problems as shown in Tables 2, 3, and 4. The comparison of the average performance of ICSO with that of RIO, HRIO, CSO, and MCSO is shown in Table 5; the comparison result clearly shows that ICSO outperforms other algorithms. Similarly, the best performance of ICSO with that of RIO, HRIO, CSO, and MCSO is shown in Table 6; ICSO has better performance than others.


SNFunctionRIOHRIOCSOMCSOICSOOptimum

1Bohachevsky1 0.00000
2Bohachevsky2 0.00000
3Bohachevsky3 0.00000
43 Hump camel back 0
56 Hump camel back
6Easom
7Matyax 0
8Schaffer1
9Schaffer2 0
10Sphere 0
11Rastrigin 0.00000
12Rosenbrock 0
13Ackley 0
14Quadric 0
15Schwefel2.22 0
16Griewangk 0.00000
17Sumsquare 0
18Sinusoidal
19Zakharov 0
20Step0.00000.0000 0.00000.00000
21Powell 0
22ST90.00000.00000.00000.00000.00000
23ST170.00000.00000.00000.00000.00000

Number of good optimums442723

ST9 denotes Storn's Tchebychev 9. ST17 denotes Storn's Tchebychev 17.

SNFunctionRIOHRIOCSOMCSOICSOOptimum

1Bohachevsky1 0.00000.00000
2Bohachevsky2 0.00000.00000
3Bohachevsky3 0.00000.00000
43 hump camel back 0
56 hump camel back
6Easom
7Matyax 0
8Schaffer1
9Schaffer2 0
10Sphere 0
12Rosenbrock 0
14Quadric 0
15Schwefel2.22 0
16Griewangk 0.00000.00000
17Sumsquare 0
18Sinusoidal
19Zakharov 0
20Step0.00000.00000.00000.00000.00000
21Powell 0
22ST90.00000.00000.00000.00000.00000
23ST170.00000.00000.00000.00000.00000

Number of good optimums4451122

ST9 denotes Storn's Tchebychev 9. ST17 denotes Storn's Tchebychev 17.

ICSO algorithm has consistent performance in each iteration. This is proved by very low standard deviation of the average optimal recoded during experiments. The ICSO average optimal STD is compared with the STD of RIO, HRIO, CSO, and MCSO in Table 7. ICSO has better minimum STD than others.


SN Function RIOHRIOCSOMCSOICSO

1Bohachevsky1 0.0000
2Bohachevsky2 0.0000
3Bohachevsky3 0.0000
43 hump camel back
56 hump camel back
6Easom
7Matyax
8Schaffer1
9Schaffer12 5.3095
10Sphere
11Rastrigin 0.0000
12Rosenbrock 0.00000.0000
13Ackley 5.8258
14Quadric
15Schwefel2.22
16Griewangk 0.0000
17Sumsquare
18Sinusoidal 1.0203
19Zakharov
20Step0.00000.0000 0.00000.0000
21Powell
22ST90.00000.00000.00000.00000.0000
23ST170.00000.00000.00000.00000.0000

Number of good STD222423

ST9 denotes Storn's Tchebychev 9. ST17 denotes Storn's Tchebychev 17.

ICSO locates good solutions in each experiment; this is proved by the success rate of the algorithm. Table 8 shows the comparison of the success rate of the proposed algorithm with the existing algorithms RIO, HRIO, CSO, and MCSO. ICSO has success rate in all test functions except Rosenbrock.


SNFunctionRIOHRIOCSOMCSOICSO

1Bohachevsky1112.511
2Bohachevsky2110.211
3Bohachevsky3110.1511
43 hump camel back0.9510.611
56 hump camel back110.9511
6Easom11111
7Matyax110.5511
8Schaffer111111
9Schaffer20.10.65011
10Sphere110.9511
11Rastrigin110.2511
12Rosenbrock00000
13Ackley00011
14Quadric11111
15Schwefel2.2200011
16Griewangk000.2511
17Sumsquare00011
18Sinusoidal11111
19Zakharov00011
20Step110.811
21Powell0.10.6011
22ST9 11111
23ST17 11111

Number of 100% success rates141562222

ST9 denotes Storn's Tchebychev 9. ST17 denotes Storn's Tchebychev 17.

ICSO utilizes minimum time in executing the selected test function. Table 9 shows the comparison of the execution time of ICSO and that of RIO, HRIO, CSO, and MCSO; ICSO is shown to have utilized minimum time.


SNFunctionRIOHRIOCSOMCSOICSO

1Bohachevsky11.1375250.88635623.9132370.0752120.097187
2Bohachevsky20.9981780.94688726.4920950.0720210.074106
3Bohachevsky31.0899200.88525225.0280540.0809080.068189
43 hump camel back4.2315330.79498318.2816830.1041320.078845
56 hump camel back0.4063550.3301985.7230390.09458560.086637
6Easom0.1240220.1073030.1067380.0771790.092393
7Matyax0.9733220.71173413.5595760.885360.076693
8Schaffer10.1090480.0864330.1190760.0724000.081599
9Schaffer262.56765431.41583629.1942830.0841270.082320
10Sphere0.6175440.55787125.3781610.825120.199373
11Rastrigin0.9563290.82677071.8111700.1755630.369987
12Rosenbrock126.618734127.46963881.36166376.08492978.572185
13Ackley122.216187117.63585482.2272100.2350120.192339
14Quadric0.7187850.51224231.0758090.2474560.227244
15Schwefel2.22128.445013127.08438779.9245160.2171040.219296
16Griewangk126.872461126.21015370.8523760.2113510.210934
17Sumsquare122.748646125.15434978.8092700.2737800.236129
18Sinusoidal0.2045590.2402000.2342050.2003610.217635
19Zakharov115.192226114.69182779.9262320.2052800.259202
20Step0.6864030.63326439.1366960.2395250.225102
21Powell122.79699192.87608674.7947301.5271700.853751
22ST90.4359110.4263200.4379440.4311220.436741
23ST171.0661611.0521691.1598301.0896571.147114

Number of minimum execution times2912

ST9 denotes Storn's Tchebychev 9. ST17 denotes Storn's Tchebychev 17.

To determine the significant difference between the performance of the proposed algorithm and the existing algorithms, test statistic of Jonckheere-Terpstra (J-T) test was conducted using the statistical package for the social science (SPSS). The Null hypothesis test for J-T test is that there is no difference among several independent groups. As the usual practice in most literature, value threshold value for hypothesis test was set to 0.05. If value is less than 0.05, the Null is rejected which means there is significant difference between the groups. Otherwise the Null hypothesis is accepted. Table 10 shows the result of J-T test; value (Asymp. Sig.) was computed to be 0.001. The value is less than the threshold value 0.05; therefore, there is significant difference in performance of ICSO and that of RIO, HRIO, CSO, and MCSO for benchmarks evaluated.


Fitness

Number of levels in algorithm 5
114
Observed J-T statistic1952.000
Mean J-T statistic2599.500
STD of J-T statistic199.355
Standard data of J-T statistic
Asymp. Sig. (2-tailed)0.001

Grouping variable: algorithm.

Effect size of the significant difference is the measure of the magnitude of the observed effect. The effect size , () of the significant difference of J-T test, was calculated as where is the standard data of J-T statistic as shown in Table 10, is the total number of samples, and . Consider where denotes observed J-T statistic, denotes the mean J-T statistic, and denoted the standard deviation of J-T statistic. Consider

The distance between the observed data and the mean in units of standard deviation is absolute value of ( is negative when observed data is below the mean and positive when above). The effect size is of medium size, using Cohen’s guideline on effect size [25, 26]. The statistics of effect size shows that there is significant difference of medium magnitude between proposed algorithm and existing algorithms.

4. Conclusion

Cockroach swarm optimization algorithm is extended in this paper with a new component called hunger component. Hunger component enhances the algorithm diversity and searching capability. An improved cockroach swarm optimization algorithm is proposed. The efficiency of the proposed algorithm is shown through empirical studies where its performance was compared with that of existing algorithms, that is, CSO, MSCO, RIO, and HRIO. Results show its outstanding performance compared to the existing algorithms. Application of the algorithm to real life problems can be considered in further studies.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgment

The authors thank College of Agriculture, Engineering and Science, University of KwaZulu-Natal, for supporting this paper through bursary award.

References

  1. S. Camazine, J. Deneubourg, N. Franks, J. Sneyd, G. Theraulaz, and E. Bonabeau, Self-Organization in Biological Systems, Princeton University Press, Princeton, NJ, USA, 2001.
  2. J. Kennedy and R. C. Eberhart, “Particle swarm optimization,” IEEE Neural Networks Proceedings, vol. 4, pp. 1942–1948, 1995. View at: Google Scholar
  3. F. Yoshikazu, N. Hideyuki, and T. Yuji, “Particle swarm optimization for the optimal operational planning of energy plants,” in Innovations in Swarm Intelligence Studies in Computational Intelligence, C. P. Lim, L. C. Jain, and S. Dehuri, Eds., pp. 159–174, Springer, Berlin, Germany, 2009. View at: Google Scholar
  4. P. Chen, “Particle swarm optimization for power dispatch with pumped hydro,” in Particle Swarm Optimization, A. Lazinica, Ed., pp. 131–144, In-Tech, Zagreb, Croatia, 2009. View at: Google Scholar
  5. I. Omar and B. Cees, “A particle optimization approach to graph permutation,” in Particle Swarm Optimization, A. Lazinica, Ed., pp. 291–312, In-Tech, Zagreb, Croatia, 2009. View at: Google Scholar
  6. M. Dorigo, Optimization, learning and natural algorithms [Ph.D. thesis], Politecnico di Milano, Milan, Italy, 1992.
  7. A. Rizzoli A, R. Montemanni, E. Lucibello, and L. Gambardella, Ant Colony Optimization for Real-World Vehicle Routing Problems: from Theory to Applications, Springer, 2007.
  8. A. A. A. Radwan, T. M. Mahmoud, and E. H. Hussein, “AntNet-RSLR: a proposed Ant routing protocol for MANETs,” in Proceedings of the IEEE Saudi International Electronics, Communications and Photonics Conference (SIECPC '11), April 2011. View at: Publisher Site | Google Scholar
  9. D. Pham, A. Ghanbarzadeh, E. Koc, S. Otri, S. Rahim, and M. Zaidi, “The bees algorithm,” Tech. Rep., Manufacturing Engineering Centre, Cardiff University, Cardiff, UK, 2005. View at: Google Scholar
  10. D. Karaboga and B. Basturk, “On the performance of artificial bee colony (ABC) algorithm,” Applied Soft Computing Journal, vol. 8, no. 1, pp. 687–697, 2008. View at: Publisher Site | Google Scholar
  11. D. Karaboga and B. Akay, “A survey: algorithms simulating bee swarm intelligence,” Artificial Intelligence Review, vol. 31, no. 1–4, pp. 61–85, 2009. View at: Publisher Site | Google Scholar
  12. B. Basturk and D. Karaboga, “An Artificail Bees Colony (ABC) algorithm for numeric computation,” in Proceedings of the IEEE Swarm Intellience Symposium, Indianapolis, Ind, USA, 2006. View at: Google Scholar
  13. T. C. Havens, C. J. Spain, N. G. Salmon, and J. M. Keller, “Roach infestation optimization,” in Proceedings of the IEEE Swarm Intelligence Symposium (SIS '08), September 2008. View at: Publisher Site | Google Scholar
  14. C. ZhaoHui and T. HaiYan, “Cockroach swarm optimization,” in Proceedings of the 2nd International Conference on Computer Engineering and Technology (ICCET '10), vol. 6, pp. 652–655, April 2010. View at: Google Scholar
  15. C. ZhaoHui, “A modified cockroach swarm optimization,” Energy Procedia, vol. 11, p. 49, 2011. View at: Google Scholar
  16. I. C. Obagbuwa, A. O. Adewumi, and A. A. Adebiyi, “A dynamic step-size adaptation roach infestation optimization,” in Proceedings of the IEEE International Conference on Advance Computing (IACC '14), Gurgaon, Indian, 2014. View at: Google Scholar
  17. L. Cheng, Z. Wang, S. Yanhong, and A. Guo, “Cockroach swarm optimization algorithm for TSP,” Advanced Engineering Forum, vol. 1, pp. 226–229, 2011. View at: Google Scholar
  18. C. ZhaoHui and T. HaiYan, “Cockroach swarm optimization for vehicle routing problems,” Energy Procedia, vol. 13, pp. 30–35, 2011. View at: Google Scholar
  19. I. C. Obagbuwa, A. O. Adewumi, and A. A. Adebiyi, “Stochastic constriction cockroach swarm optimization for multidimensional space function problems,” Mathematical Problems in Engineering, vol. 2014, Article ID 430949, 12 pages, 2014. View at: Publisher Site | Google Scholar
  20. J. B. Williams, M. Louis, R. Christine, and A. Nalepal, Cock-Roaches Ecology, Behaviour and Natural History, Johns Hopkins University Press, Baltimore, Md, USA, 2007.
  21. M. Kerckhove, “From population dynamics to partial differential equations,” The Mathematica Journal, vol. 14, 2012. View at: Google Scholar
  22. M. M. Ali, C. Khompatraporn, and Z. B. Zabinsky, “A numerical evaluation of several stochastic algorithms on selected continuous global optimization test problems,” Journal of Global Optimization, vol. 31, no. 4, pp. 635–672, 2005. View at: Publisher Site | Google Scholar
  23. D. Karaboga and B. Akay, “A comparative study of Artificial Bee Colony algorithm,” Applied Mathematics and Computation, vol. 214, no. 1, pp. 108–132, 2009. View at: Publisher Site | Google Scholar
  24. P. Civicioglu and E. Besdok, “A conceptual comparison of the Cuckoo-search, particle swarm optimization, differential evolution and artificial bee colony algorithms,” Artificial Intelligence Review, vol. 39, pp. 1–32, 2013. View at: Publisher Site | Google Scholar
  25. J. Cohen, Statistical Power Analysis for the Behavioural Science, L. Erlbaum Associates, Hillsdale, NJ, USA, 2nd edition, 1988.
  26. J. Cohen, “Statistical power analysis for the behavioural science,” Current Direction in Psychological Science, vol. 1, no. 3, pp. 98–101, 1992. View at: Google Scholar

Copyright © 2014 I. C. Obagbuwa and A. O. Adewumi. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder
Views4350
Downloads1339
Citations

Related articles

Article of the Year Award: Outstanding research contributions of 2020, as selected by our Chief Editors. Read the winning articles.