Mathematical Tools of Soft Computing 2014View this Special Issue
Improving the Bin Packing Heuristic through Grammatical Evolution Based on Swarm Intelligence
In recent years Grammatical Evolution (GE) has been used as a representation of Genetic Programming (GP) which has been applied to many optimization problems such as symbolic regression, classification, Boolean functions, constructed problems, and algorithmic problems. GE can use a diversity of searching strategies including Swarm Intelligence (SI). Particle Swarm Optimisation (PSO) is an algorithm of SI that has two main problems: premature convergence and poor diversity. Particle Evolutionary Swarm Optimization (PESO) is a recent and novel algorithm which is also part of SI. PESO uses two perturbations to avoid PSO’s problems. In this paper we propose using PESO and PSO in the frame of GE as strategies to generate heuristics that solve the Bin Packing Problem (BPP); it is possible however to apply this methodology to other kinds of problems using another Grammar designed for that problem. A comparison between PESO, PSO, and BPP’s heuristics is performed through the nonparametric Friedman test. The main contribution of this paper is proposing a Grammar to generate online and offline heuristics depending on the test instance trying to improve the heuristics generated by other grammars and humans; it also proposes a way to implement different algorithms as search strategies in GE like PESO to obtain better results than those obtained by PSO.
The methodology development to solve a specific problem is a process that entails the problem study and the analysis instances from such problem. There are many problems  for which there are no methodologies that can provide the exact solution, because the size of the problem search space makes it intractable in time, and it makes it necessary to search and improve methodologies that can give a solution in a finite time. There are methodologies based on Artificial Intelligence which do not yield exact solutions; those methodologies, however, provide an approximation, and among those we can find the following methodologies.
Heuristics are defined as “a type of strategy that dramatically limits the search for solutions” [2, 3]. One important characteristic of heuristics is that they can obtain a result for an instance problem in polynomial time , although heuristics are developed for a specific instance problem.
Metaheuristics are defined as “a master strategy that guides and modifies other heuristics to obtain solutions generally better that the ones obtained with a local search optimization” . The metaheuristics can work over several instances of a given problem or various problems, but it is necessary to adapt the metaheuristics to work with each problem.
It has been shown that the metaheuristic Genetic Programming  can generate a heuristic that can be applied to an instance problem . There also exist metaheuristics that are based on Genetic Programming’s paradigm  such as Grammatical Differential Evolution , Grammatical Swarm , Particle Swarm Programming , and Geometric Differential Evolution .
The Bin Packing Problem (BPP) has been widely studied because of its many Industrial Applications, like wood and glass cutting, packing in transportation and warehousing , and job scheduling on uniform processors [13, 14]. This is an NP-Hard Problem  and due to its complexity many heuristics have been developed attempting to give an approximation [15–19]. Some metaheuristics have also been applied to try to obtain better results than those obtained by heuristics [20–22]. Some exact algorithms have been developed [23–25]; however, given the nature of the problem the time reported by these algorithms grows up and depending on the instance the time may grow up exponentially.
The contribution of this paper is to propose a generic methodology to generate heuristics using GE with search strategies. It has been shown that is possible to use this methodology to generate BPP heuristics by using PESO and PSO as search strategies; it was also shown that the heuristics generated with the proposed Grammar have better performance than the BPP’s classical heuristics, which were designed by an expert in Operational Research. Those results were obtained by comparing the results obtained by the GE and the BPP heuristics by means of Friedman nonparametric test .
The GE is described in Section 2, including the PSO and PESO. Section 3 describes the Bin Packing Problem, the state-of-the-art heuristics, the instances used, and the fitness function. We describe the experiments performed in Section 4. Finally, general conclusions about the present work are presented in Section 5, including future perspectives of this work.
2. Grammatical Evolution
Grammatical Evolution (GE)  is a grammar-based form of Genetic Programming (GP) . GE joins the principles of molecular biology, which are used by GP, and the power of formal grammars. Unlike GP, GE adopts a population of lineal genotypic integer strings, or binary strings, witch are transformed into functional phenotypic through a genotype-to-phenotype mapping process ; this process is also known as Indirect Representation . The genotype strings evolve with no knowledge of their phenotypic equivalent, only using the fitness measure.
The transformation is governed through a Backus Naur Form grammar (BNF), which is made up of the tuple , where is the set of all nonterminal symbols, is the set of terminals, is the set of production rules that map , and is the initial start symbol where . There are a number of production rules that can be applied to a nonterminal; an “∣” (or) symbol separates the options.
Even though the GE uses the Genetic Algorithm (GA) [7, 28, 30] as a search strategy it is possible to use another search strategy like the Particle Swarm Optimization, called Grammatical Swarm (GS) .
In GE each individual is mapped into a program using the BNF, using (1) proposed in  to choose the next production based on the nonterminal symbol. An example of the mapping process employed by GE is shown in Figure 1. Consider where is the codon value and is the number of production rules available for the current nonterminal.
The GE can use different search strategies; our proposed model is shown in Figure 2. This model includes the problem instance and the search strategy as an input. In  the search strategy is part of the process; however it can be seen as an additional element that can be chosen to work with GE. The GE will generate a solution through the search strategy selected and it will be evaluated in the objective function using the problem instance.
2.1. Particle Swarm Optimization
Particle Swarm Optimization (PSO) [31–35] is a metaheuristic bioinspired in the flock of birds or school of fish. It was developed by Kennedy and Eberthart based on a concept called social metaphor. This metaheuristic simulates a society where all individuals contribute with their knowledge to obtain a better solution. There are three factors that influence the change of status or behavior of an individual.(i)The knowledge of the environment or adaptation: it is related to the importance given to the experience of the individual.(ii)His experience or local memory: it is related to the importance given to the best result found by the individual.(iii)The experience of their neighbors or global memory: this is related to how important is the best result obtained by their neighbors or other individuals.In this metaheuristic each individual is considered as a particle and moves through a multidimensional space that represents the social space; the search space depends on the dimension of space which in turn depends on the variables used to represent the problem.
For the update of each particle we use the velocity vector which tells how fast the particle will move in each of the dimensions; the method for updating the speed of PSO is given by (2), and its position is updated by (3). Algorithm 1 shows the complete PSO algorithm: where (i) is the velocity of the th particle,(ii) is adjustment factor to the environment,(iii) is the memory coefficient in the neighborhood,(iv) is the coefficient memory,(v) is the position of the th particle,(vi) is the best position found so far by all particles,(vii) is the best position found by the th particle.
2.2. Particle Evolutionary Swarm Optimization
Particle Evolutionary Swarm Optimization (PESO) [36–38] is based on PSO but introduces two perturbations in order to avoid two problems observed in PSO :(i)premature convergence,(ii)poor diversity.Algorithm 2 shows the PESO Algorithm with two perturbations, Algorithms 3 and 4. The C-Perturbation has the advantage of keeping the self-organization potential of the flock as no separate probability distribution needs to be computed; meanwhile the M-Perturbation helps keeping diversity into the population.
3. Bin Packing Problem
The Bin Packing Problem (BPP)  can be described as follows: given items that need to be packed in the lowest possible number of bins, each item has a weight , where is the element; the max capacity of the bins is also available. The objective is to minimize the bins used to pack all the items, given that each item is assigned only to one bin, and the sum of all the items in the bin can not exceed the bin’s size.
This problem has been widely studied, including the following:(i)proposing new theorems [41, 42],(ii)developing new heuristic algorithms based on Operational Research concepts [18, 43],(iii)characterizing the problem instances [44–46],(iv)implementing metaheuristics [20, 47–49].
This problem has been shown to be an NP-Hard optimization problem . A mathematical definition of the BPP is as follows:
Minimize subject to the following constraints and conditions: where(i) is weight of the item,(ii) is binary variable that shows if the bin has items,(iii) indicates whether the item is into the bin,(iv) is number of available bins,(v) is capacity of each bin.The algorithms for the BPP instances can be classified as online or offline . We have algorithms considered online if we do not know the items before starting the packing process and offline if we know all the items before starting. In this research we worked with both algorithms.
3.1. Tests Instances
Beasley  proposed a collection of test data sets, known as OR-Library and maintained by the Beasley University, which were studied by Falkenauer . This collection contains a variety of test data sets for a variety of Operational Research problems, including the BPP in several dimensions. For the one-dimensional BPP case the collection contains eight data sets that can be classified in two classes.(i)Unifor. The data sets from binpack1 to binpack4 consist of items of sizes uniformly distributed in (20, 100) to be packed into bins of size . The number of bins in the current known solution was found by .(ii)Triplets. The data sets from binpack5 to binpack8 consist of items from (24, 50) to be packed into bins of size 100. The number of bins can be obtained dividing the size of the data set by three.Scholl et al.  proposed another collection of data sets; only 1184 problems were solved optimally. Alvim et al.  reported the optimal solutions for the remaining 26 problems. The collection contains three data sets.(i)Set 1. It has 720 instances with items drawn from a uniform distribution on three intervals , , and . The bin capacity is , 120, and 150 and , 100, 200, and 500.(ii)Set 2. It has 480 instances with and , 100, 200, and 500. Each bin has an average of 3–9 items.(iii)Set 3. It has 10 instances with , , and items are drawn from a uniform distribution on . Set 3 is considered the most difficult of the three sets.
3.2. Classic Heuristics
Heuristics have been used to solve the BPP, obtaining good results. Reference  shows the following heuristics as Classical Heuristics; these heuristics can be used as online heuristics if the items need to be packed as they come in or offline heuristics if the items can be sorted before starting the packing process.(i)Best Fit  puts the piece in the fullest bin that has room for it and opens a new bin if the piece does not fit in any existing bin.(ii)Worst Fit  puts the piece in the emptiest bin that has room for it and opens a new bin if the piece does not fit in any existing bin.(iii)Almost Worst Fit  puts the piece in the second emptiest bin if that bin has room for it and opens a new bin if the piece does not fit in any open bin.(iv)Next Fit  puts the piece in the right-most bin and opens a new bin if there is not enough room for it.(v)First Fit  puts the piece in the left-most bin that has room for it and opens a new bin if it does not fit in any open bin.Even though there are some heuristics having better performance than the heuristics shown in the present section [16, 19, 42, 52, 53], such heuristics have been the result of research of lower and upper bounds to determine the minimal number of bins.
3.3. Fitness Measure
There are many Fitness Measures used to discern the results obtained by heuristics and metaheuristics algorithms. In  two fitness measures are shown: the first measure (see (6)) tries to find the difference between the used bins and the theorical upper bound on the bins needed; the second (see (7)) was proposed in  and rewards full or almost full bins; the objective is to fill each bin, minimizing the free space: where(i) is number of bins used,(ii) is number of containers,(iii) is number of pieces,(iv) is th’s piece size,(v)consider (vi) is bin capacity.
4. Grammar Design and Testing
Algorithm 5 shows the proposed approach; this approach allows the use of different fitness functions and search strategies to generate heuristics automatically.
To improve the Bin Packing Heuristics it was necessary to design a grammar that represents the Bin Packing Problem. In  Grammar 1 is shown to be based on heuristic elements taken by ; however the results obtained in  give 10% of solutions that can not be applied to the instance and for this reason this approach does not need to be included to be compared against the results obtained.
That Grammar has been improved in the Grammar 2  to obtain similar results to those obtained by the BestFit heuristic. However this grammar cannot be applied to Bin Packing offline Problems because it does not sort pieces. Grammar 3 is proposed to improve the results obtained by Grammar 2, given that it can generate heuristics online and offline:
Grammar 2. Grammar proposed in  was based on BestFist Heuristic:
Grammar 3. Grammar proposal to generate heuristics online and offline is based on Grammar 2, where(i) is size of the current piece,(ii) is bin capacity,(iii) is sum of the pieces already in the bin,(iv)Elements sorts the elements,(v)Bin sorts the bins based on the bin number,(vi)Cont sorts the bins based on the bin contents,(vii)Asc sorts in ascending order,(viii)Des sorts in descending order.
In order to generate the heuristics Grammar 3 was used. The search strategies applied to the GE were PESO and PSO. The number of function calls was taken from , where it was explained that this number is only 10% from the number of function calls used by . To obtain the parameters shown in Table 1 a fine-tuning process was applied based on Covering Arrays (CA) ; in this case the CA was generated using the Covering Array Library (CAS)  from The National Institute of Standards and Technology (NIST) (http://csrc.nist.gov/groups/SNS/acts/index.html).
In order to generate the heuristics, one instance from each set was used. Once the heuristic was obtained for each instance set, it was applied to all the sets to obtain the heuristic’s fitness. The instance sets used were detailed in Section 3.1. 33 experiments were performed independently and the median was used to compare the results against those obtained with the heuristics described in Section 3. The comparison was implemented through the nonparametric test of Friedman [26, 60]; this nonparametric test used a post hoc analysis to discern the performance between the experiments and gives a ranking of them.
The method to apply the heuristics generated by Grammar 3 for an instance set is described below.(i)For each instance in the instance set the generated heuristic will be applied.(ii)The generated heuristic has the option to sort the items before starting the packing process, to treat the instances like offline instances.(iii)The next part of the generated heuristic says how to sort the bins; many heuristics require sorting the bins before packing an item.(iv)The last part, the inequality, determines the rule to pack an item.Sometimes the generated heuristic does not have items ordered and it makes the heuristic work like an online heuristic. If it does not have the bins ordered all the items will be packed into the bins in the order they were created.
In Table 2 the results obtained with online and offline heuristics (described in Section 3.2) are shown. Results obtained by an exact algorithm were included, the MTP algorithm , and results from the fitness function from Section 3.3 are shown as well with the number of bins used. A row was added where the difference of containers regarding the optimal is shown. These results were obtained by applying the heuristics to each instance; all the results from an instance set were added.
Table 3 shows examples of heuristics generated using the proposed Grammar with GE for each instance set; some heuristics can be reduced but this is not part of the present work.
The results obtained by the PSO and PESO with the Grammars are shown in Table 4; these results are the median from 33 individual experiments. Using the results obtained by the heuristics and the GE with PESO and PSO the Friedman nonparametric test was performed to discern the results. The value obtained by the Friedman nonparametric test is 85.789215 and the value 6.763090E-11; this means that the tested heuristics have different performance. Due to this it was necessary to apply a post hoc procedure to obtain the Heuristics Ranking shown in Table 5.
Both Tables 2 and 4 have an extra row at the bottom with the total remaining bins. The results obtained by PESO using Grammar 3 show that this heuristic which has been deployed automatically has less bins than the other classic heuristics.
6. Conclusions and Future Works
In the present work a Grammar was proposed to generate online and offline heuristics in order to improve heuristics generated by other grammars and by humans. It also was proposed using PESO as a search strategy based on Swarm Intelligence to avoid the problems observed in PSO.
Through the results obtained in Section 5, it was concluded that it is possible to generate good heuristics with the proposed Grammar. Additionally it can be seen that the quality of these heuristics strongly depends on the grammar used to evolve.
The grammar proposed in the present work shows that is possible to generate heuristics with better performance that the well-known BestFit, FirstFit, NextFit, WorstFit, and Almost WorstFit heuristics from Section 3.2 regardless of heuristics being online or offline. While the heuristics are designed to work with all the instances sets, the GE adjusts heuristics automatically to work with one instance set and it makes it possible for GE to generate offline or online heuristics. The GE can generate as many heuristics as instances sets that have been working and try to adapt the best heuristic that can be generated with the used Grammar.
The current investigation is based on the one-dimensional bin packing problem but this methodology can be used to solve other problems, due to the generality of the approach. It is necessary to apply heuristic generation to other problems and investigate if the GE with PESO as search strategy gives better results than the GP or GE with other search strategies.
It will be necessary to find a methodology to choose the instance or instances for the training process as well as to determine if the instances are the same or to classify the instances in groups with the same features to generate only one heuristic by group.
It will also be necessary to research other metaheuristics that do not need the parameter tuning because the metaheuristics shown in the present paper were tuned using Covering Arrays.
Conflict of Interests
The authors declare that there is no conflict of interests regarding the publication of this paper.
The authors want to thank to the Instituto Tecnológico de León (ITL) for the support provided for this research. Additionally the authors want to aknowledge the generous support from the Consejo Nacional de Ciencia y Tecnológia (CONACyT) from Mexico for this research project.
M. R. Garey and D. S. Johnson, Computers and Intractability: A Guide to the Theory of NP-Completeness, W. H. Freeman, New York, NY, USA, 1979.View at: MathSciNet
E. A. Feigenbaum and J. Feldman, Computers and Thought, AAAI Press, 1963.
M. H. J. Romanycia and F. J. Pelletier, “What is a heuristic?” Computa tional Intelligence, vol. 1, no. 1, pp. 47–58, 1985.View at: Google Scholar
J. R. Koza, “Hierarchical genetic algorithms operating on populations of computer programs,” in Proceedings of the 11th International Joint Conference on Artificial Intelligence, pp. 768–774, San Mateo, Calif, USA, 1989.View at: Google Scholar
E. K. Burke, M. Hyde, and G. Kendall, “Evolving bin packing heuristics with genetic programming,” in Parallel Problem Solving from Nature-PPSN IX, T. Runarsson, H.-G. Beyer, J. Merelo-Guervós, L. Whitley, and X. Yao, Eds., vol. 4193 of Lecture Notes in Comput er Science, pp. 860–869, Springer, Berlin, Germany, 2006.View at: Google Scholar
C. Ryan, J. Collins, and M. O'Neill, “Grammatical evolution: evolving programs for an arbitrary language,” in Proceedings of the 1st European Workshop on Genetic Programming, vol. 1391 of Lecture Notes in Com puter Science, pp. 83–95, Springer, 1998.View at: Google Scholar
M. O'Neill and A. Brabazon, “Grammatical differential evolution,” in Proceedings of the International Conference on Artificial Intelligence (ICAI '06), CSEA Press, Las Vegas, Nev, USA, 2006.View at: Google Scholar
A. Moraglio and S. Silva, “Geometric di ff ere ntial evolution on the space of genetic programs,” in Genetic Programming, A. Esparcia-Alcazar, A. Ekart, S. Silva, S. Dignum, and A. Uyar, Eds., vol. 6021 of Lecture Notes in Computer Science, pp. 171–183, Springer, Berlin, Germany, 2010.View at: Google Scholar
E. Coffman Jr., G. Galambos, S. Martello, and D. Vigo, Bin Packing Approximation Algorithms: Combinatorial Analysis, Kluwer Academic Publishers, 1998.
E. Falkenauer, “A hybrid grouping genetic algorithm for bin packing,” Journal of Heuristics, vol. 2, pp. 5–30, 1996.View at: Google Scholar
A. Ponee-Pérez, A. Pérez-Garcia, and V. Ayala-Ramirez, “Bin-packing using genetic algorithms,” in Proceedings of the 15th International Conference on Electronics, Communications and Computers (CONIELECOMP '05), pp. 311–314, IEEE Computer Society, Los Alamitos, Calif, USA, March 2005.View at: Publisher Site | Google Scholar
J. Puchinger and G. Raidl, “Combining metaheuristics and exact algorithms in combinatorial optimization: a survey and classification,” in Artificial Intelligence and Knowledge Engineering Applications: A Bioinspired Approach, J. Mira and J. Alvarez, Eds., vol. 3562 of Lecture Notes in Computer Science, pp. 41–53, Springer, Berlin, Germany, 2005.View at: Google Scholar
J. Derrac, S. García, D. Molina, and F. Herrera, “A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms,” Swarm and Evolutionary Computation, vol. 1, no. 1, pp. 3–18, 2011.View at: Publisher Site | Google Scholar
J. R. Koza and R. Poli, “Genetic programming,” in Search Methodologies: Introductory Tutorials in Optimization and Decision Support Techniques, E. K. Burke and and G. Kendall, Eds., pp. 127–164, Kluwer, Boston, Mass, USA, 2005.View at: Google Scholar
I. Dempsey, M. O'Neill, and A. Brabazon, “Foundations in grammatical,” in Foundations in Grammatical Evolution for Dynamic Environments, vol. 194, Springer, New York, NY, USA, 2009.View at: Google Scholar
H. lan Fang, P. Ross, and D. Corne, “A promising genetic algorithm approach to job -shop scheduling, rescheduling, and open-shop scheduling problems,” in Proceedings of the 5th International Conference on Genetic Algorithms, pp. 375–382, Morgan Kaufmann, Burlington, Mass, USA, 1993.View at: Google Scholar
J. H. Holland, Adaptation in Natural and Artificial Systems, University of Michigan Press, 1975.View at: MathSciNet
J. Kennedy and R. Eberhart, “Particle swarm optimization,” in Proceedings of the IEEE International Conference on Neural Networks, pp. 1942–1948, December 1995.View at: Google Scholar
R. Poli, J. Kennedy, and T. Blackwell, “Particle swarm optimization,” Swarm Intelligence, vol. 1, no. 1, pp. 33–57, 2007.View at: Google Scholar
M. F. Tasgetiren, P. N. Suganthan, and Q. Pan, “A discrete particle swarm optimization algorithm for the generalized traveling salesman problem,” in Proceedings of the 9th Annual Genetic and Evolutionary Computation Conference (GECCO '07), pp. 158–167, New York, NY, USA, July 2007.View at: Publisher Site | Google Scholar
A. E. M. Zavala, A. H. Aguirre, and E. R. Villa Diharce, “Constrained optimization via Particle Evolutionary Swarm Optimization algorithm (PESO),” in Proceedings of the Conference on Genetic and Evolutionary Computation (GECCO '05), pp. 209–216, New York, NY, USA, June 2005.View at: Google Scholar
A. E. Muñoz-Zavala, A. Hernández-Aguirre, E. R. Villa-Diharce, and S. Botello-Rionda, “PESO+ for constrained optimization,” in Proceedings of the IEEE Congress on Evolutionary Computation (CEC '06), pp. 231–238, July 2006.View at: Google Scholar
S. Martello and P. Toth, Knapsack Problems. Algorithms and Computer Implementations, John Wiley & Sons, New York, NY, USA, 1990.View at: MathSciNet
E. G. Coffman Jr., C. Courcoubetis, M. R. Garey, P. W. Shor, and R. R. Weber, “Bin packing with discrete item sizes. I. Perfect packing theorems and the average case behavior of optimal packings,” SIAM Journal on Discrete Mathematics, vol. 13, no. 3, pp. 384–402, 2000.View at: Publisher Site | Google Scholar | MathSciNet
E. Falkenauer and A. Delchambre, “A genetic algorithm for bin packing and line balancing,” in Proceedings of the IEEE International Conference on Robotics and Automation, vol. 2, pp. 1186–1192, May 1992.View at: Google Scholar
C. D. T. Suárez, E. P. Gonzlez, and M. V. Rendón, “A heuristic algorithm for the offline one-dimensional bin packing problem inspired by the point Jacobi matrix iterative method,” in Proceedings of the 5th Mexican International Conference on Artificial Intelligence (MICAI '06), pp. 281–286, Mexico City, Mexico, November 2006.View at: Publisher Site | Google Scholar
S. Tam, H. Tam, L. Tam, and T. Zhang, “A new optimization method, the algorithm of changes, for Bin Packing Problem,” in Proceedings of the IEEE 5th International Conference on Bio-Inspired Computing: Theories and Applications (BIC-TA '10), pp. 994–999, September 2010.View at: Publisher Site | Google Scholar
M. Hyde, A genetic programming hyper-heuristic approach to automated packing [Ph.D. thesis], University of Nottingham, 2010.
M. A. Sotelo-Figueroa, H. J. Puga Soberanes, J. Martin Carpio et al., “Evolving bin packing heuristic using micro-differential evolution with indirect representation,” in Recent Advances on Hybrid Intelligent Systems, vol. 451 of Studies in Computational Intelligence, pp. 349–359, Springer, Berlin, Germany, 2013.View at: Publisher Site | Google Scholar
M. Sotelo-Figueroa, H. Puga Soberanes, J. Martin Carpio, H. Fraire Huacuja, L. Cruz Reyes, and J. Soria-Alcaraz, “Evolving and reusing bin packing heuristic through grammatical differential evolution,” in Proceedings of the World Congress on Nature and Biologically Inspired Computing (NaBIC '13), pp. 92–98, Fargo, ND, USA, August 2013.View at: Publisher Site | Google Scholar
E. K. Burke and G. Kendall, Search Method ologies: Introductory Tutorials in Optimization and Decision Support Techniques, Springer, New York, NY, USA, 2006.
A. Rodriguez-Cristerna, J. Torres-Jimenez, I. Rivera-Islas, C. Hernandez-Morales, H. Romero-Monsivais, and A. Jose-Garcia, “A mutation-selection algorithm for the problem of minimum brauer chains,” in Advances in Soft Computing, I. Batyrshin and G. Sidorov, Eds., vol. 7095 of Lecture Notes in Computer Science, pp. 107–118, Springer, Berlin, Germany, 2011.View at: Google Scholar
D. J. Sheskin, Handbook of Parametric and Nonparametric Statistical Procedures, Chapman & Hall/CRC, Boca Raton, Fla, USA, 2nd edition, 2000.View at: MathSciNet