Abstract

The permutation flowshop scheduling problem (PFSP) is an important issue in the manufacturing industry. The objective of this study is to minimize the total completion time of scheduling for minimum makespan. Although the hybrid genetic algorithms are popular for resolving PFSP, their local search methods were compromised by the local optimum which has poorer solutions. This study proposed a new hybrid genetic algorithm for PFSP which makes use of the extensive neighborhood search method. For evaluating the performance, results of this study were compared against other state-of-the-art hybrid genetic algorithms. The comparisons showed that the proposed algorithm outperformed the other algorithms. A significant 50% test instances achieved the known optimal solutions. The proposed algorithm is simple and easy to implement. It can be extended easily to apply to similar combinatorial optimization problems.

1. Introduction

The permutation flowshop scheduling problem (PFSP) [17] is the combination of jobs and machines of production scheduling problems. Each job must be processed on machines. All jobs are processed in the same sequence on each machine, but with different processing times. Each machine is identical and can only process one job at a time, and the processing cannot be interrupted or preempted. Specifically, let denote the processing time of job on Machine , and let denote a permutation of the jobs. Then the jobs are processed consecutively on Machine 1 in the order given by , with no delays between the executions of consecutive jobs. The jobs are processed in the same order on Machine 2, but each job must wait until it is finished processing on Machine 1 so that it can be started on Machine 2. Similarly, a job must wait to be processed on Machine until it has finished processing on Machine .

Subject to the foregoing stipulations, find a permutation that minimizes the maximum makespan, that is, the time to compete the processing of the last job in on the last Machine . By the preceding definition, PTSP has ! possible schedules. With an increase in the number of jobs and machines, the complexity of PTSP grows exponentially. Garey et al. [8] have proven that PFSP belonged to an NP-complete problem for . Thus, in recent years, approximation methods have been popularly used to solve PFSP. One such method is the metaheuristic algorithms [9]. In metaheuristic algorithms, studies are focused on the hybrid genetic algorithms which combine genetic algorithms and local search methods. The performance of hybrid genetic algorithms, or memetic algorithms (MAs) [10], is affected by local search methods. However, the local search methods are easily trapped in the local optima which increase the difficulty to find global optima.

This study proposes a new hybrid genetic algorithm (GA_ENS algorithm) which combines genetic algorithm and extensive neighborhood search. The diversity of the local search is thus significantly increased. When the GA_ENS algorithm explores the neighborhood of the current solution, the neighborhood will be dynamically changed to prevent redundant searches in the same region. This increases the probability of finding the global optimum.

In this study, we compared the GA_ENS algorithm with two other algorithms which belonged to state-of-the-art hybrid genetic algorithms for PFSP. One is the HGA_RMA algorithm proposed by Ruiz et al. [11] and the other is the NEGAVNS algorithm proposed by Zobolas et al. [12]. The two algorithms used different local search methods.

The remainder contents of this paper are organized as follows: Section 2 is a brief overview of the related algorithms for PFSP. Section 3 is the detailed description of this study proposing the new hybrid genetic algorithm. Section 4 is the experimental results and analyses. Finally, conclusions are presented in Section 5.

In 1954, Johnson [13] proposed Johnson’s rule to solve two-machine flowshop scheduling problem. As a result, much research had been focused on resolving the different flowshop scheduling problems. Currently, the m-machine () PFSP is the main trend of research.

Reza Hejazi and Saghafian [14] presented a complete review of flowshop scheduling problems with makespan objectives from 1954 to 2005. They reviewed various production scheduling problems of flowshops, including PFSP.

PFSP belongs to an NP-complete problem [8]. Therefore, approximation methods are applied to solve PFSP. However, the approximation methods cannot provide optimal solutions. They can only provide acceptable solutions within a reasonable time range. Approximate methods can generally be classified as heuristic algorithms and metaheuristic algorithms [15].

The design core of the heuristic algorithms is based on experiences. In other words, the design of the algorithms relies heavily on understanding the problems and experiences. Heuristic algorithms can solve problems quickly and obtain reasonable solutions within reasonable time. However, they cannot guarantee the same quality solution in different data. The greatest drawback is that they cannot guarantee global and consistent optimal solutions.

The heuristic algorithms can mainly be divided into two categories: constructive methods and improvement methods [15]. The constructive methods construct solutions from scratch according to some special rules and can provide solutions rather quickly. However, their solution quality is not guaranteed, examples of which include Palmer’s heuristic method [16], CDS heuristic [17], rapid access (RA) [18], and NEH heuristic [19]. NEH heuristic is the best algorithm in the constructive methods [15, 20]. NEH is a quick local search for minimum makespan. For the jobs (!), for the first-level current permutation, the kth job which minimizes the partial makespan is inserted. The sequence is calculated with . Taillard’s insertion moves jobs by new neighborhoods where is the best solution. Current research has been focused on improving the NEH mechanisms which also involved the hybrid designs [21, 22]. The improvement methods are also called local search methods. They can provide good solutions. However, they are time-consuming. The basic design idea of the improvement methods is to improve the initial solution by some specific rules and expect to obtain the better solutions, examples of which are RACS and RAES [18].

In the last 20 years, much work has been done on the relatively new type approximate metaheuristic algorithms which are also known as modern heuristics. The algorithms inject the probability concept into the process of solving problems. Compared with the heuristic algorithms, metaheuristic algorithms require more time for getting the solutions. However, they resulted in higher quality solutions than heuristic algorithms [15].

In order to enhance the quality of solutions in metaheuristic algorithms, algorithms must be designed with the balance between diversification and intensification. Diversification belongs to the capability of global search and maintaining the population diversity; it can explore all areas of the search space. Intensification belongs to the capability of local search; it can exploit the neighborhood of the current solution and find a local optimum.

Metaheuristic algorithms can be classified as trajectory methods and population-based methods according to the number of solutions used in the search process [9]. Trajectory methods generate a feasible solution in each iteration and attempt to find the best solution along the trajectory of the search space, such as simulated annealing (SA) [2328], Tabu search (TS) [2934], iterated local search (ILS) [9, 15], and variable neighborhood search (VNS) [9, 35]. Population-based methods generate a set of feasible solutions to perform parallel search in each generation and get the best solution after iterative evolution, such as genetic algorithms (GAs) [3640], ant colony optimization (ACO), and particle swarm optimization (PSO).

3. A New Hybrid Genetic Algorithm

Traditional genetic algorithms utilize the population to execute multiple points search using the genetic operators. As a result, they have the capability of global search and population diversity. In order to enhance the efficiency of local search for obtaining better quality solutions, this study proposes a new hybrid genetic algorithm: the GA_ENS algorithm (Figure 1). The difference between the GA_ENS algorithm and traditional genetic algorithms is that the GA_ENS algorithm added an operator of extensive neighborhood search, which can enhance the force of intensification; we named this operator “extensive neighborhood search operator” (ENS operator). In the following, we will describe all operators of the GA_ENS algorithm.

3.1. Encoding

In accordance with the definition of PFSP and the objective of this study, the permutation encoding was used to design chromosomes in the GA_ENS algorithm. Every chromosome consisted of genes, which the number of jobs. The order of genes of a chromosome was the processing sequence of all jobs; in other words, jobs was processed according to the order of chromosome encoding until all jobs had been processed.

We give an instance of PFSP in Table 1: the number of jobs is 3 () and the number of machines is also 3 (). Each number in Table 1 denotes the processing time of each job on each machine. To assume that a chromosome representation is , the processing sequence of this chromosome is “Job → Job → Job ” according to the encoding of the GA_ENS algorithm.

The objective of this study is to find the minimum makespan (); the makespan denotes the total completion time of all jobs that have been processed. Algorithm 1 is the pseudocode for makespan calculation: n is the total number of jobs; is the total number of machines; span denotes the accumulative processing time on the th machine (default value = 0); denotes the processing time of the job of the th position in the chromosome on Machine . The computational result of the above chromosome makespan is 19 (see Table 2 and Figure 2).

 for to
  for to
   if then
    span = span
   else
    span[j] = max(span, span) +
   end if
  end for
 end for
= span[]
3.2. Reproduction

The initial population of the test instance in the GA_ENS algorithm must be generated randomly before evolution. The population consisted of multiple chromosomes according to population size, which is a parameter. A chromosome is usually called individual, and an individual is a solution of PFSP, namely, a point in the search space.

The GA_ENS algorithm used multiple solutions to execute multiple points search in the search space and is expected to find the high quality solutions. First, the GA_ENS algorithm picked chromosomes and reproduced them to the mating pool. Second, it proceeded to evolve these chromosomes through the crossover and mutation operators that simulated biological evolution. Next, it utilized the ENS operator to improve quality of solutions. Finally, the GA_ENS algorithm utilized the selection operator to select better quality chromosomes into the next evolutionary generation. Before evolution of each generation, the total number of chromosomes called population size was kept the same and obtained the best solution which was an optimal schedule through iterative evolution until a termination condition was reached.

In order to maintain the population diversity, the mechanism of pick at random was adopted in the reproduction operator of this study. In other words, multiple pairs of chromosomes (called parents) were selected and reproduced to the mating pool for breeding new chromosomes (called offspring) in the next stage.

The total number of reproductions in this stage was the number of population sizes multiplied by crossover rate. After the above stage, the successive evolution stages continued.

3.3. Crossover

The main duty of the crossover operator was to generate the offspring from parents of the mating pool by mating. The purpose of this stage was that the offspring can obtain good genes from parents by simulating biological evolution and can obtain better quality solutions by the interchange of information. The difference between offspring and parents can guide the search to different regions and lead to the search regions expanded. This study decided to use the two-point crossover (Figure 3) because Murata et al. [41] proved that this two-point crossover was effective for PFSP.

3.4. Mutation

The mutation operator mutated genes of selected chromosomes self in order to enhance the population diversity and completely search the entire search space. If the mutation operator cleverly cooperates with the crossover operator, the premature convergence to the local optimum can be avoided and can obtain better quality solutions.

In addition, we need to set the mutation rate for deciding the total number of randomly selected chromosomes to undergo the mutation operator. For example, we assume the number of chromosomes after the crossover operator was performed is 15 and the mutation rate is 0.2; according to our mutation mechanism, there are 3 chromosomes (15 × 0.2 = 3) that will be randomly selected to perform the mutation operator. The mutation mechanism used in this study was the 3-position change mutation (see Figure 4).

3.5. Extensive Neighborhood Search (ENS)

The design concepts of the framework of the local search in this study were inspired by variable neighborhood search (VNS) [9, 35] and iterated greedy algorithm [6]. We integrated the above two algorithms and our designs into the proposed local search method. The searched neighborhood by this method will be dynamically changed with the perturbation of the current solution, so we named this method “extensive neighborhood search” (ENS).

The neighborhood structure must be efficiently defined for the enhancement of searching efficiency. The Insertion-Move [20] was used to construct neighborhood structures in this study.

The probability of local search was set to 1; namely, all offspring which were produced by the crossover and mutation operators must undergo the ENS operator. The steps of the ENS operator are described as follows (Algorithm 2 and Figure 5 are the procedure of the ENS operator).

function ENS(s)
 for to
  
  
  
  if then
   
  end if
 end for
 if then
  
  call ENS
 end if

Step 1. This is the initial step. Let the best solution () = the current solution and the number of perturbation .

Step 2. Execute the shaking of perturbation mechanism. The NEH heuristic [19] is used to reconstruct a new solution ( is converted to ) through the destruction and construction phases [6].

Step 3. Construct the neighborhood of by the Insertion-Move [20].

Step 4. is the best neighbor of the neighborhood : .

Step 5. If , then ; namely, the best solution needs to be updated when better solution is found.

Step 6. Let and go to Step 2 when ( is the maximum number of perturbations); on the contrary, go to Step 7.

Step 7. If , then and go to Step 1; on the contrary, end the ENS operator.

3.6. Selection

After the crossover, mutation, and ENS operators were performed in each generation, a new population was selected into the next generation before a termination condition had been reached, called the selection operator. This selection operator belonged to the evolutional selection, and its duty was to select chromosomes for the next generation evolution.

The selection operator can guide the search to the correct direction and lead to the fact that the quality of solutions can be continuously improved. In order to maintain the population diversity and promote the evolution, the selection mechanism of this study was the tournament selection.

In the tournament selection, comparing each chromosome of the offspring with a randomly selected chromosome from the parents, the chromosome with lower makespan (better quality) was selected into the next generation. The above process was repeated until every chromosome of the offspring had performed the tournament selection.

For maintaining the population diversity and avoiding losing particular chromosomes which can guide the search to the correct direction, the acceptance criterion [24, 25] was applied to inferior offspring to decide whether to accept or reject them into the next generation.

The acceptance criterion (see (1)) proposed by Osman and Potts [27] for PFSP was used in this study; the inferior offspring could be accepted into the next generation when (2) [6] is met.

In (1), the parameter is 0.4 suggested by Ruiz and Stützle [6]; is the total number of jobs; is the total number of machines; denotes the processing time of the job on Machine .

In (2), is a random number drawn uniformly from ; denotes ; temperature is the value of (1).

Based on the above description, there are two situations in the tournament selection:(1)An offspring chromosome is better than a parent chromosome: the offspring chromosome replaces the parent chromosome into the next generation.(2)An offspring chromosome is worse than a parent chromosome: when (2) is met, the offspring chromosome will replace the parent chromosome into the next generation.

4. Experimental Results and Analyses

The GA_ENS algorithm was implemented in Microsoft Visual C++, running on an Intel Pentium 2.5 GHz PC with 1 GB of main memory.

Test datasets of all the experiments in this study were taken from Taillard’s benchmark [42, 43]. It includes 12 sets of different PFSP problems, and each set is composed of 10 different instances. So Taillard’s benchmark has 120 different PFSP instances altogether.

All the experiment results in this study need to be normalized for fair comparison, converting the makespan values obtained by the GA_ENS algorithm to the relative percentage deviation (RPD) values with the following equation:

In (3), denotes the makespan value obtained by the GA_ENS algorithm; denotes the known optimal makespan value of Taillard’s benchmarks [43]. According to the above equation, the quality of the solution is better when this solution has a lower RPD value.

An experiment of termination condition was firstly executed: the test datasets were 120 different PFSP instances of Taillard’s benchmark, and the total number of generations was 2000. To observe this experimental result (see Figure 6), the improvement range of RPD is very little or nothing after 500 generations; therefore, we decided that the termination condition of each follow-up experiment with the GA_ENS algorithm was 500 generations.

We proceeded to execute an experiment of parameters which included population size, crossover rate, mutation rate, and the maximum number of perturbations in the ENS operator. Several values of these parameters were set as follows based on related experimental experiences:(1)Population size: 10, 20, and 30.(2)Crossover rate: 0.5, 0.6, 0.7, 0.8, 0.9, and 1.0.(3)Mutation rate: 0.1, 0.2, 0.3, and 0.4.(4)The maximum number of perturbations: 5, 10, and 15.

The 216 combinations (3 × 6 × 4 × 3 = 216) of the aforementioned parameters were tested with 9 instances of Taillard’s benchmark which consisted of and . The termination condition of this experiment was 500 generations and it was repetitively executed 5 times. According to this experimental result, we can obtain the best parameter combination which has the lowest value of the average RPD:

In (4), denotes the makespan value obtained by the GA_ENS algorithm at a time experiment; denotes the known optimal makespan value of Taillard’s benchmarks [43]; is the number of execution times and the total number is 5 in this study.

Four best parameter values were obtained from this experiment, shown as follows:(1)Population size: 30.(2)Crossover rate: 0.8.(3)Mutation rate: 0.2.(4)The maximum number of perturbations: 15.

This completes setting the values of all operators and parameters in the GA_ENS algorithm. The next step is to execute the main experiment of the GA_ENS algorithm (see Table 3). The makespan values from the GA_ENS algorithm are converted to the RPD in (3) and the average RPD in (4) for comparison tests with the two selected algorithms [11, 12]. When a solution has a lower RPD value than other solutions, this solution is closer to the known optimal solution. This means that the quality of this solution is better than other solutions.

In Table 3, is the total number of jobs; is the total number of machines. Table 3 presents the average experimental results from 12 sets of different PFSP problems, and each set is composed of 10 different instances, so the instances total to 120 different PFSP instances from Taillard’s benchmark.

Table 4 presents the comparison of GA_ENS with other algorithms: the test datasets include 12 sets of different PFSP problems from Taillard’s benchmark; n is the total number of jobs; m is the total number of machines. The main compared algorithms were described in Section 1 of this paper, namely, the HGA_RMA algorithm [11] and the algorithm [12].

The “GA_ENS (without ENS)” in Table 4 denotes that the GA_ENS algorithm without local search (i.e., local search probability = 0), so as to observe the performance of the ENS operator. The values of GA_ENS (without ENS) are the average experimental results after 5 runs.

In Table 3, the average RPD in 5 runs of the GA_ENS algorithm is 0.268 and the standard deviation is about 0.009. According to the above results, the quality and stability of solving PFSP of the GA_ENS algorithm both are good, so it is a stable and effective algorithm.

When or in Table 3, all the test instances can achieve the known optimal solutions. In addition, some instances of other sets of different problems can achieve the known optimal solutions, so about 50% of the test instances can achieve the known optimal solutions.

In Table 4, firstly to observe the results between the GA_ENS algorithm and the GA_ENS (without ENS) algorithm, the former is far better than the latter (0.268 versus 4.773) so that the proposed ENS operator can enhance the quality of solutions. Next, compared with other algorithms (the HGA_RMA algorithm [11] and the algorithm [12]), the comparison results showed that the proposed GA_ENS algorithm outperformed compared algorithms and can effectively improve the probability of finding known optimal solutions.

5. Conclusions

This study proposes a GA_ENS algorithm for the PFSP, which combines two design strategies: the genetic algorithm (the kernel algorithm) and the extensive neighborhood search mechanism (namely, ENS operator). The genetic algorithm is responsible for the diversification force, and the ENS operator controls the intensification force. They are coordinately important and complementary to each other. For Taillard’s benchmark, the experimental results show that about 50% of the test instances can achieve the known optimal solutions, small instances especially. And then, in the inferior instances, the difference in makespan between the inferior quality instances and the known optimal solutions can be reduced to within 1%. For the aforementioned reasons, the GA_ENS algorithm can effectively solve the PFSP.

Competing Interests

The authors declare that they have no competing interests.