Abstract

This paper discusses the flow shop scheduling problem to minimize the makespan with release dates. By resequencing the jobs, a modified heuristic algorithm is obtained for handling large-sized problems. Moreover, based on some properties, a local search scheme is provided to improve the heuristic to gain high-quality solution for moderate-sized problems. A sequence-independent lower bound is presented to evaluate the performance of the algorithms. A series of simulation results demonstrate the effectiveness of the proposed algorithms.

1. Introduction

In a flow shop scheduling model, each job must be processed on a set of machines in identical order. The goal is to determine the job sequence to optimize a certain predetermined objective function. At any given time, each machine can process at most one job, and each job can be handled by at most one machine. Meanwhile, each job cannot be preempted by the other jobs. Flow shop scheduling problems widely exist in industrial production and mechanical manufacturing. For example, in a steel-making process, molten steel is casted into semifinished slabs by a conticaster; after being heated by the heat furnace, the slabs are rolled into products in rolling mill. Obviously, it is a typical flow shop production model. As most of the problems are strongly NP-hard, it is impossible to obtain the global optimum solution in polynomial time. So the study of flow shop scheduling algorithms is very important for reducing running time and boosting productivity.

Since the first scheduling rule was presented by Johnson [1] for the two-machine flow shop problem with objective of makespan (i.e., the maximum completion time) minimization, many works have been conducted on this research area. A comprehensive survey of flow shop makespan problems by 2010 can be found in Potts and Strusevich [2] or Bai and Ren [3]. The up-to-date research works are mentioned as follows. A. Rudek and R. Rudek [4] proved the ordinary NP-hardness for two-machine flow shop makespan problem when job processing times are described by nondecreasing position dependent functions (aging effect) on at least one machine and indicated the strong NP-hardness if job processing times are varying on both machines. Aydilek and Allahverdi [5] presented a polynomial time heuristic algorithm for the two-machine flow shop makespan problem with release dates. For the minimizing makespan in an -machine flow shop with learning considerations problem, Chung and Tong [6] proposed a dominance theorem and a lower bound to accelerate the branch-and-bound algorithm for seeking the optimal solution. For the criterion of makespan in flow shop model, a high-performing constructive heuristic with an effective tie-breaking strategy was proposed by Ying and Lin [7] to improve the quality of solutions. Similarly, Gupta et al. [8] proposed an alternative heuristic algorithm that is compared with the benchmark, Palmer’s, CDS, and NEH algorithms, to solve -job and -machine flow shop scheduling problem with minimizing makespan. For the result of the job-related criterion, Bai [9] presented the asymptotic optimality of the shortest processing time-based algorithms for the flow shop problem to optimize total quadratic completion time with release dates. Bai and Zhang [10] extended the results to a general objective, total -power completion time ().

In this paper, the flow shop scheduling problem for the minimization of makespan with release dates is addressed. Contrary to the static setting in which the jobs are simultaneously available, jobs arrive over time according to their release dates, which more closely approaches practical scheduling environments. Lenstra et al. [11] proved that the two-machine flow shop makespan problem with release dates is strongly NP-hard. It implies that the optimal solution of this problem cannot be found in polynomial time; heuristic algorithm may be more effective to obtain an approximate solution for large-sized problems. Therefore, a new modified GS algorithm (MGS) based on the algorithm of Gonzalez and Sahni [12] is presented for slow shop minimizing makespan with release dates. Then an improved scheme is provided to promote performance of the MGS algorithm. Moreover, a sequence-independent lower bound of the problem is presented. Computational experiments reveal the performances of the MGS algorithm, improved scheme, and lower bound in different size problems.

The remainder of this paper is organized as follows. The problem is formulated in Section 2, and the MGS algorithm and improved scheme are provided in Sections 3 and 4, respectively. The new lower bound and computational results are given in Section 5. This paper closes with the conclusion in Section 6.

2. Problem Statement

In a flow shop problem, a set of jobs has to be processed on different machines in the same order. Job , , is processed on machines , , with a nonnegative processing time and a release date , which is the earliest time when the job is permitted to process. Each machine can process at most one job and each job can be handled by at most one machine at any given time. The machine processes the jobs in a first come, first served manner. The permutation schedule is considered in this paper, and the intermediate storage between successive machines is unlimited. The completion time of job , , on machine ,  , is denoted by . The goal is to determine a job sequence that minimizes the makespan, that is, .

3. The Modified GS Algorithm

Gonzalez and Sahni [12] presented the GS algorithm to solve the flow shop makespan problem. Based on its idea, a new heuristic named modified GS (MGS) algorithm is presented to deal with the flow shop makespan problem with release dates. A formal expression of the MGS algorithm is presented as follows.

3.1. The MGS Algorithm

Step 1. Divide the machines into groups.

Step 2. For each machine group ,  , whenever machine becomes idle or new jobs arrive, process the available jobs by Johnson’s rule (i.e., first schedule the jobs with in order of nondecreasing and then schedule the remaining jobs in order of nonincreasing , where denotes the processing time of job in group on machine ,  ); if no job is available, go to Step 3.

Step 3. Wait until a job arrives and go to Step 2. If all the jobs have been scheduled, go to Step 4.

Step 4. Terminate the program and calculate the objective values of the schedules. Select the minimum one as the final solution, .

The flowchart of the algorithm is shown in Figure 1. An example is proposed to show the execution of the MGS algorithm.

Example 1. A flow shop scheduling problem involves three machines, , , and , and four jobs, , , , and with release dates. The release dates and processing times of the jobs are listed below. The objective function of the problem is . Consider

The final sequence of the MGS algorithm is . And the objective value is . The scheduling process is shown in Figure 2.

4. The Improved Scheme

To further promote the quality of the solution for medium-scale problems, some properties for two-machine flow shop makespan problem with release dates are presented as follows.

Property 1. For problem , if two adjacent jobs and satisfy (i),(ii) and ,(iii),then the optimal sequence is that job is scheduled before job .

Proof. Define the completion times of the two jobs in Sequences and being and , respectively, . Therefore, for Sequence , and, for Sequence ,By subtraction, it is obtained thatNote thattherefore, That indicates the optimality of Sequence .

Property 2. For problem , if two adjacent jobs and satisfy(i),(ii) and ,(iii),then the optimal sequence is that job is scheduled before job .

Proof. Define the completion times of the two jobs in Sequences and being and , respectively, . Therefore, for Sequence , and, for Sequence , By subtraction, it is obtained thatNote thattherefore,That indicates the optimality of Sequence .

Property 3. For problem , if two adjacent jobs and satisfy(i),(ii) and ,(iii),then the optimal sequence is that job is scheduled before job .

Proof. Define the completion times of the two jobs in Sequences and being and , respectively, . Therefore, for Sequence , and, for Sequence , By subtraction, it is obtained thatNote thattherefore, That indicates the optimality of Sequence .

Property 4. For problem , if two adjacent jobs and satisfy(i),(ii) and ,(iii),then the optimal sequence is that job is scheduled before job .

Proof. Define the completion times of the two jobs in Sequences and being and , respectively, . Therefore, for Sequence , and, for Sequence , By subtraction, it is obtained thatNote thattherefore, That indicates the optimality of Sequence .

With these properties, an improved scheme is provided to promote the original solution obtained by the MGS algorithm. In a formal expression of the improved scheme, ,  , denotes the job found in the th position of the original sequence , denotes the number of comparisons that job sequentially compares forward with the th job to a given job in a seed sequence, denotes the number of comparisons from the current job to the end job, denotes the number of groups, and denotes a sequence set that is generated by exchanging the th job of with each different job in a seed sequence. The format expression can be summarized as follows.

4.1. Improved Scheme

Step 1. Generate the initial sequence with the MGS algorithm and calculate the objective value .

Step 2. Set ,  , and .

Step 3. Compare job forward with the next (if , set ) jobs in sequence . If the two jobs and satisfy(1),(2)one of the following conditions:(i),(ii),(iii),(iv),then exchange the jobs, calculate the objective value, and proceed to Step 4.

Step 4. If the objective value obtained in Step 3 is smaller than , set   and determine sequence , such that . If , set and return to Step 3; otherwise, proceed to Step 5.

Step 5. If , set and return to Step 3; otherwise, proceed to Step 6.

Step 6. is the final schedule. Stop.

5. Computational Results

In this section, a series of computational experiments are designed to reveal the performances of the proposed algorithm and improved scheme in different size problems. Ten different random tests for each combination of the parameters were, respectively, performed, and the averages are reported. All the algorithms were coded in MATLAB R2012a and implemented on a PC with an Intel Core i7-2600 CPU (3.4 GHz × 4) and 4 GB RAM. The processing times of the jobs were randomly generated from a discrete uniform distribution on and a discrete normal distribution with expectation 5.5 and variance 1.72.

To evaluate the performance of an algorithm for problem , Bai et al. [13] presented a lower bound ():

However, the value of sometimes may be larger than the optimal value because is sequence-dependent. Consider the following example.

Example 2. A two-machine flow shop scheduling problem involves two jobs, and , with release dates. The release dates and processing times of the jobs are listed below. The objective of the problem is :For sequence , calculate the value of :And the optimal schedule of the problem is . The associated optimal value isObviously, .

To guarantee that the lower bound value is strictly less than the optimal value, a new lower bound () is provided:

Obviously, the last two items in guarantee that the lower bound is independent of the job sequence.

Theorem 3. Let the processing times , , , be independent random variables having the same continuous distribution with bounded density defined on . Then, for every , , with probability one,

Proof. Without loss of generality,Dividing on both sides of (28) and taking limit,Bai et al. [13] proved that Combining (29) and (30) yields the result of Theorem 3.

Calculate Example 2 with :

5.1. Tests for the MGS Algorithm

Several numerical tests are conducted to reveal the effectiveness of the MGS algorithm. Three, five, and ten machines and 50, 100, 200, 500, and 1000 jobs are tested, respectively. The release dates are drawn from a discrete uniform distribution on , where is the number of jobs and is a multiplier with the values of 1, 2, 5, and 8. The DSJF heuristic algorithm presented by Bai and Tang [14] is used as a reference for comparison. First, in Tables 1 and 2, we compare the performance of the MGS algorithm and DSJF heuristic by employing mean relative percentage %, where is the objective value of the MGS algorithm and is the objective value of the DSJF heuristic.

In Tables 1 and 2, the data show that the performance of the two algorithms is dependent on multiplier . To further determine the dominance of the two algorithms, in Tables 3 and 4, we execute the following experiments with mean relative percentage %, where denotes the times when the DSJF heuristic dominates the MGS algorithm.

The results of Tables 3 and 4 indicate that the DSJF heuristic completely dominates the MGS algorithm as and the opposite case as . Therefore, in the two cases, obtaining the near-optimal solution with the proper one of the algorithms directly is more practical. To demonstrate the asymptotic optimality of the MGS algorithm, we compare its objective value with the associated value of . The mean relative percentage   × 100% is employed, where is the objective value of the MGS algorithm and is the objective value of .

The data in Tables 5 and 6 indicate the asymptotic optimality of the MGS algorithm. Contrarily, for the fixed number of jobs, ratios enlarge as the number of machines increases from 3 to 10. The cause may be that the larger the number of machines is, the larger the quantity of idle times is, which enlarges the gap between the value of objective and its lower bound.

5.2. Tests for the Improved Scheme

We compare the effectiveness of the improved scheme (IS) with that of the DSJF heuristic/MGS algorithm. In Tables 7 and 8, mean relative percentage   × 100% was employed, where is the objective value of the DSJF heuristic/MGS algorithm and is the objective value of IS. Three, five, and ten machines with 20 and 50 jobs are tested, respectively. The value of multiplier is 0.1, 0.04 (for 50 jobs) and 0.05 (for 20 jobs). The processing times of the jobs were randomly generated from a discrete uniform distribution on .

The results indicate that the improved scheme promotes the performance of the algorithms effectively for moderate-sized problems. As the problem scale and the range of release dates continue to enlarge, the improvement is weakened and the running time lengthens. Therefore, obtaining the near-optimal solution with the MGS algorithm directly for large-scale problems is more practical.

6. Conclusions

This paper presented a modified heuristic algorithm, that is, the MGS algorithm, for the flow shop makespan problem with release dates. And an improved scheme was introduced to improve the quality of the original solution. To evaluate the algorithms numerically, a new lower bound that is sequence independent was provided for the problem. The computational results demonstrate the dominance of the MGS algorithm and the effectiveness of the improved scheme.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This work is partially supported by the National Natural Science Foundation of China (61473073, 61104074, and 61203329), Fundamental Research Funds for the Central Universities (N130417006, N110417005), Science and Technology Plan Project of Shenyang City (F11-264-1-63), and Program for Liaoning Excellent Talents in University (LJQ2014028).