Discrete Dynamics in Nature and Society

Discrete Dynamics in Nature and Society / 2015 / Article
Special Issue

Recent Advances in Scheduling and Its Applications

View this Special Issue

Research Article | Open Access

Volume 2015 |Article ID 320140 | https://doi.org/10.1155/2015/320140

Tao Ren, Meiting Guo, Lin Lin, Yunhui Miao, "A Local Search Algorithm for the Flow Shop Scheduling Problem with Release Dates", Discrete Dynamics in Nature and Society, vol. 2015, Article ID 320140, 8 pages, 2015. https://doi.org/10.1155/2015/320140

A Local Search Algorithm for the Flow Shop Scheduling Problem with Release Dates

Academic Editor: Baoqiang Fan
Received25 Aug 2014
Accepted04 Nov 2014
Published19 Mar 2015

Abstract

This paper discusses the flow shop scheduling problem to minimize the makespan with release dates. By resequencing the jobs, a modified heuristic algorithm is obtained for handling large-sized problems. Moreover, based on some properties, a local search scheme is provided to improve the heuristic to gain high-quality solution for moderate-sized problems. A sequence-independent lower bound is presented to evaluate the performance of the algorithms. A series of simulation results demonstrate the effectiveness of the proposed algorithms.

1. Introduction

In a flow shop scheduling model, each job must be processed on a set of machines in identical order. The goal is to determine the job sequence to optimize a certain predetermined objective function. At any given time, each machine can process at most one job, and each job can be handled by at most one machine. Meanwhile, each job cannot be preempted by the other jobs. Flow shop scheduling problems widely exist in industrial production and mechanical manufacturing. For example, in a steel-making process, molten steel is casted into semifinished slabs by a conticaster; after being heated by the heat furnace, the slabs are rolled into products in rolling mill. Obviously, it is a typical flow shop production model. As most of the problems are strongly NP-hard, it is impossible to obtain the global optimum solution in polynomial time. So the study of flow shop scheduling algorithms is very important for reducing running time and boosting productivity.

Since the first scheduling rule was presented by Johnson [1] for the two-machine flow shop problem with objective of makespan (i.e., the maximum completion time) minimization, many works have been conducted on this research area. A comprehensive survey of flow shop makespan problems by 2010 can be found in Potts and Strusevich [2] or Bai and Ren [3]. The up-to-date research works are mentioned as follows. A. Rudek and R. Rudek [4] proved the ordinary NP-hardness for two-machine flow shop makespan problem when job processing times are described by nondecreasing position dependent functions (aging effect) on at least one machine and indicated the strong NP-hardness if job processing times are varying on both machines. Aydilek and Allahverdi [5] presented a polynomial time heuristic algorithm for the two-machine flow shop makespan problem with release dates. For the minimizing makespan in an -machine flow shop with learning considerations problem, Chung and Tong [6] proposed a dominance theorem and a lower bound to accelerate the branch-and-bound algorithm for seeking the optimal solution. For the criterion of makespan in flow shop model, a high-performing constructive heuristic with an effective tie-breaking strategy was proposed by Ying and Lin [7] to improve the quality of solutions. Similarly, Gupta et al. [8] proposed an alternative heuristic algorithm that is compared with the benchmark, Palmer’s, CDS, and NEH algorithms, to solve -job and -machine flow shop scheduling problem with minimizing makespan. For the result of the job-related criterion, Bai [9] presented the asymptotic optimality of the shortest processing time-based algorithms for the flow shop problem to optimize total quadratic completion time with release dates. Bai and Zhang [10] extended the results to a general objective, total -power completion time ().

In this paper, the flow shop scheduling problem for the minimization of makespan with release dates is addressed. Contrary to the static setting in which the jobs are simultaneously available, jobs arrive over time according to their release dates, which more closely approaches practical scheduling environments. Lenstra et al. [11] proved that the two-machine flow shop makespan problem with release dates is strongly NP-hard. It implies that the optimal solution of this problem cannot be found in polynomial time; heuristic algorithm may be more effective to obtain an approximate solution for large-sized problems. Therefore, a new modified GS algorithm (MGS) based on the algorithm of Gonzalez and Sahni [12] is presented for slow shop minimizing makespan with release dates. Then an improved scheme is provided to promote performance of the MGS algorithm. Moreover, a sequence-independent lower bound of the problem is presented. Computational experiments reveal the performances of the MGS algorithm, improved scheme, and lower bound in different size problems.

The remainder of this paper is organized as follows. The problem is formulated in Section 2, and the MGS algorithm and improved scheme are provided in Sections 3 and 4, respectively. The new lower bound and computational results are given in Section 5. This paper closes with the conclusion in Section 6.

2. Problem Statement

In a flow shop problem, a set of jobs has to be processed on different machines in the same order. Job , , is processed on machines , , with a nonnegative processing time and a release date , which is the earliest time when the job is permitted to process. Each machine can process at most one job and each job can be handled by at most one machine at any given time. The machine processes the jobs in a first come, first served manner. The permutation schedule is considered in this paper, and the intermediate storage between successive machines is unlimited. The completion time of job , , on machine ,  , is denoted by . The goal is to determine a job sequence that minimizes the makespan, that is, .

3. The Modified GS Algorithm

Gonzalez and Sahni [12] presented the GS algorithm to solve the flow shop makespan problem. Based on its idea, a new heuristic named modified GS (MGS) algorithm is presented to deal with the flow shop makespan problem with release dates. A formal expression of the MGS algorithm is presented as follows.

3.1. The MGS Algorithm

Step 1. Divide the machines into groups.

Step 2. For each machine group ,  , whenever machine becomes idle or new jobs arrive, process the available jobs by Johnson’s rule (i.e., first schedule the jobs with in order of nondecreasing and then schedule the remaining jobs in order of nonincreasing , where denotes the processing time of job in group on machine ,  ); if no job is available, go to Step 3.

Step 3. Wait until a job arrives and go to Step 2. If all the jobs have been scheduled, go to Step 4.

Step 4. Terminate the program and calculate the objective values of the schedules. Select the minimum one as the final solution, .

The flowchart of the algorithm is shown in Figure 1. An example is proposed to show the execution of the MGS algorithm.

Example 1. A flow shop scheduling problem involves three machines, , , and , and four jobs, , , , and with release dates. The release dates and processing times of the jobs are listed below. The objective function of the problem is . Consider

The final sequence of the MGS algorithm is . And the objective value is . The scheduling process is shown in Figure 2.

4. The Improved Scheme

To further promote the quality of the solution for medium-scale problems, some properties for two-machine flow shop makespan problem with release dates are presented as follows.

Property 1. For problem , if two adjacent jobs and satisfy (i),(ii) and ,(iii),then the optimal sequence is that job is scheduled before job .

Proof. Define the completion times of the two jobs in Sequences and being and , respectively, . Therefore, for Sequence , and, for Sequence ,By subtraction, it is obtained thatNote thattherefore, That indicates the optimality of Sequence .

Property 2. For problem , if two adjacent jobs and satisfy(i),(ii) and ,(iii),then the optimal sequence is that job is scheduled before job .

Proof. Define the completion times of the two jobs in Sequences and being and , respectively, . Therefore, for Sequence , and, for Sequence , By subtraction, it is obtained thatNote thattherefore,That indicates the optimality of Sequence .

Property 3. For problem , if two adjacent jobs and satisfy(i),(ii) and ,(iii),then the optimal sequence is that job is scheduled before job .

Proof. Define the completion times of the two jobs in Sequences and being and , respectively, . Therefore, for Sequence , and, for Sequence , By subtraction, it is obtained thatNote thattherefore, That indicates the optimality of Sequence .

Property 4. For problem , if two adjacent jobs and satisfy(i),(ii) and ,(iii),then the optimal sequence is that job is scheduled before job .

Proof. Define the completion times of the two jobs in Sequences and being and , respectively, . Therefore, for Sequence , and, for Sequence , By subtraction, it is obtained thatNote thattherefore, That indicates the optimality of Sequence .

With these properties, an improved scheme is provided to promote the original solution obtained by the MGS algorithm. In a formal expression of the improved scheme, ,  , denotes the job found in the th position of the original sequence , denotes the number of comparisons that job sequentially compares forward with the th job to a given job in a seed sequence, denotes the number of comparisons from the current job to the end job, denotes the number of groups, and denotes a sequence set that is generated by exchanging the th job of with each different job in a seed sequence. The format expression can be summarized as follows.

4.1. Improved Scheme

Step 1. Generate the initial sequence with the MGS algorithm and calculate the objective value .

Step 2. Set ,  , and .

Step 3. Compare job forward with the next (if , set ) jobs in sequence . If the two jobs and satisfy(1),(2)one of the following conditions:(i),(ii),(iii),(iv),then exchange the jobs, calculate the objective value, and proceed to Step 4.

Step 4. If the objective value obtained in Step 3 is smaller than , set   and determine sequence , such that . If , set and return to Step 3; otherwise, proceed to Step 5.

Step 5. If , set and return to Step 3; otherwise, proceed to Step 6.

Step 6. is the final schedule. Stop.

5. Computational Results

In this section, a series of computational experiments are designed to reveal the performances of the proposed algorithm and improved scheme in different size problems. Ten different random tests for each combination of the parameters were, respectively, performed, and the averages are reported. All the algorithms were coded in MATLAB R2012a and implemented on a PC with an Intel Core i7-2600 CPU (3.4 GHz × 4) and 4 GB RAM. The processing times of the jobs were randomly generated from a discrete uniform distribution on and a discrete normal distribution with expectation 5.5 and variance 1.72.

To evaluate the performance of an algorithm for problem , Bai et al. [13] presented a lower bound ():

However, the value of sometimes may be larger than the optimal value because is sequence-dependent. Consider the following example.

Example 2. A two-machine flow shop scheduling problem involves two jobs, and , with release dates. The release dates and processing times of the jobs are listed below. The objective of the problem is :For sequence , calculate the value of :And the optimal schedule of the problem is . The associated optimal value isObviously, .

To guarantee that the lower bound value is strictly less than the optimal value, a new lower bound () is provided:

Obviously, the last two items in guarantee that the lower bound is independent of the job sequence.

Theorem 3. Let the processing times , , , be independent random variables having the same continuous distribution with bounded density defined on . Then, for every , , with probability one,

Proof. Without loss of generality,Dividing on both sides of (28) and taking limit,Bai et al. [13] proved that Combining (29) and (30) yields the result of Theorem 3.

Calculate Example 2 with :

5.1. Tests for the MGS Algorithm

Several numerical tests are conducted to reveal the effectiveness of the MGS algorithm. Three, five, and ten machines and 50, 100, 200, 500, and 1000 jobs are tested, respectively. The release dates are drawn from a discrete uniform distribution on , where is the number of jobs and is a multiplier with the values of 1, 2, 5, and 8. The DSJF heuristic algorithm presented by Bai and Tang [14] is used as a reference for comparison. First, in Tables 1 and 2, we compare the performance of the MGS algorithm and DSJF heuristic by employing mean relative percentage %, where is the objective value of the MGS algorithm and is the objective value of the DSJF heuristic.


= 3 = 5 = 10

50 jobs = 1
= 2
= 5
= 8

100 jobs = 1
= 2
= 5
= 8

200 jobs = 1
= 2
= 5
= 8

500 jobs = 1
= 2
= 5
= 8

1000 jobs = 1
= 2
= 5
= 8


= 3 = 5 = 10

50 jobs = 1
= 2
= 5
= 8

100 jobs = 1
= 2
= 5
= 8

200 jobs = 1
= 2
= 5
= 8

500 jobs = 1
= 2
= 5
= 8

1000 jobs = 1
= 2
= 5
= 8

In Tables 1 and 2, the data show that the performance of the two algorithms is dependent on multiplier . To further determine the dominance of the two algorithms, in Tables 3 and 4, we execute the following experiments with mean relative percentage %, where denotes the times when the DSJF heuristic dominates the MGS algorithm.


= 3 = 5 = 10

50 jobs = 1
= 2
= 5
= 8

100 jobs = 1
= 2
= 5
= 8

200 jobs = 1
= 2
= 5
= 8

500 jobs = 1
= 2
= 5
= 8

1000 jobs = 1
= 2
= 5
= 8


= 3 = 5 = 10

50 jobs = 1
= 2
= 5
= 8

100 jobs = 1
= 2
= 5
= 8

200 jobs = 1
= 2
= 5
= 8

500 jobs = 1
= 2
= 5
= 8

1000 jobs = 1
= 2
= 5
= 8

The results of Tables 3 and 4 indicate that the DSJF heuristic completely dominates the MGS algorithm as and the opposite case as . Therefore, in the two cases, obtaining the near-optimal solution with the proper one of the algorithms directly is more practical. To demonstrate the asymptotic optimality of the MGS algorithm, we compare its objective value with the associated value of . The mean relative percentage   × 100% is employed, where is the objective value of the MGS algorithm and is the objective value of .

The data in Tables 5 and 6 indicate the asymptotic optimality of the MGS algorithm. Contrarily, for the fixed number of jobs, ratios enlarge as the number of machines increases from 3 to 10. The cause may be that the larger the number of machines is, the larger the quantity of idle times is, which enlarges the gap between the value of objective and its lower bound.


= 3 = 5 = 10

50 jobs = 1
= 2
= 5
= 8

100 jobs = 1
= 2
= 5
= 8

200 jobs = 1
= 2
= 5
= 8

500 jobs = 1
= 2
= 5