Because the alternative process plans have significant contributions to the production efficiency of a manufacturing system, researchers have studied the integration of manufacturing functions, which can be divided into two groups, namely, integrated process planning and scheduling (IPPS) and scheduling with due date assignment (SWDDA). Although IPPS and SWDDA are well-known and solved problems in the literature, there are limited works on integration of process planning, scheduling, and due date assignment (IPPSDDA). In this study, due date assignment function was added to IPPS in a dynamic manufacturing environment. And the studied problem was introduced as dynamic integrated process planning, scheduling, and due date assignment (DIPPSDDA). The objective function of DIPPSDDA is to minimize earliness and tardiness (E/T) and determine due dates for each job. Furthermore, four different pure metaheuristic algorithms which are genetic algorithm (GA), tabu algorithm (TA), simulated annealing (SA), and their hybrid (combination) algorithms GA/SA and GA/TA have been developed to facilitate and optimize DIPPSDDA on the 8 different sized shop floors. The performance comparisons of the algorithms for each shop floor have been given to show the efficiency and effectiveness of the algorithms used. In conclusion, computational results show that the proposed combination algorithms are competitive, give better results than pure metaheuristics, and can effectively generate good solutions for DIPPSDDA problems.

1. Introduction

Process planning, scheduling, and due date assignment functions have an important role in modern manufacturing systems. During the process planning stage, operations of jobs are sequenced and the production information is transferred to the scheduling stage. In a production facility, process planning stage comprises processes such as determination of the manufacturing method for the product and selection of appropriate machines for each part. Besides, in a job shop scheduling problem (JSSP), machines are assigned to operations of jobs with the objective of optimizing determined performance measures. Although process planning is used as an input in scheduling, process planning and scheduling functions are considered as two independent and separate functions in classic manufacturing models [1]. In recent years, it has been observed by various researchers across the world that these two functions are interrelated and needed to be handled together [2]. Integration of process planning and scheduling (IPPS) studies aim to improve scheduling performance measures such as optimizing makespan, job flow times, earliness and tardiness (E/T) of each job, and machine utilization level using different metaheuristic algorithms. Furthermore, there are several review articles on IPPS which can be given as [36]. On the other hand, many studies proposed to integrate scheduling with due date assignment (SWDDA) as well. Although there are numerous works on IPPS and SWDDA, limited works on integrated process planning, scheduling, and due date assignment (IPPSDDA) have been conducted so far.

Obviously, integrating due date assignment function with IPPS problem can provide a significant improvement in the objective function and the global performance of the manufacturing system. The first study on IPPSDDA was conducted by Demir and Taskin [7]. Demir et al. [812] also presented genetic algorithm (GA), simulated annealing (SA), tabu algorithm (TA), evolutionary strategies (ES), and hybrid algorithms to improve the global performance of the IPPSDDA problem. In this study, in addition to the other studies on IPPSDDA, we considered a dynamic scheduling model in which each job enters the system at a random time and we introduced the integration of process planning, dynamic scheduling, and due date assignment (DIPPSDDA) problem. The main contribution, and the point at which this study is separated from other studies, is to build a discrete event simulation model for IPPSDDA problem that can adapt to unexpected and sudden internal or external changes (dynamic events) in a manufacturing system. We studied new job arrivals that may occur in real shop floors as dynamic event for this study. Furthermore, it is also aimed at solving the DIPPSDDA problem with different efficient metaheuristic algorithms which are simulated annealing (SA), tabu algorithm (TA), genetic algorithm (GA), and the combination of GA/SA, GA/TA to improve the performance of the solutions. The proposed model has been built with the advantages of object-oriented programming (OOP) and discrete event simulation (DES) to enhance the flexibility of the model.

This paper is organized as follows: Section 1 is the introduction part of the study. A review of related works is presented in Section 2. The background and definition of DIPPSDDA are given in Section 3. Section 4 discusses solution techniques for the DIPPSDDA. Section 5 gives the computational results and analysis of the study. Conclusions and future researches are also summarized in Section 6.

Minimizing costs by eliminating wastes is a core principle of a just-in-time (JIT) production philosophy. Thus, determining due dates as close as possible to their due dates is inseparable from this philosophy. If companies can complete their productions on given due dates, they can make more realistic plans and have optimum capacity usage for shop floors. In this way, resources are used efficiently, and the company increases customer satisfaction. In most modern JIT companies, a job is expected to be completed in its time. The completion of jobs after their due dates could lead to an increase in costs and a decrease in customer satisfaction and, worst, customer loss. On the other hand, the completion of jobs before their due date can cause inventory holding costs. Integrated systems were proposed by considering the process planning and scheduling or scheduling with due date assignment together to avoid ineffective production. Some of the earlier studies reported that it is possible to find the optimal due dates and optimal sequences of jobs using different heuristic algorithms [1320]. The objective of these studies is to minimize the cost related to due date assignment and scheduling functions.

Besides, some of the recent studies on SWDDA can be given such as [2130]. There are 3 types of due date assignment models in the literature as follows:(i)Common due date assignment (CON): a predefined common due date for each job.(ii)Due date assignment considering process times: slack due dates (SLK), total-work-content (TWK), processing plus due dates (PPW), etc.(iii)Assignment of due dates according to given values.

Scheduling problems with an optimized due date has been the subject of considerable works over the last several decades. Determining the due date is a matter that needs to be emphasized in terms of customer satisfaction for companies. If the due dates are not met, the companies may face various problems. Some of the studies on scheduling together with the due date covered both single machine scheduling and multimachine problems. Scheduling studies considering due dates for single machine problems can often be seen in the literature such as [17, 23, 26, 3135]. Panwalkar and Smith [36] have studied the problem of determining the common due date for the scheduling problem consisting of jobs and one machine in the shop floor. At the end of the study, the problem was solved using a polynomial bound scheduling algorithm. Cheng [37] has set common due dates for each job to minimize delay for a single machine scheduling problem. Biskup and Cheng [38] have attempted to solve the problem of single machine scheduling and the assignment of due dates in which job is done with cramped process times. In Gupta and Sen’s [39] study, decisions about scheduling, batching, and the due date assignment were combined for a single machine scheduling problem with groups of jobs. Bank and Werner [40] determined the duration of release and the duration of the transaction time for each job and assigned a common due date. Cheng [41] has aimed to find the optimal due dates for a scheduling problem. He has emphasized that the heuristic method finds effective solutions for the problem.

An early study on the single machine scheduling problem with early and late due date penalties was conducted by Sidney [42]. In another study, the same problem was addressed again by Seidmann et al. [43], and an algorithm was developed to improve the solution performances. Other studies in which E/T are penalized for single machine schedules can be given such as [3234, 44].

On the other hand, there are still many studies in the literature in which there is more than one machine. All these studies have attempted to determine common due dates for each job. With the intuitive use of heuristic algorithms, many machine scheduling problems such as job floor type production can be solved. There is also a published literature review for multimachine scheduling problems conducted by Lauff and Werner [45] in which researchers used, for solving the multimachine scheduling problem, a specific due date. In another review study, Gordon et al. [46] drew a framework for the common due date and scheduling problems. In their literature study, the models in which single machine and parallel machines are included and the other studies in the literature on this subject were reviewed. A study list on SWDDA classified based on subjects or objectives of the study is given in Table 1. Other examples of studies on scheduling with due date assignment can be found in [4648].

In a classic manufacturing model, process planning and scheduling functions are carried out separately where only fixed process plans are transferred to the scheduling stage. Fixed process plans may cause some inefficient schedules because of the inflexible nature. To overcome these problems the idea of integrating process planning with scheduling has arisen from the fact that working with alternative process plans on scheduling has improved the efficiency in manufacturing systems. Wilhelm and Shin [49] addressed the efficiency of “alternative operations” in their study in 1985 for the first time. After that, the studies on IPPS were conducted by several researchers. Chryssolouris et al. [50] tried to integrate process planning and scheduling for the first time in 1985. Other early IPPS studies were carried out by Sundaram and Fu [51].

The best way to integrate process planning and scheduling is to take two functions together. The integration of process planning and scheduling functions is NP-Hard problem, and it is difficult to find their optimal results in a reasonable time, even with small sized problems [52]. There are many studies in the literature on the integration of these two functions, which can be classified based on their solution approaches as given in Table 2.

At the beginning of 2000, a very early study on IPPSDDA was a Ph.D. thesis conducted by Demir and Taskin [7]. Integration of these three functions is useful to avoid the problems that may arise in process planning, scheduling, and due date assignment. The intent of working with integrated functions is based on more efficient flow times, processing times, and variations in machine capacities and manufacturing flexibility. From a literary perspective, the integration of the three functions is less mentioned. In studies of Demir et al., the feasibility, advantages, solution methods, and results of the integrated operation of the three functions were mentioned. In addition to the models discussed in the IPPS problems, preparation times were also considered in these studies and the importance of considering preparation times was discussed. They integrated three functions which are process planning, scheduling, and due date assignment using genetic algorithms in their thesis [7]. It is observed that weighted due date assignment and weighted scheduling improved the overall performance of the problem substantially.

One of the current work areas related to the job shop scheduling problem is dynamic scheduling. Dynamic scheduling problems are frequently encountered in real job floors, for example, jobs arriving at the shop floor randomly over time, machine breakdowns, urgent jobs, or cancellation of existing jobs. Zandieh and Adibi [72] have used a variable neighboring search (VNS) algorithm to solve dynamic events such as random jobs and machine failures in their study. The dynamic scheduling problem can show deterministic or stochastic properties according to the arrival time of the jobs. Scheduling is deterministic if the arrival time of jobs is known in advance; conversely, if arrival times are randomly distributed according to a specific distribution, scheduling is stochastic [73].

The first literature review related to dynamic scheduling problems can be found in [98]. Dominic et al. [99] have developed two new dispatching rules that work in the dynamic production job floor. Dispatching rules have been revealed to be better than the existing SPT, LIFO, and LPT rules. Aydin and Oztemel [100] have found a solution for the dynamic scheduling problem using reinforced learning factors for proper dispatching rule selection. Li et al. [101] developed an artificial neural network (ANN) using a GA for the problem. Sha and Liu [102] have developed a data mining tool that adapts to the dynamic conditions of jobs. Zandieh and Adibi [72] used it to predict the appropriate parameters of scheduling methods for the shortest average processing time. Zhang et al. [103] developed a hybrid and taboo search algorithm for a dynamic flexible job shop scheduling problem.

As mentioned before, dynamic integrated process planning and scheduling problems are limited in the literature. In the study by Lin et al. [104] an integrated process planning and scheduling problem has been addressed to avoid the dynamic events that may occur in the job floors. Xia et al. [92] conducted a study using alternative process plans with the benefit of the machine breakdowns in the job floor, and the problem with the random arrival of jobs was solved using the neighbor search algorithm. Wong et al. [105] have solved the dynamic integrated process planning and scheduling (DIPPS) problem by suggesting a hybrid multifactor-based system. Recent papers, including that of Yu et al. [106], employed a discrete particle swarm optimization to solve dynamic IPPS problem. They used discrete particle swarm optimization (DPSO) algorithm to solve IPPS optimization problem. Meissner and Aurich [107] applied a cyberphysical system for IPPS. Sustainable IPPS problem was solved by Lee and Ha [108] using a standard GA. Yin et al. [109] used two competing agents for the integrated production, inventory, and batch delivery scheduling and due date assignment. Mor [110] studied common SWDDA with the focus on minmax objective functions. Teymourifar and Ozturk [111] developed a new dispatching rule for dynamic JSSP.

3. Problem Definition

Job shop scheduling problem (JSSP) mainly consists of 3 elements, which are machine configuration, process characteristics, and objective function [112]. The main objective of JSSP is to find a feasible schedule and to optimize given performance measures such as optimizing makespan, earliness and tardiness time of each job, and machine utilization level.

Basically, scheduling models can be divided into two groups: static and dynamic models. If there are stochastic events occurring over time such as new job arrivals, machine breakdowns, and order cancellations, then this kind of scheduling problem is named “dynamic scheduling problem”. Most of the studies on scheduling focused on static job shop scheduling. Static job shop scheduling means that all scheduling conditions are static and all job information is known and ready at . A static JSSP deals with the process of assigning jobs to machines with the aim of optimizing determined performance measures simultaneously. Although working on static job shop scheduling is a well-known problem in the literature, dynamic events often occur, which needs to be handled to increase the productivity and machine balance rates in a real manufacturing system [98]. Actual manufacturing scheduling models are naturally dynamic which are harder to solve than static models [90, 91], because when there is an interruption, proposed plans and schedules should be revised to respond to such dynamic events.

In fact, JSSP belongs to the NP-Hard class without any integration. That is why the integration problems are seen as the hardest among scheduling problems [71]. It can be said that the problem of DIPPSDDA has a huge solution space which is difficult to solve in a reasonable time. Therefore, the use of metaheuristic algorithms is essential for solving DIPPSDDA. In this study, we consider that arrival of jobs to the shop floor varies over time, which makes our model dynamic [113, 114]. Jobs are stochastically arriving at the system and have different weights, due dates, routes, and precedence rules.

The assumptions of the problem to be addressed are listed below:(i)There is no ready job at , and the arrival time of the job is distributed according to the exponential distribution.(ii)Due dates for jobs are not assigned at .(iii)Each machine can process a single operation at the same time.(iv)The processing time of the jobs is distributed according to the normal distribution.(v)Machines are not broken down.(vi)The preparation times of the machines are ignored.(vii)Each job can only be delivered to the customer after all its operations are finished.

The notation used for the formulation of the problem is given in Table 3.

3.1. Due Date Assignment Rules

DIPPSDDA begins with the determination of due dates which is the assignment of the due dates of the jobs. In many cases, negotiations with customers can be made to arrange due dates in this process [63, 64]. Inconsistent due dates can lead to unwanted price discounts, customer dissatisfaction, or even end-customer loss. In classic manufacturing models, a job is expected to be completed before its due date. In contrast, just-in-time (JIT) manufacturing environment requires all jobs to be completed exactly on their due dates. In most cases, early due dates can lead to an increase in inventory costs. Similarly, late due dates lead to customer dissatisfaction, price disruption, and, worst, customer loss. There are many studies using quadratic penalty functions to solve tardiness and linear penalty functions to solve earliness and tardiness (E/T) problems. In this study, linear penalty functions are utilized to overcome these problems and generate more accurate process plans.

Both dynamic and static due date assignment rules were employed in this study. Dynamic due date assignment rules use information about the job, route, and operation time, as well as the shop floor and average processing times. Static due date assignment rules use the information relevant to the job such as the arrival of jobs, route information, and operations for calculating due dates. All due date assignment rules used in the study are given in Table 4 with the explanations and equations.

3.2. Dispatching Rules

Dispatching rules are widely used for job shop scheduling with simple implementation. Generally, dispatching rules are used for the selection of the operations by machines in shop floor. 8 different dispatching rules were used in the study, and the list of dispatching rules, with the priority index and the descriptions of the rules, is given in Table 5. The is calculated as in (1).

3.3. Objective Function

The objective of scheduling problems has been often to minimize the makespan. Sometimes the objective function may also be to minimize the average flow time, balance the machine loads, etc. In this study, we use a penalty function for both E/T and due dates for each job. If the value of lateness is positive, this means that the job is completed lately. If the value of lateness is negative, this means that the job is completed early. The penalty for early completion is 0, as expected, and in case of the early completion time, the tardiness penalty is given as 0. Earliness, tardiness, and due date penalties are calculated as in (2), (3), and (4), respectively.

Weights are applied for a better objective function in the model. We considered a working day as one shift with 8 working hours and that makes 480 minutes per day. All the weighted E/T and due date related costs are punished according to (5) and (6). Meanwhile, (7) shows the sum of the total penalty.

The objective function of the model is to minimize the total penalties which are penalties for E/T and due dates. Then the final objective function which is a fitness value of the solution is calculated as shown in (8).

3.4. Data Studied

The configurations for the 8 shop floors are given in Table 6. There is also a mini-shop floor for testing algorithm performances by hand. For example, the first shop floor includes 25 jobs, 10 operations, 5 different routes, and 5 machines. Also, the number of iterations of algorithms for the shop floors is given in Table 3. The processing times in the shop floor conform to the normal distribution, which has a 6 average and 12 standard deviations.

The data produced for the study were generated specifically for this study because there is no other similar data in the literature in which process planning, scheduling, and due date assignment rules are employed. The data belonging to the shop floors produced by using NumPy library in the Python programming language were separated according to the shop floors and saved as “.txt” files. Firstly, the time of job arrivals is generated according to the exponential distribution and saved in the file “arrivals_shop_floor_number.txt”. Secondly, the machine sequences for each alternative process plan of the jobs are saved in the file “machine_numbers_shop_floor_number.txt” and the processing times are saved in the file “operation_durations_shop_floor_number.txt”. Finally, weights are given in the file “weights_shop_floor_number.txt”. The data files are provided as Supplementary Materials.

3.5. Simulation Study

In the beginning of the study, the shop floor is scheduled according to available jobs on hand. When a new job arrives at the shop floor according to an exponential distribution, the job list is updated, and the problem turns into dynamic JSSP. To solve the dynamic problem, a discrete event simulation has been established for a job shop configuration to validate the performance of the dispatching and due date rules. If we choose the right distributions for the job arrivals and operation durations, we can get more effective solutions for the simulations. The simulation input data which have several jobs, machines, operations, and routes were given in Table 5. Briefly, jobs have different routes and they must be processed on each machine only once. When a new job arrives at the shop floor, it is taken to the machine queue according to the selected route by the algorithm. Machines are allocated according to the dispatching rule for the jobs waiting in the machine queue. After that, processing times of each job are distributed with a normal distribution. The due dates are determined using selected due date assignment rule. Finally, 23 dispatching and 37 due date assignment rules are utilized, and simulation is run till all jobs finish on the shop floor.

The steps of the simulation are as follows:(i)Execute the algorithm to generate an individual solution.(ii)When a new job comes to the shop floor, determine weights, operation times, operation precedence and durations, and route for the job.(iii)Calculate the due date according to the selected due date assignment rule for the job.(iv)The first operation of the job enters the queue of the machine to be assigned.(v)The machine selects the operation from the pending operations, according to the selected dispatching rule.(vi)The completion time for the last operation of the job is determined as the departure time of the job.(vii)The earliness, tardiness (E/T), and due dates (D) for the job are calculated using the objective function which is given in (8).(viii)The solution is optimized by running steps of the proposed metaheuristics.

4. Solution Techniques

4.1. Genetic Algorithm (GA)

In 1975, Holland [115] proposed GA which has a good performance in solving difficult problems by mimicking the biological evolution process. GA works based on the population of solutions which has a common use in the field of scheduling and IPPS. Thanks to the popularity of GA and its easy-to-implement structure, researchers commonly implement GA to solve a variety of problems, including scheduling related problems. Genetic operators are the most important part of the development of GA modeling. In this study, selection, crossover, and mutation GA operators were employed. To increase performance with GA, it is needed to consider some parameters, such as the number of initial populations, crossover rate, mutation rate, the maximum number of generations, length of chromosomes, encoding the chromosomes and decoding the individuals, and calculating and formulating fitness function in an efficient way. Algorithm 1 shows the pseudocode for GA.

(1) DO Random Search
(2) BEGIN to Initialize population
(3) Evaluate population
(4) Rank chromosomes
(5) for i iter_size do
(6) Ranking selection for parents
(7) Crossover
(8) Mutation
(9) Replace population
(10) Return
(11) end for

The main steps of GA are as follow:(1)Initialization of population: A random search was performed in 10% of the total number of iterations before the genetic algorithm was applied. Best 10 chromosomes from random search generate an initial population for GA. Figure 1 shows a sample chromosome.(2)GA operators: Next step is to generate a new population from the mating pool through genetic operators which are crossover and mutation.(a) Crossover: Operator selects two chromosomes as parents and creates new child individuals. In the beginning, the crossover operator determines the crossover point size, which is calculated using formula where N is the chromosome length, so is related to the length of the chromosome. For example, if N is equal to 27 which is the value of the first shop floor, is calculated as 2. Each gene on a chromosome has a different probability to be selected as a crossover point, where due date rule gene and dispatching rule gene have a dominant probability of 0.5 and the rest of the genes share the other 0.5 equally. An example of crossover operation is given in Figure 2.(b) Mutation: Operator selects a dynamic number of points changing according to the length of the chosen chromosome and changes it to an appropriate value from the domain. Each gene of the chromosome has a different probability to be selected as a mutation point where the due date rule gene and dispatching rule gene have a dominant probability of 0.5 and the rest of the genes share the other 0.5 equally. An example of mutation operation is given in Figure 3. is the number of mutation points which is calculated using formula .(c) Selection: Process of selecting parents is repeated until the stopping criteria are met which is the maximum iteration size. The average fitness value of the population decreases in time, as the chromosomes with the best fitness values are selected for the new generation.(i) Ranking selection method: A linear ranking selection method is used in which the best chromosome in a population gets more probability to be selected. This method is chosen because the best and worst fitness values get closer as the iterations go on in time. In the last iteration, the fitness values are almost equal. This will cause the algorithm to use the same probabilities when selecting among the best and the worst chromosomes. Contrary to the use of fitness values alone, sorting individuals and assigning probabilities according to their positions will give better solutions to this problem. In the ranking selection method, chromosomes have the probability of being selected according to their ranking and these probabilities are fixed for all iterations. In this study, the preferred probability for 10 chromosomes is 0.3, 0.2, 0.15, 0.12, 0.10, 0.07, 0.03, 0.02, 0.006, 0.004], respectively. Table 7 shows the differences in selection probabilities between roulette wheel and ranking selection methods, for the initial and last population.(3)Updating the population: As the population size needs to be limited to a fixed number which is 10 for this study, 10 best chromosomes will always survive at the end of the iteration.

4.2. Tabu Algorithm (TA)

Glover [116] proposed the fundamental framework of TA, which is a metaheuristic algorithm designed to find a near-optimum solution for complex problems. It starts from an initial solution generated based on local search. After it finds a feasible solution, TA tries to get better fitness values by selecting the best individual in each iteration. TA searches space for finding a good neighborhood. It stores the previous bad solutions in a tabu list. This process is repeated until the maximum iteration number is reached. The tabu list is updated and compared to the current solution at every single iteration. The size of the tabu list is kept fixed to use memory effectively. Algorithm 2 shows the pseudocode for TA, and variable neighborhood search algorithm for TA and SA is illustrated in Figure 4.

(1) BEGIN to Choose shop floor
(2) Generate initial solution
(4) whiledo
(5) if not in tabu list then
(7) Generate using VNS
(8) then return
(9) end if
(10) else
(11) Update tabu list
(12) end if
(13) end while
4.3. Simulated Annealing (SA)

SA is used to solve large-scale optimization problems, which was developed by Kirkpatrick et al. [117] based on the cooling and recrystallization process of hot materials. SA is another neighborhood search algorithm and widely used to solve big combinatorial optimization problems. It is used in many problems in numerous disciplines; in particular, global extremum is searched within the many local extrema [118]. SA evaluates the neighborhoods with the fitness value, but it might choose the worst move in each iteration to escape from a local minimum. The probability of selecting a solution is given by an exponential control function in which the parameter of the function will decrease during the execution which is given in (9). SA differs from TA as it diversifies the solutions by randomizing them, while TA diversifies them by forcing new solutions. Algorithm 3 shows the pseudocode for SA:

(1) BEGIN to Choose shop floor
(2) Generate
(5) whiledo
(6) Generate using VNS
(8) ifthen return
(9) else
(10) Calculate
(11) ifthen return
(12) elsereturn
(13) end if
(14) end if
(16) solutions.append()
(17) end while

where is the acceptance rate, is the temperature, is the Boltzmann constant, is the fitness value of the current solution, and is the fitness value of the next solution. depends on the value of the . At the first iteration the temperature is high; when it moves to the next iterations, the temperature is decreased gradually.

5. Experimental Results

A computer program in Python programing language was developed to solve DIPPSDDA using Jupyter IDE on a computer which has Intel i5-6200U processor with 2.30 GHz speed and 8 GB of memory. The most used package list to develop the program can be given as follows: sequences, matrices, and probability calculations used in this study were coded using the NumPy library [119]. The Pandas library [120] provides important conveniences for data analysis and data science and was used to analyze data in tables. The Matplotlib library [121] was used to visualize the graphs in the study and analyze the data. A library named Salabim [122] was used for the simulation which is an important part of this study.

A discrete event simulation (DES) model was developed to represent a job shop scheduling system for this study. At , there is no job and the machines are ready to operate. Discrete event simulation updates the system itself when a job arrives. The arrival time was assigned to the next job based on the cumulative time of the interarrivals so far. The arrival intervals of jobs and the number of jobs within these intervals in each shop floors are provided with histogram graphs in Figure 5. The first graph shows the arrival times for the first 4 shop floors, and the next 4 shop floors are shown in the second graph, as giving all shop floors at the same chart reduces the readability of the chart. It is also assumed that the time between arrivals is an integer number. The arrival time of the next job is calculated by adding the value generated by the exponential distribution function to the arrival time of the previous job.

Another purpose of this study is to determine the benefit of the integration of due date assignment and the dispatching rules with process planning by realizing 4 different levels of integration. At the first level of integration, SIRO dispatching rule and RDM due date assignment rule were tested. It can also be said that no rules are applied to this level. At the second level of integration, dispatching rules are included in the problem. However, due date assignment rule is still being determined with RDM, and only the benefit of determining the dispatching rule is attempted. At the third level of integration, the dispatching rule was left as SIRO and the due date assignment rules were applied. At the fourth level of integration, the improvement in the solution has been observed in the case where all the dispatching rules and due date assignment rules are applied where process planning weighted scheduling and weighted due date assignment functions are fully integrated. According to these tests, results show that full integration level always gives the best result. Solutions of different integration levels for 8th shop floor are compared in Table 8. Full integration level has been observed to have the best performance as shown in Table 8.

Experimental results of VNS, SA, TA, GA, and HA algorithms are compared with each other and compared with the results of the ordinary solution, which is found without applying any algorithms. The number of iterations is determined based on the characteristics of the algorithms, as GA is population-based and SA and TA are individual-based. In each iteration of the GA, 10 individuals are generated. However, only one individual is produced in each iteration of the SA and TA algorithms. Therefore, the number of iterations given to GA is not equal to the SA and TA algorithms. Iteration numbers of SA and TA algorithms are taken as ten times of GA iteration number. Best, mean, and worst results of VNS, SA, TA, GA, and HA algorithms are recorded. The performance graphs of the algorithms are plotted separately for each shop floor and given in Figure 6.

As mentioned before, the developed model is a run on discrete event-time simulation. Simulation outputs have been examined instantaneously, proving that the system works correctly according to the different due date and dispatching rules, different shop floor types, and different types of jobs and machinery. Simulation trace records all the movements obtained over time until the last job is completed. For better reading of simulation results, Gantt charts for the first two shop floors are given in Figures 7 and 8. Figure 7 shows the Gantt chart for the first shop floor in which the best solution from TA algorithm is calculated with the chromosome of 9, 6, 4, 1, 3, 0, 4, 3, 2, 4, 2, 2, 0, 3, 3, 0, 2, 1, 4, 1, 2, 2, 1, 1, 3, 0, 1] with 163.28 fitness value. Figure 8 shows the Gantt chart for the second shop floor in which the best solution from GA/TA is calculated with the chromosome of 3, 17, 3, 2, 2, 2, 2, 2, 4, 3, 3, 1, 2, 2, 4, 4, 1, 1, 0, 0, 1, 3, 3, 3, 0, 3, 0, 0, 3, 2, 2, 2, 2, 2, 3, 2, 0, 1, 4, 2, 3, 2, 2, 4, 1, 2, 1, 3, 2, 0, 1, 3] with 311.09 fitness value. The Gantt charts of larger shop floors are not graphically presented because they are complex and impossible to follow. In addition to the classical Gantt charts, indicators are added to the graphs, which show the arrivals of jobs. In the Gantt chart, each job is represented with a color bar and all box pieces represent the single operation of a job.

6. Discussion and Conclusions

We studied the integration of process planning, scheduling, and due date assignment manufacturing functions that are related to and affecting the performance of each other in a dynamic shop floor where jobs arrive to the system stochastically. In actual manufacturing model, these functions are also affected by various events such as urgent orders, order cancellations, maintenance, machine breakdowns, and delays in supplies. However, studies on the integration of process planning, scheduling, and due date assignment are limited, and most of these studies are working in a static environment; our work focused on designing effective algorithm for the dynamic integrated problem. As only scheduling problem is NP-Hard, integration of scheduling with other manufacturing function is also NP-Hard. Thus, it is essential to use a heuristic solution for such a problem. We have developed 4 different pure metaheuristic algorithms and their combination algorithms to solve DIPPSDDA. Our objective for DIPPSDDA was to minimize the earliness, tardiness, and due date times of each job in 8 different shop floors. Even with integrating, these functions are not enough to model real-world problems thoroughly. Based on the tested experiments, it was shown that the proposed hybrid algorithm (GA/TA) and TA algorithms have generally a good frame for the integration problem. Besides, GA/TA and TA show higher reliability to solve DIPPSDDA. Unfortunately, it is not possible to make a benchmark to test the performance of the proposed system as it is a new field in the literature. The main contribution of this study is to develop a new model which is called DISPPSDA over the current IPPS and DIPPS models.

The study also demonstrates the benefits of running due date assignment and dispatching rules with process plan selection together. Shop floors have been run without any due date assignment and dispatching rule; then the rules are integrated step by step. As a result, it is observed that the best results are obtained at full integration level.

This study is one of the few studies on dynamic integrated process planning and scheduling. The following three suggestions can be the objectives of future work. Firstly, the flexibility of operations and operation numbers or the flexibility of processes and numbers can be integrated into the structure of DIPPSDDA. Secondly, the work can be further tailored to the dynamic events such as machine breakdowns, cancellation of jobs, and the arrival of new urgent jobs. Lastly, in the objective function of the problem, earliness, tardiness, and due date times are punished. In addition, more objectives can be specified to the problem such as makespan, machine balance rates, and minimum wait times of each job. Thus, the problem can be made multiobjective.

Data Availability

The “.txt” data used to support the findings of this study are included within the Supplementary Materials.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Supplementary Materials

The data produced for the study were generated specifically for this study because there is no other similar data in the literature in which process planning, scheduling, and due date assignment rules are employed. The data belonging to the shop floors produced by using NumPy library in the Python programming language were separated according to the shop floors and saved as “.txt” files. Firstly, the time of job arrivals is generated according to the exponential distribution and saved in the file “arrivals_shop_floor_number.txt”. Secondly, the machine sequences for each alternative process plan of the jobs are saved in the file “machine_numbers_shop_floor_number.txt” and the processing times are saved in the file “operation_durations_shop_floor_number.txt”. Finally, weights are given in the file “weights_shop_floor_number.txt”. (Supplementary Materials)