This paper outlines the development of a new evolutionary algorithms based timetabling (EAT) tool for solving course scheduling problems that include a genetic algorithm (GA) and a memetic algorithm (MA). Reproduction processes may generate infeasible solutions. Previous research has used repair processes that have been applied after a population of chromosomes has been generated. This research developed a new approach which (i) modified the genetic operators to prevent the creation of infeasible solutions before chromosomes were added to the population; (ii) included the clonal selection algorithm (CSA); and the elitist strategy (ES) to improve the quality of the solutions produced. This approach was adopted by both the GA and MA within the EAT. The MA was further modified to include hill climbing local search. The EAT program was tested using 14 benchmark timetabling problems from the literature using a sequential experimental design, which included a fractional factorial screening experiment. Experiments were conducted to (i) test the performance of the proposed modified algorithms; (ii) identify which factors and interactions were statistically significant; (iii) identify appropriate parameters for the GA and MA; and (iv) compare the performance of the various hybrid algorithms. The genetic algorithm with modified genetic operators produced an average improvement of over 50%.

1. Introduction

Metaheuristics are a class of approximation methods that solve complex optimisation problems that are beyond the scope of classical heuristics and optimisation methods [1]. They have been widely used to solve nondeterministic polynomial (NP) hard problems within acceptable computational time [2]. However, metaheuristic methods are stochastic and cannot guarantee an optimal solution [3]. Evolutionary algorithms (EA) are particularly popular metaheuristics and have been widely applied in the literature. There are three types of EA: evolutionary programming, evolutionary strategies, and genetic algorithms (GA) [4]. Evolutionary programming and evolutionary strategies have been used to solve continuous optimisation problems whilst GA have been mainly used for solving discrete optimisation problems [5].

GA are population based, stochastic search approaches that were inspired by biological evolution. GA include crossover and mutation genetic operations, which are artificial processes for producing new chromosomes. Chromosome selection mimics natural evolution to select a new population for next generation based on individual fitness [6]. GA have been widely applied to solve various optimisation problems [7] including production scheduling [8], course timetabling [9], examination timetabling [10], container packing [11], travelling salesman [12], bankruptcy prediction [13], and machine layout [14]. However, the simple GA may not be effective for solving problems with a very large solution space and many constraints [5].

The term memetic algorithm (MA) is used to describe evolutionary algorithms in which local search is used to a large extent [15]. MAs have received considerable attention from researchers in many fields [5] including job shop scheduling [16], vehicle routing [17], exam timetabling [18], and nurse scheduling [19]. The MA has also been applied to solve course timetabling problems [2025].

Genetic operations frequently produce infeasible solutions, which can be (i) discarded; (ii) penalised; or (iii) repaired [6]. However, discarding infeasible solutions or applying a high penalty is only an option when a large proportion of the chromosomes are feasible [9]. Gen and Cheng [6] recommended the repair option. In the algorithms adopted by previous research, the mutation and crossover processes produce a population that includes feasible and infeasible chromosomes. The infeasible chromosomes are then identified and repaired. However, for very large problems that are subject to numerous constraints, the repair process is likely to be highly complex and difficult to design [6]. A complex repair process may be very time consuming [5]. The literature has not considered the development of modified crossover and mutation operators that only produce feasible chromosomes. Such a strategy would likely be more computationally efficient, which would make it possible to conduct more searches within a given execution time.

The performance of evolutionary algorithms is dependent upon the parameters used (such as the population size, number of generations, and the probabilities of crossover and mutation). It is important to identify appropriate values for the parameters in order to obtain the best solutions [26]. There are four experimental strategies: (i) the best-guess approach; (ii) the trial and error approach; (iii) the one factor at a time experimental strategy; and (iv) the factorial experiment [27]. Montgomery [27] suggested that the factorial experiment is the best approach for dealing with several factors. The strategy is to systematically vary the factors together, instead of one at a time. Thus, it is best to use a factorial experiment when investigating appropriate parameter settings for metaheuristic methods. The approach is more reliable, leads to better results, and is more efficient than the alternatives [28].

Artificial immune systems (AIS) are metaheuristics that were inspired by the immune system in biology [29]. There are four main variants of the AIS: danger theory, immune network algorithm (INA), negative selection algorithm (NSA), and clonal selection algorithm (CSA) [30]. AIS have been successfully applied in three application areas: (i) learning; (ii) anomaly detection; and (iii) optimisation [31]. There is only a limited literature on the use of AIS for timetabling. He et al. [32] applied CSA to solve university course timetabling problems in Singapore and benchmark problems. The CSA produced better timetables than GA for all of the problems considered. Malim et al. [33] applied the INA, NSA, and CSA to solve course timetabling problems. The INA produced timetables with the best average fitness, whereas CSA was best in terms of average execution time. Bhaduri [34] was the only researcher to develop a hybrid AIS for timetabling, called GAIN, which included the INA and GA. The GAIN was able to produce optimal feasible timetables faster than GA. However, other researchers have used AIS hybrids in other domains. For example, Zhang et al. [35] combined AIS, the chaos operator, and particle swarm optimisation (PSO), to produce CIPSO, which was used for transportation planning. The approach outperformed GA and PSO in respect of route optimality and convergence time.

The objectives of this paper were to(i)briefly review the literature on evolutionary algorithms and course timetabling;(ii)explain the development, process, and features of a novel timetabling tool that incorporates genetic algorithms, local search, the clonal selection algorithm, roulette wheel selection, and the elitist strategy;(iii)outline a new modified regeneration mutation operator (MRMO) that is based on roulette wheel selection;(iv)describe the development of a novel local search (LS) algorithm that guarantees the feasibility of new chromosomes generated with the MRMO. This hybrid is called the modified memetic algorithm (MMA);(v)explain experiments that demonstrated that performance can be improved by the MRMO and MMA using an elitist strategy (ES);(vi)outline new hybrid algorithms that include the clonal selection algorithm, MRMO+CSA, and MMA+CSA;(vii)describe the testing of the tool using widely used benchmark problems;(viii)explain the experimental design and analysis used to investigate the significance of GA parameters and interactions and to identify appropriate parameter settings;(ix)provide a comparison of the performance of the proposed hybridisations and the other hybridisations of EA which were used to find the best timetables using 14 benchmarking obtained from the literature.

The next section of this paper briefly reviews course timetabling problems, which is followed by a detailed outline of the development of the evolutionary algorithms based timetabling tool and its features, the experimental programme, results, analysis and conclusions.

2. Course Timetabling Problems

“Timetabling is the allocation, subject to constraints, of given resources to objects being placed in space time, in such a way as to satisfy as nearly as possible a set of desirable objectives” [36, page 266]. There are many types of timetabling problems including employee timetabling, sports timetabling, transportation timetabling, and educational timetabling [28]. Course timetabling arises every academic year in educational institutions (such as high schools, colleges, or universities) and is solved by academic or administrative staff with or without an automated timetabling tool. Course timetabling is known to be a NP-hard problem [37], in which the computational time required to find a solution increases exponentially with problem size [9].

Timetabling problems include hard constraints that must be satisfied in order to produce a feasible timetable and soft constraints, which are desirable but may be violated [9, page 903]. In the case of course timetabling, it is necessary for a timetable to be feasible for students, lecturers, and classrooms [23]. In a university, a degree programme comprises a set of modules that must be completed by the students registered on the programme. Di Gaspero et al. [38] adopted the following constraints.(i)Hard constraints.(a)Lectures: all lectures within a module must be assigned to regular periods. All lectures must be scheduled ().(b)Room occupancy: only one lecture can take place in a room at a given time ().(c)Conflicts: students and staff can only attend one lecture at a time ().(d)Availabilities: lecturers must be available for a lecture to be scheduled ().(ii)Soft constraints.(a)Room capacity: the room must have sufficient seats for the students on the module ().(b)Minimum working days between lectures: for a particular module there should be a minimum amount of time between lectures ().(c)Curriculum compactness: students on a degree programme should have lectures that are consecutive with no gaps ().(d)Room stability: all lectures of a module should be given in the same room ().Another issue is that events or courses may have differing priorities; the generation of infeasible solutions can be avoided by scheduling the highest priority activities first [39].

3. Evolutionary Algorithms Based Timetabling (EAT) Tool

The aim of this research was to generate timetables for lecturers, students, and classrooms that must satisfy all of the hard constraints and minimise the number of violations of the soft constraints proposed by Di Gaspero et al. [38].

The Evolutionary Algorithm based Timetabling (EAT) program was coded using the Tool Command Language and Toolkit (Tcl/Tk) [40]. It was developed in order to construct effective course timetables by using a genetic algorithm (GA) [41] and a memetic algorithm (MA) [42]. Both methods are population based and perform multiple directional search, which achieves a greater diversity than conventional optimisation methods that conduct a single directional search [6]. The MA and GA chromosomes have different components. For MAs, the chromosomes consist of a set of memes, whereas with GAs the chromosomes comprise a set of genes [43]. The key difference is that the memes used by the MA can be self-adapting based upon local search and refinement, whereas genes do not have this capability [6].

The artificial immune system (AIS) was initially proposed in the mid 1980s by Farmer et al. [44]. The clonal selection algorithm is a well-known variants of the AIS that is based upon two immune system principles: clonal selection and affinity maturation [45]. Each antibody (candidate solution) would be cloned proportionally to its antigenic affinity (fitness) value, in which the higher antigenic affinity would have the higher number of cloned antibodies [46]. Affinity maturation is related to hypermutation and receptor editing [46]. The regulation of hypermutation is a rapid accumulation of mutations that depend upon receptor affinity, in which the cell receptor with the higher affinity is mutated by using a mutation rate that is lower than for solutions with lower fitness [46]. Receptor editing provides a mechanism for escaping from the local optima, which increases the diversity of solutions [46]. The elimination percentage % specifies how many low affinity antibodies are eliminated from the receptors.

The main procedures within the evolutionary algorithms based timetabling tool are shown in Figure 1. The first step is to the represent events within the timetable as memes/genes. The second step is to combine memes/genes to produce an initial population that represents a set of possible timetables. This part of the algorithm is designed to ensure that all of the candidate solutions are feasible. This is followed sequentially by genetic algorithms, local search, and a clonal selection algorithm, which is repeated for the required number of generations. The GA operators, LS, and CSA are designed to ensure that all of the chromosomes produced are feasible. There is an elitist strategy selection mechanism after the local search processes that selects the chromosomes for the CSA algorithm and also remembers the good solutions in its memory. There is a subsequent roulette wheel selection process after the CSA, which produces a population of chromosomes. A further elitist replacement process substitutes weaker solutions within the population with solutions remembered by the elitist strategy if they are better. The following subsections describe these processes in more detail.

3.1. Meme/Gene Representation

This research used the same data structures for genetic algorithms, memetic algorithms, and the clonal selection algorithm. The terminology used to describe the data structures varies according to the algorithm. With a genetic algorithm a chromosome comprises a set of genes. With a memetic algorithm a chromosome comprises a set of Memes. The clonal selection algorithm described in Section 3.7 uses an identical structure to represent antibodies.

A meme/gene can be encoded using either numeric (binary, integer, or real) or alphanumeric characters [9]. In this work, an integer encoded meme consists of three coded numbers: classrooms (); days per week (); and periods or timeslots per day (). Each meme/gene contains a reference to a classroom, a day, and a timeslot; for example, represents an event in the first classroom that takes place on the second days in the fourth timeslot. A chromosome comprises a set of memes/genes that represent a complete timetable.

3.2. Chromosome Initialisation

The chromosome initialisation process takes the following steps.(1)The length of the chromosome required is calculated taking into account the number of degree programmes, modules, and their associated classes.(2)An empty chromosome is generated with the appropriate length.(3)The modules are then sorted based upon their relative importance.(4)The highest priority module is scheduled first: this entails generating memes/genes for all of the classes and randomly assigning them to the chromosome. Before a meme/gene is added a check is made to ensure that the hard constraints are not violated. If there is a violation the algorithm sequentially looks for the next meme/gene that does not contravene the constraints (taking into account the modules in priority order); the process is then repeated in priority order until all the modules have been scheduled.

3.3. Evolutionary Processes

The parent chromosomes are randomly selected for the crossover and mutation genetic operations according to the probabilities of crossover () and mutation (). The selection of these parameters determines the balance between exploration and exploitation. The crossover operation (COP) produces offspring chromosomes from two parent chromosomes, whereas the mutation operation produces random meme/gene changes in one chromosome. The number of memes/genes within the chromosome that are changed is determined by the mutation rate . Thus, the selection of chromosomes is related to whereas the selection within the chromosome is related to the parameter .

The EAT includes three types of crossover operation: one-point crossover (OP), two-point crossover (TP) [47], and position based crossover (PB) [48], which were modified to ensure that only feasible chromosomes can be produced. The modified version of the regeneration mutation [9] was developed to ensure feasible solutions.

Figure 2 illustrates the regeneration mutation operator. It includes three steps. First, a chromosome is randomly selected from the population. Secondly, a section (subchromosome) is selected for regeneration. Finally, a new subchromosome is generated randomly. The remaining genes within the chromosome are inherited from the parent.

Their modified regeneration mutation operator made four modifications to the operator: (i) some memes from the parent chromosome are randomly regenerated so that beneficial memes are inherited by the offspring; (ii) new feasible memes are assigned into the empty chromosome positions by using roulette wheel selection; (iii) all of the offspring are guaranteed to be feasible chromosomes because of the hard constraint checking before memes which are inserted into the empty positions of a new chromosome; and (iv) the parameter specifies the percentage of memes to be regenerated. A higher setting of increases the amount of exploration, but this may result in beneficial memes from the parent being lost. The modified regeneration mutation procedure is illustrated in Figure 3.

3.4. Fitness Measurement

The total violation index () for a timetable may be calculated using (1) [49]. where is an index relating to the soft constraint , where is the number of soft constraints; is the index for the hard constraint , where is the number of hard constraints. is a variable used to count the number of violations of the soft constraint. is the variable used to count the number of violations of the hard constraint. For a timetable to be feasible must be zero for all the hard constraints. The user can specify the relative importance of the soft constraints by adjusting the weightings for each soft constraint. Higher weightings indicate higher priority of the associated soft constraints. In this work, the weights () were set at 1, 5, 2, and 1, respectively, as recommend by Di Gaspero et al. [38].

The GA and MA measure the quality of each chromosome using the objective function from (1) to calculate the total violation index (). As the objective is to minimise the number of violations the fitness value, which is determined by [23]

3.5. Local Search (LS)

The objective of the local search (LS) within the MA is to (i) improve the quality of chromosome or solution, through increased exploitation and (ii) increase the opportunity to quickly discover the global best solution. In this work, two hill-climbing LS heuristics, LS1 and LS2, were adopted from previous work by Thepphakorn et al. [28], as it had been demonstrated that they improved chromosome quality and prevented the generation of infeasible chromosomes. The aim of LS1 is to reduce the number of violations of the first and the forth soft constraints (SC1 and SC4), whilst the LS2 aims to reduce the number of violations of the second and the third soft constraints (SC2 and SC3). After the LS1 and LS2 procedures, the total violation index () and the fitness values for the new chromosomes are measured again before performing chromosome selection.

3.6. Elitist Strategy (ES)

The ES aims to maintain high quality chromosomes from one generation to the next. The ES helps GAs to reach convergence more quickly [13]. This ES is divided into two subprocesses: elitist memory updating, which records the best solutions (with no duplicates) and elitist replacement, which substitutes the worst chromosomes with those remembered if they are better. The elitist replacement process takes place after chromosome selection. The proportion of chromosomes remembered by the ES is determined by a user specified parameter %ES. Previous research had indicated that the most appropriate value for this parameter is 75% [50].

3.7. Clonal Selection Algorithms (CSA)

In this research, the memory of the ES is used to produce the hybridisations for (i) the genetic algorithm combined with the CSA and (ii) the memetic algorithm (which had been modified to include hill climbing local search) which was combined with the CSA. When chromosomes are assigned to the elitist memory, the procedure attempts to further improve them through the application of the CSA. The elitist memory is updated if the resultant chromosome is better after the application of the CSA.

All of the chromosomes (antibodies) in the elitist memory are sorted in accordance to their affinities (finesses); the chromosome with the highest affinity is assigned the highest rank , (), whilst the chromosome with the lowest affinity is ranked . The total number of antibodies () for cloning is equal to the number of chromosomes in the ES memory. In the following step, each rank of antibodies contained in the elitist memory is cloned according to [46]where is the total number of cloned antibodies, round () is an operator for changing real values into integers, is the multiplying factor, and is the population size. In the last step, all of the cloned antibodies are generated using affinity maturation. The regeneration mutation operator is used in this process together with a variable mutation rate that is an adaptive setting based upon an antibody’s ranking. The initial setting for is determined by the parameter . Antibodies with a lower affinity (ranking) require a value of that is greater than those with higher ranking [46].

3.8. Chromosome Selection

The classical roulette wheel approach [41] was used in this research. The general concept of the roulette wheel selection is to randomly select which chromosomes in the current population survive into the next population in such a way that their probability of survival depends upon their fitness. This process is terminated when the desired population size has been generated.

4. Experimental Results and Analysis

The objective of the EAT is to construct course timetables with the lowest number of soft constraint violations (). The aims of the computational experiments were to (i) identify which main factors and their interactions were statistically significant for the GA; (ii) identify and verify the best parameter settings; (iii) explore the performance of the GA with modified regeneration mutation (called MRMO); and (iv) explore the performance of the proposed hybridisations including MRMO+CSA and MMA+CSA.

The research considered fourteen course timetabling problems that were provided by the third track of ITC2007 [38]. These are summarized in Table 1. The experiments were performed on a personal computer with Intel 2.67 GHz Core 2 Duo CPU and 4 GB of RAM.

4.1. Screening Experiment

The screening experiment had two objectives to identify which factors and first level interactions were statistically significant and to identify the best settings for these factors. The experimental design, shown in Table 2, was used together with data from timetabling problem 1 (a small problem). The factors included (i) the combination of population size and the number of generations (PG), which determines the total number of chromosomes generated, which determines the amount of search and influences the execution time. In the computational experiments the value was fixed at 2,500 to limit the time taken for computational search; (ii) the probability of crossover (); (iii) the probability of mutation (); (iv) the crossover operation (COP); and (v) the mutation rate ().

The total number of runs required for a full factorial experiment based on the design in Table 2 would consider all the combinations of the factors in each replication. The total number of runs would therefore be the number of factors times the number of levels times the number of replications, which would be 35 = 243 runs per replication. When resources are limited it is common for researchers to use fractional factorial designs, which use a carefully chosen subset (fraction) of the experimental runs required for a full factorial design. This approach is based upon the sparsity of effects principle that states that a system is usually dominated by main effects and low order interactions [27].

In this experiment, the one-third fraction experimental design shown in Table 3 was adopted for the screening experiment, which decreased the number of computational runs by 66.67% per replication compared to the full factorial approach. The first instant problem (see Table 1) was selected and replicated five times using different random seeds. The computational results obtained from the 405 () runs were analysed using a general linear model form of analysis of variance (ANOVA). Table 4 shows the ANOVA table, which shows the source of variation (Source), degrees of freedom (DF), sum of square (SS), mean square (MS), value, and value. ANOVA was used to test the null hypothesis that there was no effect () and the alternative hypothesis () that there is an effect for each factor and interaction [27]. If a 95% confidence interval is used is accepted if , but cannot be rejected if .

Table 4 shows the GA parameters in terms of the main effect and first level interactions. PG, COP, , , , , and were statistically significant with a 95% confidence interval. The random seed number (seeds) did not statistically affect the GA performance. However, it is best not to discard parameters having a value more than 0.05 but less than 0.2 in a screening experiment [26]. Moreover, the most influential factor in this experiment was because it had the highest value followed by the COP factor.

4.2. Multiple Comparison Analysis

The experimental design considered three different levels for each factor. The alternative hypothesis () obtained from the ANOVA only identifies that at least one level of a factor has a statistically different mean, but it is not known whether the other levels are significant [27]. Thus, in some cases it is not possible to select the appropriate parameter settings from the ANOVA because it is not known which pairs of results are significantly different. After the screening experiment, appropriate parameter settings for the GA were determined by using the lowest mean obtained from main effect and interaction plots. These are shown in Figures 4 and 5 for the statistically significant GA factors. Figure 4 indicates that the best settings are , COP = PB, and . Figure 5 shows the best combinations for the interactions, which are and ; with COP = PB; with ; and with .

Tukey’s method [27] is a statistical analysis tool that may be used for multiple or pairwise comparisons. The hypothesis testing used by Tukey’s method can be defined as follows: cannot be rejected if the means of pair are equal; that is, the mean difference between the pair is zero ( value > 0.05); otherwise, will be accepted for pair with value ≤ 0.05 [27]. Many statisticians prefer to use this approach because the overall error rate is controlled [27]. Tukey’s method was therefore applied to detect significant differences in pairs of means in terms of the main and interaction effects. The results obtained with Tukey’s comparison are shown in Tables 5 and 6, each of which consists of the significant factors, the pairs of factor levels between and considered, the mean difference between and (Mean dif. ), and the value and the value.

The comparative results obtained from using Tukey’s method for the significant main effects shown in Table 5 indicated that the mean (mean difference ) difference in penalty for the factor PG between and was not statistically different ( cannot be rejected) whilst the means of other pairs were different ( will be accepted). Moreover, the means obtained from the pairs of the COP and factors were statistically different from each other pair. As there were many pairs of interactions, only the pairs that had the lowest mean in the interaction plots shown in Figure 5 were selected for pairwise comparisons. The analysis of the results obtained with Tukey’s method for the selected significant interactions is shown in Table 6. The mean obtained with the factor at 0.1 was not statistically different when the PG factor was set at either or . This indicates that all three levels of the PG factor are appropriate for use with the PB crossover. The parameter settings were practicable at all levels when the PG factor was set at either or . The means obtained for all of the levels of the factor were usable when the factor was set at 0.1. Therefore, the appropriate parameter setting for the GA factors PG, COP, , , and were established as or , PB, 0.1, 0.6–0.9, and 0.1–0.3, respectively.

4.3. Verifying Appropriate Parameter Settings

The significant factors, interactions, and important differences in the means were investigated by the screening experiment and pairwise comparison. Montgomery [27] suggested that the region for significant factors leading to the best possible response should be explored by conducting a second optimisation experiment after the screening experiment. The COP factor was a discrete GA parameter. The previous experiment identified PB as the best setting. PG settings between and showed little difference after using the pairwise analysis (see in Table 5), except for the factor. Therefore, the region of around 0.1 should be verified by using an experimental design before carrying out a comparative study.

Five levels of were therefore considered 0.02, 0.06, 0.1, 0.14, and 0.18. The appropriate settings identified by the previous experiments were used for the other parameters. The first instant problem from the ITC2007 was selected and repeated ten times using different random seed numbers. The results obtained from the computational runs for the best so far solution were statistically analysed in terms of minimum (Min), maximum (Max), and average (Avg) penalty value as well as the standard deviation (SD) and the execution time (Time) (hour unit).

The experimental results are shown in Table 7. The setting of 0.1 produced the best performance with the lowest average, minimum, and maximum values. The higher values of required more computational time than the lower parameter settings. This analysis verified that the optimal setting of for the GA is 0.1.

4.4. Performances of GA with/without the Modified Regeneration Mutation

The objective of this experiment was to explore and compare the performance of GA with/without the modified regeneration mutation operator (MRMO) in terms of the speed of convergence and the quality of the solutions. The appropriate parameter settings for the GA with the MRMO for PG, COP, , , and were found to be , PB, 0.1, 0.75, and 0.2, respectively. The benchmark problems adopted from the third track of the ITC2007 (14 instances) [38] were used to test and compare the performance of the proposed algorithms to find the course timetable with the lowest penalty . The computational run for each instance was repeated ten times by using different random seeds. The computational results were analysed in terms of Avg, SD, time (hour unit), and the percentage improvement achieved by the GA with MRMO (%Imp). A -test was used to compare the means.

Table 8 shows that the performance differences achieved by the GA with/without the MRMO were all statistically significant with a 95% confidence interval using -test analysis ( value ≤ 0.05) for all of the problems. It means that the GA with the MRMO outperformed the GA without the MRMO for all instances, each of which also had the lower Avg and SD values. Moreover, the %Imp value for each problem was distributed between 35.55% and 87.56% but with longer execution time. The average of improvement was up to 51.88%. A comparison of the convergence speed for the proposed methods to investigate the best so far solution is shown in Figures 6 and 7 by using problems number 7 and number 14 from the ITC2007 datasets.

The GA with MRMO converged more quickly for problems number 7 and number 14 than the GA without the MRMO (see Figures 6 and 7). Therefore, it can be concluded that the new regeneration mutation based upon roulette wheel selection was able to improve the GA’s performance in terms of solution quality and speed.

4.5. Analysing the Performance Evolutionary Algorithm Hybridisations

The objective of this experiment was to explore the performance of (i) the MRMO with/without local search (the modified memetic algorithm (MMA)); (ii) the addition of the elitist strategy (MRMO+ES and MMA+ES); and (iii) the use of the clonal selection algorithm (MRMO+CSA and MMA+CSA) both in terms of convergence speed and solution quality. The appropriate parameter settings for the MRMO and the MMA were adopted from the previous experiments. The benchmark problems adopted from the third track of the ITC2007 [38] were again used to test and compare the performance of the proposed algorithms to find the course timetable with the lowest penalty . The computational run for each instance was repeated ten times by using different random seed numbers. The computational results obtained were analysed statistically in terms of Avg, SD, and time (hour unit), as shown in Table 9. The percentage improvement (%Imp) achieved by the MRMO with/without hybrid heuristics was calculated, whilst the value obtained by using the -test method and the value are also shown in Table 10.

Table 10 shows that almost all of the comparisons between the results obtained from the MRMO and the other hybridisation approaches were statistically significant with a 95% confidence interval ( value ≤ 0.05). For all of the problems the results obtained from the MMA+ES, MRMO+CSA, and MMA+CSA were statistically significant with a 95% confidence interval. Moreover, the MMA+CSA achieved the highest value, %Imp, and Avg %Imp, which indicates that it was the best configuration. However, the negative or positive value in Table 10 indicated that the results obtained from some hybrid approaches did not outperform the MRMO for some problems.

According to Tables 9 and 10, the MMA+CSA outperformed the other methods for all instances because it achieved the maximum Avg %Imp of 49.85% and minimum Avg values. However, it also had the longest execution time. Although the Avg %Imp between the MRMO+CSA and the MMA+ES was nearly equal at 42%, the MRMO+CSA required less computational time than both the MMA+CSA and the MMA+ES; it was up to 6.3 times quicker for some instances. Moreover, the Avg %Imp obtained by the MRMO using CSA was better than that using LS (MMA) and ES by approximately 23–26%. The MRMO’s execution times using CSA were also up to 5.7 times faster than those using LS but slower than those using ES by up to 3.2 times for some instances. Although the Avg %Imp obtained from the MRMO using ES and LS was less than those using the proposed hybrid methods, the performances of the MRMO+ES and the MMA were better than the MRMO without hybridisations (see in Table 10). The average improvement for almost all problems was around 16–19%. A comparison of the results in terms of average convergence speeds of the proposed hybrid methods to find the best so far solution is shown in Figures 8 and 9. These were based upon problem numbers 7 and 14 from the ITC2007 which represent medium and large problem sizes.

The MMA+CSA’s converged more quickly than the other algorithms for problems 7 and 14. The next best convergence was achieved by MMA+ES (Figures 8 and 9). The MRMO+CSA had low performance in early generations. However, the average of best so far solutions found in the last generation was close to the average of best so far solutions obtained by the MMA+CSA. Moreover, the LS strategy hybridisation in the MRMO including the MMA+CSA, the MMA+ES, and the MMA was able to find a better average of best so far solutions in early generations than the other methods without LS. Therefore, it can be concluded that the LS, ES, and CSA strategies were able to improve the MRMO’s performance in terms of the solution quality and speed.

5. Conclusions

The evolutionary algorithms based timetabling (EAT) tool was developed to use genetic algorithms (GA) and memetic algorithms (MA) to solve university course timetabling problems. The work made a number of significant research contributions. A common problem with genetic algorithms is that many chromosomes within a population may represent infeasible solutions. This work developed new one-point, two-point, and position-based crossover operators and a modified regeneration mutation operators that guaranteed that all of the chromosomes generated represented feasible solutions. Likewise the chromosome initialisation process was designed to produce feasible chromosomes. The research also developed novel hybrids that included genetic algorithms, local search, and a clonal selection algorithm together with roulette wheel and elitist selection. The tool was tested using 14 datasets obtained from the third track of ITC2007 [38], which have been widely used by previous researchers.

The experimental work adopted a sequential experimental design. The screening experiment used a one-third fraction of the experimental design [27] with five factors, each of which had three levels. The factors PG, COP, , , , , and were statistically significant with a 95% confidence interval. Main effect plot analysis found the best settings to be , COP = PB, and . The best combinations for the interactions were and ; with COP = PB; with ; and with .

A further analysis using pairwise comparison found the appropriate parameter setting for PG, COP, , , and to be or , PB, 0.1, 0.6–0.9, and 0.1–0.3, respectively. A further experiment verified that the best setting for was 0.1, as it produced the best performance with the lowest average, minimum and maximum penalty values.

The comparative results indicated that the MRMO outperformed the GA for all problems, with an average improvement of up to 51.88%. The MRMO converged more quickly than GA. In terms of hybrid comparisons, the MMA+CSA outperformed all the other methods; there was an average improvement of 49.85% compared to the MRMO. The second best hybrid was the MRMO+CSA. The MMA+CSA also converged more quickly and the best so far solutions were better than for all the other hybrid methods for all generations. Although the performance of the MRMO+CSA was the second rank in terms of an average of %Imp, it required up to 6.3 times less computational time less than the MMA+CSA. The ES, LS, and CSA embedded within the EAT tool were able to improve the EA’s performances in terms of solution quality and its convergence but at the expense of longer execution time.

Thus, the development of novel hybrids has been shown to be an effective approach to solve a wide range of timetabling problems. The proposed approaches have been shown to provide good solutions quickly.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.


The authors would like to acknowledge the Naresuan University for financial support on publication.