Abstract

Assessment of the components of a solution helps provide useful information for an optimization problem. This paper presents a new population-based problem-reduction evolutionary algorithm (PREA) based on the solution components assessment. An individual solution is regarded as being constructed by basic elements, and the concept of acceptability is introduced to evaluate them. The PREA consists of a searching phase and an evaluation phase. The acceptability of basic elements is calculated in the evaluation phase and passed to the searching phase. In the searching phase, for each individual solution, the original optimization problem is reduced to a new smaller-size problem. With the evolution of the algorithm, the number of common basic elements in the population increases until all individual solutions are exactly the same which is supposed to be the near-optimal solution of the optimization problem. The new algorithm is applied to a large variety of capacitated vehicle routing problems (CVRP) with customers up to nearly 500. Experimental results show that the proposed algorithm has the advantages of fast convergence and robustness in solution quality over the comparative algorithms.

1. Introduction

The large-scale NP-hard problems with typically exponential complexity are difficult to solve in polynomial time. In practice, evolutionary-based algorithms are proposed for searching near-optimal solutions. Many different types of evolutionary algorithms have been proposed so far, for example, the genetic algorithms (GAs) [1], particle swarm optimization (PSO) [2], shuffled frog-leaping algorithm (SFLA) [3], memetic algorithms (MAs) [4], differential evolution (DE) [5], ant-colony optimization (ACO) [6], extremal optimization (EO) [7], and so on. Among these algorithms, ACO allocates and modifies the pheromones of each edge, and the ants search the new tours under the guidance of the pheromones; in the extremal optimization, the fitness is defined on the components of the feasible solution and the undesirable components are more liable to be eliminated. Both algorithms attempt to evaluate quantitatively the components of a solution, which can be used to guide the optimization process. They have found wide applications in many fields, and the idea of exploring the intrinsic properties of each component in the feasible solutions forms the foundation of our work.

In this paper, a novel problem-reduction evolutionary algorithm (PREA) is proposed. The feasible solution of the problem is supposed to be composed of a series of basic elements with their respective acceptability defined. The PREA consists of the searching phase and the evaluation phase. In the evaluation phase, the acceptability of basic elements is calculated and passed to the searching phase. The searching phase attempts to search for better solutions with a group of delicately designed encapsulation processes and optimizers. Similar to the idea of “if the backbones (i.e., the easy part of problem) are held constant, the optimization process is able to concentrate on other parts, which are more difficult to solve” in searching for backbones [8], for each individual solution, the basic elements with higher acceptability (or better components) are encapsulated as a whole. The new solution is representative of a reduced-size optimization problem. Specifically, encapsulation probability is introduced into the encapsulation which helps to adjust the searching area size of the successive optimization. The acceptability for individual solutions is also defined in the optimization phase to guide the search direction, which provides a new way for an individual to learn from others. With the evolution of the algorithm, the original optimization problem is reduced to a series of smaller-size optimization problems, thus significantly improving the convergence speed.

In this paper, the problem-reduction evolutionary algorithm is applied to the capacitated vehicle routing problem (CVRP). The CVRP proposed by Dantzig and Ramser [9] is an extension of the well-known NP-hard traveling sales man problem (TSP) and has been studied extensively in the literature. It is reported that the best exact algorithm can solve instances involving approximately 100 customers [10]. Researchers mainly focus on heuristics which can find near-optimal solutions in acceptable time. The variable neighborhood method and the simple iterated local search (ILS) are commonly used for CVRP. Li et al. [11] proposed a VRTR algorithm which combines the record-to-record (RTR) principle with a variable length neighbor list. Chen et al. [12] presented an ILS algorithm together with variable neighborhood descent based on multioperator optimization. Subramanian et al. [13] designed an ILS-based heuristic incorporating a set partitioning (SP) approach. Sequences of SP models which represent routes found by a metaheuristic approach are solved by a mixed integer programming (MIP). To overcome the limitations of a single method, hybrid algorithms have been extensively proposed; for example, Luo et al. [14] presented an improved shuffled frog leaping algorithm (SFLA) combined with the power-law extremal optimization (-EO), Nagata and Bräysy [15] put forward a memetic algorithm (MA) with edge assembly (EAX) crossover in the local search, and an efficient modification algorithm is applied to address the constraint violation of the infeasible solutions. Some techniques have been put forward to reduce the computational complexity of an algorithm. Zachariadis and Kiranoudis [16] presented a penalized static move descriptors algorithm (PSMDA), in which a static move descriptor (SMD) data structure is constructed to reduce the computational cost for evaluating the solution neighborhoods. Liu and Li [17] provided a fast feasibility evaluation of solution neighborhoods by introducing the concepts of “preload” and “postload.” In this paper, we propose a new evolutionary algorithm in which the evolution of both individual solutions and reduced problems is considered. Experimental results show that it is computationally efficient and claim to find new best known solutions to 7 well-studied CVRP instances.

The rest of the paper is organized as follows. Section 2 briefly described the capacitated vehicle routing problems with the novel coding mechanism for the feasible solution. In Section 3, the problem-reduction evolutionary algorithm is presented in detail. Experimental results and analysis are reported in Section 4, followed by the conclusion in Section 5.

2. The Capacitated Vehicle Routing Problem

The capacitated vehicle routing problem (CVRP) aims to find minimum total cost routes for a fleet of vehicles to serve the given customers with known locations and demands, subject to the constraint that vehicles assigned to the routes must carry no more than a fixed quantity of goods. For the capacity and distance VRP (CDVRP), the duration of each route must not exceed an upper bound .

In this paper, the variable length codes (VLC) are used. A feasible solution for the -customers CVRP is expressed as an integer sequence: where is the coding length defined later. Equation (1) implies that the first and the last elements of represent the central depot (denoted by 0), and the other elements in between can be either the depot or any customer (denoted by its index ) which appears only once. For example, is a solution of the CVRP, and there are two routes in the solution:Route 1: 0, 1, 3, 6, 8, 0,Route 2: 0, 4, 5, 7, 2, 0.

The coding length is equal to with being the number of vehicle routes. It should be noted that the number of vehicle routes is not fixed in the proposed algorithm which means that the coding length is variable. This implies that the CVRP with the proposed coding scheme is basically a multiobjective optimization problem which needs to be solved with both the number of vehicles and the traveling costs being optimized simultaneously.

3. Problem-Reduction Evolutionary Algorithm

The PREA is a population-based evolutionary algorithm with two phases: the searching phase and the evaluation phase. During the searching phase, we start from randomly chosen initial feasible solutions. They are fed into individual optimizers to find “good” solutions. The optimizers may be the same or not. In the evaluation phase, analysis of the components of solutions is performed based on two factors. One is whether the solution is good (with big fitness) or not; another is the frequency of the component appearing in the individual solutions. Of course, components which are found available in most of the “good” solutions should be “reliable” components. Next, the evaluation values are fed back to the searching phase to reformulate the problem through encapsulation which will be described in detail later. As a consequence, the problem is reduced to a smaller-size one and the individual solutions can be optimized by respective optimizer with the help of the attained evaluation values of components. Again, the results are sent to the evaluation phase. The two phases are carried out alternately until all individuals converge to the same solution which is supposed to be the near-optimal solution. The novel point of PREA lies in the evaluation of the components extracted from the individual solutions and the delicate design of the encapsulation process which reduces the optimization scale. The framework of PREA can be described in Figure 1.

3.1. Basic Element and Its Acceptability

In the evaluation phase of PREA, solutions are decomposed to smaller components. In this paper, we confine our attention to combinatorial optimization problems in which the feasible solution can be expressed as an integer sequence. To make it concrete, we have the following definitions.

Definition 1. Given an optimization problem of size and its set of feasible solutions with , . The collection of adjoining pair is defined as the basic element set with respect to . Define as the basic element set of problem .

Definition 2. For an optimization problem with feasible solutions , suppose that denotes the respective fitness; then the normalized fitness for is defined as where is the fitness of the worst solution in the population.
It can be seen from (2) that and the sum is equal to 1. Now, we present the idea of the acceptability for basic elements. As the name implies, acceptability is a quantitative measure for evaluating the acceptable degree of a basic element in a group of individuals; it is defined as follows.

Definition 3. For a population-based optimization with agents, is the feasible solution set with respective basic element set . The acceptability of a basic element is defined as
Obviously, . If for an arbitrary pair , it is not the basic element for any of the solutions, then ; on the contrary, implies that the pair inevitably appears to be common component for all individual solutions.
Similarly, we can define the acceptability of a feasible solution by accumulating those of the individual basic elements as

3.2. Encapsulation Process

The encapsulation process aims to reduce the size of an optimization problem with the use of the acceptability of basic elements. To be specific, for an individual solution, the basic element or a group of basic elements can be encapsulated to form a new entity, which correspondingly represents a new reduced-size problem. The critical point lies in the way to reformulate the problem. Here, we define the encapsulation probability for the basic element of an individual solution as where and , respectively, represent the maximum and minimum acceptability of basic elements in . is a predefined encapsulation probability with the corresponding maximal acceptability . is used to adjust the acceptability threshold for basic elements. A large implies that an acceptability threshold closer to is preferred. A basic element in is encapsulated with the encapsulation probability.

It can be seen from (5) that the parameter basically reflects the chance of basic elements to be encapsulated. With the increase of , more basic elements will be encapsulated and vice versa. Similarly, the bigger the value of , the more the chance for the basic elements to be encapsulated. At the beginning of the algorithm, there are fewer common basic elements among individual solutions, which implies that there is big room for improvement; increasing the values of and can result in a decrease in the problem scale which makes the searching process focused on more promising areas. With the improvement of individual solutions, the common basic elements in the solution population increase, and fewer parts of the individuals can be improved. On the contrary, decreasing the values of and can increase the problem scale which leads to wider areas in the searching space.

In the encapsulation process, the better basic elements are encapsulated as new nodes [18] which will keep unchanged in the later optimization process. The new solution achieved by encapsulation is called the encapsulation solution. It is used as the starting point for the next-stage optimization. For example, let be a feasible solution of the CVRP as shown in Figure 2(a). Assume that basic elements , , , , and are selected to be encapsulated. Figure 2(b) illustrates the encapsulation solution with six new nodes. From Figure 2(b) we can see that each new node has a head and a tail, which is different from the original one, and the distance between two nodes also needs to be redefined. For instance, for the reduced problem shown in Figure 2(b), the distance between the new nodes ③ and ⑤ may take on a value derived from those of the original node pairs , , , and . Obviously, the reformulation of the reduced configuration is problem specific and for the vehicle routing problems, we simply enclose some parts of the route as new nodes.

By encapsulating, the original optimization problem is reduced to a smaller-size optimization problem, thus greatly speeding up the process of searching for the “better” solutions. It is worth noting that the encapsulation varies from individual to individual and also from iteration to iteration, which can be explained as exploring diverse regions at different levels under the guidance of acceptability of basic elements.

In the optimization process for the reduced problem, it is obvious that a simple neighborhood operation in the reduced problem domain usually implies a complicated local search strategy in the original problem [18]. This is illustrated in Figure 2(d), which depicts a 1-1-interchange neighborhood for an encapsulation solution. Actually, this is the 4-opt neighborhood for the original solution shown in Figure 2(c).

3.3. Problem-Reduction Evolutionary Algorithm

In the framework of PREA shown in Figure 1, there are many choices for the selection of optimizers. To make it simpler, we assume that all optimizers are of the same type and the well-known simple-structure iterated local search (ILS) [19] is used. The main idea of ILS is to start from a randomly generated solution, then to repeatedly apply local search combined with mutation to the solution, and to reserve the better ones until the stopping criterion is met. To take the advantage of the solution acceptability and facilitate the CVRP at hand, not only the better solutions but also the worse solutions with bigger acceptability are reserved during the implementation of the ILS, which is described as follows:

implies if the fitness of solution is less than that of or the growth rate of acceptability between solutions and is bigger than the reduction rate of fitness between these two solutions, which implies that is a more potential candidate, and 0, vice versa. This is reasonable in that the individual with a higher acceptability (with many better basic elements) is more likely a potential solution. Meanwhile, the new acceptance criterion helps guide individuals to move towards the common ground of all individuals, thus speeding up the convergence of the algorithm and providing a completely new way for an individual to learn from others. The pseudocode of the improved ILS (ILS) is presented in Algorithm 1.

function  )
  ,
  do
    
    
    if  ()
     
     if  
      
     endif
    else
     
    endif
  while  (stop criterion satisfied)
  return  ,
end  function

There are many choices for local search. In this paper, we use a fast local search method [17] which calculates the difference in the objective values (i.e., ) between the current solution and its neighbor . The complexity of the calculation of is reduced by introducing the concepts of “preload” and “postload.” The four operators used in the local search algorithm are insertion, 1-1-interchange, 2-opt, and 2-opt.

Given the definitions of the preceding section, the PREA can be stated as follows. At first, an initial population is created randomly. Then the processes of ILS, acceptability calculation, and encapsulation are carried out alternately until the termination rule is met. The pseudocode for PREA is shown in Algorithm 2.

Initialize population
for   to
  
  ,
end  for
repeat
 Cal_acceptability();
for    to  
  ;
  ;
  ;
end  for
until  termination rule is met

In Algorithm 2, denotes the population size; and , respectively, represent the best solution individual has achieved and the corresponding fitness. Cal_acceptability() calculates the basic elements’ acceptability on the basis of the available best individual set . Encapsulate () is the encapsulation process for the individual , where the selected basic elements of are encapsulated to form a solution for a reduced-size problem. It is used as the starting point for the improved ILS algorithm which aims to find a “better” local optimum of the encapsulated . Unclose () transforms the resulting encapsulated solution back into the original one.

The general idea underlying the PREA is that the basic elements with higher acceptability are encapsulated and keep constant in the later searching process. With the evolution of the individual solutions and the corresponding reduced problems, the common basic elements of individual solutions are reserved from iteration to iteration. Other basic elements are still passed to the next iterations and evolve in successive searching processes, which helps to keep the diversity of the population. With the increase of common basic elements, all individual solutions will eventually be the same which happens to be the near-optimal solution for the problem.

4. Experimental Results and Analysis

Experiments have been conducted to analyze the performance of the proposed PREA for the CVRP. They were carried out with the standard benchmarks of Christofides and Eilon [20] and Golden et al. [21]. The problem set of Christofides includes 14 problem instances C1–C14 and that of Golden 20 instances G1–G20. It is noted that customers in problems C1–C10 are randomly distributed, while in problems C11–C14 customers are distributed into clusters. The problems C1–C5, C11-C12, and G9–G20 are CVRPs, and problems C6–C10, C13-C14, and G1–G8 belong to CDVRPs. For all problem instances, the number of customers ranges from 50 to 483. The proposed PREA is implemented in C++ and runs on an Intel Core E7500 2.93 GHz PC.

The critical point in PREA is how to encapsulate parts of a solution to form a new smaller-size optimization problem. This is achieved by the appropriate choice of parameters and which are directly related to both the accuracy and the convergence of the PREA. To evaluate the influence of these two parameters, and are set to different values in our experiments. The test instances are the 5 CVRPs (C1~C5, the number of customers ranges from 50 to 199) of Christofides. For each pair of and , 50 independent runs are carried out for each instance. The results are shown in Figure 3.

In Figure 3, time represents the average running time for a single run while gap (%) evaluates the difference (in percentage) between the average solutions (the average of 50 runs for each instance) and the given best known solutions (BKSs). It is clear that, with the decrease of and , more accuracy solutions can be achieved; yet this consumes more running time. This coincides with the analysis that smaller and lead to less reduction in size for the reduced problem which implies a larger searching area with more time needed. To evaluate the influence of population size, experiments are also conducted on 5 CVRPs (C1~C5) of Christofides with different . The parameters and are set to 0.5 and 0.9, respectively, in the experiments. The results are shown in Figure 4.

Figure 4 shows that by increasing the population size, we can get better results, but this requires more computational time. More individuals in the population make the poor basic elements have less chance to become the common parts of the individuals; thus the solution is improved. Even if is set to 2, the PREA can find the solutions close to the BSKs in short time. When increasing to 10, the average results are very close to the BSKs, and the computation time is quite acceptable. By carefully adjusting the three parameters , , and , the PREA can achieve better solutions quickly. This provides a convenient way for users to compromise between computational time and the quality of solution.

To investigate the convergence performance of the PREA, especially the mechanism of the encapsulation strategy, we illustrate the detailed evolution process for PREA for problem case G10 (323 customers) in Figure 5, with the parameters chosen as , , and . Figure 5(a) displays the evolution of the average scale of the reduced problem in the group. The size of the reduced problem decreases from the original 268 to the final 52, with a drastic decrease happening from around the 180 iterations (we call this acute phase). In no more than 20 iterations, the scale of the problems decreases to one half, while at the beginning this should require more than 100 iterations (we call it obtuse phase). It is worth noting that the gentle decrease of size in obtuse phase is necessary which guarantees a more sufficient and diverse search for each reduced problem to find “better” components to be encapsulated. Of course, the running time complies with the scale of the reduced problem, also decreasing quickly in the acute phase. A careful examination of Figure 5(b) reveals that, in the acute phase, the decrease in fitness is significant and the PREA can still find better solutions efficiently. This is easily explained, since during the later iterations, more and more common basic elements are picked up and encapsulated so that PREA can concentrate on smaller and more promising areas, which greatly improves the searching efficiency.

The results for the 14 CVRP instances of Christofides and Eilon and the 20 CVRP instances of Golden et al. are presented in Tables 1 and 2, respectively. The parameters used in the experiments are the same as before. The PREA terminates when all solutions have achieved the same optimum or the maximum iteration 2000 is achieved. Meanwhile, the results achieved by IVND [12] and the recently proposed metaheuristics ILS-RVND-SP [13] are listed for comparison. Similar to PREA, both use ILS as the basic optimizer. To make it easier for comparison, the performance indices considered are the same as those of the counterpart algorithms and the results are obtained from 10 individual runs. In Tables 1 and 2, the previous best known solutions (the total travel distance/the number of routes) [13, 22] are listed in second column; 3 performance indices are recorded in the following columns: the gap between the best solutions and the BKSs, the average deviation to the BSKs, and also the average time in seconds in a single run. The last three rows give the overall average deviation, average CPU time per problem in seconds, and the computational platform used in different experiments.

The problem sets Christofides and Golden consist of instances which are representative of a variety of CVRP instances including customer variation in scale, distribution, and possibly additional constraint on traveling distance. Table 1 presents the results for the small-scale Christofides set which vary from case to case with an obvious decrease in computational time yet slightly degraded performance. For the Golden set, the results are more interesting and worth noting. The second last row in Table 2 indicates that for PREA with 10 agents, the performance in best/average gaps is very close to that of the recently proposed ILS-RVND-SP method, but the time needed is only about one-seventh of the latter. This implies that the proposed new algorithm is more efficient for large-scale problems, in the sense that the PREA achieves the comparable best/average gap in shorter time compared with the INVD and ILS-RVND-SP methods. More agents in PREA will result in “better” performance with the cost of requiring more time; with 20 agents, the PREA has achieved lower best/average gap than ILS-RVND-SP with less running time. To further show the effectiveness of the proposed PREA, we also present 5 new best solutions obtained by PREA during all our experiments, and the new best solutions found for two other examples which are better than the previous best known solutions [23]. These are displayed in Table 3 and Figures 6 and 7.

5. Conclusion

This paper introduces a new population-based evolutionary algorithm called the problem-reduction evolutionary algorithm. The PREA consists of two alternating phases: the searching phase and the evaluation phase. The acceptability of basic elements is obtained in the evaluation phase which is used to guide the encapsulation in the searching process. Encapsulation is an indispensable scheme for cutting down the scale of the optimization problem. With the evolution of the algorithm, the common basic elements increase to eventually direct to the same optimal. The proposed new algorithm is applied to two benchmark capacitated vehicle routing problems (CVRP) with several hundreds of customers. Experimental results show that the proposed algorithm is superior to the recently proposed comparative algorithms in both convergence speed and solution quality.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This work is supported by the National Natural Science Foundation of China (Grants nos. 61171124, 61373158, and 61301298) and Shenzhen Key Project for Foundation Research (JC201105170613A).