Abstract
The continuous planar facility location problem with the connected region of feasible solutions bounded by arcs is a particular case of the constrained Weber problem. This problem is a continuous optimization problem which has a nonconvex feasible set of constraints. This paper suggests appropriate modifications of four metaheuristic algorithms which are defined with the aim of solving this type of nonconvex optimization problems. Also, a comparison of these algorithms to each other as well as to the heuristic algorithm is presented. The artificial bee colony algorithm, firefly algorithm, and their recently proposed improved versions for constrained optimization are appropriately modified and applied to the case study. The heuristic algorithm based on modified Weiszfeld procedure is also implemented for the purpose of comparison with the metaheuristic approaches. Obtained numerical results show that metaheuristic algorithms can be successfully applied to solve the instances of this problem of up to 500 constraints. Among these four algorithms, the improved version of artificial bee algorithm is the most efficient with respect to the quality of the solution, robustness, and the computational efficiency.
1. Introduction
The Weber problem is one of the most studied problems in location theory [1–3]. This optimization problem searches for an optimal facility location on a plane, which satisfiesIn (1), it is assumed that , are known demand points, and are weight coefficients, and is a matrix norm, used as the distance function.
The basic Weber problem is stated with the Euclidean norm underlying the definition of the distance function. Also, many other types of distances have been used in the facility location problems [3–5]. In general, a lot of extensions and modifications of the Weber location problem are known. Detailed reviews of these problems can be found in [3, 6].
The most popular method for solving the Weber problem with Euclidean distances is given by a onepoint iterative procedure which was first proposed by Weiszfeld [7]. Later, Vardi and Zhang developed a different extension of Weiszfeld’s algorithm [8], while Szegedy partially extended Weiszfeld’s algorithm to a more general problem [9]. In particular, some variants of the continuous Weber problem represent nonconvex optimization problems which are hard to be solved exactly [10]. A nonconvex optimization problem may have multiple feasible regions and multiple locally optimal points within each region [11]. Consequently, finding the global solution of a nonconvex optimization problem is very difficult.
Heuristics and metaheuristics represent the main types of stochastic methods [12]. Both types of algorithms can be used to speed up the process of finding a highquality solution in the cases where finding an optimal solution is very hard. The distinctions between heuristic and metaheuristic methods are inappreciable [12]. Heuristics are algorithms developed to solve a specific problem without the possibility of generalization or application to other similar problems [13]. On the other hand, a metaheuristic method represents a higherlevel heuristic in the sense that they guide their design. In such a way we can use any of these methods to design a specific method for computing an approximate solution for an optimization problem.
In the last several decades, there is a trend in the scientific community to solve complex optimization problems by using metaheuristic optimization algorithms. Some applications of metaheuristic algorithms include neural networks, data mining, industrial, mechanical, electrical, and software engineering, as well as certain problems from location theory [14–21]. The most interesting and most widely used metaheuristic algorithms are swarmintelligence algorithms which are based on a collective intelligence of colonies of ants, termites, bees, flock of birds, and so forth [22]. The reason of their success lies in the fact that they use commonly shared information among multiple agents, so that selforganization, coevolution, and learning during cycles may help in creating the highest quality results. Although not all of the swarmintelligence algorithms are successful, a few techniques have proved to be very efficient and thus have become prominent tools for solving realworld problems [23]. Some of the most efficient and the most widely studied examples are ant colony optimization (ACO) [24–26], particle swarm optimization (PSO) [15, 27–29], artificial bee colony (ABC) [19, 30–35], and recently proposed firefly algorithm (FA) [18, 36–38] and cuckoo search (CS) [17, 39–41].
Different heuristic methods are proposed in order to provide encouraging results for challenging continuous Weber problem with regard to solution quality and computational effort [42–46]. Also, some variants of the Weber problem have been successfully solved by different metaheuristic approaches [47–52]. In [52], the authors studied a capacitated multisource Weber problem as an extended facility location problem that involves both facility locations and service allocations simultaneously. The method proposed in [52] is based on the integration of two genetic algorithms. The problem of locating one new facility with respect to a given set of existing facilities in the plane and in the presence of convex polyhedral barriers was considered in [47]. The general strategy in [47] arises from the iterative application of a genetic algorithm for the subproblems selection. A hybrid particle swarm optimization approach was applied in solving the incapacitated continuous locationallocation problem in [48]. In [49], the authors compared performances of four metaheuristic algorithms, modified to solve the singlefacility location problem with barriers. The method for solving a kind of Weber problem from [50] was developed using an evolutionary algorithm enhanced with variable neighborhood search.
The aim of this paper is to investigate the performances of some prominent swarmintelligence metaheuristic approaches to solve the constrained Weber problem with feasible region bounded by arcs. This variant of Weber problem has a nonconvex feasible set given by the constraints that make it much harder to find the global optimum using any deterministic algorithms. Hence, metaheuristic optimization algorithms can be employed in order to provide promising results.
In this paper, four swarmintelligence techniques are applied to solve this version of the constrained Weber problem: the artificial bee colony for constrained optimization [53], the crossoverbased artificial bee colony (CBABC) algorithm [54], the firefly algorithm for constrained optimization [37], and the enhanced firefly algorithm (EFA) [55]. The CBABC and the EFA are two of the most recently proposed improved variants of the ABC and FA for solving constrained problems, respectively. Also, a heuristic algorithm is proposed in [44] with the aim of solving this version of the constrained Weber problem. Hence, it is also implemented for the purpose of comparison with the metaheuristic approaches. These five techniques are tested to solve randomly generated test instances of constrained Weber problem with feasible region bounded by arcs of up to 500 constraints.
The rest of the paper is organized as follows. A formulation of the constrained Weber problem with feasible region bounded by arcs and the heuristic approach developed to solve this variant of the constrained Weber problem are presented in Section 2. Section 3 presents the four metaheuristic optimization techniques used to solve this variant of the Weber problem. Description of the generated benchmark functions and comparative results of the four implemented metaheuristic techniques are given in Section 4. Concluding remarks are provided in Section 5.
2. The Heuristic Method for Solving a Constrained Weber Problem
The constrained Weber problem with feasible region bounded by arcs in the continuous space was introduced in [44]. In order to complete our presentation, we briefly restate the method. It can be formulated by the goal function defined in (1) and by the feasible region which is defined on the basis of constraints of two opposite types:where is the total number of demand points and and are subsets of the set of demand point indices satisfying , , and . For the sake of simplicity, the optimization problem given by (1) with constraints (2) is denoted as the CWP problem.
Such a problem may occur if some demand points coincide with locations of some important facilities and the searched optimal location must be close to them. Other demand points may coincide with dangerous facilities and the facility must be located far from them.
The metric used in practically important location problems depends on various factors, including properties of the transportation means [44]. In the case of public transportation systems, the price usually depends on a distance. However, some minimum price is usually defined. For example, the initial fare of the taxi cab may include some distance, usually 1–5 km. Having rescaled the distances so that this distance included in the initial price is equal to 1, we can define the price function aswhere is a matrix norm.
In the case of distance function defined by (3), the problem can be decomposed into series of constrained location problems with the Euclidean metric where the area of the feasible solutions is bounded by arcs. Each of the problems has the feasible region equal to the same intersection of the discs with centers in the demand points. For more details, see [44, 56].
The Weiszfeld procedure for solving the Weber problem with a given tolerance , based on the results from [57], is presented as Algorithm 2.1 in [44].
An algorithm based on the Weiszfeld procedure for solving the CWP defined by objective (1) and constraints (2) was proposed [44]. The feasible set of our constrained optimization problems is generally nonconvex, while the objective function given by (1) is convex [58]. A solution of constrained optimization problems with convex objective functions coincides with the solution of the unconstrained problem or lies on the border of the forbidden region [59]. Thus, if is a solution of the constrained problem given by (1) with constraints (2) then it is the solution of the unconstrained problem (1) or .
Step 2.2 of Algorithm 2.1 from [44] can lead to generating a new point outside the feasible region determined by constraints (2). Let us denote this region . It is assumed that .
For an arbitrary point , let us denote the closest point in by . It can be computed using
Algorithm 1 was proposed as Algorithm 2.2 in [44], and it is based on the substitution of the point generated in Step 2.2 of Algorithm 2.1 from [44] with its closest point in the feasible region.

3. Review of the Metaheuristic Optimization Techniques
The four metaheuristics used to solve constrained Weber problem with feasible region bounded by arcs are described in the following subsections.
3.1. Artificial Bee Colony Algorithm for Solving the CWP
A numerical variant of the ABC algorithm for constrained optimization problems (COPs) proposed in [60] is applied to solve the CWP. In the ABC the population is iteratively refined through employed, onlooker, and scout bee phases.
The update process used in the employed and onlooker bee phase is the same and it is determined bywhere is a uniform random number in the range , represents another solution selected randomly from the population, is the modification rate control parameter, is a randomly chosen real number in the range , and . The update process is completed when the selection between and is carried out.
The ABC uses Deb’s rules in order to decide which solution will be kept for the next iteration. This constraint handling method consists of a set of three feasibility rules introduced by Deb [61]. They are the following: (1) any feasible solution is preferred to any infeasible solution, (2) between two feasible solutions, the one having a better fitness value is preferred, and (3) if both solutions are infeasible, the one with the lowest sum of constraint violations is preferred.
In the employed bee phase, every solution involves the update process. On the other hand, in the onlooker bee phase only the solutions selected probabilistically proportional to their fitness values have the chance to be upgraded [60].
In the scout phase solutions that do not improve over a certain number of cases are replaced by new randomly generated solutions. The control parameters and the scout production period are used in this phase. The parameter is used to signify exhausted food source, while parameter is employed in order to denote a predetermined period of cycles for producing scout bees.
The pseudocode of the ABC is given as Algorithm 2.

3.2. CrossoverBased Artificial Bee Colony Algorithm for Solving the CWP
Recent improved variant of the ABC for COPs, called crossoverbased artificial bee colony, is also used to solve the constrained Weber problem [54]. The main modifications introduced in the CBABC are related to the search operators used in each bee phase in order to improve the distribution of good information between solutions [54]. The differences between the CBABC and the ABC for COPs are given as follows.
In the employed bee phase, the CBABC algorithm uses modified search equation (5), in which is the same random number of each parameter which will be changed. Also, the CBABC does not use the fixed value of control parameter. Value of linearly increases from to the predefined value in the first iterations, while the value is used in the remaining iterations. The value of is defined in Table 1.
In the onlooker bee phase, the CBABC proposes a new search equation with the aim of enabling better exploration of the neighborhood of the highquality solution. This equation is given bywhere is a uniform random number in range , and represent the other two solutions selected randomly from the population, is a randomly chosen real number in the range , and .
In the scout bee phase, the CBABC uses uniform crossover operator to generate new solutions in a promising region of the search space. Therefore, after each th iteration, each solution which did not improve number of times is replaced with a new solution which is created bywhere is the th element of the global best solution found so far, is a randomly chosen real number in range , and .
3.3. Firefly Algorithm for Solving the CWP
In order to solve the CWP we have employed a numerical optimization version of the FA for COPs, introduced in [37]. In the FA, a colony of artificial fireflies searches for good solutions in every iteration.
The search operator represents the movement of a firefly to another more attractive or brighter firefly and it is given bywhere the second term is due to the attraction and the third term is a randomization term.
In the second term of (8), the parameter is the attractiveness of fireflies which is calculated according to the following monotonically decreasing function [62]: where denotes the distance between firefly and firefly , while and are predetermined algorithm parameters: maximum attractiveness value and absorption coefficient, respectively. Distance between fireflies is calculated by the Euclidean distance.
In the third term of (8), is a randomization parameter, are the scaling parameters, and is a random number uniformly distributed between and . The scaling parameters () are calculated by , where and are the lower and upper bound of the parameter . Diversity of solutions is controlled by the randomization parameter which needs to be reduced gradually during iterations so that it can vary with the iteration counter [63].
In the FA for solving CWP, penalty functions approach is used in order to handle the constraints. In this way, a constrained problem is solved as an unconstrained one. A general formula of calculation penalty functions is given in [64] bywhere is the new (expanded) objective function to be optimized, and are positive constants normally called “penalty factors,” is the number of inequality constraints, and is the number of equality constraints for a given problem. We found it suitable to set each to the value . The penalty factors for equality constraints were not used, since these problems have only inequality constraints.
The pseudocode of the FA is given as Algorithm 3.

3.4. An Enhanced Firefly Algorithm for Solving the CWP
An enhanced firefly algorithm for COPs is presented in [55] and it is also applied to solve the CWP. Two modifications are incorporated in the EFA in order to improve the performance of the firefly algorithm for COPs.
The first modification is related to using Deb’s rules instead of the penalty approach. Three feasibility rules are employed instead of the greedy selection in order to decide which firefly is brighter. These rules are also used each time after (8) is applied in order to decide whether the solution will be updated. Evaluation of solution population is given as Algorithm 4.

The second modification is employing the geometric progression reduction scheme to reduce the scaling factors at the end of each cycle, by the rulewhere is the maximum cycle number, is the current iteration number, and .
4. Experimental Study
The ABC, CBABC, FA, and EFA are implemented in the Java programming language on a PC Intel Core [email protected] GHz with 4 GB of RAM. The heuristic algorithm based on the modified Weiszfeld procedure is also implemented for the purpose of comparison with the metaheuristic approaches.
4.1. Benchmark Functions
The performance of the four metaheuristics techniques and behavior of the heuristic algorithm are evaluated through eighteen test instances of the singlefacility constrained Weber problems with the connected feasible region bounded by arcs with equal radius.
The benchmark problems with the increasing number of input points are randomly generated according to the algorithm given in [44]. These problems have 5, 10, 50, 100, 250, and 500 input points. Three different random test problems are generated for each number of input points. Hence, these test instances have a nonconvex feasible set given from 5 up to 500 constraints.
Four example problems, named P1, P4, P7, and P10 with 5, 10, 50, and 100 input points, respectively, are shown in Figure 1. In each test image, the feasible region is represented by a gray surface area and the final solution obtained by the heuristic algorithm [44] is represented by a red cross.
(a)
(b)
(c)
(d)
4.2. Parameter Settings
The solution number (SN) in the four metaheuristic algorithms was set to 20. The maximum number of fitness function evaluations (FEs) was used as the stopping criterion. The allowed FEs were set to . In addition, the metaheuristic algorithms presented in Section 2 have several other control parameters that considerably influence their performance. The values of these control parameters are presented in Table 1.
In order to calculate FEs researchers usually use the rule , where is the maximum number of iterations [65, 66]. Hence, the FA and EFA were terminated after iterations. The number of consumed fitness evaluations in each iteration of the ABC and CBABC algorithms is , since it calculates the solutions both in the employed bee and in onlooker bee phase [65]. Therefore, to ensure a fair comparison, the ABC and CBABC algorithms were terminated after iterations.
For the FA, it is widely reported in the literature that the light absorption coefficient , the initial attractiveness , and the initial randomness factor can be used for most applications [36, 62]. It can be seen from Table 1 that the value of the parameter was set to and the initial value of was set to for both FA and EFA. A typical value of is used in the FA. It was empirically determined that slightly higher value of the parameter is more suitable for the EFA. Hence was adapted. For the ABC and CBABC algorithms, the values of the specific control parameters were taken from [53, 54], where these algorithms were proposed to solve COPs. Especially for the CBABC, it was empirically determined that a lower value of the scout production period is more appropriate for solving the CWP. Therefore, it was set to . Each of the experiments was repeated for runs.
4.3. Analysis of Solution Quality and Robustness
The coordinates of the solution, corresponding objective function value, and the CPU time (in seconds) obtained by the heuristic algorithm are arranged in Table 2. To analyze the solution quality of the tested four metaheuristic algorithms, the best values, mean values, and standard deviations have been obtained by the ABC, CBABC, FA, and EFA algorithms over runs. Significance tests are used to achieve reliable comparisons. According to [67], twosample 95%confidence ttest was conducted between each pair of compared metaheuristics on every benchmark function. The calculated best results are presented in Table 3, while the mean values and standard deviations are arranged in Table 4. Results of twosample ttests are reported in Table 5. The sign “+” indicates that the associated comparative algorithm is significantly better than the other one, while the sign “−” indicates it is significantly worse than the opposite one. If both algorithms show similar performance, they are both marked by “+.”
Kazakovtsev in [44] experimentally proved the convergence of the heuristic algorithm on randomly generated test problems. Hence, the calculated best values of the metaheuristics can be compared to the results found by the heuristic approach in order to show the ability of the metaheuristic algorithm to reach the nearoptimal result. The obtained mean and standard deviation values indicate the robustness of the metaheuristic approaches.
It can be seen from Table 3 that each of the metaheuristic algorithms found the best results which are very close to the results obtained by the heuristic algorithm. More precisely, the FA obtained 10 better best results (P2, P4, P5, P6, P8, P9, P11, P16, P17, and P18) and 8 worse best results with respect to the heuristic approach. The EFA obtained 11 better best results (P1, P2, P4, P5, P6, P8, P9, P11, P16, P17, and P18), one equal best result (P3) and 6 slightly worse best results in comparison with the heuristic approach. The ABC algorithm achieved 12 better best results (P2, P4, P5, P6, P8, P9, P10, P11, P12, P16, P17, and P18), one equal best result (P3), and 5 worse best results with respect to the heuristic approach. The algorithm CBABC was able to find better or the same best solution for all problems with respect to the heuristic algorithm, with the exception of the problem P7, where the CBABC obtained slightly worse best result.
In terms of best results from Table 3, it can be noticed that the CBABC achieved better and in several cases the same values in comparison with each considered metaheuristic approach. Further, each of the improved metaheuristics, the EFA and CBABC, obtained better best results with respect to both original metaheuristic algorithms for the majority of test problems. If we compare the performance of the original ABC to that of the original FA, it can be seen that both algorithms show similar ability to reach the nearoptimal result; that is, the ABC has found 9 slightly better best results and 9 slightly worse ones compared to the FA.
From Table 4, it can be seen that mean and standard deviation results obtained by the CBABC are much better than the results obtained by the other metaheuristic algorithms. The CBABC converged consistently to the same solution with the same objective function value and very lower standard deviation. If we compare the robustness of the remaining three metaheuristics, it can be noticed that the EFA outperformed the FA and ABC. Compared with the ABC, the FA obtained better mean results and standard deviation values (P1, P2, P4, P6, P11, P13, P14, P17, and P18). The remaining mean and standard deviation results are better in the case of the ABC algorithm, with the exception of P5 and P15 where the FA and ABC show similar performances.
Results of twosample ttests are given in Table 5, and they show that the CBABC is significantly better than the FA, EFA, and ABC on , , and test problems, respectively. It is similar to the FA, EFA, and ABC on , , and problems, respectively. It is worth noting that the FA, EFA, and ABC can not outperform the CBABC on any problem. Further, it can be observed that the EFA is significantly better than the FA on each test problem. In comparison with the ABC, the EFA is superior on test problems, inferior on 4 problems, and similar on benchmarks. When comparing the performances of the FA and ABC it can be noticed that the FA is significantly better than the ABC on problems, while it is inferior to it on problems. The FA and ABC show similar performances on benchmarks.
According to the results reported in Tables 3, 4, and 5, we can conclude that the CBABC and EFA exhibit superior performances compared to both original versions, ABC and FA, in solving constrained Weber problems with the connected feasible region bounded by arcs. Further, from these results and according to the results from Table 2, it is clear that the CBABC outperformed all other three metaheuristic algorithms as well as the heuristic algorithm with respect to the quality of the obtained results. Although the CBABC has more accurate and more stable results than the remaining three metaheuristics, all four metaheuristic approaches perform better than or equal to the heuristic approach with respect to the quality of the obtained results for most of the tested problems.
4.4. Computational Time Analysis
In order to compare the computational cost of the four metaheuristic algorithms, we computed the mean of the CPU times over runs taken by each metaheuristic algorithm. These results are reported in Table 6. The results from Table 6 show that the execution time for each of the metaheuristics approaches linearly increases when the number of the constraints or input points increases.
By comparing computational times for the ABC and CBABC algorithms with respect to the FA and EFA, it is observable that ABC and CBABC algorithms are about times faster than the FA and about times faster than the EFA for the majority of test problems. The computational times of the ABC and CBABC algorithms are not significantly different. In addition, when the number of constraints is that time is less than seconds. The computational time requirements for the EFA algorithm are about five times greater compared to the FA and when the number of constraints or input points is 500 that time is about two seconds.
Compared with the computational time results of the heuristic approach, which are presented in Table 2, it can be seen that the heuristic algorithm requires less computational time than the four metaheuristic algorithms. However, the computational time of the four metaheuristics is reasonable and it can be considered as negligible, since it is less than one second in most cases.
5. Conclusion
The constrained Weber problem with feasible region bounded by arcs represents a problem of a nonconvex optimization. Finding a global optimum of such a problem is difficult considering the fact that it has multiple locally optimal points within the feasible region. Metaheuristic approaches for solving this problem are suitable choice, since these techniques can obtain quality results in a reasonable amount of time.
The performances of two prominent swarmintelligence algorithms (the artificial bee colony and firefly algorithm) and their recently proposed improved versions for constrained optimization (the crossoverbased artificial bee colony and enhanced firefly algorithm) are compared. The heuristic algorithm based on modified Weiszfeld procedure is also implemented for the purpose of the comparison with the metaheuristic approaches.
The four metaheuristic algorithms are compared on eighteen randomly generated test instances in which the number of input points or constraints increases up to 500. Numerical results indicate that all four metaheuristic algorithms are superior compared to the heuristic approach with respect to the precision of the results, with the notable ascendancy of the CBABC algorithm. In terms of the execution time, the ABC and CBABC are more efficient than the FA and EFA. Although these four algorithms require somewhat higher computational cost than the heuristic approach, the CPU times for all these algorithms are reasonable and grow at a linear rate as the number of input points or constraints increases. Finally, it turns out that the CBABC algorithm is superior compared to other metaheuristics with respect to the quality of the results, robustness, and computational efficiency.
From this research it can be concluded that metaheuristic approaches can be successfully used for problems with maximum and minimum distance limits. Further, this research encourages the application of the metaheuristic algorithms for solving some other complex constrained optimization problems of practical importance.
Conflicts of Interest
The authors declare that there are no conflicts of interest regarding the publication of this paper.
Acknowledgments
The second and third authors gratefully acknowledge support from the Project supported by Ministry of Education and Science of Republic of Serbia, Grant no. 174013. The first, third, and fifth authors gratefully acknowledge support from the Project Applying Direct Methods for Digital Image Restoring of the Goce Delčev University. The fourth author gratefully acknowledges support from the Ministry of Education and Science of Russian Federation (Project 2.5527.2017/8.9).