`Mathematical Problems in EngineeringVolume 2012 (2012), Article ID 475018, 16 pageshttp://dx.doi.org/10.1155/2012/475018`
Research Article

A VNS Metaheuristic with Stochastic Steps for Max 3-Cut and Max 3-Section

1Research Center of Security and Future, School of Finance, Jiangxi University of Finance and Economics, Nanchang 330013, China
2Key Laboratory of Management, Decision and Information Systems, Academy of Mathematics and Systems Science, CAS, Beijing 100190, China

Received 15 February 2012; Accepted 30 May 2012

Copyright © 2012 Ai-fan Ling. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

A heuristic algorithm based on VNS is proposed to solve the Max 3-cut and Max 3-section problems. By establishing a neighborhood structure of the Max 3-cut problem, we propose a local search algorithm and a variable neighborhood global search algorithm with two stochastic search steps to obtain the global solution. We give some numerical results and comparisons with the well-known 0.836-approximate algorithm. Numerical results show that the proposed heuristic algorithm can obtain efficiently the high-quality solutions and has the better numerical performance than the 0.836-approximate algorithm for the NP-Hard Max 3-cut and Max 3-section problems.

1. Introduction

Given a graph , with nodes set and edges set , the Max 3-cut problem is to find a partition , and , of the set , such that , and the sum of the weights on the edges connecting the different parts is maximized. Similar to the Max cut problem, the Max 3-cut problem has long been known to be NP complete [1], even for any un-weighted graphs [2], and has also applications in circuit layout design, statistical physics, and so on [3]. However, due to the complexity of this problem, its research progresses is much lower than that of the Max cut problem. Based on the semidefinite programming relaxation proposed by Goemans and Williamson [4], Frieze and Jerrum [5] obtained a 0.800217-approximation algorithm for the Max 3-Cut problem. Recently, Goemans and Williamson [6] and Zhang and Huang [7] improved Frieze and Jerrum’s 0.800217-approximation ratio to 0.836 using a complex semidefinite programming relaxation of the Max 3-cut problem.

For the purpose of our analysis, we first introduce some notations. We denote the complex conjugate of by , where i is the pure image number and the real part and image part of a complex number by Re and Im, respectively. For an dimensional complex vector written as bold letter and dimensional complex matrix , we write and to denote their conjugate and transpose. That is, and . The set of dimensional real symmetric (semidefinite positive) matrices and the set of dimensional complex Hermitian (semidefinite positive) matrices are denoted by and , respectively. We sometimes use to show (or ). For any two complex vector ,  , we write as their inner product. For any two complex matrices , , we write as their inner product; that is, , where and . means the module of a complex number or the 2-norm of a complex vector or the -norm of a complex matrix.

Let the third root of unity be denoted by , , . Introduce a complex variable , , then it is not hard to know that Denote and . Then the Max 3-cut problem can be expressed as here means that , , is the weight-valued matrix of a given graph.

By relaxing the complex variable into an dimensional complex vector , we get a complex semidefinite programming (CSDP) relaxation of (M3C) as follows: where , and denotes the vector with zeros everywhere except for an unit in the th component. It is easily to verify that constraints can be expressed as To get an approximate solution of M3C, Goemans and Williamson [6] do not directly solve the CSDP, but solve an equivalent real SDP with following form (Although some softwares, such as SeDuMi [8] and the earlier version of SDPT3-4.0 [9], can deal with SDPs with complex data, this does not reduce the dimensions of problems): where is the Laplace matrix of given graph, is an -dimensional full zeros matrix.

In RSDP, the first, third, and forth classes of equality constraints ensure that , and with the form The final two classes of equality constraints ensure that and is a skew-symmetric matrix.

If is an optimal solution of RSDP, then the complex matrix is an optimal solution of CSDP. Then one can randomly generate a complex vector , such that , and set where means the complex angle principal value of a complex number. Goemans and Williamson [6] had verified that, see also Zhang and Huang [7],

The algorithm proposed by Goemans and Williamson [6] can obtain a very good approximate ratio, and RSDP can be solved by interior point algorithm, but the 0.836-approximate algorithm will be not practical in numerical study for the Max 3-cut problem. From RSDP, one can see that for a graph with nodes, RSDP has constraints and slack variables via the inequality constraints. That is to say, RSDP has a dimensional unknown symmetrical semidefinite positive matrix variable and a dimensional unknown vector variable, and constraints, and has also many matrices without an explicit block diagonal structure although they are sparse. For instance, when , RSDP becomes a very-high-dimensional semidefinite programming problem with 14850 slack variables and 24950 constraints. Further, as we known, it is only a class of universal and medium-scale instances for Max 3-cut problems with 50 to 100 nodes. Hence, it will be very time consuming to solve such a RSDP relaxation of M3C using the current existing any SDP softwares. This leads that 0.836-approximate algorithm is not suitable for computational study of the Max 3-cut problem. This limitation for solving M3C based on CSDP (or RSDP) relaxation motivates us to find a new efficient and fast algorithm for the practical purpose for the Max 3-cut problem.

In the current paper, we first establish a definition of -neighborhood structure of the Max 3-cut problem and design a local search algorithm to find the local minimizer. And then, we propose a variable neighborhood search (VNS) metaheuristic with stochastic steps which is originally considered by Mladenović and Hansen [10], by which we can find efficiently a high-quality global approximate solution of the Max 3-cut problem. Further, combining a greed algorithm, we extend the proposed algorithm to the Max 3-section problem. To the best of our knowledge, it is first time to consider the computational study of the Max 3-cut problem. In order to test the performance of the proposed algorithm, we compare the numerical results with Goemans and Williamson’s 0.836-approximate algorithm.

This paper is organized as follows. In Section 2, we give some definitions and lemmas. In Section 3, we present the VNS metaheuristic for solving the Max 3-cut problem. The VNS is extended to the Max 3-section problem in Section 4. In Section 5, we give some numerical results and comparisons.

2. Preliminaries

In this section, we will establish some definitions and give some facts for our sequel purpose. For the third roots of unity, , , , we can get the following fact: Denote . Then based on (2.1), for any , we may definite a -neighborhood of as follows.

Definition 2.1. For any and any positive integer number , one defines the -neighborhood, denoted by , of as the set In particular, if , we write the 1-neighborhood of as .

The boundary of the -neighborhood is defined by . Clearly, . If , we call a -neighbor of . From Definition 2.1, the difference of between points and its -neighbor is that they have only different components. By computing straightforwardly, we get the number of elements of , that is . Particularly, when .

Example 2.2. Let . Then , , and .

Definition 2.3. For any , define two maps from to itself as follows: .

Clearly, for any , , and . Applying Definition 2.3, for any there exists an unique component, say, of , such that and either or , and other components of and are the same. For simplicity, for any with and , we denote by or corresponding to or . By Definitions 2.1 and 2.3, for any , we can structure its 1-neighborhood points using maps defined by Definition 2.3; that is, we have the following result.

Lemma 2.4. Let be defined by Definition 2.3. Then, for any and any fixed positive integer number , one has that is, and are two 1-neighborhood points of .

Definition 2.5. A point is called a -local maximizer of the function over , if , for all . Furthermore, if for all , then is called a global maximizer of over . A 1-local maximizer of the function is also called a local maximizer of the function over .

3. VNS for Max 3-Cut

3.1. Local Search Algorithm

Let be a feasible solution of problem M3C. If is not a local maximizer of , then for all , we may find a , such that . It is clear that . If is not still a local maximizer of , then replacing with and repeating the process until a point satisfying is found, which indicates that is a local maximizer of .

For any positive integer number , let ; that is, Denote Then, we have the following result whose proof is clear.

Lemma 3.1. Consider

Based on Lemma 3.1, if we know the value of , then we can obtain the value function at next iterative point by calculating by (3.3), instead of calculating directly the values , which reduces sharply the computational cost. By Definition 2.1, there exist two points satisfying (3.1) for fixed ; that is, when and (3.1) is satisfied, then either or . For our convenience, we denote by when and by when . In what follows, we describe the local search algorithm for the Max 3-cut problem denoted by LSM3C; by this algorithm, we can get a local maximizer of function over .

For LSM3C, one has the following.(1)Input any initial feasible solution of problem (M3C). (2)For from 1 to , set , calculate , and set again , calculate . (3)Find by the following way: (4)If , then set , return , and stop. Otherwise, go to next. (5)Set ; go to Step 2.

3.2. Variable Neighborhood Stochastic Search

Let be a local maximizer obtained by LSM3C and a fixed positive integer number. we now describe the variable neighborhood search (VNS) with stochastic steps, by which we can find an approximate global maximizer of problem (M3C). The proposed VNS algorithm actually has three phases: First, for any given positive integer number , a -neighborhood point, say, is randomly selected; that is, . Next, a solution, say, is obtained by applying algorithm LSM3C to . Finally, the current solution jumps from to if it improves the former one. Otherwise, the order of the neighborhood is increased by one when and the above steps are repeated until some stopping condition is met. The VNS that is also called k-max [11] can be illustrated as follows.

For VNS-k, one has the following.(1)Arbitrary choose a point , implement LSM3C starting from and denote the obtained local maximizer bySet . (2)Randomly take a point and implement again LSM3C from , and denote the obtained new local maximizer by . (3)If , then set and ; go to Step 2. (4)If , set ; go to Step 2. Otherwise, return as an approximate global solution of problem M3C and stop.

The subscript in Step 2 is a function of and is also a positive integer number not greater than . reflects the main skill of converting the current neighborhood of local maximizer into another neighborhood of . For a given , let and , where means the integral part of . We divide the neighborhoods of , into neighborhood blocks , such that, for , and, for , In order to obtain the neighborhood blocks of , , , we divide the set into disjoint subsets, where each subset of the first subsets has integers and each subset of the last subsets has integers. For any integer , let or Then we can randomly choose a point in , where is a random number from uniformly distribution , such that satisfies (3.5) or (3.6).

VNS-k stops when the maximum neighborhood is reached. Additionally, we also consider another termination criterion of VNS based on the maximum CPU-time and denoted by VNS-t. VNS-t can obtain a better solution than VNS-k since VNS-t actually runs several times VNS-k in the maximum allowing time , but it generally has to spend more computational time. The VNS-t can be stated as follows.

For VNS-t, one has the following.(1)Set , running VNS-k for an arbitrary initial point , and let a local optimal solution be obtained. (2)If , go to Step 3. (3)If , then set ; go to Step 2 in VNS-k. Otherwise, return as an approximate global solution of problem M3C and stop.

We mention that it differs from the classical variable neighborhood search metaheuristic that is originally proposed by Mladenović and Hansen [10]. In order to obtain a global optimal solution or a high-quality approximate solution of problem M3C, we use two stochastic steps in VNS. First, for a fixed , a -neighbor of is chosen randomly. Second, by the definition of , when we change the neighborhood of from to , may take any a neighborhood among , of , which is decided by random number . In VNS, positive integer decides the maximum search neighborhood block of , which also decides directly the CPU-time of VNS. Based on the second stochastic step, we may choose a relative small comparing with . This can decrease our computational time.

4. A Greedy Algorithm for Max 3-Section

When the number of nodes is a multiple of three and the condition is required, the Max 3-cut problem becomes the Max 3-section problem. Notice that , then the Max 3-section problem can be formulated as the following programming problem M3S: and its CSDP relaxation is where is the column vector of all ones. Andersson [12] extended Frieze and Jerrum’s random rounding method to M3S and obtained a -approximate algorithm, which is the current best approximate ratio for M3S; also see the recent research of Gaur et al. [13]. The author of the current paper considers a special the Max 3-Section problem and obtains a 0.6733-approximate algorithm; see Ling (2009) [14].

Clearly, the feasible region of problem M3S is a subset of , and the optimal value of problem M3S is not greater than that of problem M3C. Assume that we have get a global optimal solution or a high-quality approximate solution of problem M3C. It is clear that may not satisfy the condition . But we may adjust to get a new feasible solution using a greedy algorithm, such that satisfies . This is the motivation that we propose the greedy algorithm for the Max 3-section problem.

For the sake of our analysis, without loss of generality, we assume that the local maximizer satisfies . This means that is the subset of with maximum cardinal number. If , then we may set , . The resulted new solution will not change the objective value since ; moreover, the new partition based on satisfies . By our assumption, the partition still exist four possible cases.

Case 1. .

Case 2. .

Case 3. .

Case 4. .
The sizes adjusting greedy algorithm of Cases 3 and 4 are similar to Cases 1 and 2. Hence, we mainly consider Cases 1 and 2 for adjusting the partition of from to such that , . Denote Then, it follows from simple computation that where , , .
In what follows, we describe the size adjusting greedy algorithms (SAGAs) for Cases 1 and 2, and denote the greedy algorithms for the two cases by SAGA1 and SAGA2, respectively.
For SAGA1, one has the following.(1)Calculate (2)If , let , where , . Set , and renew to calculate for each . Let , where . Set and .(3)If , let , where , , set , , and then renew to calculate for each . Set and , where here.(4)Return the current partition ; stop.
For SAGA2, one has the following.(1)Calculate , , and (2)If , let where , . Set where . Renew to calculate and let where , . Set (3)If , let where , . Set where . Renew to calculate and let where , . Set (4)Return the current partition ; stop.

5. Numerical Results

This section describes the obtained experimental results for some instances of Max 3-cut and Max 3-Section problems using the proposed VNS metaheuristic. We also show a quantitative comparison with 0.836-approximate algorithm. The computational experiments are performed in an Intel Pentium 4 processor at 2.0 GHz, with 512 MB of RAM, and all algorithms are coded in Matlab. Because RSDP relaxation of M3C includes many slack variables, many constraints, and matrices variables without a block diagonal structures, in our numerical comparisons, we choose SDPT3-4.0 [9], one of the best and well-known solvers of semidefinite programming, to solve RSDP relaxation of M3C.

All our test problems are generated randomly by the following way. Let be a constant and a random number. If , then there is an edge between nodes and with weight , that is, a random integer between 1 and 10. Otherwise, ; that is, there is no edge between nodes and . Because of the limits of memory of SDPT3, when , RSDP becomes a huge semidefinite programming problem with not less than 59700 slack variables and 99900 constraints and is out of memory of SDPT3. Hence, in the numerical experiments, we consider 30 instances with , and varying from 20 to 200.

Firstly, we check the influence of on the quality of solution obtained by VNS-k. For a given graph, we take ; Table 1 presents the results, where Wnp in the first column of this table and the following tables means that a graph is randomly generated with nodes and density ; for instance, W30.6 presents a graph generated randomly with and . We find from Table 1 that the influence of to objective value denoted by Obj in Table 1 is slight when , but the CPU time increases sharply as increases. This result is actually not surprising. Indeed, because , we choose randomly a point in , instead of . This avoids to choose too large which leads to more CPU-time cost. Hence, in sequel numerical comparisons, we fix for all test problems.

Table 1: The objective value obtained by VNS for M3C with different .

Secondly, we compare VNS (VNS-k, VNS-t) metaheuristic with 0.836-approximate algorithm for all test problems. To avoid the effect of initial points, for each test problem, after RSDP is solved, we run the round procedure of 0.836-approximate algorithm and VNS metaheuristic ten times, respectively.

Table 2 gives the result of numerical comparisons. In the numerical presentations of Table 2, Objrsdp is the optimal value of problem RSDP; that is, it is an upper bound of M3C. ObjGM is the largest value obtained by 0.836-approximate algorithm in the ten tests. Objvns stands for the largest value obtained by VNS for M3C in the ten tests, respectively. and s.v. are the number of constraints and slack variables (s.v.), respectively. and are the average time (second) associated with the two algorithms in the ten tests. For the maximum CPU time of VNS-t, we take , but the real CPU time of VNS-t will be greater than . Additionally, for measuring the performance of solutions, we take for M3C and for M3S. Clearly, can reflect how close to the solution obtained by VNS from the optimal solution of RSDP. One can see from Table 2 that (1) the VNS metaheuristic not only can obtain a better solution than 0.836-approximate algorithm for all test problems, but also that the elapsed CPU-time of VNS metaheuristic is much less than that of 0.836-approximate algorithm for all test problems, (2) the performance of solution can be improved by VNS-t for most of test problems when the termination criterion of VNS is based on the maximum CPU-time, but VNS-t spends more computational time than VNS-k. The improved performance can be reflected by in the final column of Table 2. Average speaking, VNS-t improves 0.91 percentage point.

Table 2: The numerical comparisons of 0.836-approximate algorithm with VNS metaheuristic.

Finally, we consider the solution of M3S by combining VNS-k and greedy sizes-adjusted algorithm SAGA stated in Section 4. Let be an approximate solution of M3C obtained by VNS; we can obtain an approximate solution of M3S from SAGA. The numerical results are reported by Table 3 in which stands for the largest value obtained by VNS-k plus SAGA for M3S. Although our sizes-adjusted algorithm may decrease the objective value obtained by VNS, the changes of objective values are very slight from Table 3. Particular, objective values of some problems do not decrease, instead increase, such as W150.3. We do not compare the obtained results with Andersson’s 2/3-approximate algorithm. Because we find that all approximate solutions of M3S obtained by VNS plus SAGA still are better than that of 0.836-approximate algorithm with the exception of only W30.1 and W30.3.

Table 3: The numerical results of combining VNS-k metaheuristic with SAGA for M3S.

6. Conclusions

A variable neighborhood stochastic metaheuristic was proposed to solve the Max 3-cut and Max 3-section problems in this paper. Our algorithms can solve Max 3-cut and Max 3-section problems with different sizes and densities. Although 0.836-approximate algorithm has the very good theoretic results, in numerical aspects, our comparisons indicate that the proposed VNS metaheuristic is superior to the well-known 0.836-approximate algorithm and can efficiently obtain very high-quality solutions of the Max 3-cut and Max 3-section problems.

We mention that the proposed algorithm in fact can deal with higher dimensional G-set graphs problems created by Pro. Rinaldi using a graph generator, rudy. But, we cannot give numerical comparisons with 0.836-approximate algorithm since RSDP relaxations of these problems are out of memory of the current all SDP software. In additionally, if we increase or in numerical implementing, then the quality of solution of M3C will further be improved by VNS.

Funding

This work is supported by the National Natural Science Foundations of China (no. 71001045, 10971162), the China Postdoctoral Science Foundation (no. 20100480491), the Natural Science Foundation of Jiangxi Province of China (no. 20114BAB211008), and the Jiangxi University of Finance and Economics Support Program Funds for Outstanding Youths.

Acknowledgment

The authors would like to thank the editor and an anonymous referee for their numerous suggestions for improving the paper.

References

1. R. M. Karp, “Reducibility among combinatorial problems,” in Complexity of Computer Computations, R. Miller and J. Thatcher, Eds., pp. 85–103, Plenum Press, New York, NY, USA, 1972.
2. M. R. Garey, D. S. Johnson, and L. Stockmeyer, “Some simplified NP-complete graph problems,” Theoretical Computer Science, vol. 1, no. 3, pp. 237–267, 1976.
3. F. Barahona, M. Gröetschel, G. Reinelt, and M. Juenger, “An application of combinotiorial optimization to statistical physics and circuit layout design,” Operations Research, vol. 36, no. 3, pp. 493–513, 1988.
4. M. X. Goemans and D. P. Williamson, “Improved approximation algorithms for maximum cut and satisfiability problems using semidefinite programming,” Journal of the Association for Computing Machinery, vol. 42, no. 6, pp. 1115–1145, 1995.
5. A. Frieze and M. Jerrum, “Improved approximation algorithms for MAX $k$-CUT and MAX BISECTION,” in Integer Programming and Combinatorial Optimization, E. Balas and J. Clausen, Eds., vol. 920, pp. 1–13, 1995.
6. M. X. Goemans and D. P. Williamson, “Approximation algorithms for MAX-3-CUT and other problems via complex semidefinite programming,” Journal of Computer and System Sciences, vol. 68, no. 2, pp. 442–470, 2004.
7. S. Zhang and Y. Huang, “Complex quadratic optimization and semidefinite programming,” SIAM Journal on Optimization, vol. 16, no. 3, pp. 871–890, 2006.
8. J. F. Sturm, “Using SeDuMi 1.02, ‘a MATLAB toolbox for optimization over symmetric cones’,” Optimization Methods and Software, vol. 11, no. 1–4, pp. 625–653, 1999.
9. K.-C. Toh, M. J. Todd, and R. H. Tütüncü, “SDPT3 version 4.0 (beta)—a MATLAB software for semidefinite-quadratic-linear programming,” 2004, http://www.math.nus.edu.sg/~mattohkc/sdpt3.html.
10. N. Mladenović and P. Hansen, “Variable neighborhood search,” Computers & Operations Research, vol. 24, no. 11, pp. 1097–1100, 1997.
11. P. Hansen, N. Mladenović, and J. A. Moreno Pérez, “Variable neighbourhood search: methods and applications,” Annals of Operations Research, vol. 175, pp. 367–407, 2010.
12. G. Andersson, “An approximation algorithm for Max $p$-Section,” Lecture Notes in Computer Science, vol. 1563, pp. 237–247, 1999.
13. D. R. Gaur, R. Krishnamurti, and R. Kohli, “The capacitated max $k$-cut problem,” Mathematical Programming, vol. 115, no. 1, pp. 65–72, 2008.
14. A.-F. Ling, “Approximation algorithms for Max 3-section using complex semidefinite programming relaxation,” in Combinatorial Optimization and Applications, vol. 5573 of Lecture Notes in Computer Science, pp. 219–230, 2009.