Abstract

This paper presents a novel optimization method for effectively solving nonconvex quadratically constrained quadratic programs (NQCQP) problem. By applying a novel parametric linearizing approach, the initial NQCQP problem and its subproblems can be transformed into a sequence of parametric linear programs relaxation problems. To enhance the computational efficiency of the presented algorithm, a cutting down approach is combined in the branch and bound algorithm. By computing a series of parametric linear programs problems, the presented algorithm converges to the global optimum point of the NQCQP problem. At last, numerical experiments demonstrate the performance and computational superiority of the presented algorithm.

1. Introduction

The nonconvex quadratically constrained quadratic programs problems have attracted the attention of practitioners and researchers for 30 years. During the past 10 years, curiosity in these problems has been especially intense. In part, this is because the NQCQP problems have a large number of practical applications, for example, pooling problems in petrochemistry [1], modularization of product subassemblies [2], chance-constrained optimization problems, production planning or portfolio optimization [3ā€“5], the fuel mixture problem encountered in oil industry [6], and also placement and layout problems appeared in integrated circuit design (see [7, 8]). In addition, many nonlinear optimization problems can be transformed into the form, for example, special classes of structured stochastic games [9] can be interpreted as quadratic programs problems, the packing problem contained in the unit square can be formulated as concave quadratic constraints quadratic programs problem, variable in 0-1 programming may be also represented by concave quadratic constraints, and minmax location problems [4] also lead to quadratic programs problems with quadratic constraints. Another cause for the strong attention in the NQCQP problems is that, from a research point of view, the class of problems put forward significant theoretical and computational defiance. This is mainly because these problems are global optimization problem; that is, they are well known to generally own multiple local optimum points that are not globally optimum point. Therefore, it is very essential to put forward a good global optimization method for solving the NQCQP problems.

In this paper, we will investigate the following NQCQP problems: where are all symmetric matrices, , , , and , .

In last several decades, some algorithms have been exploited for globally solving the special case or general case of the NQCQP problems. For example, based on a novel reformulation-linearization/convexification approach, Sherali and Tuncbilek [10] proposed a global optimization algorithm for linearly constrained nonconvex quadratic programs problems. Based on outer approximation and branch and bound scheme by solving linear programs subproblems, Al-Khayyal et al. [7] presented an algorithm for computing the approximate global optimal solutions of the NQCQP problems. Based on Lagrangian underestimation method to compute lower bounds and utilize the Interval Newton method to facilitate the convergence of algorithm in the neighborhood of the global optimum point, Van Voorhis [11] developed a branch and bound algorithm for globally solving the NQCQP problems. By partitioning the feasible region into the Cartesian product of two-dimensional triangles and rectangles and by utilizing the convex and concave envelopes of bilinear functions over triangles and rectangles, a simplicial branch and bound algorithm [12] was presented for solving globally the NQCQP problems. Based on semidefinite relaxations and finite KKT-branching method, Burer and Vandenbussche [13] presented a finite branch and bound algorithm for globally solving the NQCQP problems. In [14], Zheng et al. presented a decomposition-approximation method for constructing convex relaxations of the NQCQP problems, which can be used to offer a tighter lower bound for solving the problems . Using duality bounds approach, Thoai [15] presented a branch and bound algorithm for solving the NQCQP problems. Based on linear relaxation approximation technique and linearity-based range deleting tactics, Gao et al. [16] presented a rectangle branch and reduce method for the linearly constrained quadratic programs problems; by utilizing linearizing technique and quadratic constraint-based range compressing technique, Gao et al. [17] presented a branch and reduce approach for globally solving the NQCQP problems. By utilizing the special structure of quadratic function and linearization technique, Shen et al. [18] and Shen and Liu [19] proposed two effective global optimization algorithms for computing the NQCQP problems. By making use of linear relaxation approximation technique, Qu et al. [20] and Jiao and Chen [21] proposed two deterministic algorithms for calculating the NQCQP problems. Except for the above reviewed references, several algorithms for solving generalized geometric programming problem presented in [22ā€“30] can be also used to solve the NQCQP problems. For an excellent review of recent advances in global optimization, the reader is referred to Floudas and Gounaris [31].

In this paper, by combing a parametric linearizing approach with a cutting down approach, we will present a novel global optimization method for solving the NQCQP problems. The main characteristics of the presented approach are expounded as follows. Firstly, a novel linear relaxation approximation technique, that is, a parametric linearizing approach, is constructed for inconsistently transforming the NQCQP problems into a sequence of parametric linear programs relaxation problems, and by subsequently subdividing the initial hyperrectangle the optimal point of parametric linear programs relaxation problems can infinitely approach the global optimum point of the problem . Secondly, the proposed parametric linear programs relaxation problems are putted into a branch and bound framework without adding any new variables and constraint functions and which can be easily calculated by any effective linear programs algorithm procedure. Thirdly, a cutting down approach is exploited to eliminate a large part of the currently investigated subhyperrectangle which does not contain the global optimum point of the problem . Combing the parametric linear programs relaxation problem with the cutting down approach in a branch and bound procedure, a new optimization method is displayed for globally solving the NQCQP problems. At last, numerical results indicate that the proposed method can be employed to obtain the global optimum point of the problem .

The remainder of this paper is composed as follows. The next section describes a novel parametric linearizing approach and the parametric linear programs relaxation of the problem is constructed. In Section 3 a cutting down approach is presented. Section 4 combines the cutting down approach within a branch and bound scheme; an optimization algorithm and its global convergence are described. In Section 5 some test examples and their results are reported to demonstrate the feasibility and superiority of the presented algorithm. At last, some concluding remarks are described.

2. Parametric Linear Relaxation

The principal composition in the configuration of a branch and bound algorithm for globally solving the problem is the computation of lower bounds of the problem and its divided subproblems. The lower bounds of the global minimum values of the problem and its divided subproblems can be calculated by solving a series of parametric linear programs relaxation problem . In order to construct the PLPRP, the proposed approach is to replace each quadratic function by a parametric linear function.

Let with . For each , define , where , , , , and . For convenience in expression, for all , for each , some symbols are given as follows:

Obviously, we have

Theorem 1. For any , for each , consider the function and ; then the following conclusions hold:(i); (ii) and as .

Proof. (i) The gradient function of the function , can be expressed as follows: Obviously, we have
By the mean value theorem, for any , there exists a point , where , such that If , then we have If , it follows that Therefore, we can get that
Analogously, if , then it follows that If , the following inequalities hold: Hence, it follows as above that The conclusion (i) is followed.
(ii) Consider , we have where with .
Since as , thus we can get that Similarly, we can prove that The conclusion is complete.

By Theorem 1, for any , for each , we define

Theorem 2. For any , for each , the following conclusions hold:(i); (ii) and as .

Proof. (i) By the definitions of and and the conclusion (i) of Theorem 1, we have Therefore, the conclusion (i) is followed.
(ii) By the expressions of , , , , , and , we can get that
By the proof of Theorem 1 (ii), we can get The conclusion is complete.

By Theorem 2, we can construct the underestimating approximation parametric linear programs relaxation problem of the problem over subhyperrectangle as follows: where

Based on the above parametric linearizing approach, every feasible point of the problem is feasible to the problem over the subhyperrectangle , and the objective function value of the problem at each feasible point is less than or equal to that of the problem over the subhyperrectangle . Thus, the optimal value of the problem offers a valid lower bound for the optimal value of the problem over the subhyperrectangle .

3. Cutting Down Approach

To enhance the computational speed of the investigated algorithm, based on the above parametric linear relaxation problem, a novel cutting down approach is described in the following Theorem 3. At the -th iteration of the proposed algorithm, we will judge whether or not the subhyperrectangle contains a global optimum point of the problem . The cutting down approach can be used to reject a part of the subhyperrectangle or the whole without deleting any global optimum point of the problem . For convenience, for any with , without loss of generality, we express the in the problem over the subhyperrectangle as the following form:

Assume that is the currently known upper bound of the proposed branch and bound algorithm, and for any fixed , let

And define , and , where

Theorem 3. For any subhyperrectangle , one has the following results.(i)If , then the subhyperrectangle does not contain the global optimum point of the problem .(ii)If , then for any , if , then the subhyperrectangle does not contain the global optimum point of the problem ; if , then the subhyperrectangle does not contain the global optimum point of the problem .(iii)If there exists some such that , then the subhyperrectangle does not contain the global optimum point of the problem .(iv)If for all , then for each , if , then the subhyperrectangle does not contain the global optimum point of the problem ; if , then the subhyperrectangle does not contain the global optimum point of the problem .

Proof. (i) If , then for all , by the Theorem 2 we have Hence, the does not contain the global optimum point of the problem .
(ii) If , then for any , if , for all , we have ; that is, . Thus, we have Therefore, by the above inequality and Theorem 2, we get that Hence, the rectangle does not contain the global optimum point of the problem .
Similarly, if , we can prove that the subhyperrectangle does not contain the global optimum point of the problem .
Using similar proving method as the above, we can draw the conclusions (iii) and (iv).

By Theorem 3, by making use of the above cutting down approach to reject a part of the investigated subhyperrectangle which does not contain the global minimum point of the problem , thus we can enhance the computational speed of the proposed branch and bound algorithm.

4. Algorithm and Its Convergence

In this section, based on the former parametric linear programs relaxation problem, we present a novel optimization method for globally solving the problem . There are three fundamental compositions in the presented method: a branching approach, an updating upper bounds approach, and an updating lower bounds approach.

The branching approach iteratively subdivides the investigated hyperrectangle into two subhyperrectangles, which produces a more refined partition for computing the global optimum point of the problem . In this paper we select a simple partitioning approach, which is enough to guarantee the global convergence of the presented branch and bound algorithm. For any selected subhyperrectangle , the selected partitioning approach is described as follows.(a)Let .(b)Let By making use of this branching approach, the selected hyperrectangle is subdivided into two subhyperrectangles and .

The updating lower bounds approach needs to compute a sequence of parametric linear programs relaxation problems by using the simplex approach. The updating upper bounds approaches need to calculate the objective function value of the feasible point of the problem , where the feasible point can be found by solving the parametric linear programs relaxation problem and probing the feasibility of the midpoint of the investigated subhyperrectangle , respectively.

4.1. Novel Optimization Algorithm

Let and be the optimal value and the optimal solution of the problem over subhyperrectangle , respectively. Combining the former parametric linear programs relaxation problem with the cutting down approach in a branch and bound framework, a novel global optimization method for the problem is described as follows.

Algorithm Steps

Step 1 (initializing). Initialize the iteration counter , the collection of all active node , the feasible solution set , the convergence error , and the upper bound .
Solve the problem over the hyperrectangle to compute and . For all , if , then let and .
If , then algorithm terminates with as the global optimum point of the problem . Otherwise, go to Step 2.

Step 2 (partitioning hyperrectangle). Utilizing the proposed branching approach, select a branching variable to partition into two new subhyperrectangles, and still let the new collection of partitioned subhyperrectangles by .

Step 3 (cutting down region). For each subhyperrectangle , for any fixed parameter vector , compute , , and .
For each , if , let ; elseā€‰if and for some , then let .ā€‰else if and for some , then let .
If , then let ; elseā€‰if and for some , then let ,ā€‰else if and for some , then let .
At last, still let the remaining subhyperrectangle be , and let the remaining partitioned set be .

Step 4 (feasibility fathoming). For each new subhyperrectangle , compute the lower bounds and by solving the problem over .
If , set ; otherwise, if the midpoint of satisfies for all , then let , and if satisfies for all , then let ,

Step 5 (renewing bound). Renew the upper bound . If , the best known feasible solution is denoted by . Let , and renew the lower bound .

Step 6 (convergence fathoming). If , then algorithm stops, and we get that is the global -minimum value of the problem , and is a global -optimum point. Otherwise, , and select a new subhyperrectangle such that , and return to Step 2.

4.2. Global Convergence of the Algorithm

The global convergence of the presented algorithm is described as follows.

Theorem 4. If the proposed algorithm stops finitely at the -th iteration, then when the algorithm stops, is the global optimum point of the problem ; else it will bring about an infinite sequence of iteration, such that any accumulation point of the sequence will be the global optimum point of the problem , and the sequence is nonincreasing and the sequence is nondecreasing; moreover they meet , where is the global minimum value of the problem .

Proof. (i) If the proposed algorithm stops finitely at the -th iteration, then when it stops, we get that . Thus, by the characteristic of the proposed branch and bound algorithm, we get the global optimum point of the problem . If the proposed algorithm is infinite, then it must bring about an infinite subhyperrectangle sequence , since the used branching approach is exhaustive, we get that the subhyperrectangle sequence converges to a point. By the branch and bound characteristic of the algorithm we get that the sequence is nonincreasing, and the sequence is nondecreasing; therefore the sequence is a positive and nonincreasing sequence. From Theorem 2, we know that the sequence must be convergent to zero. Also for each indicate that . Since is always a feasible solution of the problem and the upper bound , any cluster point of the sequence must be feasible to the problem with objective function value . Therefore, the conclusion is followed.

5. Numerical Experiments

To validate the performance and computational efficiency of the presented optimization method, several common test examples in literatures are put into effect on microcomputer, the algorithm program is coded in C++, the simplex approach is applied to solve a series of parametric linear programs relaxation problems, and the termination tolerance error is set to . These test examples are described and their numerical results are listed as follows. In the following Tables 1, 2, and 3, number of algorithm iteration and computational time in seconds are denoted by ā€œIterā€ and ā€œTime,ā€ respectively.

Example 1 (see [21, 32]). Consider

Example 2 (see [19, 21, 22]). Consider

Example 3 (see [19, 21, 22]). Consider

Example 4 (see [21, 23]). Consider

Example 5 (see [17, 21]). Consider

Example 6 (see [21, 25]). Consider

Example 7 (see [33, 34]). Consider
This problem is from the heat exchanger design problem; solve this problem by the proposed method, with the given convergence error and parameter vector , , and a global -optimal solution is found after iterations with objective function value , and computational time is ā€‰s.
Using the proposed approach in Floudas et al. [33], a global optimal solution, under the given convergence tolerance , is found with global optimal value .
Using the proposed approach in Lin and Tsai [34], under the convergence tolerance , a global optimal solution is found with global optimization value .

Example 8 (see [24]). Consider
This test problem has a relative high degree of difficulty, and it contains both negative and positive terms. By using the proposed algorithm in this paper, initializing the parameter , , with the given convergence error , the global -optimal solution can be obtained after the iterations.
But using the proposed algorithm in [24], with the given convergence error , the global -optimal solution can be obtained after the iterations.

Example 9 (see [16, 21]). Consider
Using the proposed algorithm in this paper, initializing the parameter , , the numerical results are compared with those in [16, 21] and are illustrated in Table 3.
From the numerical results for Examples 1ā€“9, our algorithm is competitive.

6. Concluding Remarks

In this paper, a novel optimization method based on the parametric linear programs relaxation problem is proposed for globally solving the NQCQP problem. The parametric linear programs relaxation problem is constructed by underestimating each quadratic function with a parametric linear function. By making use of the currently known upper bound and the parametric linear programs relaxation of the problem , a cutting down approach is constructed and used to enhance the computational speed of the branch and bound algorithm. The algorithm is convergent to the global optimum point by subdividing the initial hyperrectangle and solving sequences of parametric linear programs relaxation problems. Numerical experimental results are reported to demonstrate that the presented method can be employed to effectively solve the problem .

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This research work is supported by the National Natural Science Foundation of China under Grant (11171094) and the Science and Technology Key Project of Education Department of Henan Province (14A110024).