Abstract

Reverse bridge theorem (RBTH) has been proved to be both a necessary and sufficient condition for solving Nonlinear programming problems. In this paper, we first propose three algorithms for finding constraint minimum points of continuous, discrete, and mixed-integer nonlinear programming problems based on the reverse bridge theorem. Moreover, we prove that RBTH under constraint partition is also a necessary and sufficient condition for solving nonlinear programming problems. This property can help us to develop an algorithm using RBTH under constraints. Specifically, the algorithm first partitions mixed-integer nonlinear programming problems (MINLPs) by their constraints into some subproblems in similar forms, then solves each subproblem by using RBTH directly, and finally resolves those unsatisfied global constraints by choosing appropriate penalties. Finally, we prove the soundness and completeness of our algorithm. Experimental results also show that our algorithm is effective and sound.

1. Introduction

Nonlinear programming problems (NLPs) play an important role in both manufacturing systems and industrial processes and have been widely used in the fields of operations research, planning and scheduling, optimal control, engineering designs, and production management [17]. Due to its significance in both academic and engineering applications, different kinds of approaches have been proposed to solve NLPs and obtained some achievements. In [8], we propose a general approach, called reverse bridge theorem (RBTH), which can significantly reduce the complexity in solving NLPs. We also prove in [8] that RBTH is a necessary and sufficient condition for solving NLPs. Compared to other methods such as extended saddle-point condition (ESPC) [9, 10], RBTH has two obvious advantages. Firstly, the core inequality of RBTH is formed by only one subinequality and one subequality; thus RBTH is easier to handle. Secondly, RBTH does not need extra conditions for solving NLPs and therefore can be used more widely. However, in [8] we do not provide any concrete algorithm to solve NLPs using RBTH. Consequently, in this paper, we first present three algorithms for solving discrete nonlinear programming problems (DNLPs), continuous nonlinear programming problems (CNLPs), and mixed-integer nonlinear programming problems (MINLPs), respectively. After that, we prove the soundness and completeness of these algorithms.

On the other hand, constraint partition has been proved to be an attractive approach for solving large-scale problems in NLPs recently [1118]. Based on the regular constraint structure of a problem instance, we can cluster its constraints into multiple loosely coupled partitions. Accordingly, the original problem can be partitioned by its constraints into several subproblems, each of which is a relaxation of the problem and can be solved in exponentially less time than the original problem. In Section 4 of this paper, we first prove that RBTH under constraint partition is a necessary and sufficient condition for solving nonlinear programming problems. Then we further prove that this necessary and sufficient condition can be rewritten into necessary conditions. This property can help us to develop a novel algorithm using RBTH under constraint partition. Specifically speaking, the algorithm firstly partitions a nonlinear programming problem into several relaxed subproblems, each of which is then solved by using RBTH. After that the algorithm resolves the unsatisfied global constraints by choosing appropriate penalties. Finally, we prove that this algorithm is sound and complete.

The paper is organized as follows. After this introduction, we recall some basic notions and related work in Section 2. Then in Section 3, we introduce how to solve nonlinear programming problems using RBTH. In Section 4, we show how to solve nonlinear programming problems using RBTH under constraint partition. In Section 5, simulations and comparisons based on some benchmarks are carried out, which show that our algorithm is both effective and efficient. In the last section we conclude this paper.

In this section, firstly we recall some basic concepts that will be used in this paper. For details, we refer to [17].

2.1. Basic Concepts

Generally speaking, nonlinear programming problems can be divided into three categories, that is, continuous nonlinear programming problems, discrete nonlinear programming problems, and mixed-integer nonlinear programming problems. The general forms of these nonlinear programming problems are as follows:

Definition 2.1 (Continuous Nonlinear Programming). A continuous nonlinear programming problem (CNLP) is defined as where are continuous variables. The function is assumed to be continuous and differentiable, and the constraint functions   and can be discontinuous, nondifferentiable, and not in closed form.

Definition 2.2 (Discrete Nonlinear Programming). A discrete nonlinear programming problem (DNLP) is defined as where   are discrete variables. The functions , ,  and may be assumed discontinuous, nondifferentiable, or not even given in closed form.

Definition 2.3 (Mixed-Integer Nonlinear Programming). A mixed-integer nonlinear programming problem (MINLP) is defined as where are continuous variables, and are discrete variables. The function is bounded below and is assumed to be continuous and differentiable with respect to , and the constraint functions    and are general functions that can be discontinuous, non-differentiable, or not even given in closed form.

The aims of solving continuous, discrete, and mixed-integer nonlinear programming problems are, respectively, to find the constrained minima with respect to neighbourhood of a continuous point , a discrete point , and a mixed point .

Definition 2.4 (Constrained Minimum of CNLP). Point is a constrained minimum of (CMc), if is feasible and for all feasible in the neighbourhood of , that is, .

Definition 2.5 (Constrained Minimum of DNLP). Point is a constrained minimum of (CMd), if is feasible and for all feasible   in a neighbourhood of .

Definition 2.6 (Constrained Minimum of MINLP). Point is a constrained minimum of (CMm), if is feasible and for all feasible in a neighbourhood of .

2.2. Related Work

In the following, we introduce some existing methods for solving nonlinear programming problems.

2.2.1. Necessary Karush-Kuhn-Tucker (KKT) Condition [18, 19]

KKT is mainly designed for solving continuous nonlinear programming problems. The penalty function of is a Lagrangian function with Lagrange-multiplier vectors and , which is defined as

Theorem 2.7 (see [19]). If point is a CMc of and a regular point (gradient vectors of equality constraints and active inequality constraints are linearly independent), then there exist unique and such that where   for all   (the set of active constraints), and otherwise.
The unique , and that satisfy equation (2.5) can be found by solving equation (2.5) as a system of nonlinear equations. Because KKT is just a necessary condition, there exist some points that are not CMc. Moreover, the approach is limited to solving CNLPs with continuous and differentiable functions.

2.2.2. Sufficient Saddle-Point (SP) Condition [6]

The concept of saddle point has been widely studied during the past decades.

Theorem 2.8 (see [6]). For continuous and differentiable constraint functions, point is CMc of if there exist unique and that satisfy the following saddle-point condition at : for all and all and .
The existing saddle-point condition is only a sufficient but not necessary condition. This means that there exist some and that do not satisfy (2.6) for each CMc of .

2.2.3. The Necessary and Sufficient Reverse Bridge Theorem (RBTH) [8]

In [8], we propose RBTH as a method for finding a constrained minimum.

Definition 2.9 (Penalty Function for CRBTH). The penalty function of with penalty-multiplier and is defined as where and .

Theorem 2.10 (see [8]). Point is CMc of if and only if there exist finite and such that for any , the following condition is satisfied: for each in the neighbourhood of and all and .

Definition 2.11 (Penalty Function for DRBTH). The penalty function of with penalty-multiplier and is defined as where and .

Theorem 2.12 (see [8]). Point is CMd of if and only if there exist finite and such that for any , the following condition is satisfied: for each in the neighbourhood of and all and .

Definition 2.13 (Penalty Function for MRBTH). The penalty function of with penalty-multiplier and is defined as where and .

Definition 2.14 (Mixed-Neighbourhood). A mixed-neighbourhood in mixed space is defined as

Theorem 2.15 (see [8]). Point is CMm of if and only if there exist finite and such that for any , the following condition is satisfied: for all in the neighbourhood of and all and .

3. Solving NLPs Using Reverse Bridge Theorem

We have proved that RBTH is a necessary and sufficient condition for constrained local optima under a range of penalties in [8]. However, in [8] we do not provide any concrete algorithm to solve NLPs using RBTH. Consequently, in this paper, we first present three algorithms for solving CNLPs, DNLPs, and MINLPs, respectively.

We first present an algorithm, called RBTH_CNLP, to find the constrained minimum of CNLPs, as is shown in Algorithm 1. According to Theorem 2.10, if is a local minimum of a CNLP with respect to , must be a constrained minimum of the CNLP as well. Therefore, given an arbitrary CNLP , to find its constrained minimum, we only need to find its local minimum. According to formula (2.8), let be the penalty function of ; a reverse bridge point is the local minimum of . Consequently, in order to find the constraint minimum of a CNLP, we only need to find its reverse bridge point. In this way, we design our algorithm RBTH_CNLP. The key idea of the algorithm is to increase and gradually and to minimize simultaneously until , .

Procedure RBTH_CNLP (Pc,x,α̅,β̅)
    α0,β0
    repeat
     increase  αi by δ if (hi(x)0  and  αi<α̅i) for i=1,,m;
     increase  βj by δ if (gj(x)0  and  βj<β̅j) for j=1,,r;
     repeat
      perform descent of Lc(x,α,β) with respect to x;
     until a local minimum of Lc(x,α,β) is found;
    until a CMc of Pc is found or (αi>α̅i for all hi(x)0 and βj>β̅j for all gj(x)0).
    return CMc if found;
end_procedure

As we can see in Algorithm 1, we first initialize the values of and . Then the algorithm is executed to find a local minimum of according to formula (2.8). If point is not a feasible solution of , then the algorithm increases the penalties corresponding to the violated constraints. The process is repeated until we find that a CMc or (resp., ) is larger than its maximum bound (resp., ).

Algorithm 2 shows the algorithm RBTH_DNLP to solve discrete nonlinear programming problems. The idea of this algorithm is similar to algorithm RBTH_CNLP. Algorithm RBTH_DNLP first initializes the values of and , then gradually increases and , and minimizes until , .

Procedure RBTH_DNLP (Pd,y,α̅,β̅)
    α0,β0;
    repeat
     increase  αi by δ  if (hi(y)0  and  αi<α̅i) for i=1,,m;
     increase  βj by δ if (gj(y)0  and  βj<β̅j) for j=1,,r;
     repeat
      perform descent of Ld(y,α,β) with respect to y;
     until a local minimum of Ld(y,α,β) is found;
    until a CMd of Pd is found or (αi>α̅i  for  all  hi(y)0  and  βj>β̅j  for  all  gj(y)0).
    return CMd if found;
end_procedure

In order to solve MINLPs using RBTH, we need to define two neighbourhoods, that is, discrete neighbourhood and mixed neighbourhood.

Definition 3.1 (Discrete Neighbourhood). A discrete neighbourhood of in discrete space is a finite set of points in such a way that is reachable from in one step, that , and that it is possible to reach every from any in one or more steps through neighbouring points.

Definition 3.2 (Mixed Neighbourhood). A mixed neighbourhood in mixed space is defined as where is continuous neighbourhood of variable , and is discrete neighbourhood of variable .

Corollary 3.3. According to Definition 3.2 and Theorem 2.15, the RBTH in formula (2.13) can be rewritten into two following necessary conditions that, collectively, are sufficient:

Based on the corollary, we present an algorithm RBTH_MINLP, as is seen in Algorithm 3, to find the constrained minimum of MINLPs. According to Theorem 2.15 as we mentioned in Section 2, if the point () is a local minimum of an MINLP with respect to (), this point must also be a constrained minimum of the MINLP. This means that given an arbitrary MINLP , to find its constrained minimum, we only need to find its local minimum. On the other hand, according to formula (2.13), let be the penalty function of ; a reverse bridge point is the local minimum of . In this sense, in order to find a MINLP’s constrained minimum, we only need to find the reverse bridge point.

Procedure RBTH_MINLP (Pm,x,y,α̅,β̅)
    α0, β0;
    repeat
     increase  αi by δ  if (hi(x)0  and  αi<α̅i) for i=1,,m;
     increase  βj by δ if (gj(x)    0  and  βj<β̅j) for j=1,,r;
     repeat
      perform descent of Lm(x,y,α,β) with respect to  x for given  y;
     until a local minimum of Lm(x,y,α,β) with respect to  x is found;
     repeat
      perform descent of Lm(x,y,α,β) with respect to  y for given  x;
     until a local minimum of Lm(x,y,α,β) with respect to  y is found;
    until a CMm of Pm is found or (αi>α̅i  for all hi(x)0 and βj>β̅j for all gj(x)    0)
    return CMm if found;
end_procedure

In the procedure of RBTH_MINLP, we first initialize the values of and . In the first inner loop, we focus on finding the local minimum of with respect to the continuous neighbourhoods of ; in the second inner loop, we are devoted to looking for the local minimum by with respect to the discrete neighbourhoods of . If the local minimum point violates global constraints of , we increase the penalties of violated constraints in the outer loop. The process is repeated until we find that a CMm of or (resp., ) is larger than (resp., ).

Obviously, the main idea of all the three algorithms RBTH_CNLP, RBTH_DNLP, and RBTH_MINLP is to find the reverse bridge points for CNLPs, DNLPs, and MINLPs, respectively. According to Theorems 2.10, 2.12 and 2.15 as we mentioned in Section 2, we know that these reverse bridge points are also constraint minima of NLPs. Because CRBTH, DRBTH, and MRBTH are all necessary and sufficient conditions for solving CNLPs, DNLPs, and MINLPs, respectively, as we proved in [8], the following theorem stands.

Theorem 3.4. Given a CNLP (DNLP, MINLP, resp.), if there exists a solution, the algorithm RBTH_CNLP (RBTH_DNLP, RBTH_MINLP, resp.) can find the solution; if algorithm RBTH_CNLP (RBTH_DNLP, RBTH_MINLP, resp.) finds a solution, then it must be the solution of the CNLP.

In this section, we have shown that RBTH is an effective method for solving NLPs. However, it is difficult and expensive to solve some large-scale NLPs by RBTH because of their huge search spaces and several constraints in different form. Thus, in the next section, we further propose an approach to solve large-scale NLPs using RBTH under constraint partition.

4. RBTH under Constraint Partition

In this section, we show how to solve a nonlinear programming problem using RBTH under constraint partition. Because CNLPs and DNLPs can be regarded as special cases of MINLPs, in this section we only focus on MINLPs. Constraint partitioning has led to a major breakthrough in solving nonlinear programming problems in operations research and engineering applications [1118]. In this section, we first prove that RBTH under constraint partition is also a necessary and sufficient condition for solving MINLPs. Then we prove that this necessary and sufficient condition can be rewritten into several necessary conditions by providing a partitioned neighbourhood. After that, we present an algorithm for solving MINLPs and prove that this algorithm is sound and complete.

4.1. RBTH for Partitioned Subproblems

Given an arbitrary mixed-integer nonlinear programming problem , we can partition its constraints into stages. Each stage () includes local variables, local equality constraints, and local inequality constraints. Here local constraints restrict the variables of each stage, and global constraints restrict all the variables of problems. By applying this partition, the variable vector of the problem can be decomposed into subvectors , where is a vector of dynamic state variables in mixed space and stage . In this sense, the MINLP formulation is as follows: where is assumed to be continuous and differentiable with respect to , and are vectors of local-constraint functions that involve and time in stage t, and and are vectors of global-constraint functions that involve state variables and time in two or more stages.

A solution of can be regarded as the assignments of all the variables in . The goal of solving is then to find a constraint minimum with respect to all the feasible solutions in its mixed neighbourhood. It is clear that the partition of each stage needs to be further decomposed into discrete and continuous parts; however, we do not consider such situation for the purpose of simplification. In the following, we define the penalty function of and then propose the partitioned necessary and sufficient RBTH condition on CMm of .

Definition 4.1 (Penalty Function). The penalty function for and the corresponding penalty function in stage t are defined as follows: where and are the penalty vectors for the local constraints in stage t, ; and are the penalty vectors for the global constraints.

Theorem 4.2. Solution is a CMm of with respect to its mixed neighbourhood if and only if there exist finite , , , and for any , , , and such that the following RBTH condition is satisfied: for all .

Proof. The proof consists of two parts.
” parts: given a constraint minimum , we need to prove that there exist finite , , , and to satisfy formula (4.3).
Equality Part:
is a feasible solution; so it satisfies all local constraints and global constraints. Therefore, we obtain So, .
Inequality Part:
It is as follows.
   is the unique minimum of . For any , , , , and. Therefore the following equation is right regardless of the choice of the penalties:
   is not the unique minimum of or is not a minimum of . Assume that there exists some , which satisfies .(i), , , , and . This means that is another CMm of . Therefore, regardless of the choice of penalties, the inequality of formula (4.3) is satisfied.(ii), and violates some constraints. We suppose that it violates a global equality constraint function (the case with a global inequality constraint or a local constraint is similar), and thus is enough.(iii), and is a feasible solution. This is impossible because is not CMm of in this situation.(iv), and violates some constraints. We suppose that it violates a global equality constraint function (the case with a global inequality constraint or a local constraint is similar), so . Therefore, let . Then we have , Thus

” parts: assume that formula (4.3) is satisfied; we need to prove that is a CMm of . Because , we have for any , ,, and . Thus we can obtain , , and. Therefore is a feasible solution.
In the following, we prove that is minimum for all feasible solutions. Assume that is another feasible solution; we have Because , we get . Therefore, is a constrained minimum of .

In order to partition RBTH into several independent necessary conditions efficiently, we define the mixed neighbourhood of solution as follows.

Definition 4.3 (Partitioned Mixed Neighbourhood). The partitioned mixed neighbourhood of , denoted by , is defined as where is the mixed-space neighbourhood of variable vector in stage .

Based on this definition, we can further partition the condition described in formula (4.3) into multiple conditions.

Theorem 4.4. Given , the RBTH in formula (4.3) can be rewritten into following necessary conditions that, collectively, are sufficient: for all , , , , .

Proof. We prove that formula (4.3) is equivalent to the combined formula (4.10) and formula (4.11).
” parts: given a satisfying formula (4.3), we need to prove that it also satisfies formula (4.10) and formula (4.11). For all , any point is also a point in ; therefore , , , and: Therefore, .
Then we can know that the equality in formula (4.10) and the equality in formula (4.11) hold in case of satisfying all the constraints.
” parts: we prove this part by contradiction. Assume that satisfies formula (4.10) and formula (4.11) but does not satisfy formula (4.3). The equality in formula (4.3) cannot be violated because the equality in formula (4.10) and the equality in formula (4.11) imply that all local and global constraints are satisfied. Therefore, the inequality in formula (4.3) is unsatisfied. In this case, there exists some and a unique where such that But Therefore This implies that holds for , which contradicts our assumption.
Therefore, any that satisfies formula (4.10) and formula (4.11) must also satisfy (4.3).
Theorem 4.4 proves that the original RBTH mentioned in Theorem 2.15 can be partitioned into necessary conditions as formula (4.10) and a global necessary condition as formula (4.11). Consequently, local reverse bridge points, which satisfy formula (4.10) in stage , are local minimum of the original RBTH in stage . The above reverse bridge points are essentially the solutions of solving the following MINLP , in which we add the global constraints to the original objective function :

In brief, solving the original problem can be reduced to solve multiple MINLPs defined by in formula (4.17) and to increase the penalties of violated global constraints defined in formula (4.11). Therefore, by partitioning the original problem and solving each subproblem, Theorem 4.4 leads to a significant reduction on computational complexity.

4.2. The Partitioning and Resolving Procedure

Algorithm 4 shows the partitioning and resolving procedure for finding the reverse bridge points satisfying the conditions in Theorem 4.4. As is shown in the algorithm, and are initialized firstly; the inner loop is then carried out to find a reverse bridge point of MINLP in each stage , which can be implemented by the procedure RBTH_MINLP, as we introduced in Section 3. After all the subproblems are solved, the penalties corresponding to the violated global constraints are increased in the outer loop. The process is repeated until a CMm of is found or and are larger than their maximum bounds and .

Procedure RBTH_partition_resolve_mixed (Pt,z,α¯,β̅,γ̅,η̅)
  γ0,  η0;
  repeat
    increase γi by δ if (Hi(z)0  and  γi<γ̅i)  for i=1,,p;
    increase ηj by δ if (Gj(z)0  and  ηj<η̅j) for j=1,,q;
    for  t=0 to  N
     call RBTH_MINLP (Pt(t),z,α¯,β̅) to solve Pt(t).
    end for;
  until a CMm of Pt is found or (γi>γ̅i  for  all  Hi(z)0  and  ηj>η̅j  for  all  Gj(z)0).
  return CMm of Pt if found;
end_procedure

According to Theorems 4.2 and 4.4, we know that solving the original problem can be reduced to solve multiple MINLPs. For every MINLP in stage t of original problem, algorithm RBTH_partition_resolve_mixed calls RBTH_MINLP to solve it. Moreover, Theorem 3.4 has proved that algorithm RBTH_MINLP is sound and complete for solving MINLPs. Therefore, the following theorem stands.

Theorem 4.5. Given a MINLP, if there exists a solution, the algorithm RBTH_partition_resolve_mixed can find the solution; if the algorithm finds a solution, then it must be the solution of the MINLP.

5. Numerical Simulation Results and Comparisons

All of our algorithms are coded in C, and in our simulation, numerical experiments are performed on a PC with Pentium 3.0 GHz Processor and 1.0 GB memory. We first compare our algorithm CPRBTH (RBTH under constraint partition) to two of the best CNLP solvers, Lancelot [20] and SNOPT [21]. To test the performance of the proposed algorithms, computational simulations are carried out with some well-studied benchmark problems taken from the CUTE library [22]. These problems are all minimization problems. Some of these problems were constructed by researchers to test optimization algorithms, while others were from real applications, such as computer production planning in operations research. For each instance, the algorithm is independently executed 15 times for comparison. The experimental results are shown in Table 1. The first three columns show the problems IDs, the number of constraints (nc), and the number of variables (nv). The last six columns show the solutions (Sol.) and CPU times we obtain by using LANCELOT, SNOPT and CRBTH. Both the CPU time and the solutions are the average of the measure of the 15 executions of the algorithms. Numerical results indicate that the algorithm usually performs quite well in terms of CPU time and quality of solution found.

We then compare our algorithm with three famous MINLP solvers, MINLP_BB [23], BARON [24], and CPOPT [9]. MINLP_BB performs a branch and bound algorithm with a sequential-quadratic-programming solver for solving continuous problems. BARON is a MINLP solver implementing the branch and reduce algorithm. CPOPT is an MINLP solver implementing the extended saddle point condition under constraint partition algorithm. To test the performance of the proposed algorithms, computational simulations are carried out with some well-studied benchmark problems taken from the MacMINLP library [25]. The first three columns show the problems IDs, the number of constraints (nc), and the number of variables (nv). The last eight columns show the solutions (Sol.) and CPU times we obtain by MINLP_BB, BARON, CPOPT, and CPRBTH. Both the CPU time and the solutions are the average of the measure of the 15 executions of the algorithm. Because the results of MINLP_BB and BARON in [13] were obtained by submitting jobs to the NEOS server and BARON’s site, respectively, we accept the results of [13]. The other two solvers were run on a PC with Pentium 3.0 GHz Processor and 1.0 GB memory.  The experimental results are shown in Table 2. Compared to CPOPT, the solutions of CPRBTH are at least competitive, and the running cost of CPRBTH is relatively lower.

6. Conclusion

RBTH is a necessary and sufficient condition for constrained local optima under a range of penalties. In this paper, we first propose three algorithms to solve NLPs using RBTH and then prove that these algorithms are both sound and complete. Additionally, we combine RBTH with constraint partition to solve large-scale MINLPs. Specifically, we decompose the constraints of MINLPs into some easier subproblems that are significant relaxations of the original problem, each of which can be solved by using RBTH directly, and then resolve those violated global constraints across the subproblems via RBTH. In the final part, we present an algorithm for implementing this search procedure and also prove that the algorithm is sound and complete for solving MINLPs under constraint partition. Experimental results also show that our algorithm is both sound and complete.

Acknowledgments

This project was granted by the National Natural Science Foundation of China under Grant nos. 60473042, 60573067, 60803102, and 60773097. The authors are grateful to the anonymous reviewers for providing their detailed, thoughtful, and helpful comments on improving the work presented here.