Abstract

In this work, we tend to deal within the field of the constrained optimization methods of three-term Conjugate Gradient (CG) technique which is primarily based on Dai–Liao (DL) formula. The new proposed technique satisfies the conjugacy property and the descent conditions of Karush–Kuhn–Tucker (K.K.T.). Our planned constrained technique uses the robust Wolfe line search condition with some assumptions. We tend to prove the global convergence property of the new planned technique. Numeral comparisons for (30-thirty) constrained optimization issues make sure the effectiveness of the new planned formula.

1. Introduction

All strategies for constrained problems will be classified into (2) basic categories; specifically, direct and indirect ways. Generating uncontained sub-problem ways for the later kind square measure vital even for a few special optimization interior and exterior penalty function techniques transform the constrained problem into unconstrained optimization type problems. The technique in the main easy and quite sturdy for a previous technique known as Sequential Unconstrained Minimization Technique (SUMT). The essential optimization problem with inequality constrained of this way outlined as

This problem is regenerate into unconstrained minimization technique by constructing a function of the shape

where and is defined by [1]:

Therefore, we can rewrite the Equation (2) as follows

The derivatives of this functions are and are linear independent, so that

Now we turn to the second part parallel to the importance of the previous part, which is unconstrained optimization technique and let us know the problem (2), where is a real-valued continuous and scalable derivation function. The iterative is

whereas is step-length. The new search direction is:

The value of the derivative function at the current point is and is a positive scalar called the conjugate gradient parameter.

There are some known formulas of are from Hestenes–Stiefel (HS) [2], Fletcher–Reeves (FR) [3], Polak–Ribière (PR) [4], Liu–Storey (LS) [5] and Dai–Liao (DL) [6].

In the existing convergence analysis and implementation of the CG technique, the weak Wolfe condition [7] are defined as:

and .

By updating one of the conditions also strong Wolfe conditions [7] consist of (8) and

Furthermore, the sufficient descent property, namely

The constant c is a positive number that satisfies the descent condition.

2. A Modified Dai–Liao Three-Term CG Technique

Many researchers have provided different updates which are suitable for the parameter of Dai–Liao (DL) CG-method consisting of:

Recall the work of Liveries and Pintelas [8] which they forward a new update to the parameter which was based on the modified secant equation and they replaced with this new one. Other researchers, e.g. Babaie-Kafaki and Ghanbari [9] present in their work a derivation of two modified CG-methods which are based on Perry’s work; they got better numerical results than the original one given by DL. The researchers continued various updates of the DL-parameter in order to obtain some suitable formulas. See for example [1012]. Moreover, the researcher’s Zhang et al. [13] prompt a three-term CG-technique based mostly of the DL-technique as follows:

This direction satisfying the condition () for all . Now, exploitation (13) within the constrained CG-technique outlined in (1)–(4) yields

By updating this formula using the modified techniques of Dai–Liao CG in (14), we obtain:

When rewriting the new search direction this is as follows:

where

Since >0 (by the strong Wolfe condition), through these inequality and Quasi Newton condition we get:

this means that is a positive definite matrix.

3. New Theorem

The new direction in (15) satisfying the sufficiently descent condition (11).

Proof. Now multiply each side of (15) by that capable for unconstrained optimization then we have a tendency to get

Let

The scalar is known, this means (). Moreover, when multiply the other end of the direction by we get:

Where (is a positive constant). Now, we have

which can be written differently

and is the Barrier function at point .

Then depending on one of Karush–Kuhn–Tucker, KKT’s [14] optimal conditions and some regularity conditions of [15] such that if as well as formula

So, when and in order to get a min of the function we take the limit for the function , in form:

we get the required, a sufficient descent direction of our new algorithm.

Lemma 1 [16]. The new direction defined in (15) is satisfying the conjugacy condition.

Proof. Let

where and this condition is equal to where

4. Global Convergence Property

In this part of the article, we will address the convergence analysis of the new algorithm where the following assumptions are often used in CG techniques.

Assumptions [17 and 18]. (i) Let the level set bounded i.e., there exists a constant q > 0 such that (ii) Clearly there is some neighborhood of , the function is continuously differentiable, and its gradient is Lipschitz continuous, i.e. there exists a constant such thatAssuming that conditions (i) and (ii) are satisfy, we can deduce that there exists a constant such that

5. Global Convergence for New Theorem

Consider the new three-term CG-technique (15) which is satisfying (13) and assume that the step-size satisfies (8) and (10) then

Proof. The new search direction is:

From Lipschitz condition and

Hence, by taking the summation of the search direction we get:

This means that (33) is true.

6. Numerical Experiments

In order to assess the performance of the planned new algorithm outlined in (15). The new constrained CG technique is checked over thirty nonlinear-selected test functions (see the Appendix of [1, 19, 20] for the details of those test problems). For all cases the stopping criterion is taken to be

The comparative performance of all thought of algorithms is evaluated by considering NOF, NOI, NOC where NOF denotes the number of perform function evaluations, NOI denotes the number of iterations required to minimize the test problem and NOC denotes the number of constrained evaluations. We adopt the performance profiles given by Dolan and More [21].

The following three forms have the task of clarifying the performance of the algorithm more clearly as follows:(i)Figure 1 illustrates the activity of the new algorithm in calculating the number of function values.(ii)Figure 2 shows the level of improvement of the number of iterations.(iii)Figure 3 illustrates the efficiency of the new algorithm in the calculation of constraints.

To measure the percentages of optimization for better accuracy we give the following Table 1 showing the percentage of effectiveness of the new algorithm and the efficiency of the number of updates to reach the optimal solution.

From the last Table 1, it is evident that the new proposed constrained CG-technique formulated in (15) outdo the standard three-term CG-technique formulated in (12) in about (59)% NOF; (47)% in NOI and (7)% in NOC, respectively.

Data Availability

The data used the support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Acknowledgments

The research is supported by College of Computer Sciences and Mathematics, University of Mosul, Republic of Iraq, under Project No. 6378368.