Abstract

In this paper we put forward a family of algorithms for lifting solutions of a polynomial congruence to polynomial congruence . For this purpose, root-finding iterative methods are employed for solving polynomial congruences of the form , where and are integers which are not divisible by an odd prime . It is shown that the algorithms suggested in this paper drastically reduce the complexity for such computations to a logarithmic scale. The efficacy of the proposed technique for solving negative exponent equations of the form has also been addressed.

1. Introduction and Preliminaries

The scope of congruence in number theory is of vital importance. The use of iterative methods for solving nonlinear equations has become a valuable device for numerical analysts. This research work addresses some iterative methods for solving polynomial congruences of the form where , and are integers which are not divisible by an odd prime . The root-finding recursive techniques have been discussed in [14] to get the inverse of numbers modulo prime powers, which is the motivation of the proposed research work. In this piece of work, we use higher order iterative methods with a particular focus on Householder's and Basic Family of Iteration Functions (for detail, see [57]) in order to find solutions of polynomial congruences of the form .

Hensel's lemma is one of the most popular methods amongst the existing techniques for solving polynomial congruence modulo . By applying Hensel's lemma on some polynomial congruence modulo , it can be seen that the experience of the exposition of this lemma is strenuous and much more laborious. So the proposed technique endeavors to keep the elucidation consistently a little low to give advantage in finding the solution of such congruences by means of explicit iteration techniques which are quite fast in finding these solutions. The following are the two versions of well-known Hensel's lemma.

Theorem 1 (see [8]). Suppose that is a polynomial with integral coefficients. If and , then there is a unique such that .

The following theorem can easily be deduced from Hensel's lemma after applying Taylor's Theorem. This is actually the typical procedure to find solutions of congruences modulo by means of Hensel's lemma. For details see [9, page 106].

Theorem 2 (see [9]). Let be a prime and an arbitrary positive integer, and suppose that is a solution of .(1)If , then there is precisely one solution of such that is given by , where is the unique solution of .(2)If and , then there are solutions of that are congruent to , given by , for .(3)If and , then there are no solutions of that are congruent to .

Let us solve the congruence using Theorem 2. Let . First, we solve . By trial, it is easy to find that and are the solutions of the congruence . To perform iterations by Theorem 2, we proceed as under.

Take ; then . By Theorem 2, there is a unique solution of . To find , we find integer from the congruence . This gives or . Hence the unique solution gives . Therefore is the unique solution of .

Next we take ; then becomes . That is, or . Hence the unique solution gives . Therefore, is the unique solution of .

Finally we take ; then is not divisible by 7. Thus there is a unique solution of . To find , we solve . This gives or . Then, . Hence, is the solution of .

From above example, it is noticed that several iterations are required in order to compute a solution to a congruence of higher powers of prime which is computationally intensive. Moreover at each step we need to calculate derivative of the function at current root. Hence we may hesitate in solving polynomial congruences with modulus of higher powers of primes using this lemma. Thus we need to find some explicit algorithms in which the needed derivatives are already incorporated with some steps. In the underlying paper we solve the polynomial congruence with higher modulo by means of algorithms developed using root-finding iterative methods. The p-adic proof of these algorithms has been derived using elementary number theory. Notations used in this paper are standard and we follow [13, 10, 11].

2. A Solution of Congruences Using Newton’s Method

Newton's method is a well-known iterative procedure for finding the roots of an equation. It is the best tool in many ways for the solution of a nonlinear problem. Its simplicity and great speed always attract in attempting a nonlinear problem. Assume that an initial estimate is known for the desired root of . Then to perform iterations the formula for Newton's method is Let us take . Then using (1), the explicit form for Newton's method is Like real numbers, it can be proved that Newton's method is quadratically convergent. Now if is the solution of then for some integer , we have, . Then by using (2), we obtain Now, if then and hence . Then by Cancellation law, (3) yields that is the solution of the congruence .

Let us solve the congruence . In order to solve a polynomial congruence of the form , we first see that it is sufficient to solve since every solution of is a solution of . Once we do this, then we can apply (2) for finding the solutions of from the solutions of the congruence . Therefore, we first solve the congruence . By inspection we see that is the solution of the congruence . Thus we choose as our initial guess. Then by (2), we have Hence, We repeat above process and find that and are the solutions of the given congruence modulo , and , respectively.

2.1. Order of Convergence

As far as the convergence of an iterative method of order is concerned, it avows that the accurateness or precision to compute the current approximation is only digits. This means that if we start with an -digit integer as the initial estimate in some modulo , then would be a new approximation in modulo containing -digits.

To ensure that the Newton's method is quadratically convergent, we show that will not be a solution if we expand the binomial up to terms involving . For this, we rewrite the step

3. Third Order Iterative Methods

The following are the third order iterative methods for which the explicit formulas for finding the roots of congruences are presented. The p-adic proofs of their convergence is given in the following two theorems.

3.1. A Variant of Newton’s Method

Several variants of Newton's method have been given by many researchers to improve the order of convergence. For third-order convergence, the following three variants of Newton's methods have been studied earlier in [11, 12] to solve nonlinear equations, given as In the following theorem, we use (7) to find the solutions of congruences of the form , , from the solutions of the congruence .

Theorem 3. Let , and be integers which are not divisible by an odd prime . If , satisfies then satisfies the congruence , where

Proof. To prove this, let . By (8), we get, . Then by (7), we obtain If is the solution of then for some integer , we have . Putting in (12), we get Since and , so . Then by (13) This implies that

3.2. Abbasbandy's Method

In [10], Abbasbandy used Adomian decomposition method to improve Newton's method for solving nonlinear equations. The improved method is called Abbasbandy’s method (AM). Solving polynomial congruences is one of the most interesting problems in number theory. In this section, we use AM to solve polynomial congruences of the form , . It can be seen that AM lifts a solution modulo to then to and by iteration to . To perform iterations, the formula for AM is expressed as

Theorem 4. Let , and be integers which are not divisible by a prime . If , satisfies then satisfies the congruence , where

Proof. To prove this, let . By (16), we obtain
This can be simplified as Next we show that is a solution of the congruence . Now if is the solution of then for some integer , we have, . Putting in (19), we get This implies that Using Binomial Series expansion, we obtain Since and , so and hence . Finally, by Cancelation law (22) yields that is the solution of the congruence .

3.3. Remark

Note that variants of Newton's method discussed in Section 3 are the two-step (predictor-corrector) methods while the followings are one-step methods. This clearly shows that the technique suggested is equally good even to two step methods and could be enhanced to multistep methods.

4. Higher Order Iterative Families

The following are the two well-known one step root-finding higher order iterative families. In this section, we make use of these families in order to find the solutions of congruences modulo . The following theorems demonstrate how one can employ the order of convergence of these families to get the solutions of ecstatic problems in number theory.

4.1. Householder's Family

Householder's methods are a class of well-known iterative algorithms for solving a nonlinear equation in one variable. Let be a function of one variable with continuous derivatives of order . The formula for Householder's method of order to perform iterations is Let us establish a formula for using  (23)

th derivative of Similarly, th derivative of Substituting the values of (24) and (25) into (23), we obtain In the following theorem, we use Householder's method of order in order to find the solutions of congruences of the form , , from the solutions of the congruence .

Theorem 5. Let , and be integers which are not divisible by a prime . If , is a solution of the congruence then is the solution of the congruence satisfying the equation

Proof. To prove this, let and solve using (26); we get Next we show that is a root of the congruence . Now if is the solution of then for some integer , we have, . Putting in (27), we obtain This implies that Using Binomial Series expansion, we obtain Similarly, we expand denominator to get Substituting (31) and (32) into (30), we get Next we claim that . To prove our assertion we let . But then . This shows that as and . Which further implies that , a contradiction since . Hence we conclude that and so because . Thus by (33), is the solution of the congruence .

4.2. Basic Family of Iteration Functions

Basic family of iteration functions denoted by is a well-known class of iterative algorithms of order for solving a nonlinear equation in one variable. The details of Basic Family and its mathematical interpretation regarding existence and characterizations have been discoursed earlier in [5]. It has been proved that the members of this family like and coincide with well-known Newton's and Halley's methods. We further see that the member coincides with a member of Householder's Family of order four. However, we demonstrate that the Basic family of iteration functions is more convenient in finding roots of the given congruence with desired convergence. To find the solutions of congruences modulo , we need to recall the basics of this family as given in [5] (for details see page 1–3 in [13]).

Let be a polynomial of degree over the field of complex numbers. For integer , let be a square matrix of order whose diagonals elements are defined as For , let be a square matrix of order obtained by removing first rows and last columns of the matrix together with given as The family of iteration functions is termed as Basic Family of iteration function of order , where Let us establish a formula for . Then by (36), we obtain Now Then Similarly, Then Substituting (39) and (41) into (37), after simplifying we obtain Now by (26) and (42), it is interesting to notice that both families are mapped onto each other even for fourth order members. That is, the iteration function of Householder's family for and the iteration function for Basic Family are same. Therefore, the fourth order convergence for -adic analysis of Basic Family as proved in Theorem 5 is instinctively established. However, the Basic Family is certainly more expedient to find solutions of congruences modulo as one can find the desired solution using (36) as well. Let us again find the solution of through determinants. Here and is the initial solution. Then by (37), Similarly, Substituting the values into (37), we obtain

4.3. Remarks

(1)The overhead methods also give an equally efficient technique in solving polynomial congruences modulo with negative-exponent. To find the solutions of the congruence , we solve the congruence , where . This means that the solution of the first congruence is the multiplicative inverse of the solution of the later congruence. We claim that the linear congruence is always solvable, where is the solution of the congruence . Then must be a solution of the congruence . That is . Since , so . This clearly shows that . Hence . Consequently, the linear congruence is solvable as we know that the linear congruence is solvable if and only if . Let be the solution of the linear congruence . Then is the desired solution of . By (45), satisfies . To find a solution of , we solve the linear congruence . It is easy to see that is its solution. Hence, is the desired solution of the congruence .(2)The text does not discuss the well-known Halley's third order iterative method as it can be seen that it is a subcase of both families discussed above. In the Householder's family for and in basic iteration family for , the results yield Halley's method.

In the following algorithm, we summarize the solutions of congruences discussed in Theorems 3, 4, and 5.

4.4. Algorithm

Step  1: Set as initial estimate.

Step  2:

Step  3: For to do

Step  4: Solution =

The following algorithm is an improved form of the above given algorithms for any arbitrary value of where is the order of convergence of the iterative method induced.

Step  1: Set as initial estimate.

Step  2:

Step  3: For to do

Step  4: Solution =

5. Numerical Examples

Let us solve the congruence by using Hensel's lemma (HL), Abbasbandy’s method (AM), a variant of Newton's method (VN), Householder's method (HM), and Basic Family's (BM) method. In order to solve a polynomial congruence of the form , it is sufficient to solve . Once we do this, then we can apply algorithm given Section 4.4 for finding the solutions of from the solutions of the congruence . Therefore, we first solve the congruence . By inspection we see that and are the solutions of the congruence . Thus we choose as our initial guess. Then by Algorithm given in Section 4.4, we have We repeat above process to find the roots of the given congruence modulo , , and so on until we get the solution of the congruence . The necessary computations are summarized in Table 1.

6. Conclusion

The complexity of a typical method for numerical computations using Theorem 2 is linear. In the underlying work we have suggested various methods that drastically reduce the complexity for such computations to a logarithmic scale. It is easily deduced from the algorithm given in Section 4.4 that the complexity of the described method is . Additionally the method developed is an explicit technique which does not require any numerical computation for finding any sort of derivative. Therefore, the techniques developed in this paper perform much faster for values of in powers of in contrast with existing techniques for solving polynomial congruences. Moreover the research work proves that both families of iterative function, that is, the Householder's and Basic iteration family work p-adically as shown in various results. Furthermore we have shown the efficacy of the given method for solving negative exponent equations of the form .

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.