Some Improvements to a Third Order Variant of Newton ’ s Method from Simpson ’ s Rule

In this paper, we present three improvements to a three-point third order variant of Newton’s method derived from the Simpson rule. The first one is a fifth order method using the same number of functional evaluations as the third order method, the second one is a four-point 10th order method and the last one is a five-point 20th order method. In terms of computational point of view, our methods require four evaluations (one function and three first derivatives) to get fifth order, five evaluations (two functions and three derivatives) to get 10th order and six evaluations (three functions and three derivatives) to get 20th order. Hence, these methods have efficiency indexes of 1.495, 1.585 and 1.648, respectively which are better than the efficiency index of 1.316 of the third order method. We test the methods through some numerical experiments which show that the 20th order method is very efficient.


Introduction
Newton's method has remained one of the best root-finding methods for solving nonlinear scalar equation f (x) = 0.In last 15 years, many higher order variants of Newton's method have been developed.One of the them is a third order variant developed by Hasanov et al. [1] by approximating an indefinite integral in the Newton theorem by Simpson's formula.This method is a three-point method requiring 1 function and 3 first derivative evaluations and has an efficiency index of 3 1/4 = 1.316 which is lower than 2 1/2 = 1.414 of the 1-point Newton method.Recently, the order of many variants of Newton's method have been improved using the same number of functional evaluations by means of weight functions (see [2][3][4][5][6][7] and the references therein).
In this work, we improve the order of the three-point variant from three to five using weight function.Using polynomial interpolation, we develop four-point 10th order and five-point 20th order methods.Finally, we test the efficiency of the methods through numerical experiments.

Developments of the Methods
Let x n+1 = ψ(x n ) define an Iterative Function (I.F.).Definition 1. [8] If the sequence {x n } tends to a limit x * in such a way that for p ≥ 1, then the order of convergence of the sequence is said to be p, and C is known as the asymptotic error constant.If p = 1, p = 2 or p = 3, the convergence is said to be linear, quadratic or cubic, respectively.
Let e n = x n − x * , then the relation is called the error equation.The value of p is called the order of convergence of the method.

Definition 2. [9]
The Efficiency Index is given by where d is the total number of new function evaluations (the values of f and its derivatives) per iteration.
Let x n+1 be determined by new information at No old information is reused.Thus, Then ψ is called a multipoint I.F without memory.
Kung-Traub Conjecture [10] Let ψ be an I.F.without memory with d evaluations.Then where p opt is the maximum order.
The second order Newton (also called Newton-Raphson) I.F.(2 nd NR) is given by The 2 nd NR I.F. is a is 1-point I.F. with 2 functions evaluations and it satisfies the Kung-Traub conjecture d = 2.
Thus, EI 2 nd N R = 1.414.The Newton I.F.can be constructed from its local linear model, of the function f (x), which is the tangent drawn to the function f (x) at the current point x.The local linear model at x is This local linear model can be interpreted from the viewpoint of the Newton Theorem: Weerakoon and Fernando [11] showed that if the indefinite integral I N T is approximated by the rectangle: I N T ≈ f (x)(ψ − x), the Newton I.F. is obtained by setting L(x) = 0. Hasanov et al. [1] obtained a new linear model by approximating I N T by Simpson's formula: Solving the new linear model, they obtained the implicit I.F.: Using the Newton I.F. to estimate ψ(x) in the first derivative by ψ 2 nd N R (x), they obtained the 3 rd 3pV S I.F.: The 3 rd 3pV S I.F. is a special case of the Frontini-Sormani family of third order I.F.s from quadrature rule [12].It was extended to systems of equations in [13].However this I.F. is considered as inefficient.According to Kung-Traub conjecture, the optimal order of the I.F. with d = 4 is eight.In fact, we can have optimal three-point eighth order I.F.s with 4 function evaluations or with 3 function and 1 first derivative evaluations (see [5,14] and the references therein).
However, we can achieve only a maximum of sixth order with I.F.s with 2 function and 2 first derivative evaluations.The question we now pose: What is the maximum order we can achieve with I.F.s with 1 function and 3 first derivative evaluations?
Let us define τ = f (ψ . We improve the 3 rd 3pV S I.F. using weight function to obtain a three-point fifth order I.F.(5 th 3pV S): It is remarkable that with the same number of functional evaluations we have improved the efficiency index from IE 3 rd 3pV S = 1.316 to IE 5 th 3pV S = 5 1/4 = 1.495.However, the maximum order we could achieve I.F.s with 1 function and 3 first derivative evaluations is five.Babajee [2] developed a technique to improve the order of old methods.
Theorem 3 (Babajee's theorem for improving the order of old methods).[2] Let a sufficiently smooth function f : D ⊂ R → R has a simple root x * in the open interval D. Let ψ old (x) be an Iteration Function (I.F.) of order p.Then the I.F.defined as where C G is a constant.Suppose that the error equation of the old I.F. is given by Then, the error equation of the new I.F. is given by where Usually, G is a weight function or an approximation to obtained from polynomial interpolation.Using Babajee's theorem and applying Newton I.F., we obtain a 10th order I.F.(10 th VS): However, we need to compute two more function evaluations.So we estimate f (ψ 5 th 3pV S (x)) by the following polynomial: which satisfies the following conditions Let where we define the divided differences: Equation ( 14) reduces, after using the divided differences, to a system of 3 linear equations in matrix form: whose solutions are given by Using Babajee's theorem with p = q = 5, we can obtain a four-point 10th-order I.F.s (10 th 4pVS): The efficiency index has now increased since IE 10 th 4pV S = 10 1/5 = 1.585 with 2 function and 3 first derivative evaluations.We point out that the optimal order of a four-point I.F. with 5 functional evaluations is 16 but this can be achieved with either 5 function evaluations or 4 function and 1 first derivative evaluations (see [5,14] and the references therein).
Using a similar approach to estimate f (ψ 10 th 4pV S (x)) by the following polynomial: which satisfies the following conditions q 2 (ψ 5 th 3pV S (x)) = f (ψ 5 th 3pV S (x)), q 2 (ψ 10 th 4pV S (x)) = f (ψ 10 th 4pV S (x)), 17) reduces, after using the divided differences, to a system of 4 linear equations in matrix form: whose solutions are given by b = −10 Using the Taylor series and the symbolic software such as Maple we have and and Substituting Equations ( 19), ( 20) and (21) into Equation (11), we obtain, after simplifications, Now, using Maple, we have Using Babajee's theorem with p = q = 5, we have (24)

Numerical Examples
In this section, we give numerical results on some test functions to compare the efficiency of the proposed methods (5 th 3pVS, 10 th 4pVS and 20 th 5pVS) with 3 rd 3pVS and Newton's method (2 nd NR).Numerical computations have been carried out in the MATLAB software rounding to 1000 significant digits.Depending on the precision of the computer, we use the stopping criteria for the iterative process E n = |x n − x n−1 | < where = 10 −100 .Let N be the number of iterations required for convergence.The test functions and their simple zeros are given below: x * = 1.3652300134140968457...
Table 1 shows the corresponding results for f 1 (x) to f 4 (x).It can be found that the 20 th 5pVS I.F.converge in less iterations with the least error E N for the functions and their starting points considered.This I.F.takes at most half the number of iterations than that of the 2 nd N R I.F. to converge.The number of iterations and the error are smaller when we choose a starting point close to the root.Table 1.Results for the 5 th 3pVS, 10 th 4pVS and 20 th 5pVS Iterative Functions (I.F.s) for f 1 (x)-f 4 (x) along with 2 nd N R and 3 rd 3pVS I.F.s

Conclusion
In this work, we have developed three-point fifth order, four-point 10th order and five-point 20th order methods using weight functions and polynomial interpolation.It is clear that our proposed methods require only four evaluations per iterative step to obtain fifth order method, five evaluations per iterative step to get 10th order and six evaluations per iterative step to get 20th order.We have thus increased the order of convergence to five, 10 and 20 compared to the third order method suggested in [1] with efficiency indexes EI =1.495, EI =1.585 and EI = 1.648, respectively.Our proposed methods are better than Newton's method in terms of efficiency index (EI =1.4142).Numerical results show that the five-point 20th order method is the most efficient.