A New Newton Method with Memory for Solving Nonlinear Equations

A new Newton method with memory is proposed by using a variable self-accelerating parameter. Firstly, a modified Newton method without memory with invariant parameter is constructed for solving nonlinear equations. Substituting the invariant parameter of Newton method without memory by a variable self-accelerating parameter, we obtain a novel Newton method with memory. The convergence order of the new Newton method with memory is 1 + 2 . The acceleration of the convergence rate is attained without any additional function evaluations. The main innovation is that the self-accelerating parameter is constructed by a simple way. Numerical experiments show the presented method has faster convergence speed than existing methods.


Introduction
Using information from the current and previous iterations, iterative methods with memory for solving nonlinear equations can attain high convergence order and computational efficiency without any additional function evaluations. Traub [1] first proposed the following iterative method with memory with the convergence order 1 + √ 2 ≈ 2.414.
where f [x n , x n−1 ] = { f (x n ) − f (x n−1 )}/(x n − x n−1 ) is a divided difference and the parameter T n is called self-accelerating parameter. Method (1) is defined as TRM in this paper.
Inspired by Traub's idea, Džunić et al. [2] obtained a modified Newton method with order 1 + where the self-accelerating parameter T n is calculated by interpolating polynomial of Hermite-Birkhoff type. Method (2) is defined as DZM in this paper. McDougall and Wotherspoon [3] proposed the following method with convergence order 1 + √ 2 ≈ 2.414.
where x * 0 = x 0 and (3) is defined as MWM in this paper. Many efficiency iterative methods with memory have been studied in recent years, see [4][5][6][7][8][9][10]. Most of them achieved higher convergence order by using the self-accelerating parameters. The self-accelerating parameters usually are constructed by the interpolation method. In this paper, a new way to construct the self-accelerating parameter is proposed and a simple modified Newton method with memory is obtained.
The major innovative work of this paper is that we present a novel way to construct the self-accelerating parameter. In Section 2, we derive a modified Newton method with convergence order 2 for solving nonlinear equations. In Section 3, based on the modified Newton method, a new iterative method with memory is obtained by using the novel self-accelerating parameter. The order of convergence of new method with memory is increased from 2 to 1 + √ 2 without any additional functional evaluations. In Section 4, some numerical tests are used to compare the new methods with some well-known methods. Numerical experiments show that the new method has faster convergence speed than the existing methods.

Modified Newton Method
It is well known that Newton method [11] converges quadratically. Newton method is defined as NM in this paper. If the sequence {x n } is generated by NM, which converges to a simple root ζ of nonlinear equation, then the sequence {x n } satisfies the following relation: where c 2 = f (ζ)/(2 f (ζ)) is the asymptotic error constant, e n = x n − ζ and e n+1 = x n+1 − ζ.
We consider the following scheme: where T ∈ R. The first step of our method (5) is the Newton method.

New Newton Method with Memory
In this Section, a new way to construct the self-accelerating parameter will be given. The new Newton method (5) can be accelerated with the use of information from the current and previous iterations. The minimization of the error relation Equation (9) can be obtained by recalculating the free parameter T = T n = c 2 . If the variable parameter T n satisfies lim n→∞ T n = c 2 , then the asymptotic convergence constant to be zero in Equation (10). From Equation (4), T n = (x n+1 − ζ)/(x n − ζ) 2 can be the self-accelerating parameter. But, the zero ζ is unknown in Equation (4), we can use the sequence information of method (5) from the current and previous iterations to approximate ζ and obtain the new self-accelerating parameter T n . The new self-accelerating parameter T n is given by the following formulas: Formula 1: Formula 2: Formula 3: Replacing the parameter T in Equation (5) with T n , we obtain the following iterative method with memory: where T n is calculated by using one of Formulas (11)- (13). The parameter T n depends on the iterative sequence information x n−1 , y n−1 , x n and y n . The convergence order of the iterative method with memory Equation (14) will be estimated by the concept of R-order of convergence [11] and the following Theorem (see [12] (p. 287)).

Theorem 2.
If the errors of approximations e k = x k − a obtained in an iterative method (IM) satisfy:  Let the self-accelerating parameter T n be calculated by (11), (12) or (13) in the iterative method (14), respectively. If x 0 is an initial approximation, which is sufficiently close to a simple root ζ of f (x), then the R-order of convergence of the iterative methods (14) is at least 1 + Proof. Let the sequence {x n } be generated by an iterative method, which converges to the root ζ of f (x) with the R-order O R (IM, a) ≥ r, we obtain: e n+1 ∼ D n,r e r n , e n = x n − ζ, when n → ∞ , D n,r tends to the asymptotic error constant in Equation (15). Therefore, e n+1 ∼ D n,r (D n−1,r e r n−1 ) r = D n,r D r n−1,r e r 2 n−1 .
From Equations (22), (24) and (26), we can see that Formula 1, Formula 2 and Formula 3 have the same error level. Thus, the convergence order of iterative method (14) with memory is 1 + √ 2 ≈ 2.414, when Equations (12) or (13) is used to compute the parameter T n , respectively.
This completes the proof.

Numerical Examples
Now, the modified Newton method (5) without memory and method (14) with memory are used compare with Newton's method (NM), method TRM (1), method DZM (2) and method MWM (3) for solving some nonlinear equations. Tables 1-7 give the absolute errors |x k − ζ|, where the root ζ is computed by 1200 significant digits. We use the parameters T = 0.1 and T 0 = 0.1 in the first iteration. The ρ is the computational convergence order [13], which is used to approach the theoretical convergence order of iterative method. The ρ is defined as follows: We use the following test functions: Tables 1-10 show that the new Newton methods with memory (14) present an increased rate of convergence over Newton method with no additional cost. The new method (14) has higher precision than other methods.  Table 5. Numerical results for function f 5 (x).
Methods  Table 6. Numerical results for function f 6 (x).
Methods   Table 9. Numerical results for function f 9 (x).
Methods   Tables 11 and 12 give the mean CPU time (in seconds) of the programs after 50 performances. The stopping criterion is x k+1 − x k < 10 −150 in Table 11. The stopping criterion is x k+1 − x k < 10 −300 in Table 12. Tables 11 and 12 show that method (14) costs less computing time than other methods. The main reason is that the structure of self-accelerating parameter of our method (14) is simple.

Conclusions
In this paper, we obtain a new way to construct the variable self-accelerating parameter. Using a novel self-accelerating parameter, we obtain a modified Newton method (14) with memory. By theoretical analysis and numerical experiments, we confirm that the modified Newton method with memory achieves convergence order 1 + √ 2 requiring a function and a derivative evaluation per iteration. The convergence speed of the modified Newton method (14) is faster than that of other methods. Thus, the new method (14) in this contribution can be considered as an improvement of Newton's method.