Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (8)

Search Parameters:
Keywords = with-memory methods

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
26 pages, 5138 KB  
Article
On Traub–Steffensen-Type Iteration Schemes With and Without Memory: Fractal Analysis Using Basins of Attraction
by Moin-ud-Din Junjua, Shahid Abdullah, Munish Kansal and Shabbir Ahmad
Fractal Fract. 2024, 8(12), 698; https://doi.org/10.3390/fractalfract8120698 - 26 Nov 2024
Cited by 2 | Viewed by 2045
Abstract
This paper investigates the design and stability of Traub–Steffensen-type iteration schemes with and without memory for solving nonlinear equations. Steffensen’s method overcomes the drawback of the derivative evaluation of Newton’s scheme, but it has, in general, smaller sets of initial guesses that converge [...] Read more.
This paper investigates the design and stability of Traub–Steffensen-type iteration schemes with and without memory for solving nonlinear equations. Steffensen’s method overcomes the drawback of the derivative evaluation of Newton’s scheme, but it has, in general, smaller sets of initial guesses that converge to the desired root. Despite this drawback of Steffensen’s method, several researchers have developed higher-order iterative methods based on Steffensen’s scheme. Traub introduced a free parameter in Steffensen’s scheme to obtain the first parametric iteration method, which provides larger basins of attraction for specific values of the parameter. In this paper, we introduce a two-step derivative free fourth-order optimal iteration scheme based on Traub’s method by employing three free parameters and a weight function. We further extend it into a two-step eighth-order iteration scheme by means of memory with the help of suitable approximations of the involved parameters using Newton’s interpolation. The convergence analysis demonstrates that the proposed iteration scheme without memory has an order of convergence of 4, while its memory-based extension achieves an order of convergence of at least 7.993, attaining the efficiency index 7.9931/32. Two special cases of the proposed iteration scheme are also presented. Notably, the proposed methods compete with any optimal j-point method without memory. We affirm the superiority of the proposed iteration schemes in terms of efficiency index, absolute error, computational order of convergence, basins of attraction, and CPU time using comparisons with several existing iterative methods of similar kinds across diverse nonlinear equations. In general, for the comparison of iterative schemes, the basins of iteration are investigated on simple polynomials of the form zn1 in the complex plane. However, we investigate the stability and regions of convergence of the proposed iteration methods in comparison with some existing methods on a variety of nonlinear equations in terms of fractals of basins of attraction. The proposed iteration schemes generate the basins of attraction in less time with simple fractals and wider regions of convergence, confirming their stability and superiority in comparison with the existing methods. Full article
Show Figures

Figure 1

15 pages, 558 KB  
Article
Enhanced Ninth-Order Memory-Based Iterative Technique for Efficiently Solving Nonlinear Equations
by Shubham Kumar Mittal, Sunil Panday and Lorentz Jäntschi
Mathematics 2024, 12(22), 3490; https://doi.org/10.3390/math12223490 - 8 Nov 2024
Cited by 4 | Viewed by 1245
Abstract
In this article, we present a novel three-step with-memory iterative method for solving nonlinear equations. We have improved the convergence order of a well-known optimal eighth-order iterative method by converting it into a with-memory version. The Hermite interpolating polynomial is utilized to compute [...] Read more.
In this article, we present a novel three-step with-memory iterative method for solving nonlinear equations. We have improved the convergence order of a well-known optimal eighth-order iterative method by converting it into a with-memory version. The Hermite interpolating polynomial is utilized to compute a self-accelerating parameter that improves the convergence order. The proposed uni-parametric with-memory iterative method improves its R-order of convergence from 8 to 8.8989. Additionally, no more function evaluations are required to achieve this improvement in convergence order. Furthermore, the efficiency index has increased from 1.6818 to 1.7272. The proposed method is shown to be more effective than some well-known existing methods, as shown by extensive numerical testing on a variety of problems. Full article
(This article belongs to the Special Issue New Trends and Developments in Numerical Analysis: 2nd Edition)
Show Figures

Figure 1

14 pages, 434 KB  
Article
A New Adaptive Eleventh-Order Memory Algorithm for Solving Nonlinear Equations
by Sunil Panday, Shubham Kumar Mittal, Carmen Elena Stoenoiu and Lorentz Jäntschi
Mathematics 2024, 12(12), 1809; https://doi.org/10.3390/math12121809 - 11 Jun 2024
Cited by 4 | Viewed by 1528
Abstract
In this article, we introduce a novel three-step iterative algorithm with memory for finding the roots of nonlinear equations. The convergence order of an established eighth-order iterative method is elevated by transforming it into a with-memory variant. The improvement in the convergence order [...] Read more.
In this article, we introduce a novel three-step iterative algorithm with memory for finding the roots of nonlinear equations. The convergence order of an established eighth-order iterative method is elevated by transforming it into a with-memory variant. The improvement in the convergence order is achieved by introducing two self-accelerating parameters, calculated using the Hermite interpolating polynomial. As a result, the R-order of convergence for the proposed bi-parametric with-memory iterative algorithm is enhanced from 8 to 10.5208. Notably, this enhancement in the convergence order is accomplished without the need for extra function evaluations. Moreover, the efficiency index of the newly proposed with-memory iterative algorithm improves from 1.5157 to 1.6011. Extensive numerical testing across various problems confirms the usefulness and superior performance of the presented algorithm relative to some well-known existing algorithms. Full article
(This article belongs to the Special Issue New Trends in Nonlinear Analysis)
Show Figures

Figure 1

15 pages, 488 KB  
Article
An Efficient Bi-Parametric With-Memory Iterative Method for Solving Nonlinear Equations
by Ekta Sharma, Shubham Kumar Mittal, J. P. Jaiswal and Sunil Panday
AppliedMath 2023, 3(4), 1019-1033; https://doi.org/10.3390/appliedmath3040051 - 11 Dec 2023
Cited by 3 | Viewed by 1910
Abstract
New three-step with-memory iterative methods for solving nonlinear equations are presented. We have enhanced the convergence order of an existing eighth-order memory-less iterative method by transforming it into a with-memory method. Enhanced acceleration of the convergence order is achieved by introducing two self-accelerating [...] Read more.
New three-step with-memory iterative methods for solving nonlinear equations are presented. We have enhanced the convergence order of an existing eighth-order memory-less iterative method by transforming it into a with-memory method. Enhanced acceleration of the convergence order is achieved by introducing two self-accelerating parameters computed using the Hermite interpolating polynomial. The corresponding R-order of convergence of the proposed uni- and bi-parametric with-memory methods is increased from 8 to 9 and 10, respectively. This increase in convergence order is accomplished without requiring additional function evaluations, making the with-memory method computationally efficient. The efficiency of our with-memory methods NWM9 and NWM10 increases from 1.6818 to 1.7320 and 1.7783, respectively. Numeric testing confirms the theoretical findings and emphasizes the superior efficacy of suggested methods when compared to some well-known methods in the existing literature. Full article
(This article belongs to the Special Issue Contemporary Iterative Methods with Applications in Applied Sciences)
Show Figures

Figure 1

13 pages, 514 KB  
Article
Derivative-Free Families of With- and Without-Memory Iterative Methods for Solving Nonlinear Equations and Their Engineering Applications
by Ekta Sharma, Sunil Panday, Shubham Kumar Mittal, Dan-Marian Joița, Lavinia Lorena Pruteanu and Lorentz Jäntschi
Mathematics 2023, 11(21), 4512; https://doi.org/10.3390/math11214512 - 1 Nov 2023
Cited by 6 | Viewed by 1982
Abstract
In this paper, we propose a new fifth-order family of derivative-free iterative methods for solving nonlinear equations. Numerous iterative schemes found in the existing literature either exhibit divergence or fail to work when the function derivative is zero. However, the proposed family of [...] Read more.
In this paper, we propose a new fifth-order family of derivative-free iterative methods for solving nonlinear equations. Numerous iterative schemes found in the existing literature either exhibit divergence or fail to work when the function derivative is zero. However, the proposed family of methods successfully works even in such scenarios. We extended this idea to memory-based iterative methods by utilizing self-accelerating parameters derived from the current and previous approximations. As a result, we increased the convergence order from five to ten without requiring additional function evaluations. Analytical proofs of the proposed family of derivative-free methods, both with and without memory, are provided. Furthermore, numerical experimentation on diverse problems reveals the effectiveness and good performance of the proposed methods when compared with well-known existing methods. Full article
(This article belongs to the Special Issue Advances in Linear Recurrence System)
Show Figures

Figure 1

18 pages, 848 KB  
Article
Efficient Families of Multi-Point Iterative Methods and Their Self-Acceleration with Memory for Solving Nonlinear Equations
by G Thangkhenpau, Sunil Panday, Liviu C. Bolunduţ and Lorentz Jäntschi
Symmetry 2023, 15(8), 1546; https://doi.org/10.3390/sym15081546 - 6 Aug 2023
Cited by 12 | Viewed by 1760
Abstract
In this paper, we have constructed new families of derivative-free three- and four-parametric methods with and without memory for finding the roots of nonlinear equations. Error analysis verifies that the without-memory methods are optimal as per Kung–Traub’s conjecture, with orders of convergence of [...] Read more.
In this paper, we have constructed new families of derivative-free three- and four-parametric methods with and without memory for finding the roots of nonlinear equations. Error analysis verifies that the without-memory methods are optimal as per Kung–Traub’s conjecture, with orders of convergence of 4 and 8, respectively. To further enhance their convergence capabilities, the with-memory methods incorporate accelerating parameters, elevating their convergence orders to 7.5311 and 15.5156, respectively, without introducing extra function evaluations. As such, they exhibit exceptional efficiency indices of 1.9601 and 1.9847, respectively, nearing the maximum efficiency index of 2. The convergence domains are also analysed using the basins of attraction, which exhibit symmetrical patterns and shed light on the fascinating interplay between symmetry, dynamic behaviour, the number of diverging points, and efficient root-finding methods for nonlinear equations. Numerical experiments and comparison with existing methods are carried out on some nonlinear functions, including real-world chemical engineering problems, to demonstrate the effectiveness of the new proposed methods and confirm the theoretical results. Notably, our numerical experiments reveal that the proposed methods outperform their existing counterparts, offering superior precision in computation. Full article
Show Figures

Figure 1

23 pages, 2165 KB  
Article
Extension of King’s Iterative Scheme by Means of Memory for Nonlinear Equations
by Saima Akram, Maira Khalid, Moin-ud-Din Junjua, Shazia Altaf and Sunil Kumar
Symmetry 2023, 15(5), 1116; https://doi.org/10.3390/sym15051116 - 19 May 2023
Cited by 11 | Viewed by 2609
Abstract
We developed a new family of optimal eighth-order derivative-free iterative methods for finding simple roots of nonlinear equations based on King’s scheme and Lagrange interpolation. By incorporating four self-accelerating parameters and a weight function in a single variable, we extend the proposed family [...] Read more.
We developed a new family of optimal eighth-order derivative-free iterative methods for finding simple roots of nonlinear equations based on King’s scheme and Lagrange interpolation. By incorporating four self-accelerating parameters and a weight function in a single variable, we extend the proposed family to an efficient iterative scheme with memory. Without performing additional functional evaluations, the order of convergence is boosted from 8 to 15.51560, and the efficiency index is raised from 1.6817 to 1.9847. To compare the performance of the proposed and existing schemes, some real-world problems are selected, such as the eigenvalue problem, continuous stirred-tank reactor problem, and energy distribution for Planck’s radiation. The stability and regions of convergence of the proposed iterative schemes are investigated through graphical tools, such as 2D symmetric basins of attractions for the case of memory-based schemes and 3D stereographic projections in the case of schemes without memory. The stability analysis demonstrates that our newly developed schemes have wider symmetric regions of convergence than the existing schemes in their respective domains. Full article
Show Figures

Figure 1

15 pages, 308 KB  
Article
Using Matrix Eigenvalues to Construct an Iterative Method with the Highest Possible Efficiency Index Two
by Malik Zaka Ullah, Vali Torkashvand, Stanford Shateyi and Mir Asma
Mathematics 2022, 10(9), 1370; https://doi.org/10.3390/math10091370 - 20 Apr 2022
Cited by 3 | Viewed by 2284
Abstract
In this paper, we first derive a family of iterative schemes with fourth order. A weight function is used to maintain its optimality. Then, we transform it into methods with several self-accelerating parameters to reach the highest possible convergence rate 8. For this [...] Read more.
In this paper, we first derive a family of iterative schemes with fourth order. A weight function is used to maintain its optimality. Then, we transform it into methods with several self-accelerating parameters to reach the highest possible convergence rate 8. For this aim, we employ the property of the eigenvalues of the matrices and the technique with memory. Solving several nonlinear test equations shows that the proposed variants have a computational efficiency index of two (maximum amount possible) in practice. Full article
(This article belongs to the Special Issue New Trends and Developments in Numerical Analysis)
Back to TopTop