The Role of Fractional Calculus in Modern Optimization: A Survey of Algorithms, Applications, and Open Challenges
Abstract
1. Introduction
2. Fundamentals of Fractional Calculus
2.1. Historical Background and Definitions
2.2. Key Concepts: Caputo, Riemann–Liouville, Grünwald–Letnikov, and Nabla Operators
- Riemann–Liouville derivative:
- Caputo derivative:
- Grünwald–Letnikov (GL) derivative:
- Nabla fractional operators:
2.3. Technical Aspects: Window Length, Truncation, and Stability

2.4. Properties Relevant to Optimization
- Non-locality: Fractional derivatives are sensitive to changes over extended domains, enabling global information to be considered during optimization [14].
- Control of smoothness: The fractional-order serves as a tuning parameter to adjust the degree of smoothness and sensitivity, offering flexibility to balance global search and local refinement [23].
3. Fractional Calculus in Classical Algorithms
3.1. Fractional-Order Least Mean Square (FOLMS)
| Algorithm 1: FOLMS with Variable and Fixed Fractional Order [35] |
// Variable gradient order update |
3.2. Fractional-Order Adam (FOAdam)
| Algorithm 2: Fractional-Order Adam (FOAdam) [22] |
![]() |
3.3. Fractional Gradient Descent Method with Adaptive Momentum (FBPAM)
| Algorithm 3: FBPAM: Fractional BP with Adaptive Momentum [43] |
Initial parameter value General parameter vector Definitions of the increments Adaptive momentum coefficients Fractional derivative of error function |
3.4. Applications and Performance Evaluation
4. Fractional Calculus in Metaheuristic-Based Algorithms
| Algorithm | Year | Authors | Summary of Main Features |
|---|---|---|---|
| FO-Darwinian PSO (for ORPD) | 2020 | Y. Muhammad et al. [77] | Applies fractional-order Darwinian PSO to optimal reactive power dispatch on IEEE 30/57-bus systems; objectives are line-loss minimization and voltage-deviation reduction; reports consistent gains vs. classical counterparts. |
| FPSOGSA (FO-PSO + GSA) | 2020 | N. H. Khan et al. [78] | Hybridizes PSO and Gravitational Search by introducing a fractional derivative in the velocity term; solves ORPD (IEEE 30/57), minimizing losses and voltage deviation; shows superior performance across single/multiple runs. |
| FMNSICS (FMW-ANN–PSO–SQP) | 2021 | Z. Sabir et al. [76] | Neuro-swarm solver that uses fractional Meyer wavelet ANNs with PSO global search and SQP local refinement to solve non-linear fractional Lane–Emden systems; accuracy validated against exact solutions and statistical tests. |
| FO-Darwinian PSO (PV cells) | 2022 | W. A. E. M. Ahmed et al. [79] | “Improved” FODPSO for parameter identification of solar PV cells/modules using single- and double-diode models; consistently outperforms standard PSO in estimation accuracy. |
| IFPSO (Improved Fractional PSO) | 2022 | J. Li et al. [80] | Addresses the trade-off of a single fractional operator in FPSO; builds an IFPSO-SVM prediction model where IFPSO tunes SVM penalty/kernel parameters, improving convergence and prediction metrics. |
| FBBA (Fractional-Order Binary Bat Algorithm) | 2023 | A. Esfandiari et al. [81] | Introduces fractional-order memory into the binary bat algorithm to control convergence; used in a two-stage feature selection pipeline (correlation-based clustering + wrapper) on high-dimensional microarray data. |
| FMACA (Fractional-Order Memristive ACO) | 2023 | W. Zhu et al. [82] | Uses a physical fracmemristor system to store probabilistic transfer information and pass “future” transition tendencies to the current node, enabling prediction and speeding up ACO search. |
| FOWFO (FO Water Flow Optimizer) | 2024 | Z. Tang et al. [83] | Injects fractional-order difference (memory) into WFO; benchmarked on CEC-2017 functions and several large real problems; improves robustness and solution quality vs. the original WFO and peers. |
| FDBO (FO Dung Beetle Optimizer) | 2024 | H. Xia et al. [84] | Enhances DBO with fractional-order calculus to retain/use historical information and an adaptive mechanism to balance exploration–exploitation; applied to global optimization and multilevel CT image thresholding. |
| FO-ASMFO (FO Archimedean Spiral Moth–Flame) | 2024 | A. Wadood et al. [85] | Fractional-order MFO with Archimedean spiral for ORPD; minimizes active-power loss and sets reactive-power flows on IEEE 30/57 test systems; reports improved loss reduction and voltage profiles. |
| FO-DE (Fractional-Order-Driven DE) | 2025 | S. Tao et al. [86] | Introduces fractional-order difference guidance into DE to handle the discrete nature of wind farm layout optimization; across 10 wind-field conditions, outperforms GA/PSO/DE variants in performance and robustness. |
4.1. Representative Fractional Metaheuristics
Fractional Particle Swarm Optimization
| Algorithm 4: Fractional-Order Particle Swarm Optimization (FO-PSO) [72] |
![]() |
4.2. Novel and Promising: FO-DE and FO-WFO
- Mutation: For each target vector , a mutant vector is generated using scaled differences of randomly selected individuals:Here, denotes the index of the base vector (which may be the best or a random individual), and each pair corresponds to two distinct individuals. The scaling factor controls the step size. The indices are chosen so that .
- Crossover: The mutant vector is combined with the target vector to produce a trial vector . Two crossover schemes are commonly used:
- –
- Binomial crossover: each component is taken from either the mutant or the target based on the crossover rate :
- –
- Exponential crossover: A contiguous segment of components is copied from the mutant. Let and be two integers such that , then
- Selection: The trial vector competes with its parent based on the objective function . The individual with the lower cost is retained in the next generation:
- Laminar Operator. In low-velocity conditions, particles follow smooth, parallel trajectories. This is mathematically modeled as follows:where is a random shifting coefficient, and is the flow direction vector pointing from a randomly selected particle toward the best particle in the current iteration.
- Turbulent Operator. To mimic the chaotic behavior of fluids when water encounters obstacles, WFO employs a turbulence mechanism. This operator randomly perturbs a selected dimension of the particle using two transformation strategies:where are random dimensions, and m is a mutation value determined as follows:with a uniform random number and as the eddying probability. The transformation functions are eddying and over-layer moving as follows:where , is the shear force of the k-th particle to the i-th particle, and is a randomly generated angle. The bounds and define the search space for each dimension.
- Flow Decision. The operator selection at each iteration is controlled by a laminar probability , such that the algorithm alternates between smooth flow and turbulence, enabling a balance between global exploration and local exploitation.
| Algorithm 5: Fractional-Order Differential Evolution (FODE) [86] |
![]() |
| Algorithm 6: Fractional-Order Water Flow Optimizer (FOWFO) [83] |
![]() |
4.3. Applications and Performance Evaluation
5. Open Challenges and Future Research Directions
5.1. Classical Optimization Methodologies with Fractional-Order Calculus
5.1.1. Theoretical Challenges
5.1.2. Computational and Implementation Challenges
5.1.3. Future Research Directions
- Adaptive and variable-order methods: Future work should explore theadaptive adjustment of during iterations, possibly guided by curvature, entropy, or reinforcement signals [98]. Such strategies could dynamically balance exploration and exploitation.
- Hybrid frameworks: Combining fractional gradient descent with momentum methods (e.g., Adam, RMSProp) has shown promise [22,101], but systematic studies comparing fractional and classical momentum mechanisms remain scarce. Hybrid designs such as the AP-FOGDL algorithms proposed by Ma et al. [50] demonstrate the potential of adaptive parameterization for convergence guarantees in practice.
- Theoretical foundations: Stronger convergence proofs, particularly in non-convex optimization and stochastic regimes, are required to establish fractional methods as reliable tools for deep learning and scientific computing [46,47,96]. The tempered memory mechanism introduced by Naifar [102] offers a promising path to stabilize convergence in noisy, high-dimensional optimization problems.
- Application-driven design: Emerging fields such as federated learning and physics-informed neural networks (PINNs) could benefit from fractional-order memory effects [21]. Tailoring classical algorithms with fractional dynamics for these contexts may open new pathways for practical adoption.
- Numerical benchmarks and reproducibility: A standardized suite of benchmarks for fractional gradient-based algorithms would allow rigorous comparison and accelerate the maturity of the field. Current evaluations are fragmented and problem-specific, as noted in the recent survey by Elnady et al. [96].
5.2. Metaheuristic Optimization Methodologies with Fractional-Order Calculus
5.2.1. Theoretical Challenges
- The stability and boundedness of fractional dynamics in discrete-time metaheuristics are not fully understood [89].
- The optimal choice of the fractional order often relies on empirical tuning rather than analytical justification [90].
- There is a need for general frameworks that can relate to problem complexity, landscape ruggedness, or system memory [104].
5.2.2. Computational and Implementation Issues
- Increased computational cost, especially in large-scale problems or real-time applications [106];
- Scalability concerns when deploying fractional models on parallel or embedded hardware [22];
- Challenges in integrating fractional solvers into existing software ecosystems and optimization toolkits [107].
5.2.3. Directions for Future Research
- Fractional quantum and neuromorphic computing: These leverage emerging paradigms to execute fractional dynamics efficiently, with recent proposals investigating fractional-inspired kernels in quantum annealers [67].
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Salcan-Reyes, G.; Cajo, R.; Aguirre, A.; Espinoza, V.; Plaza, D.; Martín, C. Comparison of Control Strategies for Temperature Control of Buildings, Volume 6: Dynamics, Vibration, and Control. In Proceedings of the ASME International Mechanical Engineering Congress and Exposition, New Orleans, LA, USA, 29 October–2 November 2023. [Google Scholar] [CrossRef]
- Fu, H.; Yao, W.; Cajo, R.; Zhao, S. Trajectory Tracking Predictive Control for Unmanned Surface Vehicles with Improved Nonlinear Disturbance Observer. J. Mar. Sci. Eng. 2023, 11, 1874. [Google Scholar] [CrossRef]
- Fernandez Cornejo, E.R.; Diaz, R.C.; Alama, W.I. PID Tuning based on Classical and Meta-heuristic Algorithms: A Performance Comparison. In Proceedings of the 2020 IEEE Engineering International Research Conference (EIRCON), Lima, Peru, 21–23 October 2020; pp. 1–4. [Google Scholar] [CrossRef]
- Aghababa, M.P. Fractional-Neuro-Optimizer: A Neural-Network-Based Optimization Method. Neural Process. Lett. 2014, 40, 169–189. [Google Scholar] [CrossRef]
- Cajo, R.; Zhao, S.; Birs, I.; Espinoza, V.; Fernández, E.; Plaza, D.; Salcan-Reyes, G. An Advanced Fractional Order Method for Temperature Control. Fractal Fract. 2023, 7, 172. [Google Scholar] [CrossRef]
- Zhao, H.; Chen, J.; Jiang, P.; Zeng, Z. Optimizing Neural Network Image Classification with Fractional Order Gradient Methods. SSRN Electron. J. 2023. [Google Scholar] [CrossRef]
- Zhao, S.; Mu, J.; Liu, H.; Sun, Y.; Cajo, R. Heading control of USV based on fractional-order model predictive control. Ocean Eng. 2025, 322, 120476. [Google Scholar] [CrossRef]
- Hai, P.V.; Rosenfeld, J.A. The gradient descent method from the perspective of fractional calculus. Math. Methods Appl. Sci. 2021, 44, 5520–5547. [Google Scholar] [CrossRef]
- Saleh, M.; Ajarmah, B. Fractional Gradient Descent Learning of Backpropagation Artificial Neural Networks with Conformable Fractional Calculus. In Fuzzy Systems and Data Mining VIII; IOS Press: Amsterdam, The Netherlands, 2022; pp. 72–79. [Google Scholar] [CrossRef]
- Song, Z.; Fan, Q.; Dong, Q. Convergence Analysis and Application for Multi-Layer Neural Network Based on Fractional-Order Gradient Descent Learning. Adv. Theory Simul. 2023, 7, 2300662. [Google Scholar] [CrossRef]
- Raubitzek, S.; Mallinger, K.; Neubauer, T. Combining Fractional Derivatives and Machine Learning: A Review. Entropy 2022, 25, 35. [Google Scholar] [CrossRef]
- Herrera-Alcántara, O. Fractional Derivative Gradient-Based Optimizers for Neural Networks and Human Activity Recognition. Appl. Sci. 2022, 12, 9264. [Google Scholar] [CrossRef]
- Herrera-Alcántara, O. Fractional Gradient Optimizers for PyTorch: Enhancing GAN and BERT. Fractal Fract. 2023, 7, 500. [Google Scholar] [CrossRef]
- Wei, Y.; Chen, Y.; Zhao, X.; Cao, J. Analysis and Synthesis of Gradient Algorithms Based on Fractional-Order System Theory. IEEE Trans. Syst. Man Cybern. Syst. 2023, 53, 1895–1906. [Google Scholar] [CrossRef]
- Lou, W.; Gao, W.; Han, X.; Zhang, Y. Variable Order Fractional Gradient Descent Method and Its Application in Neural Networks Optimization. In Proceedings of the Chinese Control and Decision Conference (CCDC), Hefei, China, 15–17 August 2022; pp. 109–114. [Google Scholar] [CrossRef]
- Zhou, X.; Zhao, C.; Huang, Y. A Deep Learning Optimizer Based on Grünwald–Letnikov Fractional Order Definition. Mathematics 2023, 11, 316. [Google Scholar] [CrossRef]
- Trigka, M.; Dritsas, E. A comprehensive survey of deep learning approaches in image processing. Sensors 2025, 25, 531. [Google Scholar] [CrossRef]
- Muftah, M.N.; Faudzi, A.A.M.; Sahlan, S.; Mohamaddan, S. Fuzzy fractional order PID tuned via PSO for a pneumatic actuator with ball beam (PABB) system. Fractal Fract. 2023, 7, 416. [Google Scholar] [CrossRef]
- Xie, J.; Dmour, A.A.; Lakys, Y. Application of Nonlinear Fractional Differential Equations in Computer Artificial Intelligence Algorithms. Appl. Math. Nonlinear Sci. 2022, 8, 1145–1154. [Google Scholar] [CrossRef]
- Subramanian, S.; Bhojane, N.U.; Madhnani, H.M.; Pant, S.; Kumar, A.; Kotecha, K. A Comprehensive Review of Nature-Inspired Optimization Techniques and Their Varied Applications. In Advances in Computer and Electrical Engineering; IGI Global Publishing: Hershey, PA, USA, 2024; pp. 105–174. [Google Scholar] [CrossRef]
- Lixandru, A.; van Gerven, M.; Pequito, S. Fractional Order Distributed Optimization. arXiv 2024, arXiv:2412.02546. [Google Scholar] [CrossRef]
- Chen, G.; Liang, Y.; Sihao, L.; Zhao, X. A Novel Gradient Descent Optimizer based on Fractional Order Scheduler and its Application in Deep Neural Networks. Appl. Math. Model. 2024, 128, 26–57. [Google Scholar] [CrossRef]
- Szente, T.A.; Harrison, J.; Zanfir, M.; Sminchisescu, C. Applications of Fractional Calculus in Learned Optimization. arXiv 2024, arXiv:2411.14855. [Google Scholar] [CrossRef]
- Cajo, R.; Mac, T.T.; Plaza, D.; Copot, C.; De Keyser, R.; Ionescu, C. A Survey on Fractional Order Control Techniques for Unmanned Aerial and Ground Vehicles. IEEE Access 2019, 7, 66864–66878. [Google Scholar] [CrossRef]
- Podlubny, I. Fractional Differential Equations: An Introduction to Fractional Derivatives, Fractional Differential Equations, to Methods of Their Solution and Some of Their Applications; Elsevier: Amsterdam, The Netherlands, 1998; Volume 198. [Google Scholar]
- Ostalczyk, P. Discrete Fractional Calculus: Applications in Control and Image Processing; Series In Computer Vision; World Scientific Publishing: Singapore, 2015. [Google Scholar]
- Duhé, J.F.; Victor, S.; Melchior, P.; Abdelmounen, Y.; Roubertie, F. Fractional derivative truncation approximation for real-time applications. Commun. Nonlinear Sci. Numer. Simul. 2023, 119, 107096. [Google Scholar] [CrossRef]
- Joshi, D.D.; Bhalekar, S.; Gade, P.M. Stability Analysis of Fractional Difference Equations with Delay. arXiv 2023, arXiv:2305.06686. [Google Scholar]
- Shoaib, B.; Qureshi, I.M.; Ihsanulhaq; Shafqatullah. A modified fractional least mean square algorithm for chaotic and nonstationary time series prediction. Chin. Phys. B 2014, 23, 030502. [Google Scholar] [CrossRef]
- Khan, Z.A.; Chaudhary, N.I.; Zubair, S. Fractional stochastic gradient descent for recommender systems. Electron. Mark. 2019, 29, 275–285. [Google Scholar] [CrossRef]
- Aslam, M.S.; Raja, M.A.Z. A new adaptive strategy to improve online secondary path modeling in active noise control systems using fractional signal processing approach. Signal Process. 2015, 107, 433–443. [Google Scholar] [CrossRef]
- Tan, Y.; He, Z.; Tian, B. A novel generalization of modified LMS algorithm to fractional order. IEEE Signal Process. Lett. 2015, 22, 1244–1248. [Google Scholar] [CrossRef]
- Chen, Y.; Gao, Q.; Wei, Y.; Wang, Y. Study on fractional order gradient methods. Appl. Math. Comput. 2017, 314, 310–321. [Google Scholar] [CrossRef]
- Chen, Y.; Wei, Y.; Du, B.; Wang, Y. A novel fractional order gradient method for identifying a linear system. In Proceedings of the 2017 32nd Youth Academic Annual Conference of Chinese Association of Automation (YAC), Hefei, China, 19–21 May 2017; pp. 352–356. [Google Scholar]
- Cheng, S.; Wei, Y.; Chen, Y.; Li, Y.; Wang, Y. An innovative fractional order LMS based on variable initial value and gradient order. Signal Process. 2017, 133, 260–269. [Google Scholar] [CrossRef]
- Chen, Y.; Wei, Y.; Wang, Y. A novel perspective to gradient method: The fractional order approach. arXiv 2019, arXiv:1903.03239. [Google Scholar] [CrossRef]
- Wei, Y.; Kang, Y.; Yin, W.; Wang, Y. Generalization of the gradient method with fractional order gradient direction. J. Frankl. Inst. 2020, 357, 2514–2532. [Google Scholar] [CrossRef]
- Liu, J.; Zhai, R.; Liu, Y.; Li, W.; Wang, B.; Huang, L. A quasi fractional order gradient descent method with adaptive stepsize and its application in system identification. Appl. Math. Comput. 2021, 393, 125797. [Google Scholar] [CrossRef]
- Chaudhary, N.I.; Raja, M.A.Z.; Khan, Z.A.; Cheema, K.M.; Milyani, A.H. Hierarchical quasi-fractional gradient descent method for parameter estimation of nonlinear ARX systems using key term separation principle. Mathematics 2021, 9, 3302. [Google Scholar] [CrossRef]
- Fang, Q. Estimation of Navigation Mark Floating Based on Fractional-Order Gradient Descent with Momentum for RBF Neural Network. Math. Probl. Eng. 2021, 2021, 6681651. [Google Scholar] [CrossRef]
- Wang, Y.; He, Y.; Zhu, Z. Study on fast speed fractional order gradient descent method and its application in neural networks. Neurocomputing 2022, 489, 366–376. [Google Scholar] [CrossRef]
- Chaudhary, N.I.; Khan, Z.A.; Kiani, A.K.; Raja, M.A.Z.; Chaudhary, I.I.; Pinto, C.M. Design of auxiliary model based normalized fractional gradient algorithm for nonlinear output-error systems. Chaos Solitons Fractals 2022, 163, 112611. [Google Scholar] [CrossRef]
- Han, X.; Dong, J. Applications of fractional gradient descent method with adaptive momentum in BP neural networks. Appl. Math. Comput. 2023, 448, 127944. [Google Scholar] [CrossRef]
- Ye, L.; Chen, Y.; Liu, Q. Development of an Efficient Variable Step-Size Gradient Method Utilizing Variable Fractional Derivatives. Fractal Fract. 2023, 7, 789. [Google Scholar] [CrossRef]
- Vieira, N.; Rodrigues, M.M.; Ferreira, M. Fractional gradient methods via ψ-Hilfer derivative. Fractal Fract. 2023, 7, 275. [Google Scholar] [CrossRef]
- Sun, S.; Gao, Z.; Jia, K. State of charge estimation of lithium-ion battery based on improved Hausdorff gradient using wavelet neural networks. J. Energy Storage 2023, 64, 107184. [Google Scholar] [CrossRef]
- Shin, Y.; Darbon, J.; Karniadakis, G.E. Accelerating gradient descent and Adam via fractional gradients. Neural Netw. 2023, 161, 185–201. [Google Scholar] [CrossRef]
- Chen, G.; Xu, Z. λ-FAdaMax: A novel fractional-order gradient descent method with decaying second moment for neural network training. Expert Syst. Appl. 2025, 279, 127156. [Google Scholar] [CrossRef]
- Naifar, O. Theoretical Framework for Tempered Fractional Gradient Descent: Application to Breast Cancer Classification. arXiv 2025, arXiv:2504.18849. [Google Scholar] [CrossRef]
- Ma, M.; Chen, S.; Zheng, L. Novel adaptive parameter fractional-order gradient descent learning for stock selection decision support systems. Eur. J. Oper. Res. 2025, 324, 276–289. [Google Scholar] [CrossRef]
- Partohaghighi, M.; Marcia, R.; Chen, Y. Effective Dimension Aware Fractional-Order Stochastic Gradient Descent for Convex Optimization Problems. arXiv 2025, arXiv:2503.13764. [Google Scholar] [CrossRef]
- Shin, Y.; Darbon, J.; Karniadakis, G.E. A Caputo fractional derivative-based algorithm for optimization. arXiv 2021, arXiv:2104.02259. [Google Scholar] [CrossRef]
- Pu, Y.F.; Yi, Z.; Zhou, J.L. Fractional Hopfield Neural Networks: Fractional Dynamic Associative Recurrent Neural Networks. IEEE Trans. Neural Netw. Learn. Syst. 2017, 28, 2319–2333. [Google Scholar] [CrossRef] [PubMed]
- Viera-Martin, E.; Gómez-Aguilar, J.F.; Solís-Pérez, J.E.; Hernández-Pérez, J.A.; Escobar-Jiménez, R. Artificial Neural Networks: A Practical Review of Applications Involving Fractional Calculus. Eur. Phys. J. Spec. Top. 2022, 231, 2059–2095. [Google Scholar] [CrossRef]
- Coelho, C.; Ferrás, L.L. Fractional Calculus Meets Neural Networks for Computer Vision: A Survey. AI 2024, 5, 1391–1426. [Google Scholar] [CrossRef]
- Shah, S.M.; Samar, R.; Khan, N.M.; Raja, M.A.Z. Fractional-Order Adaptive Signal Processing Strategies for Active Noise Control Systems. Nonlinear Dyn. 2017, 85, 1363–1376. [Google Scholar] [CrossRef]
- Shah, S.M.; Samar, R.; Khan, N.M.; Raja, M.A.Z. Design of Fractional-Order Variants of Complex LMS and NLMS Algorithms for Adaptive Channel Equalization. Nonlinear Dyn. 2017, 88, 839–858. [Google Scholar] [CrossRef]
- Aslam, M.S.; Chaudhary, N.I.; Raja, M.A.Z. A Sliding-Window Approximation-Based Fractional Adaptive Strategy for Hammerstein Nonlinear ARMAX Systems. Nonlinear Dyn. 2018, 87, 519–533. [Google Scholar] [CrossRef]
- Chaudhary, N.I.; Zubair, S.; Aslam, M.S.; Raja, M.A.Z.; Machado, J.T. Design of Momentum Fractional LMS for Hammerstein Nonlinear System Identification with Application to Electrically Stimulated Muscle Model. Eur. Phys. J. Plus 2019, 134, 407. [Google Scholar] [CrossRef]
- Chaudhary, N.I.; Zubair, S.; Raja, M.A.Z. A New Computing Approach for Power Signal Modeling Using Fractional Adaptive Algorithms. ISA Trans. 2020, 68, 189–202. [Google Scholar] [CrossRef]
- Chen, Y.; Wei, Y.; Liang, S.; Wang, Y. Indirect model reference adaptive control for a class of fractional order systems. Commun. Nonlinear Sci. Numer. Simul. 2016, 39, 458–471. [Google Scholar] [CrossRef]
- Matusiak, M. Optimization for Software Implementation of Fractional Calculus Numerical Methods in an Embedded System. Entropy 2020, 22, 566. [Google Scholar] [CrossRef] [PubMed]
- Pu, Y.F. Fractional-Order Euler-Lagrange Equation for Fractional-Order Variational Method: A Necessary Condition for Fractional-Order Fixed Boundary Optimization Problems in Signal Processing and Image Processing. IEEE Access 2016, 4, 10110–10135. [Google Scholar] [CrossRef]
- Wei, Y.; Kang, Y.; Yin, W.; Wang, Y. Design of Generalized Fractional Order Gradient Descent Method. arXiv 2019, arXiv:1901.05294. [Google Scholar]
- Cao, Y.; Su, S. Fractional Gradient Descent Algorithms for Systems with Outliers: A Matrix Fractional Derivative or a Scalar Fractional Derivative. Chaos Solitons Fractals 2023, 174, 113881. [Google Scholar] [CrossRef]
- Aggarwal, P. Convergence Analysis of a Fractional Gradient Descent Method. arXiv 2024, arXiv:2409.10390. [Google Scholar]
- Nassef, A.M.; Abdelkareem, M.A.; Maghrabie, H.M.; Baroutaji, A. Metaheuristic-Based Algorithms for Optimizing Fractional-Order Controllers—A Recent, Systematic, and Comprehensive Review. Fractal Fract. 2023, 7, 553. [Google Scholar] [CrossRef]
- Raubitzek, R.; Koundal, A.; Kuhn, M. Memory-aware metaheuristics using fractional-order control: Review and framework. Fractal Fract. 2022, 6, 208. [Google Scholar] [CrossRef]
- Machado, J.A.T.; Kiryakova, V.; Mainardi, F. Recent history of fractional calculus. Commun. Nonlinear Sci. Numer. Simul. 2011, 16, 1140–1153. [Google Scholar] [CrossRef]
- Nakisa, B.; Rastgoo, M.N.; Norodin, M.J. Balancing exploration and exploitation in particle swarm optimization on search tasking. Res. J. Appl. Sci. Eng. Technol. 2014, 8, 1429–1434. [Google Scholar] [CrossRef]
- Fidalgo, J.; Silva, T.; Marques, R. Hybrid fractional-order PSO-SQP algorithm for thermal system identification. Fractal Fract. 2025, 9, 51. [Google Scholar] [CrossRef]
- Khan, M.W.; Muhammad, Y.; Raja, M.A.Z.; Ullah, F.; Chaudhary, N.I.; He, Y. A new fractional particle swarm optimization with entropy diversity based velocity for reactive power planning. Entropy 2020, 22, 1112. [Google Scholar] [CrossRef]
- Khan, M.W.; He, Y.; Li, X. Memory-enhanced PSO via fractional operators: Design and application. Appl. Sci. 2020, 10, 6517. [Google Scholar] [CrossRef]
- Ma, M.; Yang, J. Convergence Analysis of Novel Fractional-Order Backpropagation Neural Networks with Regularization Terms. IEEE Trans. Cybern. 2023, 54, 3039–3050. [Google Scholar] [CrossRef]
- Cuevas, E.; Luque, A.; Morales Castañeda, B.; Rivera, B. Fractional fuzzy controller using metaheuristic techniques. In Metaheuristic Algorithms: New Methods, Evaluation, and Performance Analysis; Springer: Cham, Switzerland, 2024; pp. 223–243. [Google Scholar] [CrossRef]
- Sabir, Z.; Raja, M.A.Z.; Umar, M.; Shoaib, M.; Baleanu, D. FMNSICS: Fractional Meyer neuro-swarm intelligent computing solver for nonlinear fractional Lane–Emden systems. Neural Comput. Appl. 2021, 34, 4193–4206. [Google Scholar] [CrossRef]
- Muhammad, Y.; Khan, R.; Ullah, F.; Rehman, A.u.; Aslam, M.S.; Raja, M.A.Z. Design of fractional swarming strategy for solution of optimal reactive power dispatch. Neural Comput. Appl. 2020, 32, 10501–10518. [Google Scholar] [CrossRef]
- Khan, N.H.; Wang, Y.; Tian, D.; Raja, M.A.Z.; Jamal, R.; Muhammad, Y. Design of Fractional Particle Swarm Optimization Gravitational Search Algorithm for Optimal Reactive Power Dispatch Problems. IEEE Access 2020, 8, 146785–146806. [Google Scholar] [CrossRef]
- Ahmed, W.A.E.M.; Mageed, H.M.A.; Mohamed, S.A.; Saleh, A.A. Fractional order Darwinian particle swarm optimization for parameters identification of solar PV cells and modules. Alex. Eng. J. 2022, 61, 1249–1263. [Google Scholar] [CrossRef]
- Li, J.; Zhao, C. Improvement and Application of Fractional Particle Swarm Optimization Algorithm. Math. Probl. Eng. 2022, 2022, 5885235. [Google Scholar] [CrossRef]
- Esfandiari, A.; Farivar, F.; Khaloozadeh, H. Fractional-order binary bat algorithm for feature selection on high-dimensional microarray data. J. Ambient Intell. Humaniz. Comput. 2023, 14, 7453–7467. [Google Scholar] [CrossRef]
- Zhu, W.; Pu, Y. A Study of Fractional-Order Memristive Ant Colony Algorithm: Take Fracmemristor into Swarm Intelligent Algorithm. Fractal Fract. 2023, 7, 211. [Google Scholar] [CrossRef]
- Tang, Z.; Wang, K.; Zang, Y.; Zhu, Q.; Todo, Y.; Gao, S. Fractional-Order Water Flow Optimizer. Int. J. Comput. Intell. Syst. 2024, 17, 84. [Google Scholar] [CrossRef]
- Xia, H.; Ke, Y.; Liao, R.; Sun, Y. Fractional order calculus enhanced dung beetle optimizer for function global optimization and multilevel threshold medical image segmentation. J. Supercomput. 2024, 81, 90. [Google Scholar] [CrossRef]
- Wadood, A.; Ahmed, E.; Rhee, S.B.; Sattar Khan, B. A Fractional-Order Archimedean Spiral Moth–Flame Optimization Strategy to Solve Optimal Power Flows. Fractal Fract. 2024, 8, 225. [Google Scholar] [CrossRef]
- Tao, S.; Liu, S.; Zhao, R.; Yang, Y.; Todo, H.; Yang, H. A State-of-the-Art Fractional Order-Driven Differential Evolution for Wind Farm Layout Optimization. Mathematics 2025, 13, 282. [Google Scholar] [CrossRef]
- Li, Q.; Liu, S.Y.; Yang, X.S. Influence of initialization on the performance of metaheuristic optimizers. Appl. Soft Comput. 2020, 91, 106193. [Google Scholar] [CrossRef]
- Abdulkadirov, R.; Lyakhov, P.; Nagornov, N. Survey of Optimization Algorithms in Modern Neural Networks. Mathematics 2023, 11, 2466. [Google Scholar] [CrossRef]
- Abedi Pahnehkolaei, S.M.; Alfi, A.; Tenreiro Machado, J. Analytical stability analysis of the fractional-order particle swarm optimization algorithm. Chaos Solitons Fractals 2022, 155, 111658. [Google Scholar] [CrossRef]
- Peng, Y.; Sun, S.; He, S.; Zou, J.; Liu, Y.; Xia, Y. A fractional-order JAYA algorithm with memory effect for solving global optimization problem. Expert Syst. Appl. 2025, 270, 126539. [Google Scholar] [CrossRef]
- Li, Z.; Liu, L.; Dehghan, S.; Chen, Y.; Xue, D. A review and evaluation of numerical tools for fractional calculus and fractional order controls. Int. J. Control 2016, 90, 1165–1181. [Google Scholar] [CrossRef]
- Wadood, A.; Park, H. A Novel Application of Fractional Order Derivative Moth Flame Optimization Algorithm for Solving the Problem of Optimal Coordination of Directional Overcurrent Relays. Fractal Fract. 2024, 8, 251. [Google Scholar] [CrossRef]
- Muhammad, Y.; Chaudhary, N.I.; Sattar, B.; Siar, B.; Awan, S.E.; Raja, M.A.Z.; Shu, C.M. Fractional order swarming intelligence for multi-objective load dispatch with photovoltaic integration. Eng. Appl. Artif. Intell. 2024, 137, 109073. [Google Scholar] [CrossRef]
- Chen, G.; Liang, Y.; Jiang, Z.; Li, S.; Li, H.; Xu, Z. Fractional-order PID-based search algorithms: A math-inspired meta-heuristic technique with historical information consideration. Adv. Eng. Inform. 2025, 65, 103088. [Google Scholar] [CrossRef]
- Chou, F.I.; Huang, T.H.; Yang, P.Y.; Lin, C.H.; Lin, T.C.; Ho, W.H.; Chou, J.H. Controllability of Fractional-Order Particle Swarm Optimizer and Its Application in the Classification of Heart Disease. Appl. Sci. 2021, 11, 11517. [Google Scholar] [CrossRef]
- Elnady, S.M.; El-Beltagy, M.; Radwan, A.G.; Fouda, M.E. A comprehensive survey of fractional gradient descent methods and their convergence analysis. Chaos Solitons Fractals 2025, 194, 116154. [Google Scholar] [CrossRef]
- Esfandiari, A.; Khaloozadeh, H.; Farivar, F. A scalable memory-enhanced swarm intelligence optimization method: Fractional-order Bat-inspired algorithm. Int. J. Mach. Learn. Cybern. 2024, 15, 2179–2197. [Google Scholar] [CrossRef]
- Huang, Z.; Mao, S.; Yang, Y. MFFGD: An adaptive Caputo fractional-order gradient algorithm for DNN. Neurocomputing 2024, 610, 128606. [Google Scholar] [CrossRef]
- Zhang, Y.; Xu, H.; Li, Y.; Lin, G.; Zhang, L.; Tao, C.; Wu, Y. An Integer-Fractional Gradient Algorithm for Back Propagation Neural Networks. Algorithms 2024, 17, 220. [Google Scholar] [CrossRef]
- Yang, Y.; Mo, L.; Hu, Y.; Long, F. The Improved Stochastic Fractional Order Gradient Descent Algorithm. Fractal Fract. 2023, 7, 631. [Google Scholar] [CrossRef]
- Zhou, X.; You, Z.; Sun, W.; Zhao, D.; Yan, S. Fractional-order stochastic gradient descent method with momentum and energy for deep neural networks. Neural Netw. 2025, 181, 106810. [Google Scholar] [CrossRef]
- Naifar, O. Tempered Fractional Gradient Descent: Theory, Algorithms, and Robust Learning Applications. Neural Netw. 2025, 193, 108005. [Google Scholar] [CrossRef] [PubMed]
- Wang, C.H.; Hu, K.; Wu, X.; Ou, Y. Rethinking Metaheuristics: Unveiling the Myth of “Novelty” in Metaheuristic Algorithms. Mathematics 2025, 13, 2158. [Google Scholar] [CrossRef]
- Liu, J.; Chen, S.; Cai, S.; Xu, C. The Novel Adaptive Fractional Order Gradient Decent Algorithms Design via Robust Control. arXiv 2023, arXiv:2303.04328. [Google Scholar] [CrossRef]
- Yang, Q.; Wang, Y.; Liu, L.; Zhang, X. Adaptive Fractional-Order Multi-Scale Optimization TV-L1 Optical Flow Algorithm. Fractal Fract. 2024, 8, 179. [Google Scholar] [CrossRef]
- Tlelo-Cuautle, E.; Gonz’alez-Zapata, A.M.; D’iaz-Mu noz, J.D.; de la Fraga, L.G.; Cruz-Vega, I. Optimization of fractional-order chaotic cellular neural networks by metaheuristics. Eur. Phys. J. Spec. Top. 2022, 231, 2037–2043. [Google Scholar] [CrossRef]
- Sattar, D.; Shehadeh Braik, M. Metaheuristic methods to identify parameters and orders of fractional-order chaotic systems. Expert Syst. Appl. 2023, 228, 120426. [Google Scholar] [CrossRef]








| Algorithm | Year | Authors | Summary of Main Features |
|---|---|---|---|
| Fractional-Order Gradient Method (FOGM-V1) | 2017 | Y. Chen et al. [33,34]. | Satisfactory convergence in the initial stages. Subsequently, it encountered convergence difficulties due to the long-term memory that is inherent in fractional derivatives. |
| Fractional-Order Least Mean Square (FOLMS) | 2017 | S. Cheng et al. [35]. | Proposed a balance between convergence speed and estimation accuracy by integrating a variable initial value strategy and a variable gradient order. |
| Fractional variable-Order Least Mean Square (FVOLMS) | 2018 | S. Cheng et al. [35]. | An iterative adjustment strategy of the fractional gradient order was applied to minimize the approximation error (steady-state error). |
| Fractional-Order Gradient Method (FOGM-V2) | 2019 | Y. Chen et al. [36]. | Improved and faster convergence. It successfully mitigated the prolonged memory effect of FOGM-V1 by limiting the influence of distant past data. |
| Short-Memory Fractional Gradient Descent-Version 1 (SMFGD-V1) | 2020 | Y. Wei et al. [37]. | Introduced the short-memory principle to truncate the memory length and reduce the influence of distant past iterations. This approach effectively minimized non-local effects and achieved convergence to the true optimum. |
| Higher-Order Truncation for Fractional Gradient Descent (HOTFGD) | 2020 | Y. Wei et al. [37]. | A truncation of the infinite series expansion of the fractional derivative was proposed, preserving only the dominant term to guarantee true convergence. |
| Variable-Order Fractional Gradient Descent (VOFGD-V1) | 2020 | Y. Wei et al. [37]. | It introduced a variable-order fractional approach. This strategy enabled the adaptation of adaptive control techniques to fractional calculus, enhancing convergence and robustness. |
| Quasi-Fractional-Order Gradient Descent Method (QFOGDM-V1) | 2021 | J. Liu et al. [38]. | The FOGM was extended to accommodate high-dimensional optimization (multidimensional convex optimization) through the implementation of a novel iterative update mechanism designed to ensure both accuracy and computational efficiency. |
| Quasi-Fractional-Order Gradient Descent Method (QFOGDM-V2) | 2021 | J. Liu et al. [38]. | QFOGDM-V1 enhancement introducing the Hadamard product to modify the gradient direction and reduce its zigzagging in multidimensional spaces. It dynamically adjusts the step size at each iteration to achieve faster convergence. |
| Hierarchical Quasi-Fractional Gradient Descent (HQFGD) | 2021 | N. Chaudhary et al. [39]. | The issue of QFOGDM overparameterization was addressed by combining hierarchical identification techniques with a parameter separation methodology, prioritizing the most relevant parameters to reduce dimensionality and complexity. |
| Fractional-Order Gradient Descent with Momentum Radial Basis Function (FOGDM-RBF) | 2021 | Q. Fang et al. [40]. | Proposed the Fractional-Order Gradient Descent Radial Basis Function Neural Network with Momentum to enhance the training of RBF neural networks by accelerating convergence and avoiding local minima. |
| Fractional-Order Gradient Descent with Momentum Radial Basis Function (FOGDM-RBF) | 2021 | Q. Fang et al. [40]. | A Fractional-Order Gradient Descent approach combined with a Momentum-modified Radial Basis Function is proposed to enhance RBF network training efficiency by speeding up convergence and mitigating local minima issues. |
| Fractional-order gradient descent method with random weight particle swarm optimization (FOGD-RPSO) | 2022 | Y. Wang et al. [41]. | The methodology incorporates a Particle Swarm Optimization (PSO) algorithm with stochastic inertia weight, which facilitates the selection of an appropriate initial search point, accelerates convergence, and strengthens global optimization capabilities. |
| Fractional-order gradient descent with momentum (FOM). Fractional-order gradient descent with variable learning rate (FOVLR). Fractional-order gradient descent with momentum and variable learning rate (FOMVLR). Short-Memory Fractional Gradient Descent-Version 2 (SMFGD-V2) | 2022 | Y. Wang et al. [41]. | These fractional-order gradient descent methods were simultaneously introduced and tailored for quadratic loss functions. These approaches incorporate the Riemann–Liouville fractional derivative in conjunction with a variable-initial-value scheme, thereby ensuring convergence to the true extremum. The FOMVLR algorithm represents an advancement over the previously established FOM and FOVLR methods. An analogous improvement is observed in the transition from SMFGD-V1 to SMFGD-V2. |
| Variable-Order Fractional Gradient Descent (VOFGD-V2) | 2022 | W. Lou et al. [15]. | The proposed method generalizes classical gradient descent by embedding a Caputo fractional derivative with an order that dynamically evolves across iterations. Such progressive adaptation enables the exploitation of multiple fractional orders to achieve enhanced optimization efficiency. The algorithm initially employs a high fractional order to accelerate convergence, then gradually decreases it as the solution approaches the optimum, ultimately improving precision in the final stages. |
| Auxiliary Model-based Normalized Variable Initial Value with Fractional Least Mean Square (AM-NVIV-FLMS) | 2022 | N. Chaudhary et al. [42]. | This approach was introduced for identifying systems exhibiting non-linear dynamics. It integrates the variable-initial-value strategy with the concept of short-term memory, thereby enabling the adaptive regulation of the learning rate throughout the optimization process. |
| Conformable Local Fractional Gradient Descent (CLFGD) | 2022 | M. Saleh et al. [9]. | A fractional gradient descent method was proposed for training neural networks via backpropagation, employing Conformable Fractional Calculus (CFD). The CLFGD algorithm enables the dynamic adjustment of the fractional order during optimization, thereby enhancing training efficiency by allowing the learning rate to adapt naturally according to the weight values. |
| Fractional gradient descent method with adaptive momentum (FBPAM) | 2023 | X. Han et al. [43]. | The method was proposed to enhance the convergence rate and training stability in neural networks. Specifically, the gradient computation is extended by incorporating the Grunwald–Letnikov (GL) fractional derivative. At the same time, an adaptive momentum coefficient is introduced to dynamically regulate the update process as a function of the fractional gradient and variations in historical weights. |
| Truncation Variable Fractional-Order Gradient Method (TVFOGM) | 2023 | L. Ye et al. [44]. | The method consolidates the advantages of various fractional gradient approaches to strike a balance between convergence rate and solution accuracy. The TVFOGM dynamically modifies the fractional orders throughout the iterative process, enhancing computational efficiency while preserving numerical precision. Additionally, it autonomously adjusts the step size based on inter-iteration variations, which improves accuracy as the algorithm converges toward the optimal solution. |
| -Hilfer Fractional Gradient Method (-FGD) | 2023 | N. Vieira et al. [45]. | The objective function is represented by a power series expansion, in which higher-order terms are truncated to reduce computational complexity. The experimental findings revealed that integrating variable-order differentiation with step-size optimization significantly enhances both the speed and precision of convergence. |
| Hausdorff–Local Fractional Gradient Descent (H-LFGD) | 2023 | S. Sun et al. [46]. | A novel approach was developed to estimate the state of charge (SOC) in lithium-ion batteries (H-LFGD). It integrates an enhanced Hausdorff derivative for optimizing neural network parameters, enabling precise SOC estimation from real-time data. The adaptive fractional-order tuning dynamically adjusts the derivative during optimization, significantly strengthening both convergence speed and estimation accuracy. |
| Adaptive Terminal Fractional Gradient Descent (AT-FGD) | 2023 | Y. Shin et al. [47]. | A class of functions with specific regularity and decay properties was established, ensuring the convergence of the infinite series associated with Caputo fractional derivatives. This approach achieves a balance between global exploration and local exploitation by regulating the parameters and , thereby optimizing the convergence dynamics. |
| Fractional-Order Adam (FOAdam). Fractional-Order Scheduler (FOS) | 2024 | G. Chen et al. [22]. | The algorithm incorporates the Caputo fractional derivative into the conventional Adam structure to accelerate convergence in deep neural networks. First-order momentum enhancement was achieved through Caputo fractional derivatives, and a Fractional-Order Scheduler (FOS) was implemented to dynamically adjust the fractional order during training, balancing convergence speed and accuracy. |
| -Fractional AdaMax ( -FAdaMax) | 2025 | G. Chen et al. [48]. | A decay factor is incorporated into the Caputo fractional derivative, and an Adam-based optimization framework is employed to formulate a novel fractional gradient descent method (FGDM), denoted as -FAdaMax. The proposed algorithm begins with a first-order derivative and gradually transitions to a fractional-order derivative as iterations proceed, thus establishing a trade-off between convergence rate and solution accuracy. Experimental evaluation indicates that the method yields competitive performance in training deep neural networks (DNNs) for engineering-oriented tasks. |
| Tempered Fractional Gradient Descent (TFGD), | 2025 | O. Naifar et al. [49]. | It proposes a novel optimization framework that integrates fractional calculus with exponential tempering to enhance gradient-based learning. The method incorporates a tempered memory mechanism, where historical gradients are weighted by fractional coefficients and exponentially decayed through a tempering parameter . This tempered memory mechanism ensures convergence in convex settings and proves to be particularly effective in medical classification tasks, where feature correlations benefit from a stable gradient-averaging process. |
| Adaptive parameter fractional-order gradient descent learning (AP-FOGDL) | 2025 | M. Ma et al. [50]. | An adaptive learning rate is proposed by introducing computable upper bounds. Convergence is established for both Caputo and Riemann–Liouville fractional derivatives, under scenarios with and without adaptive learning rate adjustment. Moreover, to enhance predictive accuracy, an amplification factor is incorporated to augment the adaptive learning rate, yielding satisfactory outcomes that confirm the algorithm’s feasibility, high accuracy, and robust performance. |
| Two-Scale Effective Dimension Fractional-Order Stochastic Gradient Descent (2SEDFOSGD) | 2025 | M. Partohaghighi et al. [51]. | Integrates the two-scale effective dimension (2SED) algorithm with FOSGD to adapt the fractional exponent of the fractional-order stochastic gradient based on the data, thereby capturing long-memory effects in the optimization process, mitigating oscillations, and ensuring rapid convergence to the true optimum. |
| Algorithm | Year | Authors | Summary of Main Features |
|---|---|---|---|
| Fractional-Order Least Mean Square (FOLMS) | 2017 | S. Cheng et al. [35] | Introduces fractional gradient in LMS; order tunes speed/accuracy trade-off; analyses of convergence behavior and steady-state MSE. |
| Complex/NLMS Fractional Variants (ANC, Equalization) | 2017 | S. Shah et al. [56,57] | Fractional complex LMS/NLMS for active noise control and channel equalization; improved adaptation under noise and non-stationarity. |
| Fractional Hopfield / Recurrent NNs | 2017 | Y.-F. Pu et al. [53] | Fractional dynamics in associative neural nets; memory effect enhances recall stability and convergence properties. |
| Sliding-Window Fractional Adaptive Strategy (Hammerstein-ARMAX) | 2018 | M. S. Aslam et al. [58] | Finite-memory fractional update to reduce cost; better tracking for non-linear systems with hereditary effects. |
| Fractional-Order Variational / Euler–Lagrange for Optimization | 2018 | Y.-F. Pu [63] | Necessary conditions for fractional fixed-boundary optimization; foundation for fractional gradient-based solvers. |
| Generalized Fractional Gradient Descent (G-FGD) | 2019 | Y. Wei et al. [64] | Unifies multiple fractional GD designs; parameterized orders/weights to shape convergence and stability. |
| Momentum Fractional LMS (Hammerstein ID) | 2019 | N. I. Chaudhary et al. [59] | Adds momentum to FOLMS; faster transients and lower steady-state error in non-linear identification. |
| Fractional Adaptive Algorithms for Power Signal Modeling | 2020 | N. I. Chaudhary et al. [60] | Fractional adaptive filters for distorted power signals; improved tracking with non-stationary harmonics. |
| Embedded GL-based Fractional Numerical Methods | 2020 | M. Matusiak [62] | Software optimizations for GL discretization on embedded targets; finite-memory/truncation strategies. |
| Caputo Fractional Gradient Descent (CFGD) | 2022 | Y. Shin et al. [52] | Convergence guarantees for Caputo-based GD; non-adaptive and adaptive-order variants; links to smoothed objectives. |
| Fractional Gradient Descent with Outlier Modeling | 2023 | Y. Cao et al. [65] | Matrix/scalar fractional derivatives to improve robustness to outliers in system learning tasks. |
| Fractional GD for Multilayer NNs (Analysis & Application) | 2023 | Z. Song et al. [10] | Theoretical/experimental study of FGD in deep nets; demonstrates convergence behavior and application cases. |
| Synthesis of Gradient Algorithms via Fractional-Order System Theory | 2023 | Y. Wei et al. [14] | System-theoretic framing that derives/compares fractional gradient algorithms; guidance on parameterization. |
| FC Meets Neural Networks for Computer Vision (Survey) | 2024 | C. Coelho & L. L. Ferrás [55] | Survey of FC operators embedded in CV/NN pipelines; documents recent applications and trends (2017–2024). |
| Convergence Analysis of Fractional GD (Recent Theory) | 2024 | P. Aggarwal [66] | Rates for smooth/convex and strongly convex settings; clarifies conditions for and linear convergence. |
| Algorithm | Year | Authors | Summary of Main Features |
|---|---|---|---|
| FO-DE | 2025 | S. Tao et al. [86] | Applies fractional-order-driven DE to the wind farm layout optimization problem (WFLOP); improves energy efficiency and power output; outperforms GA, PSO, and classical DE in convergence, robustness, and adaptability. |
| FO-DPSO (Fractional-Order Darwinian PSO) | 2020 | Y. Muhammad et al. [77] | Uses fractional-order Darwinian PSO for optimal reactive power dispatch (IEEE 30-, 57-, 19-bus systems); objectives include line-loss minimization and voltage-profile improvement; best performance at , outperforming classical metaheuristics across 100 trials in accuracy, robustness, and voltage profile. |
| FDBO (Fractional Dung Beetle Optimizer) | 2024 | H. Xia et al. [84] | Enhances dung beetle optimizer with fractional-order calculus for lung CT image segmentation (Otsu-based multilevel thresholding) and global optimization (CEC2019); improves PSNR, SSIM, and global optima accuracy over PSO, BOA, and GWO; better memory use and convergence. |
| FMAC (Fractional-order Memristive Ant Colony) | 2023 | W. Zhu et al. [82] | Applies memristor-based fractional-order ant colony optimization to the Traveling Salesman Problem (TSP); reduces TSP cost and execution time; outperforms PACO-3opt, FACA, and ACO/MMAS, achieving ∼30% faster results than FACA. |
| FODMFO (Fractional-Order Derivative Moth Flame Optimizer) | 2024 | A. Wadood et al. [92] | Solves optimal coordination of directional overcurrent relays (DOCRs) in IEEE test systems; minimizes total DOCR operating time and improves convergence speed and reliability; outperforms TLBO, PSO, SA, GWO, GA, BSA, DJAYA, and others in all case studies. |
| FO-AMFO (Fractional-Order Archimedean Spiral MFO) | 2024 | A. Wadood et al. [85] | Implements fractional-order Archimedean Spiral Moth–Flame Optimization for optimal reactive power dispatch (IEEE 30- and 57-bus systems); minimizes power loss and optimizes reactive power flow; best performance at (30-bus) and (57-bus). |
| FPSO (Fractional-order Particle Swarm Optimization) | 2024 | Y. Muhammad et al. [93] | Uses fractional-order PSO for combined economic emission and load dispatch (CEED) with PV integration (IEEE 30-bus); minimizes fuel cost, emissions, and line losses; achieves statistically significant improvements over PSO, DE, GA, and GSA (). |
| FoPSA-I, II, III (Fractional-order PID-based Search Algorithms) | 2025 | G. Chen et al. [94] | Introduces fractional-order PID-based search algorithms for global optimization (CEC benchmarks + engineering problems); improves convergence speed and precision; FoPSA-III significantly outperforms PSA (), with historical position data enhancing diversity and accuracy. |
| CFPSO (Controllable Fractional-Order PSO) | 2021 | F.-I. Chou et al. [95] | Applies controllable fractional-order PSO to heart disease classification (hyperparameter tuning); achieves best results with , outperforming XGBoost’s default hyperparameters in average, best, and standard deviation metrics. |
| FPSOGSA (Fractional PSO + GSA) | 2021 | N. H. Khan et al. [78] | Hybridizes fractional-order PSO with Gravitational Search Algorithm for optimal reactive power dispatch (IEEE 30- and 57-bus systems); reduces power losses by 19.03–24.07%, voltage deviation, and execution time; consistent improvements across 100 trials. |
| FO-PSO with entropy-based velocity | 2020 | M.W. Khan et al. [72] | Uses entropy-based velocity update in fractional-order PSO for optimal reactive power dispatch (IEEE 30- and 57-bus systems); minimizes power loss, improves voltage profile, reduces operating cost; best performance at , surpassing traditional PSO. |
| FOWFO (Fractional-Order Water Flow Optimizer) | 2024 | Z. Tang et al. [83] | Integrates fractional-order memory into the Water Flow Optimizer; tested on real-world high-dimensional problems and CEC2017 benchmarks; improves fitness, exploration–exploitation balance, and computational cost over nine peer algorithms. |
| Function | Method | Iterations | Final | Distance to Optimum |
|---|---|---|---|---|
| Rosenbrock | Adam (Gradient-based) | 182 | ||
| Rosenbrock | FO Adam (Fractional Gradient-based) | 155 | ||
| Rosenbrock | PSO (Particle Swarm Optimization) | 450 | ||
| Rosenbrock | FO PSO (Fractional Particle Swarm Optimization) | 450 | ||
| Beale | Adam (Gradient-based) | 2240 | ||
| Beale | FO Adam (Fractional Gradient-based) | 489 | ||
| Beale | PSO (Particle Swarm Optimization) | 900 | ||
| Beale | FO PSO (Fractional Particle Swarm Optimization) | 900 |
| Algorithm | Time Complexity | Memory Complexity | Dependence on r |
|---|---|---|---|
| FOGM/-FGD | Constants (gamma, powers). | ||
| FOLMS | Adjustment in update rule. | ||
| FOAdam | with extra buffers | Affects stability, not order. | |
| 2SEDFOSGD | r adapted dynamically (cost in metrics). | ||
| FPSOGSA | r in velocity update (constants). | ||
| FMACA | r in memristive predictor. | ||
| FDBO | r in movement rules. | ||
| FO-DE | r in differential combination. |
| Aspect | Classical Metaheuristics | Fractional-Order Metaheuristics |
|---|---|---|
| Memory | Memoryless updates (Markovian) | Incorporate historical memory via fractional derivatives |
| Search Dynamics | Local or short-term influence | Non-local, history-sensitive influence via |
| Exploration vs. Exploitation | Fixed via parameters (e.g., inertia, learning factors) | Tunable via ; dynamic exploration–exploitation balance |
| Convergence Behavior | Faster but prone to local minima | Smoother trajectories, reduced premature convergence |
| Robustness to Noise | Sensitive to irregularities | Improved due to averaging of past states |
| Computational Cost | Lower (single-step updates) | Higher due to non-local operators and memory |
| Interpretability of Parameters | Well-established tuning strategies | Optimal often problem-dependent and empirical |
| Hybridization | Common with local search or ensembles | Naturally suited to hybrid models (NNs, wavelets, SQP) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Fernandez, E.; Huilcapi, V.; Birs, I.; Cajo, R. The Role of Fractional Calculus in Modern Optimization: A Survey of Algorithms, Applications, and Open Challenges. Mathematics 2025, 13, 3172. https://doi.org/10.3390/math13193172
Fernandez E, Huilcapi V, Birs I, Cajo R. The Role of Fractional Calculus in Modern Optimization: A Survey of Algorithms, Applications, and Open Challenges. Mathematics. 2025; 13(19):3172. https://doi.org/10.3390/math13193172
Chicago/Turabian StyleFernandez, Edson, Victor Huilcapi, Isabela Birs, and Ricardo Cajo. 2025. "The Role of Fractional Calculus in Modern Optimization: A Survey of Algorithms, Applications, and Open Challenges" Mathematics 13, no. 19: 3172. https://doi.org/10.3390/math13193172
APA StyleFernandez, E., Huilcapi, V., Birs, I., & Cajo, R. (2025). The Role of Fractional Calculus in Modern Optimization: A Survey of Algorithms, Applications, and Open Challenges. Mathematics, 13(19), 3172. https://doi.org/10.3390/math13193172





