Designing a Bayesian Regularization Approach to Solve the Fractional Layla and Majnun System

: The present work provides the numerical solutions of the mathematical model based on the fractional-order Layla and Majnun model (MFLMM). A soft computing stochastic-based Bayesian regularization neural network approach (BRNNA) is provided to investigate the numerical accomplishments of the MFLMM. The nonlinear system is classiﬁed into two dynamics, whereas the correctness of the BRNNA is observed through the comparison of results. Furthermore, the reducible performance of the absolute error improves the exactitude of the computational BRNNA. Twenty neurons have been chosen, along with the data statics of training 74% and 13%, for both authorization and testing. The consistency of the designed BRNNA is demonstrated using the correlation/regression, error histograms, and transition of state values in order to solve the MFLMM.


Introduction
Many Americans have claimed that psychology is the scientific study of the mind's active learning [1][2][3].The mathematical connection in the modeling of love tales has rarely been conveyed to psychologists and scientists [4,5].Manifestations of psychology have been observed in human growth, social environments, cognitive processes, and clinical behaviors [6,7].A common psychological question about the meaning of love always comes to mind.Every person has their own justifications, with different meanings attached to them.Physical and spiritual love are the two main types of love.Physical love displays an inherent desire, whereas spiritual love is considered true as it does not alter depending on the circumstances.Someone who is experiencing love can occasionally become frightened and enraged.Those that are really in love have the sensation of the planet whirling.When someone falls in love, there is no discrimination based on creed, race, and religion.Feelings of love also occur in other creatures than humans.Romeo and Juliet, Sassi Punnuh, Heer Ranjha, Sohni Mahiwal, Haleema Ertu grul, and Layla Majnun are just a few examples of historical real-life love stories.
The current study displays the numerical results for the historical mathematical fractional Layla and Majnun model (MFLMM).This tale is so historic that Persia was aware of it in the ninth century.The concepts of this tale have also been expressed in the languages of Pakistan, as well as in Persian, Arabic, Turkish, and Indian, among other languages.In addition to being described in the literature, this love story has also appeared in several film/drama industries.All religions preach love, whether it is the Torat, the Geeta, the Quran, or the Bible.No one can secure himself from love, either Khusro, Zuleka, Iqbal, Ghalib, or Bulay Shah.

•
The stochastic BRNNA is applied successfully for the numerical performances of the differential MFLMM.

•
The fractional derivatives are implemented to accomplish the accurate performances of differential MFLMM.

•
Three different variations based on the MFLMM are numerically simulated through the process of the BRNNA.

•
The exactness of the BRNNA is perceived via comparison of performances-based achieved and source solutions.

•
The reduceable absolute error (AE) performances authenticate the precision of the BRNNA for solving the MFLMM.

•
The correlation/regression, error histograms (EHs), and transition of state (TS) values to solve the MFLMM demonstrate the reliability of the BRNNA.
The remaining parts of this work are as follows: Section 2 presents the MFLMM.Section 3 is the stochastic BRNNA.Section 4 represents the numerical MFLMM performances.Section 5 provides the concluding remarks.

Fractional LMM
The current section represents the MFLMM that is based on two relations, real and complex.The simplified model's form including its two classes is shown as [29,30]: where µ a , µ b , µ c , and µ d are constants, while the initial conditions are k 1 and k 2 .Layla and Majnun's feeling are stated as L(θ) and M(θ).The parameter constants are represented as µ a and µ b based on the environmental spirits properties.µ a is a positive fixed value that shows the compassion and sympathy for Majnun, while µ b < 0 represents the cruel people's behavior towards Layla.L 2 and M 2 represent the extreme level of love.µ b < 0 and µ d < 0 are the emotions of real love.The simplified form of system (1) using the complex forms, i.e., M = iM i + M r and L = iL i + L r is given as [31,32]: where, L i (θ) and M i (θ) are the emotions of Layla and Majnun based on the imaginary values, while L r (θ) and M r (θ) are the real emotions of these two characters.The subject of this study is to solve the MFLMM through the soft computing BRNNA.The nonlinear MFLMM is shown as [33]: In the above model, α shows the Caputo derivative.The fractional derivatives are presented to accomplish the specific and accurate outcomes.The examination of minute features in fractional models are not easy to manage, but applying the integer kinds reveals more information about the system's dynamics.In terms of proficiency, the fractional derivatives outperformed those based on the integer kind when the requirement was attainable [34,35].Additionally, many applications that emerge in control networks, engineering, and mathematical systems have been used to solve the fractional derivatives [36][37][38].Over the last 30 years, considerable operators have been used to solve various models [39,40].Some of them are Riemann-Liouville, Caputo, Erdlyi-Kober, Grünwald-Letnikov and Weyl-Riesz [41,42].Each of these operators has its own specific effects, but on the other hand, the Caputo derivative is considered easy to apply and can be implemented for the non-homogeneous/homogeneous initial conditions.The authors are encouraged to achieve the numerical performances of the MFLMM through the BRNN by keeping in view the significance of these submissions.

Designed Methodology
In this section, the methodology, based on the proposed BRNNA, for solving the differential MFLMM including the essential practices of the scheme and its execution is presented.The optimization based on the BRNNA is presented in Figure 1, which is categorized into MFLMM and model presentations.
In mathematical theory, there are various innovations based on supervised neural networks, which have contributed to understanding of training, behavior, and performance.Some of them are universal calculation depth and theorem, gradient descent and backpropagation, optimization surfaces and loss landscapes, generalization theory, weight initialization, normalization schemes, regularization and dropout, margin theory and loss functions, adaptive learning rates and schedules, network pruning and compression, Bayesian neural networks, elastic net, and weight regularization.These revolutions in mathematical theory have progressed the recognizing of supervised neural networks and have to more efficient training schemes, better system interpretability, and improved generalization.To create the dataset, the numerical performances are obtained by using the values of the default parameters.Twenty neurons are chosen, along with the data statics of training 74% and 13%, for the testing and certification.The stochastic BRNNA are presented using the best relationship, including premature convergence, complexity, and underfitting and overfitting measures.The program's settings are also changed through understanding, simulations, training, and minor link inconsistencies.The designed BRNNA is applied in "Matlab" with command "nftool" to achieve the assortment of learning schemes, proper hidden neurons, and certification/testing actions.
The computing performances have been achieved by using the BRNNA for the differential MFLMM.The same data need not necessarily be chosen for validation, training, and testing.Due to biased input outputs, the training is selected to be >74% to obtain the enhanced and superior appearances.If the training data are <74%, then the accuracy of the proposed BRNNA is reduced.Therefore, it is important to select these values with care and concentration.Figure 2  The designed BRNNA is applied in "Matlab" with command "nftool" to achieve the assortment of learning schemes, proper hidden neurons, and certification/testing actions.The computing performances have been achieved by using the BRNNA for the differential MFLMM.The same data need not necessarily be chosen for validation, training, and testing.Due to biased input outputs, the training is selected to be >74% to obtain the enhanced and superior appearances.If the training data are <74%, then the accuracy of the proposed BRNNA is reduced.Therefore, it is important to select these values with care and concentration.Figure 2 presents the different layers structure.

Bayesian Regularization (BR) Scheme
As compared to other traditional backpropagation approaches, one optimization technique that exhibits reliable and operational solutions is called BR. Cumbersome validation criteria are removed by using the BR.Many mathematical models have described the BR, which transforms regression representations into the constrained predictive method based on regression analysis.BR is an approach that is applied in statistics and machine learning to address uncertainty and overfitting by combining previous knowledge on the model's parameters.BR is commonly used in such scenarios as limited data or balancing the trade-off between data fitting and averting overly complicated systems.BR suggests a principled pathway to control system complexity, combine previous knowledge, and account for ambiguity in the statistical and machine learning tasks.Currently, BR is functional for applications such as making things measurable by exposure map rebuilding [43], the sensorless quantity of pumps [44], inverse acoustic systems [45], certification of groundwater pollution foundations along with hydraulic restrictions [46], economic systems [47], and permeability calculations based on the tight gas sandstones [48].

Numerical Performances
The current section provides three different cases by fixing these values  3-5 present the MFLMM, which is developed using the computational BRNNA.The mean square error (MSE) results achieved by employing the BRNNA to solve the MFLMM are presented in Figure 3.The optimal results are presented as 4.67150 × 10 −11 , 1.77309 × 10 −11 , and 2.65858 × 10 -12 by using the epochs as 260, 359, and 871 for cases 1-3.By increasing the epochs, the testing, training, and endorsement curves perform the position of steady state using the performances up to 10 −12 .The gradient operator values for the first to the third case are presented as 5.1178 × 10 −8 , 1.4841 × 10 −8 , and 9.1037 × 10 −9 .An error gradient shows the magnitude and direction values that are performed during the process of training based on the proposed neural network, which is applied to enhance the weights of the network with the right amount and direction.Figure 4 shows the obtained calculated results along with the EHs measures to solve the MFLMM.The EHs for cases 1 to 3 of the MFLMM have been performed as 1.81 ×10 −7 , 1.97 × 10 −6 , and 6.45 × 10 −7 .Figure 4 shows a plot, based on justification, of the testing, and training that perform the best.It also shows

Bayesian Regularization (BR) Scheme
As compared to other traditional backpropagation approaches, one optimization technique that exhibits reliable and operational solutions is called BR. Cumbersome validation criteria are removed by using the BR.Many mathematical models have described the BR, which transforms regression representations into the constrained predictive method based on regression analysis.BR is an approach that is applied in statistics and machine learning to address uncertainty and overfitting by combining previous knowledge on the model's parameters.BR is commonly used in such scenarios as limited data or balancing the tradeoff between data fitting and averting overly complicated systems.BR suggests a principled pathway to control system complexity, combine previous knowledge, and account for ambiguity in the statistical and machine learning tasks.Currently, BR is functional for applications such as making things measurable by exposure map rebuilding [43], the sensorless quantity of pumps [44], inverse acoustic systems [45], certification of groundwater pollution foundations along with hydraulic restrictions [46], economic systems [47], and permeability calculations based on the tight gas sandstones [48].

Numerical Performances
The current section provides three different cases by fixing these values µ d = −1, µ b = −1, µ a = 1, and µ c = −1, while the values of α are taken as 0.6, 0.7 and 0.8.Figures 3-7 display the numerical MFLMM based on the designed BRNNA.Figures 3-5 present the MFLMM, which is developed using the computational BRNNA.The mean square error (MSE) results achieved by employing the BRNNA to solve the MFLMM are presented in Figure 3.The optimal results are presented as 4.67150 × 10 −11 , 1.77309 × 10 −11 , and 2.65858 × 10 −12 by using the epochs as 260, 359, and 871 for cases 1-3.By increasing the epochs, the testing, training, and endorsement curves perform the position of steady state using the performances up to 10 −12 .The gradient operator values for the first to the third case are presented as 5.1178 × 10 −8 , 1.4841 × 10 −8 , and 9.1037 × 10 −9 .An error gradient shows the magnitude and direction values that are performed during the process of training based on the proposed neural network, which is applied to enhance the weights of the network with the right amount and direction.Figure 4 shows the obtained calculated results along with the EHs measures to solve the MFLMM.The EHs for cases 1 to 3 of the MFLMM have been performed as 1.81 ×10 −7 , 1.97 × 10 −6 , and 6.45 × 10 −7 .Figure 4 shows a plot, based on justification, of the testing, and training that perform the best.It also shows the training that performs the momentum constant or parameter, which is contained in the updated weight expression to evade the local minimum issues.Figure 5 depicts the correlation graphs produced through the BRNNA for the MFLMM, which is one (perfect value) for each case.The coefficient of correlation (R) varies in input −1 and +1; however, if R performance is found to be +1, high performance of the network is obtained along with positive linear relationship.The precision of the designed BRNNA for the MFLMM is achieved in the form of test/train and verification.Figures 6 and 7 present the comparisons and the AE performances for the MFLMM using the BRNNA.Figure 6 provides the graphs of result comparisons that have been obtained through the matching of proposed and reference results.These corresponding results provide the accuracy of the scheme for solving the MFLMM.Figure 7 indicates the   Figures 6 and 7 present the comparisons and the AE performances for the MFLMM using the BRNNA.Figure provides the graphs of result comparisons that have been obtained through the matching of proposed and reference results.These corresponding results provide the accuracy of the scheme for solving the MFLMM.Figure 7 indicates the AE performances for each dynamic of the MFLMM.AE performances for M r (θ) are illustrated in Figure 7a, and lie as 10 −6 -10 −8 , 10 −5 -10 −7 , and 10 −6 -10 −7 for cases 1 to 3. For the class M i (θ), the AE for cases 1 to 3 is performed as 10 −5 -10 −6 , 10 −5 -10 −7 , and 10 −6 -10 −7 .These measures for the classes L r (θ) and L i (θ) are derived in Figure 7c,d

Concluding Remarks
The numerical representations using the artificial neural networks for solving the fractional-order mathematical Layla and Majnun system are provided in the current study.The mathematical nonlinear system is classified into two dynamics.A few concluding remarks of this study are:

•
A soft computing Bayesian regularization-based neural network approach has been suggested successfully for the numerical representations of the MFLMM.

•
For the accuracy of the results, the fractional derivatives have been provided to solve mathematical model.

•
The exactness of the proposed BRNNA has been validated through the overlapping of the results.

•
The reducible absolute error performances improve the accuracy of the designed BRNNA.

•
Twenty neurons have been selected, together with the statics of training 74% and 13%, for both certification and testing.

•
The reliability and consistency of the designed BRNNA is demonstrated based on the correlation, transition of state, and performances of error histograms to solve the MFLMM.
presents the different layers structure.Mathematics 2023, 11, x FOR PEER REVIEW 4 of 13 by using the values of the default parameters.Twenty neurons are chosen, along with the data statics of training 74% and 13%, for the testing and certification.The stochastic BRNNA are presented using the best relationship, including premature convergence, complexity, and underfitting and overfitting measures.The program's settings are also changed through understanding, simulations, training, and minor link inconsistencies.

Figure 1 .
Figure 1.Depictions of the BRNNA for the MFLMM.

Figure 2 .
Figure 2. Different layer presentations for the MFLMM.
Figures 3-7 display the numerical MFLMM based on the designed BRNNA.Figures3-5present the MFLMM, which is developed using the computational BRNNA.The mean square error (MSE) results achieved by employing the BRNNA to solve the MFLMM are presented in Figure3.The optimal results are presented as 4.67150 × 10 −11 , 1.77309 × 10 −11 , and 2.65858 × 10 -12 by using the epochs as 260, 359, and 871 for cases 1-3.By increasing the epochs, the testing, training, and endorsement curves perform the position of steady state using the performances up to 10 −12 .The gradient operator values for the first to the third case are presented as 5.1178 × 10 −8 , 1.4841 × 10 −8 , and 9.1037 × 10 −9 .An error gradient shows the magnitude and direction values that are performed during the process of training based on the proposed neural network, which is applied to enhance the weights of the network with the right amount and direction.Figure4shows the obtained calculated results along with the EHs measures to solve the MFLMM.The EHs for cases 1 to 3 of the MFLMM have been performed as 1.81 ×10 −7 , 1.97 × 10 −6 , and 6.45 × 10 −7 .Figure4shows a plot, based on justification, of the testing, and training that perform the best.It also shows

Figure 2 .
Figure 2. Different layer presentations for the MFLMM.

Figure 3 .
Figure 3. Best authorization and gradient values for the MFLMM.

Figure 3 .
Figure 3. Best authorization and gradient values for the MFLMM.

Figure 3 .Figure 4 .
Figure 3. Best authorization and gradient values for the MFLMM.

Figure 6 .
Figure 6.Result comparison for each class of the MFLMM using the BRNNA.

Figure 6 .
Figure 6.Result comparison for each class of the MFLMM using the BRNNA.
Figures6 and 7present the comparisons and the AE performances for the MFLMM using the BRNNA.Figure provides the graphs of result comparisons that have been obtained through the matching of proposed and reference results.These corresponding results provide the accuracy of the scheme for solving the MFLMM.Figure7indicates the AE performances for each dynamic of the MFLMM.AE performances for M r (θ) are illustrated in Figure7a, and lie as 10 −6 -10 −8 , 10 −5 -10 −7 , and 10 −6 -10 −7 for cases 1 to 3. For the class M i (θ), the AE for cases 1 to 3 is performed as 10 −5 -10 −6 , 10 −5 -10 −7 , and 10 −6 -10 −7 .These measures for the classes L r (θ) and L i (θ) are derived in Figure7c,d that are performed as 10 −6 -10 −9 , 10 −6 -10 −8 and 10 −7 -10 −8 for the first to the third case.These results match and are reduceable to AE, which implies the accuracy of the BRNNA for solving the MFLMM.

Figure 6 .Figure 7 .
Figure 6.Result comparison for each class of the MFLMM using the BRNNA.

Figure 7 .
Figure 7. AE for the nonlinear dynamics based on the MFLMM.
Table 1 performs the MSE by using the BRNNA for the MFLMM.

Table 1 .
MSE performances through BRNNA for the MFLMM.