Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (59)

Search Parameters:
Keywords = reparametrization

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
19 pages, 622 KB  
Article
Q-Function-Based Diagnostic and Spatial Dependence in Reparametrized t-Student Linear Model
by Miguel A. Uribe-Opazo, Rosangela C. Schemmer, Fernanda De Bastiani, Manuel Galea, Rosangela A. B. Assumpção and Tamara C. Maltauro
Mathematics 2025, 13(18), 3035; https://doi.org/10.3390/math13183035 - 20 Sep 2025
Viewed by 353
Abstract
Characterizingthe spatial variability of agricultural data is a fundamental step in precision agriculture, especially in soil management and the creation of differentiated management units for increasing productivity. Modeling the spatial dependence structure using geostatistical methods is of great importance for efficiency, estimating the [...] Read more.
Characterizingthe spatial variability of agricultural data is a fundamental step in precision agriculture, especially in soil management and the creation of differentiated management units for increasing productivity. Modeling the spatial dependence structure using geostatistical methods is of great importance for efficiency, estimating the parameters that define this structure, and performing kriging-based interpolation. This work presents diagnostic techniques for global and local influence and generalized leverage using the displacement of the conditional expectation of the logarithm of the joint-likelihood, called the Q-function. This method is used to identify the presence of influential observations that can interfere with parameter estimations, geostatistics model selection, map construction, and spatial variability. To study spatially correlated data, we used reparameterized t-Student distribution linear spatial modeling. This distribution has been used as an alternative to the normal distribution when data have outliers, and it has the same form of covariance matrix as the normal distribution, which enables a direct comparison between them. The methodology is illustrated using one real data set, and the results showed that the modeling was more robust in the presence of influential observations. The study of these observations is indispensable for decision-making in precision agriculture. Full article
Show Figures

Figure 1

21 pages, 654 KB  
Article
Regression Modeling for Cure Factors on Uterine Cancer Data Using the Reparametrized Defective Generalized Gompertz Distribution
by Dionisio Silva-Neto, Francisco Louzada-Neto and Vera Lucia Tomazella
Math. Comput. Appl. 2025, 30(5), 93; https://doi.org/10.3390/mca30050093 - 31 Aug 2025
Viewed by 497
Abstract
Recent advances in medical research have improved survival outcomes for patients with life-threatening diseases. As a result, the existence of long-term survivors from these illnesses is becoming common. However, conventional models in survival analysis assume that all individuals remain at risk of death [...] Read more.
Recent advances in medical research have improved survival outcomes for patients with life-threatening diseases. As a result, the existence of long-term survivors from these illnesses is becoming common. However, conventional models in survival analysis assume that all individuals remain at risk of death after the follow-up, disregarding the presence of a cured subpopulation. An important methodological advancement in this context is the use of defective distributions. In the defective models, the survival function converges to a constant value p(0,1) as a function of the parameters. Among these models, the defective generalized Gompertz distribution (DGGD) has emerged as a flexible approach. In this work, we introduce a reparametrized version of the DGGD that incorporates the cure parameter and accommodates covariate effects to assess individual-level factors associated with long-term survival. A Bayesian model is presented, with parameter estimation via the Hamiltonian Monte Carlo algorithm. A simulation study demonstrates good asymptotic results of the estimation process under vague prior information. The proposed methodology is applied to a real-world dataset of patients with uterine cancer. Our results reveal statistically significant protective effects of surgical intervention, alongside elevated risk associated with age over 50 years, diagnosis at the metastatic stage, and treatment with chemotherapy. Full article
(This article belongs to the Special Issue Statistical Inference in Linear Models, 2nd Edition)
Show Figures

Figure 1

26 pages, 18271 KB  
Article
ECAN-Detector: An Efficient Context-Aggregation Network for Small-Object Detection
by Gaofeng Xing, Zhikang Xu, Yulong He, Hailong Ning, Menghao Sun and Chunmei Wang
AppliedMath 2025, 5(2), 58; https://doi.org/10.3390/appliedmath5020058 - 20 May 2025
Viewed by 1648
Abstract
Over the past decade, the field of object detection has advanced remarkably, especially in the accurate recognition of medium- and large-sized objects. Nevertheless, detecting small objects is still difficult because their low-resolution appearance provides insufficient discriminative features, and they often suffer severe occlusions, [...] Read more.
Over the past decade, the field of object detection has advanced remarkably, especially in the accurate recognition of medium- and large-sized objects. Nevertheless, detecting small objects is still difficult because their low-resolution appearance provides insufficient discriminative features, and they often suffer severe occlusions, particularly in the safety-critical context of autonomous driving. Conventional detectors often fail to extract sufficient information from shallow feature maps, which limits their ability to detect small objects with high precision. To address this issue, we propose the ECAN-Detector, an efficient context-aggregation method designed to enrich the feature representation of shallow layers, which are particularly beneficial for small-object detection. The model first employs an additional shallow detection layer to extract high-resolution features that provide more detailed information for subsequent stages of the network, and then incorporates a dynamic scaled transformer (DST) that enriches spatial perception by adaptively fusing global semantics and local context. Concurrently, a context-augmentation module (CAM) embedded in the shallow layer complements both global and local features relevant to small objects. To further boost the average precision of small-object detection, we implement a faster method utilizing two reparametrized convolutions in the detection head. Finally, extensive experiments conducted on the VisDrone2012-DET and VisDrone2021-DET datasets verified that our proposed method surpasses the baseline model, and achieved a significant improvement of 3.1% in AP and 3.5% in APs. Compared with recent state-of-the-art (SOTA) detectors, ECAN Detector delivers comparable accuracy yet preserves real-time throughput, reaching 54.3 FPS. Full article
(This article belongs to the Special Issue Optimization and Machine Learning)
Show Figures

Figure 1

9 pages, 459 KB  
Article
Invariance Property of Cauchy–Stieltjes Kernel Families Under Free and Boolean Multiplicative Convolutions
by Fahad Alsharari, Raouf Fakhfakh and Fatimah Alshahrani
Mathematics 2025, 13(7), 1044; https://doi.org/10.3390/math13071044 - 23 Mar 2025
Viewed by 397
Abstract
This article delves into some properties of free and Boolean multiplicative convolutions, in connection with the theory of Cauchy–Stieltjes kernel (CSK) families and their respective variance functions (VFs). Consider [...] Read more.
This article delves into some properties of free and Boolean multiplicative convolutions, in connection with the theory of Cauchy–Stieltjes kernel (CSK) families and their respective variance functions (VFs). Consider K(μ)={Qmμ(ds):m(mμ,m0μ)}, a CSK family induced by a non-degenerate probability measure μ on the positive real line with a finite first-moment m0μ. For γ>1, we introduce a new family of measures: K(μ)γ=Qmμγ(ds):m(mμ,m0μ). We show that if K(μ)γ represents a re-parametrization of the CSK family K(μ), then μ is characterized by its corresponding VF Vμ(m)=cm2ln(m), with c>0. We also prove that if K(μ)γ is a re-parametrization of K(D1/γ(μγ)) (where ⊞ is the additive free convolution and Da(μ) denotes the dilation μ by a number a0), then μ is characterized by its corresponding VF Vμ(m)=c1(mln(m))2, with c1>0. Similar results are obtained if we substitute the free multiplicative convolution ⊠ with the Boolean multiplicative convolution ⨃. Full article
(This article belongs to the Section D1: Probability and Statistics)
28 pages, 1473 KB  
Article
Maximum Trimmed Likelihood Estimation for Discrete Multivariate Vasicek Processes
by Thomas M. Fullerton, Michael Pokojovy, Andrews T. Anum and Ebenezer Nkum
Economies 2025, 13(3), 68; https://doi.org/10.3390/economies13030068 - 6 Mar 2025
Viewed by 928
Abstract
The multivariate Vasicek model is commonly used to capture mean-reverting dynamics typical for short rates, asset price stochastic log-volatilities, etc. Reparametrizing the discretized problem as a VAR(1) model, the parameters are oftentimes estimated using the multivariate least squares (MLS) method, which can be [...] Read more.
The multivariate Vasicek model is commonly used to capture mean-reverting dynamics typical for short rates, asset price stochastic log-volatilities, etc. Reparametrizing the discretized problem as a VAR(1) model, the parameters are oftentimes estimated using the multivariate least squares (MLS) method, which can be susceptible to outliers. To account for potential model violations, a maximum trimmed likelihood estimation (MTLE) approach is utilized to derive a system of nonlinear estimating equations, and an iterative procedure is developed to solve the latter. In addition to robustness, our new technique allows for reliable recovery of the long-term mean, unlike existing methodologies. A set of simulation studies across multiple dimensions, sample sizes and robustness configurations are performed. MTLE outcomes are compared to those of multivariate least trimmed squares (MLTS), MLE and MLS. Empirical results suggest that MTLE not only maintains good relative efficiency for uncontaminated data but significantly improves overall estimation quality in the presence of data irregularities. Additionally, real data examples containing daily log-volatilities of six common assets (commodities and currencies) and US/Euro short rates are also analyzed. The results indicate that MTLE provides an attractive instrument for interest rate forecasting, stochastic volatility modeling, risk management and other applications requiring statistical robustness in complex economic and financial environments. Full article
Show Figures

Figure 1

13 pages, 1180 KB  
Article
Ustekinumab Drug Clearance Is Better Associated with Disease Control than Serum Trough Concentrations in a Prospective Cohort of Inflammatory Bowel Disease
by Andres J. Yarur, Thierry Dervieux, Ryan Ungaro, Elizabeth A. Spencer, Alexandra Bruss, Lizbeth Nunez, Brandon Berens, Séverine Vermeire, Zhigang Wang, John C. Panetta, Erwin Dreesen and Marla C. Dubinsky
Pharmaceutics 2025, 17(2), 187; https://doi.org/10.3390/pharmaceutics17020187 - 2 Feb 2025
Cited by 1 | Viewed by 1572
Abstract
Background/Objectives: This study aimed to compare the association of ustekinumab (UST) drug clearance (CL) and trough drug concentrations with disease activity in patients with inflammatory bowel diseases (IBDs). Methods: A prospective cohort of 83 patients with IBD receiving maintenance therapy with [...] Read more.
Background/Objectives: This study aimed to compare the association of ustekinumab (UST) drug clearance (CL) and trough drug concentrations with disease activity in patients with inflammatory bowel diseases (IBDs). Methods: A prospective cohort of 83 patients with IBD receiving maintenance therapy with 90 mg subcutaneous UST was analyzed using Bayesian PK modeling. UST concentrations and antibodies to UST (ATU) were collected at the trough and measured using a drug-tolerant homogenous mobility shift assay (HMSA). CL was estimated using Bayesian estimation methods with priors from a previous population pharmacokinetic study specifically reparametrized using HMSA. Outcomes were combined clinical and biochemical remission and endoscopic healing index (EHI) score, a validated marker of endoscopic active disease in IBD. Statistical analysis consisted of linear and nonlinear mixed effect models for repeated time-to-event analysis. Results: A total of 83 patients with IBD were enrolled (median age 42 years, 52% female) and evaluated across 312 dose cycles (median follow-up: 279 days, median of 3 cycles/patient). Median concentrations and CL were 5.0 µg/mL and 0.157 L/day, respectively. Most patients (89%) were exposed to other biologics before starting UST, which was associated with lower rates of clinical and biochemical remission (p = 0.01). Longitudinal changes in concentrations were not associated with remission (p = 0.53). Conversely, higher CL was associated with a lower likelihood of remission (p < 0.01). EHI > 50 points (endoscopic active disease, n = 303 cycles) was associated with higher UST CL (p < 0.01). Conclusions: UST CL was more strongly associated with clinical and biochemical outcomes than trough concentrations, highlighting its potential role in therapy optimization. Full article
Show Figures

Figure 1

22 pages, 8009 KB  
Article
Modeling of Spiral Wound Membranes for Gas Separations—Part IV: Real-Time Monitoring Based on Detailed Phenomenological Model
by Marília Caroline C. de Sá, Diego Q. F. de Menezes, Tahyná B. Fontoura, Luiz Felipe de O. Campos, Thiago K. Anzai, Fábio C. Diehl, Pedro H. Thompson and José Carlos Pinto
Processes 2024, 12(11), 2597; https://doi.org/10.3390/pr12112597 - 19 Nov 2024
Viewed by 1607
Abstract
The present study presents, for the first time, the real-time monitoring of an actual spiral-wound membrane unit used for CO2 removal from natural gas in an actual industrial offshore platform, utilizing a detailed phenomenological model. An Object-Oriented Programming (OOP) paradigm was employed [...] Read more.
The present study presents, for the first time, the real-time monitoring of an actual spiral-wound membrane unit used for CO2 removal from natural gas in an actual industrial offshore platform, utilizing a detailed phenomenological model. An Object-Oriented Programming (OOP) paradigm was employed to simulate the offshore membrane separation unit, accounting for the diverse levels of the membrane separation setup. A parameter estimation procedure was implemented to fit the phenomenological model to the real industrial data in real-time, for the first time. In addition, estimated permeance parameters and calculated unmeasured variables (soft sensor) were used for monitoring Key Performance Indicators (KPIs), such as membrane selectivity, dew point temperature, and hydrocarbon loss. Finally, a reparametrization of the parameters was implemented to improve the robustness of the optimization procedure. Thus, the model variables presented good adjustments to the data, indicating the satisfactory performance of the estimation. Consequently, the good accuracy of the model provided reliable information to the soft sensors and KPIs. Full article
(This article belongs to the Section Process Control and Monitoring)
Show Figures

Figure 1

20 pages, 11655 KB  
Article
Variational Color Shift and Auto-Encoder Based on Large Separable Kernel Attention for Enhanced Text CAPTCHA Vulnerability Assessment
by Xing Wan, Juliana Johari and Fazlina Ahmat Ruslan
Information 2024, 15(11), 717; https://doi.org/10.3390/info15110717 - 7 Nov 2024
Cited by 2 | Viewed by 1345
Abstract
Text CAPTCHAs are crucial security measures deployed on global websites to deter unauthorized intrusions. The presence of anti-attack features incorporated into text CAPTCHAs limits the effectiveness of evaluating them, despite CAPTCHA recognition being an effective method for assessing their security. This study introduces [...] Read more.
Text CAPTCHAs are crucial security measures deployed on global websites to deter unauthorized intrusions. The presence of anti-attack features incorporated into text CAPTCHAs limits the effectiveness of evaluating them, despite CAPTCHA recognition being an effective method for assessing their security. This study introduces a novel color augmentation technique called Variational Color Shift (VCS) to boost the recognition accuracy of different networks. VCS generates a color shift of every input image and then resamples the image within that range to generate a new image, thus expanding the number of samples of the original dataset to improve training effectiveness. In contrast to Random Color Shift (RCS), which treats the color offsets as hyperparameters, VCS estimates color shifts by reparametrizing the points sampled from the uniform distribution using predicted offsets according to every image, which makes the color shifts learnable. To better balance the computation and performance, we also propose two variants of VCS: Sim-VCS and Dilated-VCS. In addition, to solve the overfitting problem caused by disturbances in text CAPTCHAs, we propose an Auto-Encoder (AE) based on Large Separable Kernel Attention (AE-LSKA) to replace the convolutional module with large kernels in the text CAPTCHA recognizer. This new module employs an AE to compress the interference while expanding the receptive field using Large Separable Kernel Attention (LSKA), reducing the impact of local interference on the model training and improving the overall perception of characters. The experimental results show that the recognition accuracy of the model after integrating the AE-LSKA module is improved by at least 15 percentage points on both M-CAPTCHA and P-CAPTCHA datasets. In addition, experimental results demonstrate that color augmentation using VCS is more effective in enhancing recognition, which has higher accuracy compared to RCS and PCA Color Shift (PCA-CS). Full article
(This article belongs to the Special Issue Computer Vision for Security Applications)
Show Figures

Graphical abstract

24 pages, 5846 KB  
Article
Enhanced Drag Force Estimation in Automotive Design: A Surrogate Model Leveraging Limited Full-Order Model Drag Data and Comprehensive Physical Field Integration
by Kalinja Naffer-Chevassier, Florian De Vuyst and Yohann Goardou
Computation 2024, 12(10), 207; https://doi.org/10.3390/computation12100207 - 16 Oct 2024
Cited by 2 | Viewed by 2462
Abstract
In this paper, a novel surrogate model for shape-parametrized vehicle drag force prediction is proposed. It is assumed that only a limited dataset of high-fidelity CFD results is available, typically less than ten high-fidelity CFD solutions for different shape samples. The idea is [...] Read more.
In this paper, a novel surrogate model for shape-parametrized vehicle drag force prediction is proposed. It is assumed that only a limited dataset of high-fidelity CFD results is available, typically less than ten high-fidelity CFD solutions for different shape samples. The idea is to take advantage not only of the drag coefficients but also physical fields such as velocity, pressure, and kinetic energy evaluated on a cutting plane in the wake of the vehicle and perpendicular to the road. This additional “augmented” information provides a more accurate and robust prediction of the drag force compared to a standard surface response methodology. As a first step, an original reparametrization of the shape based on combination coefficients of shape principal components is proposed, leading to a low-dimensional representation of the shape space. The second step consists in determining principal components of the x-direction momentum flux through a cutting plane behind the car. The final step is to find the mapping between the reduced shape description and the momentum flux formula to achieve an accurate drag estimation. The resulting surrogate model is a space-parameter separated representation with shape principal component coefficients and spatial modes dedicated to drag-force evaluation. The algorithm can deal with shapes of variable mesh by using an optimal transport procedure that interpolates the fields on a shared reference mesh. The Machine Learning algorithm is challenged on a car concept with a three-dimensional shape design space. With only two well-chosen samples, the numerical algorithm is able to return a drag surrogate model with reasonable uniform error over the validation dataset. An incremental learning approach involving additional high-fidelity computations is also proposed. The leading algorithm is shown to improve the model accuracy. The study also shows the sensitivity of the results with respect to the initial experimental design. As feedback, we discuss and suggest what appear to be the correct choices of experimental designs for the best results. Full article
(This article belongs to the Special Issue Synergy between Multiphysics/Multiscale Modeling and Machine Learning)
Show Figures

Figure 1

13 pages, 2551 KB  
Article
Lactobacilli and Bifidobacteria: A Parapostbiotic Approach to Study and Explain Their Mutual Bioactive Influence
by Clelia Altieri, Alfonso Filippone, Antonio Bevilacqua, Maria Rosaria Corbo and Milena Sinigaglia
Foods 2024, 13(18), 2966; https://doi.org/10.3390/foods13182966 - 19 Sep 2024
Viewed by 2798
Abstract
Three strains of Lactiplantibacillus plantarum and three bifidobacteria (Bifidobacterium animalis subsp. lactis, Bifidobacterium breve, and Bifidobacterium subtile) were used as target strains; in addition, for each microorganism, the cell-free supernatant (CFS) was produced and used as an ingredient of [...] Read more.
Three strains of Lactiplantibacillus plantarum and three bifidobacteria (Bifidobacterium animalis subsp. lactis, Bifidobacterium breve, and Bifidobacterium subtile) were used as target strains; in addition, for each microorganism, the cell-free supernatant (CFS) was produced and used as an ingredient of the growth medium. Namely CFSs from lactobacilli were used on bifidobacteria and CFSs from bifidobacteria were used on lactobacilli. The viable count was assessed, and the data were modelled through a reparametrized Gompertz equation cast both in the positive and negative form to evaluate the parameters t-7log, which is the time after which the viable count was 7 log CFU/mL, and the t-7log*, which is the time after which the viable count was below 7 log CFU/mL; the difference between the t-7log* and t-7log defines the stability time. Statistics through a multiparametric ANOVA (analysis of variance) provided evidence for the presence of a bifidogenic and/or bioactive factor produced by bifidobacteria and active on lactobacilli, and vice versa (bioactive factor of lactobacilli with a functional effect on bifidobacteria), although further studies are required to better explain the mechanisms beyond the positive effects. In addition, the influence on the target strains can be found during the growth phase (stimulation), as well as during senescence and death phase (protective effect), with a strong strain/species dependence on both CFS production and target strain. Full article
(This article belongs to the Section Food Microbiology)
Show Figures

Figure 1

24 pages, 413 KB  
Article
Bayesian Methods for Step-Stress Accelerated Test under Gamma Distribution with a Useful Reparametrization and an Industrial Data Application
by Hassan S. Bakouch, Fernando A. Moala, Shuhrah Alghamdi and Olayan Albalawi
Mathematics 2024, 12(17), 2747; https://doi.org/10.3390/math12172747 - 4 Sep 2024
Cited by 1 | Viewed by 1276
Abstract
This paper presents a multiple step-stress accelerated life test using type II censoring. Assuming that the lifetimes of the test item follow the gamma distribution, the maximum likelihood estimation and Bayesian approaches are used to estimate the distribution parameters. In the Bayesian approach, [...] Read more.
This paper presents a multiple step-stress accelerated life test using type II censoring. Assuming that the lifetimes of the test item follow the gamma distribution, the maximum likelihood estimation and Bayesian approaches are used to estimate the distribution parameters. In the Bayesian approach, new parametrizations can lead to new prior distributions and can be a useful technique to improve the efficiency and effectiveness of Bayesian modeling, particularly when dealing with complex or high-dimensional models. Therefore, in this paper, we present two sets of prior distributions for the parameters of the accelerated test where one of them is based on the reparametrization of the other. The performance of the proposed prior distributions and maximum likelihood approach are investigated and compared by examining the summaries and frequentist coverage probabilities of intervals. We introduce the Markov Chain Monte Carlo (MCMC) algorithms to generate samples from the posterior distributions in order to evaluate the estimators and intervals. Numerical simulations are conducted to examine the approach’s performance and one-sample lifetime data are presented to illustrate the proposed methodology. Full article
(This article belongs to the Special Issue Reliability Estimation and Mathematical Statistics)
Show Figures

Figure 1

27 pages, 2286 KB  
Review
The Scale-Invariant Vacuum Paradigm: Main Results and Current Progress Review (Part II)
by Vesselin G. Gueorguiev and Andre Maeder
Symmetry 2024, 16(6), 657; https://doi.org/10.3390/sym16060657 - 26 May 2024
Cited by 8 | Viewed by 1782
Abstract
This is a summary of the main results within the Scale-Invariant Vacuum (SIV) paradigm based on Weyl integrable geometry. We also review the mathematical framework and utilize alternative derivations of the key equations based on the reparametrization invariance as well. The main results [...] Read more.
This is a summary of the main results within the Scale-Invariant Vacuum (SIV) paradigm based on Weyl integrable geometry. We also review the mathematical framework and utilize alternative derivations of the key equations based on the reparametrization invariance as well. The main results discussed are related to the early universe; that is, applications to inflation, Big Bang Nucleosynthesis, and the growth of the density fluctuations within the SIV. Some of the key SIV results for the early universe are a natural exit from inflation within the SIV in a later time texit with value related to the parameters of the inflationary potential along with the possibility for the density fluctuations to grow sufficiently fast within the SIV without the need for dark matter to seed the growth of structure in the universe. In the late-time universe, the applications of the SIV paradigm are related to scale-invariant dynamics of galaxies, MOND, dark matter, and dwarf spheroidals, where one can find MOND to be a peculiar case of the SIV theory. Finally, within the recent time epoch, we highlight that some of the change in the length-of-the-day (LOD), about 0.92 cm/yr, can be accounted for by SIV effects in the Earth–Moon system. Full article
(This article belongs to the Special Issue Nature and Origin of Dark Matter and Dark Energy, 2nd Edition)
Show Figures

Figure 1

28 pages, 844 KB  
Article
Sparse Bayesian Neural Networks: Bridging Model and Parameter Uncertainty through Scalable Variational Inference
by Aliaksandr Hubin and Geir Storvik
Mathematics 2024, 12(6), 788; https://doi.org/10.3390/math12060788 - 7 Mar 2024
Cited by 4 | Viewed by 3889
Abstract
Bayesian neural networks (BNNs) have recently regained a significant amount of attention in the deep learning community due to the development of scalable approximate Bayesian inference techniques. There are several advantages of using a Bayesian approach: parameter and prediction uncertainties become easily available, [...] Read more.
Bayesian neural networks (BNNs) have recently regained a significant amount of attention in the deep learning community due to the development of scalable approximate Bayesian inference techniques. There are several advantages of using a Bayesian approach: parameter and prediction uncertainties become easily available, facilitating more rigorous statistical analysis. Furthermore, prior knowledge can be incorporated. However, the construction of scalable techniques that combine both structural and parameter uncertainty remains a challenge. In this paper, we apply the concept of model uncertainty as a framework for structural learning in BNNs and, hence, make inferences in the joint space of structures/models and parameters. Moreover, we suggest an adaptation of a scalable variational inference approach with reparametrization of marginal inclusion probabilities to incorporate the model space constraints. Experimental results on a range of benchmark datasets show that we obtain comparable accuracy results with the competing models, but based on methods that are much more sparse than ordinary BNNs. Full article
(This article belongs to the Special Issue Neural Networks and Their Applications)
Show Figures

Figure 1

13 pages, 428 KB  
Article
A Geometrical Study about the Biparametric Family of Anomalies in the Elliptic Two-Body Problem with Extensions to Other Families
by José Antonio López Ortí, Francisco José Marco Castillo and María José Martínez Usó
Algorithms 2024, 17(2), 66; https://doi.org/10.3390/a17020066 - 4 Feb 2024
Viewed by 3018
Abstract
In the present paper, we efficiently solve the two-body problem for extreme cases such as those with high eccentricities. The use of numerical methods, with the usual variables, cannot maintain the perihelion passage accurately. In previous articles, we have verified that this problem [...] Read more.
In the present paper, we efficiently solve the two-body problem for extreme cases such as those with high eccentricities. The use of numerical methods, with the usual variables, cannot maintain the perihelion passage accurately. In previous articles, we have verified that this problem is treated more adequately through temporal reparametrizations related to the mean anomaly through the partition function. The biparametric family of anomalies, with an appropriate partition function, allows a systematic study of these transformations. In the present work, we consider the elliptical orbit as a meridian section of the ellipsoid of revolution, and the partition function depends on two variables raised to specific parameters. One of the variables is the mean radius of the ellipsoid at the secondary, and the other is the distance to the primary. One parameter regulates the concentration of points in the apoapsis region, and the other produces a symmetrical displacement between the polar and equatorial regions. The three most used geodesy latitude variables are also studied, resulting in one not belonging to the biparametric family. However, it is in the one introduced now, which implies an extension of the biparametric method. The results obtained using the method presented here now allow a causal interpretation of the operation of numerous reparametrizations used in the study of orbital motion. Full article
(This article belongs to the Special Issue Mathematical Modelling in Engineering and Human Behaviour)
Show Figures

Figure 1

16 pages, 1501 KB  
Article
Wald Intervals via Profile Likelihood for the Mean of the Inverse Gaussian Distribution
by Patchanok Srisuradetchai, Ausaina Niyomdecha and Wikanda Phaphan
Symmetry 2024, 16(1), 93; https://doi.org/10.3390/sym16010093 - 11 Jan 2024
Cited by 7 | Viewed by 2271
Abstract
The inverse Gaussian distribution, known for its flexible shape, is widely used across various applications. Existing confidence intervals for the mean parameter, such as profile likelihood, reparametrized profile likelihood, and Wald-type reparametrized profile likelihood with observed Fisher information intervals, are generally effective. However, [...] Read more.
The inverse Gaussian distribution, known for its flexible shape, is widely used across various applications. Existing confidence intervals for the mean parameter, such as profile likelihood, reparametrized profile likelihood, and Wald-type reparametrized profile likelihood with observed Fisher information intervals, are generally effective. However, our simulation study identifies scenarios where the coverage probability falls below the nominal confidence level. Wald-type intervals are widely used in statistics and have a symmetry property. We mathematically derive the Wald-type profile likelihood (WPL) interval and the Wald-type reparametrized profile likelihood with expected Fisher information (WRPLE) interval and compare their performance to existing methods. Our results indicate that the WRPLE interval outperforms others in terms of coverage probability, while the WPL typically yields the shortest interval. Additionally, we apply these proposed intervals to a real dataset, demonstrating their potential applicability to other datasets that follow the IG distribution. Full article
(This article belongs to the Special Issue Symmetry in Probability Theory and Statistics)
Show Figures

Figure 1

Back to TopTop