Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (29)

Search Parameters:
Keywords = binomial removal

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
15 pages, 1168 KB  
Article
Disinfection Strategies for Euplotes spp. Control in Marine Copepod Cultures
by Maribeth Wichterman, Grace McCranie, Chase Taylor, Olivia Markham, Brittney Lacy, Matthew DiMaggio and Casey Murray
Fishes 2026, 11(2), 91; https://doi.org/10.3390/fishes11020091 - 2 Feb 2026
Viewed by 662
Abstract
Marine copepods are an essential live feed for the culture of many marine ornamental fish and other finfish species, yet their production is frequently constrained by contamination from free-living ciliates. To address this challenge, the efficacy of three disinfectants (sodium hypochlorite, iodine, and [...] Read more.
Marine copepods are an essential live feed for the culture of many marine ornamental fish and other finfish species, yet their production is frequently constrained by contamination from free-living ciliates. To address this challenge, the efficacy of three disinfectants (sodium hypochlorite, iodine, and hydrogen peroxide) was evaluated for ciliate removal in cultures of two copepod species, Parvocalanus crassirostris and Oithona colcarva. Appropriate ranges of disinfectant concentrations and exposure durations were identified through a preliminary trial assessing the toxicity to Euplotes spp. over a 5-min period. Subsequent experiments tested three doses of each disinfectant to quantify ciliate removal success and egg hatch rates for each copepod species. Ciliate presence/absence showed no variation (100% in controls, 0% after disinfection), precluding statistical analysis except for one variable iodine trial, which was analyzed using Fisher’s Exact Test. Hatch and recovery rates were analyzed using binomial GLMMs with treatment as a fixed effect and replicate as a random effect, with Tukey-adjusted pairwise comparisons and α = 0.05. Sodium hypochlorite and hydrogen peroxide consistently removed all ciliates across tested concentrations, whereas iodine only achieved complete removal at the highest dose. The effects on hatch rate differed between species, with hydrogen peroxide producing the highest hatch rates in P. crassirostris (approximately 44 to 46% at 50–100 g/L for one minute) and sodium hypochlorite supporting the highest hatch in O. colcarva (up to 92% at 250 mg/L for one minute). These findings demonstrate that disinfectant performance is species-specific and that species-specific disinfection protocols are warranted to improve the reliability of copepod production in marine aquaculture. Full article
(This article belongs to the Special Issue Zooplankton Production Applied to Aquaculture)
Show Figures

Figure 1

12 pages, 1655 KB  
Article
Impact of Integrated Control Interventions on Sandfly Populations in Human and Canine Visceral Leishmaniasis Control in Araçatuba, State of São Paulo, Brazil
by Keuryn Alessandra Mira Luz-Requena, Tania Mara Tomiko Suto, Osias Rangel, Regina Célia Loverdi de Lima Stringheta, Thais Rabelo Santos-Doni, Lilian Aparecida Colebrusco Rodas and Katia Denise Saraiva Bresciani
Insects 2026, 17(1), 125; https://doi.org/10.3390/insects17010125 - 21 Jan 2026
Viewed by 353
Abstract
Visceral leishmaniasis (VL) is a serious vector-borne disease affecting humans and dogs, posing major public health challenges in endemic regions. Control efforts often target sandfly vectors, whose larvae and pupae develop in soil. Environmental management, such as removing organic matter, reducing moisture, and [...] Read more.
Visceral leishmaniasis (VL) is a serious vector-borne disease affecting humans and dogs, posing major public health challenges in endemic regions. Control efforts often target sandfly vectors, whose larvae and pupae develop in soil. Environmental management, such as removing organic matter, reducing moisture, and pruning vegetation, aims to limit breeding sites and reduce sandfly populations. This study evaluated the impact of integrated interventions on sandfly behavior in priority areas for VL control in Araçatuba, São Paulo, Brazil. The control strategy combined environmental management, canine surveys, and educational actions across seven local work areas (LWAs). Between 2019 and 2021, CDC-type light traps were installed in intra- and peridomiciliary settings at twelve properties in LWA 5. Spatial risk analysis for canine transmission was conducted in LWAs 3 and 5 using a Generalized Additive Model, with results presented as spatial odds ratios. Vector prevalence was analyzed using negative binomial regression compared to historical municipal data. Intervention coverage averaged 52.91% of visited properties (n = 15,905), ranging from 48% to 76.8% across LWAs. Adherence to environmental management exceeded 85%. Of the 150 sandflies collected, 98.67% were Lutzomyia longipalpis and 1.33% Nyssomyia neivai. A 6% reduction in vector density was observed compared with historical data, although this difference was not statistically significant. Spatial risk varied among LWAs, indicating heterogeneous transmission levels. These findings suggest that integrated environmental and educational interventions may contribute to reducing vector density and that identifying priority areas tends to support surveillance and the effectiveness of disease control actions. Full article
Show Figures

Figure 1

31 pages, 12350 KB  
Article
Statistical Evaluation of Beta-Binomial Probability Law for Removal in Progressive First-Failure Censoring and Its Applications to Three Cancer Cases
by Ahmed Elshahhat, Osama E. Abo-Kasem and Heba S. Mohammed
Mathematics 2025, 13(18), 3028; https://doi.org/10.3390/math13183028 - 19 Sep 2025
Viewed by 766
Abstract
Progressive first-failure censoring is a flexible and cost-efficient strategy that captures real-world testing scenarios where only the first failure is observed at each stage while randomly removing remaining units, making it ideal for biomedical and reliability studies. By applying the α-power transformation [...] Read more.
Progressive first-failure censoring is a flexible and cost-efficient strategy that captures real-world testing scenarios where only the first failure is observed at each stage while randomly removing remaining units, making it ideal for biomedical and reliability studies. By applying the α-power transformation to the exponential baseline, the proposed model introduces an additional flexibility parameter that enriches the family of lifetime distributions, enabling it to better capture varying failure rates and diverse hazard rate behaviors commonly observed in biomedical data, thus extending the classical exponential model. This study develops a novel computational framework for analyzing an α-powered exponential model under beta-binomial random removals within the proposed censoring test. To address the inherent complexity of the likelihood function arising from simultaneous random removals and progressive censoring, we derive closed-form expressions for the likelihood, survival, and hazard functions and propose efficient estimation strategies based on both maximum likelihood and Bayesian inference. For the Bayesian approach, gamma and beta priors are adopted, and a tailored Metropolis–Hastings algorithm is implemented to approximate posterior distributions under symmetric and asymmetric loss functions. To evaluate the empirical performance of the proposed estimators, extensive Monte Carlo simulations are conducted, examining bias, mean squared error, and credible interval coverage under varying censoring levels and removal probabilities. Furthermore, the practical utility of the model is illustrated through three oncological datasets, including multiple myeloma, lung cancer, and breast cancer patients, demonstrating superior goodness of fit and predictive reliability compared to traditional models. The results show that the proposed lifespan model, under the beta-binomial probability law and within the examined censoring mechanism, offers a flexible and computationally tractable framework for reliability and biomedical survival analysis, providing new insights into censored data structures with random withdrawals. Full article
(This article belongs to the Special Issue New Advance in Applied Probability and Statistical Inference)
Show Figures

Figure 1

11 pages, 711 KB  
Article
Therapeutic Plasma Exchange in Acute Liver Failure: A Real-World Study in Mexico
by Jose Carlos Gasca-Aldama, Jesús Enrique Castrejón-Sánchez, Mario A. Carrasco Flores, Enzo Vásquez-Jiménez, Paulina Carpinteyro-Espin, Juanita Pérez-Escobar, Karlos Dhamian Gutierrez-Toledo, Pablo E. Galindo, Marcos Vidals-Sanchez and Paula Costa-Urrutia
Healthcare 2025, 13(16), 2059; https://doi.org/10.3390/healthcare13162059 - 20 Aug 2025
Cited by 2 | Viewed by 3499
Abstract
Background/Objectives: Acute liver failure (ALF) is a life-threatening condition with high mortality in nontransplant candidates. Therapeutic plasma exchange (TPE) has emerged as a promising intervention for removing inflammatory mediators and toxic metabolites. In Latin America, data on the efficacy of TPE in [...] Read more.
Background/Objectives: Acute liver failure (ALF) is a life-threatening condition with high mortality in nontransplant candidates. Therapeutic plasma exchange (TPE) has emerged as a promising intervention for removing inflammatory mediators and toxic metabolites. In Latin America, data on the efficacy of TPE in ALF patients are limited. This real-world study aimed to compare 30-day survival outcomes between patients receiving standard medical treatment (SMT) and those receiving SMT plus TPE. Methods: We analyzed 25 ALF patients admitted to the tertiary intensive care unit (ICU) of Hospital Juárez of Mexico City, Mexico, from 2018 to 2024. Patients received either standard medical treatment (SMT group, n = 12) or SMT with TPE (TPE group, n = 13), including high-volume TPE (n = 8) and standard-volume TPE (n = 5). Survival analysis was performed via Kaplan–Meier estimates, and binomial regression analysis was run to estimate the mortality probability stratified by the hepatic encephalopathy grade. Results: At 30 days, survival was significantly greater in the TPE group (92%) than in the SMT group (50%) (p = 0.02). The greatest survival benefit was observed in patients with Grade 4 encephalopathy. The ICU stay was longer in the TPE group, reflecting the complexity of ALF management. Conclusions: TPE significantly improves 30-day survival in ALF patients compared with SMT alone, supporting its role as an adjunct therapy. Further studies are needed to refine patient selection and optimize treatment protocols. Full article
(This article belongs to the Section Critical Care)
Show Figures

Figure 1

34 pages, 18712 KB  
Article
Statistical Computation of Hjorth Competing Risks Using Binomial Removals in Adaptive Progressive Type II Censoring
by Refah Alotaibi, Mazen Nassar and Ahmed Elshahhat
Mathematics 2025, 13(12), 2010; https://doi.org/10.3390/math13122010 - 18 Jun 2025
Cited by 1 | Viewed by 698
Abstract
In complex reliability applications, it is common for the failure of an individual or an item to be attributed to multiple causes known as competing risks. This paper explores the estimation of the Hjorth competing risks model based on an adaptive progressive Type [...] Read more.
In complex reliability applications, it is common for the failure of an individual or an item to be attributed to multiple causes known as competing risks. This paper explores the estimation of the Hjorth competing risks model based on an adaptive progressive Type II censoring scheme via a binomial removal mechanism. For parameter and reliability metric estimation, both frequentist and Bayesian methodologies are developed. Maximum likelihood estimates for the Hjorth parameters are computed numerically due to their intricate form, while the binomial removal parameter is derived explicitly. Confidence intervals are constructed using asymptotic approximations. Within the Bayesian paradigm, gamma priors are assigned to the Hjorth parameters and a beta prior for the binomial parameter, facilitating posterior analysis. Markov Chain Monte Carlo techniques yield Bayesian estimates and credible intervals for parameters and reliability measures. The performance of the proposed methods is compared using Monte Carlo simulations. Finally, to illustrate the practical applicability of the proposed methodology, two real-world competing risk data sets are analyzed: one representing the breaking strength of jute fibers and the other representing the failure modes of electrical appliances. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation: 3rd Edition)
Show Figures

Figure 1

11 pages, 2664 KB  
Article
Physical and Chemical Characteristics of Aedes aegypti Larval Habitats in Nouakchott, Mauritania
by Mohamed Haidy Massa, Mohamed Aly Ould Lemrabott, Osman Abdillahi Guedi, Sébastien Briolant and Ali Ould Mohamed Salem Boukhary
Trop. Med. Infect. Dis. 2025, 10(6), 147; https://doi.org/10.3390/tropicalmed10060147 - 23 May 2025
Viewed by 3732
Abstract
Aedes aegypti, the main urban vector of dengue fever, represents a growing public health problem in Nouakchott, the capital of Mauritania. Identifying the factors influencing the distribution and productivity of its breeding sites is essential for the development of effective control strategies. [...] Read more.
Aedes aegypti, the main urban vector of dengue fever, represents a growing public health problem in Nouakchott, the capital of Mauritania. Identifying the factors influencing the distribution and productivity of its breeding sites is essential for the development of effective control strategies. From May 2023 to April 2024, physico-chemical characteristics were recorded and mosquito larvae were collected, using a standard dipping method, from 60 water collections each month during the dry season and twice a month during the rainy season, totaling 294 observations. The larval positivity of water collections and larval abundance of breeding sites over the time were modeled using a random-effect logistic regression model and a negative binomial regression model, respectively. The depth, distance from habitat, type of water collection and exposure to sunlight were statistically significant and independently associated with water collection positivity for Ae. aegypti larvae (aOR = 5.18, 95%CI [1.66–16.18], p-value = 0.005; aOR = 0.00, 95%CI [0.00–0.02], p-value < 0.001; aOR = 252.88, 95%CI [4.05–15,786.84], p-value = 0.009 and aOR = 0.04, 95%CI [0.01–0.26], p-value < 0.001, respectively). Aedes aegypti larval habitats were mainly artificial (90%), temporary (n = 217 observations), close to dwellings (n = 114) and shaded (n = 96). Plastic water tanks (n = 17, 48.6%), wells (n = 6, 17.1%) and barrels (n = 4, 11.4%) were the most common breeding sites. Larval abundance was negatively associated with containers of increasing pH and surface area (aOR = 0.50, 95%CI [0.33–0.75] p-value = 0.001 and aOR = 0.48, 95%CI [0.27–0.87], p-value = 0.016, respectively). As Ae. aegypti mosquitoes are multi-resistant to adult insecticides and dengue has become endemo-epidemic since 2014, vector control should give the priority to the physical removal or treatment of shaded, peridomestic containers—particularly plastic water tanks and barrels—and consider the use of biological larvicides to target breeding sites with low pH and small surface areas. Full article
Show Figures

Figure 1

26 pages, 12878 KB  
Article
Reliability Estimation for the Inverse Chen Distribution Under Adaptive Progressive Censoring with Binomial Removals: A Framework for Asymmetric Data
by Refah Alotaibi, Mazen Nassar and Ahmed Elshahhat
Symmetry 2025, 17(6), 812; https://doi.org/10.3390/sym17060812 - 23 May 2025
Cited by 2 | Viewed by 912
Abstract
Traditional reliability methods using fixed removal plans often overlook withdrawal randomness, leading to biased estimates for asymmetric data. This study advances classical and Bayesian frameworks for the inverse Chen distribution, which is suited for modeling asymmetric data under adaptive progressively Type-II censoring with [...] Read more.
Traditional reliability methods using fixed removal plans often overlook withdrawal randomness, leading to biased estimates for asymmetric data. This study advances classical and Bayesian frameworks for the inverse Chen distribution, which is suited for modeling asymmetric data under adaptive progressively Type-II censoring with binomial removals. Here, removals post-failure follow a dynamic binomial process, enhancing a more realistic approach for reliability studies. Maximum likelihood estimates are computed numerically, with confidence intervals derived asymptotically. Bayesian approaches employ gamma priors, symmetric squared error loss, and posterior sampling for estimates and credible intervals. A simulation study validates the methods, while two asymmetric real-world applications demonstrate practicality: (1) analyzing diamond sizes from South-West Africa, capturing skewed geological distributions, and (2) modeling failure times of airborne communication transceivers, vital for aviation safety. The flexibility of the inverse Chen in handling asymmetric data addresses the limitations of symmetric assumptions, offering precise reliability tools for complex scenarios. This integration of adaptive censoring and asymmetric distributions advances reliability analysis, providing robust solutions where traditional approaches falter. Full article
Show Figures

Figure 1

28 pages, 1067 KB  
Article
Inference Based on Progressive-Stress Accelerated Life-Testing for Extended Distribution via the Marshall-Olkin Family Under Progressive Type-II Censoring with Optimality Techniques
by Ehab M. Almetwally, Osama M. Khaled and Haroon M. Barakat
Axioms 2025, 14(4), 244; https://doi.org/10.3390/axioms14040244 - 23 Mar 2025
Cited by 3 | Viewed by 1020
Abstract
This paper explores a progressive-stress accelerated life test under progressive type-II censoring with binomial random removal. It assumes a cumulative exposure model in which the lifetimes of test units follow a Marshall–Olkin length-biased exponential distribution. The study derives maximum likelihood and Bayes estimates [...] Read more.
This paper explores a progressive-stress accelerated life test under progressive type-II censoring with binomial random removal. It assumes a cumulative exposure model in which the lifetimes of test units follow a Marshall–Olkin length-biased exponential distribution. The study derives maximum likelihood and Bayes estimates of the model parameters and constructs Bayes estimates of the unknown parameters under various loss functions. In addition, this study provides approximate, credible, and bootstrapping confidence intervals for the estimators. Moreover, it evaluates three optimal test methods to determine the most effective censoring approach based on various optimality criteria. A real-life dataset is analyzed to demonstrate the proposed procedures and simulation studies used to compare two different designs of the progressive-stress test. Full article
(This article belongs to the Special Issue Stochastic Modeling and Optimization Techniques)
Show Figures

Figure 1

21 pages, 9147 KB  
Article
Exploration and Enrichment Analysis of the QTLome for Important Traits in Livestock Species
by Francisco J. Jahuey-Martínez, José A. Martínez-Quintana, Felipe A. Rodríguez-Almeida and Gaspar M. Parra-Bracamonte
Genes 2024, 15(12), 1513; https://doi.org/10.3390/genes15121513 - 26 Nov 2024
Cited by 5 | Viewed by 2428
Abstract
Background: Quantitative trait loci (QTL) are genomic regions that influence essential traits in livestock. Understanding QTL distribution and density across species’ genomes is crucial for animal genetics research. Objectives: This study explored the QTLome of cattle, pigs, sheep, and chickens by analyzing QTL [...] Read more.
Background: Quantitative trait loci (QTL) are genomic regions that influence essential traits in livestock. Understanding QTL distribution and density across species’ genomes is crucial for animal genetics research. Objectives: This study explored the QTLome of cattle, pigs, sheep, and chickens by analyzing QTL distribution and evaluating the correlation between QTL, gene density, and chromosome size with the aim to identify QTL-enriched genomic regions. Methods: Data from 211,715 QTL (1994–2021) were retrieved from the AnimalQTLdb and analyzed using R software v4.2.1. Unique QTL annotations were identified, and redundant or inconsistent data were removed. Statistical analyses included Pearson correlations and binomial, hypergeometric, and bootstrap-based enrichment tests. Results: QTL densities per Mbp were 10 for bovine, 4 for pig, 1 for sheep, and 3 for chicken genomes. Analysis of QTL distribution across chromosomes revealed uneven patterns, with certain regions enriched for QTL. Correlation analysis revealed a strong positive relationship between QTL and gene density/chromosome size across all species (p < 0.05). Enrichment analysis identified pleiotropic regions, where QTL affect multiple traits, often aligning with known candidate and major genes. Significant QTL-enriched windows (p < 0.05) were detected, with 699 (187), 355 (68), 50 (15), and 38 (17) genomic windows for cattle, pigs, sheep, and chickens, respectively, associated with overall traits (and specific phenotypic categories). Conclusions: This study provides critical insights into QTL distribution and its correlation with gene density, offering valuable data for advancing genetic research in livestock species. The identification of QTL-enriched regions also highlights key areas for future exploration in trait improvement programs. Full article
(This article belongs to the Special Issue Functional Genomics and Breeding of Animals)
Show Figures

Figure 1

24 pages, 607 KB  
Article
Bivariate Length-Biased Exponential Distribution under Progressive Type-II Censoring: Incorporating Random Removal and Applications to Industrial and Computer Science Data
by Aisha Fayomi, Ehab M. Almetwally and Maha E. Qura
Axioms 2024, 13(10), 664; https://doi.org/10.3390/axioms13100664 - 26 Sep 2024
Cited by 3 | Viewed by 1604
Abstract
In this paper, we address the analysis of bivariate lifetime data from a length-biased exponential distribution observed under Type II progressive censoring with random removals, where the number of units removed at each failure time follows a binomial distribution. We derive the likelihood [...] Read more.
In this paper, we address the analysis of bivariate lifetime data from a length-biased exponential distribution observed under Type II progressive censoring with random removals, where the number of units removed at each failure time follows a binomial distribution. We derive the likelihood function for the progressive Type II censoring scheme with random removals and apply it to the bivariate length-biased exponential distribution. The parameters of the proposed model are estimated using both likelihood and Bayesian methods for point and interval estimators, including asymptotic confidence intervals and bootstrap confidence intervals. We also employ different loss functions to construct Bayesian estimators. Additionally, a simulation study is conducted to compare the performance of censoring schemes. The effectiveness of the proposed methodology is demonstrated through the analysis of two real datasets from the industrial and computer science domains, providing valuable insights for illustrative purposes. Full article
(This article belongs to the Special Issue Applications of Bayesian Methods in Statistical Analysis)
Show Figures

Figure 1

22 pages, 420 KB  
Article
Estimation of Marshall–Olkin Extended Generalized Extreme Value Distribution Parameters under Progressive Type-II Censoring by Using a Genetic Algorithm
by Rasha Abd El-Wahab Attwa, Shimaa Wasfy Sadk and Taha Radwan
Symmetry 2024, 16(6), 669; https://doi.org/10.3390/sym16060669 - 29 May 2024
Cited by 6 | Viewed by 1744
Abstract
In this article, we consider the statistical analysis of the parameter estimation of the Marshall–Olkin extended generalized extreme value under liner normalization distribution (MO-GEVL) within the context of progressively type-II censored data. The progressively type-II censored data are considered for three specific distribution [...] Read more.
In this article, we consider the statistical analysis of the parameter estimation of the Marshall–Olkin extended generalized extreme value under liner normalization distribution (MO-GEVL) within the context of progressively type-II censored data. The progressively type-II censored data are considered for three specific distribution patterns: fixed, discrete uniform, and binomial random removal. The challenge lies in the computation of maximum likelihood estimations (MLEs), as there is no straightforward analytical solution. The classical numerical methods are considered inadequate for solving the complex MLE equation system, leading to the necessity of employing artificial intelligence algorithms. This article utilizes the genetic algorithm (GA) to overcome this difficulty. This article considers parameter estimation through both maximum likelihood and Bayesian methods. For the MLE, the confidence intervals of the parameters are calculated using the Fisher information matrix. In the Bayesian estimation, the Lindley approximation is applied, considering LINEX loss functions and square error loss, suitable for both non-informative and informative contexts. The effectiveness and applicability of these proposed methods are demonstrated through numerical simulations and practical real-data examples. Full article
Show Figures

Figure 1

9 pages, 860 KB  
Article
Number Bias in Clinicians’ Documentation of Actinic Keratosis Removal
by Phillip G. Holovach, Wei-Wen Hsu and Alan B. Fleischer
J. Clin. Med. 2024, 13(1), 202; https://doi.org/10.3390/jcm13010202 - 29 Dec 2023
Viewed by 1385
Abstract
Background: Actinic keratosis (AK) is a pre-cancerous skin condition caused by sun exposure. Number bias, a phenomenon that occurs when meaning other than numerical value is associated with numbers, may influence the reporting of AK removal. The present study aims to determine if [...] Read more.
Background: Actinic keratosis (AK) is a pre-cancerous skin condition caused by sun exposure. Number bias, a phenomenon that occurs when meaning other than numerical value is associated with numbers, may influence the reporting of AK removal. The present study aims to determine if number bias is affecting healthcare providers’ documentation of patient-provider encounters. Methods: A single-center retrospective chart review of 1415 patients’ charts was conducted at the University of Cincinnati Medical Center. To determine if there was a significant difference between even and odd-numbered AK removals reported, an exact binomial test was used. The frequency of removals per encounter was fitted to a zero-truncated negative binomial distribution to predict the number of removals expected. All data were analyzed with RStudio. Results: There were 741 odd and 549 even encounters. Odd removals were reported at a significantly greater frequency than even p < 0.001. Age may be contributing to the observed number bias (p < 0.001). One, two, and eight were reportedly removed more frequently, while nine, 13, and 14 were reportedly removed less frequently than expected, respectively. Conclusion: Number bias may be affecting clinicians’ documentation of AK removal and should be investigated in other clinical settings. Full article
(This article belongs to the Section Dermatology)
Show Figures

Figure 1

23 pages, 832 KB  
Article
Stress–Strength Reliability Analysis for Different Distributions Using Progressive Type-II Censoring with Binomial Removal
by Ibrahim Elbatal, Amal S. Hassan, L. S. Diab, Anis Ben Ghorbal, Mohammed Elgarhy and Ahmed R. El-Saeed
Axioms 2023, 12(11), 1054; https://doi.org/10.3390/axioms12111054 - 15 Nov 2023
Cited by 9 | Viewed by 2395
Abstract
In the statistical literature, one of the most important subjects that is commonly used is stress–strength reliability, which is defined as δ=PW<V, where V and W are the strength and stress random variables, respectively, and δ is [...] Read more.
In the statistical literature, one of the most important subjects that is commonly used is stress–strength reliability, which is defined as δ=PW<V, where V and W are the strength and stress random variables, respectively, and δ is reliability parameter. Type-II progressive censoring with binomial removal is used in this study to examine the inference of δ=PW<V for a component with strength V and being subjected to stress W. We suppose that V and W are independent random variables taken from the Burr XII distribution and the Burr III distribution, respectively, with a common shape parameter. The maximum likelihood estimator of δ is derived. The Bayes estimator of δ under the assumption of independent gamma priors is derived. To determine the Bayes estimates for squared error and linear exponential loss functions in the lack of explicit forms, the Metropolis–Hastings method was provided. Utilizing comprehensive simulations and two metrics (average of estimates and root mean squared errors), we compare these estimators. Further, an analysis is performed on two actual data sets based on breakdown times for insulating fluid between electrodes recorded under varying voltages. Full article
Show Figures

Figure 1

22 pages, 514 KB  
Article
Enhancement of Non-Permutation Binomial Power Functions to Construct Cryptographically Strong S-Boxes
by Herman Isa, Syed Alwee Aljunid Syed Junid, Muhammad Reza Z’aba, Rosdisham Endut, Syed Mohammad Ammar and Norshamsuri Ali
Mathematics 2023, 11(2), 446; https://doi.org/10.3390/math11020446 - 14 Jan 2023
Cited by 12 | Viewed by 3033
Abstract
A Substitution box (S-box) is an important component used in symmetric key cryptosystems to satisfy Shannon’s property on confusion. As the only nonlinear operation, the S-box must be cryptographically strong to thwart any cryptanalysis tools on cryptosystems. Generally, the S-boxes can be constructed [...] Read more.
A Substitution box (S-box) is an important component used in symmetric key cryptosystems to satisfy Shannon’s property on confusion. As the only nonlinear operation, the S-box must be cryptographically strong to thwart any cryptanalysis tools on cryptosystems. Generally, the S-boxes can be constructed using any of the following approaches: the random search approach, heuristic/evolutionary approach or mathematical approach. However, the current S-box construction has some drawbacks, such as low cryptographic properties for the random search approach and the fact that it is hard to develop mathematical functions that can be used to construct a cryptographically strong S-box. In this paper, we explore the non-permutation function that was generated from the binomial operation of the power function to construct a cryptographically strong S-box. By adopting the method called the Redundancy Removal Algorithm, we propose some enhancement in the algorithm such that the desired result can be obtained. The analytical results of our experiment indicate that all criteria such as bijective, nonlinearity, differential uniformity, algebraic degree and linear approximation are found to hold in the obtained S-boxes. Our proposed S-box also surpassed several bijective S-boxes available in the literature in terms of cryptographic properties. Full article
(This article belongs to the Special Issue New Advances in Coding Theory and Cryptography)
Show Figures

Figure 1

13 pages, 387 KB  
Review
Safety of Nicotine Replacement Therapy during Pregnancy: A Narrative Review
by María Morales-Suárez-Varela, Beatriz Marcos Puig, Linda Kaerlev, Isabel Peraita-Costa and Alfredo Perales-Marín
Int. J. Environ. Res. Public Health 2023, 20(1), 250; https://doi.org/10.3390/ijerph20010250 - 23 Dec 2022
Cited by 13 | Viewed by 6947
Abstract
Background: Smoking during pregnancy is a public health problem worldwide and the leading preventable cause of fetal morbidity and mortality and obstetric disease. Although the risk of tobacco-related harm can be substantially reduced if mothers stop smoking in the first trimester, the proportion [...] Read more.
Background: Smoking during pregnancy is a public health problem worldwide and the leading preventable cause of fetal morbidity and mortality and obstetric disease. Although the risk of tobacco-related harm can be substantially reduced if mothers stop smoking in the first trimester, the proportion of women who do so remains modest; therefore, the treatment of smoking in pregnant women will be the first therapeutic measure that health professionals should adopt when providing care to pregnant women. The recommendation of nicotine replacement therapy during pregnancy remains controversial due to the potential effects on the health of the fetus. Purpose: The aim of this review was to provide an overview of human studies about the use of nicotine replacement therapy during pregnancy, evaluating the efficacy and safety of the different formulations. Methods: The electronic databases PubMed and EMBASE were searched from May 2012 to May 2022. A total of 95 articles were identified through database searching using a combination of keywords. Out of 79 screened articles and after the removal of duplicates, 28 full-text articles were assessed for eligibility and 12 articles were finally included for review. Results: Although demonstrated to be effective in adult smokers, evidence in support of NRT in pregnant women is limited. The results of the apparent safety of the use of NRT during pregnancy contradict the FDA classification of the different NRT formulations. Faster-acting formulations seem to be the safest and even most beneficial forms for the offspring. Conclusions: NRT is not completely harmless for the fetus or for the mother; however, if an adequate assessment of the risk-benefit binomial is made, its use during pregnancy to aid in quitting smoking does seem appropriate. It is necessary to establish individual recommendations on the formulation and dose to be used during pregnancy based on individual nicotinic needs. Full article
Back to TopTop