Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (68)

Search Parameters:
Keywords = non-homogeneous Poisson process

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 716 KB  
Article
Improving Bovine Tuberculosis Surveillance Through Risk-Based Prioritization of Slaughterhouse-Triggered Trace-Back Investigations
by Luiz Felipe Crispim Lourenço and Ricardo Evandro Mendes
Animals 2026, 16(8), 1224; https://doi.org/10.3390/ani16081224 - 16 Apr 2026
Viewed by 273
Abstract
Slaughterhouse detection of lesions compatible with bovine tuberculosis represents a key passive surveillance component in Santa Catarina, Brazil, yet subsequent trace-back investigations often fail to identify infected farms. This study developed a quantitative framework to prioritize epidemiological investigations by estimating the probability of [...] Read more.
Slaughterhouse detection of lesions compatible with bovine tuberculosis represents a key passive surveillance component in Santa Catarina, Brazil, yet subsequent trace-back investigations often fail to identify infected farms. This study developed a quantitative framework to prioritize epidemiological investigations by estimating the probability of infection associated with each farm connected to PCR-confirmed cases. Using official movement records and historical diagnostic data, we reconstructed the lifetime contact networks of slaughtered cattle presenting confirmed Mycobacterium bovis lesions (n = 502). For each sentinel animal–farm interaction (n = 1452), infection probability was estimated through a non-homogeneous Poisson process incorporating exposure duration and the time-weighted average herd size as determinants of infectious pressure. After evaluating stochastic variability through Monte Carlo simulation, a deterministic model using the mean infectious-pressure parameter was applied to classify farms into high-, medium-, and low-risk categories. Model performance was assessed using validated field diagnostic outcomes within a three-year temporal window. High-risk farms represented most validated contacts (58%) and demonstrated a relative risk of 3.48 compared with lower-risk category. These findings indicate that a standardized risk-based classification can substantially improve the prioritization of trace-back investigations, offering a practical decision-support tool to enhance bovine tuberculosis surveillance and contribute to eradication strategies in Santa Catarina. Full article
(This article belongs to the Section Cattle)
Show Figures

Figure 1

20 pages, 3015 KB  
Article
A Comprehensive Cost Estimation Model for Energy-Efficient and Reliable Operation of Rainwater Pumping Stations
by Jin-Gul Joo, In-Seon Jeong, Jin-Ho You, Seungwan Han and Seung-Ho Kang
Water 2026, 18(6), 676; https://doi.org/10.3390/w18060676 - 13 Mar 2026
Viewed by 357
Abstract
The increasing frequency of torrential rainfall due to global warming has resulted in a significant rise in urban flooding and river overflows. Rainwater pumping stations, typically located near rivers, serve as buffers between sewer systems and receiving water bodies, helping to mitigate flood [...] Read more.
The increasing frequency of torrential rainfall due to global warming has resulted in a significant rise in urban flooding and river overflows. Rainwater pumping stations, typically located near rivers, serve as buffers between sewer systems and receiving water bodies, helping to mitigate flood risks. A primary challenge in operating these stations is optimizing pump performance to prevent flooding while minimizing energy consumption and costs. Various computational methods, including meta-heuristics and deep learning, have been proposed to tackle this optimization problem. However, most studies either overlook or inadequately address pump maintenance costs, which are essential for long-term operational efficiency. This gap stems from the lack of a comprehensive model that accurately captures the full spectrum of costs involved in pump operation. This paper introduces a cost estimation model that integrates both deterministic and probabilistic elements to enhance the energy-efficient operation of rainwater pumping stations. The model focuses on pumps with capacities of 100 m3/min and 170 m3/min, which are commonly used. It takes into account electricity consumption costs as well as maintenance costs arising from frequent on/off cycles and dry-run events. Predictions of failures due to these operational stresses are modeled using the Crow–AMSAA non-homogeneous Poisson process (NHPP) and Weibull distributions—probabilistic models widely used in mechanical failure analysis. To evaluate the proposed model, simulations were conducted using the Storm Water Management Model (SWMM), comparing a deep reinforcement learning-based control strategy with the current operational method at the Gasan Pumping Station in Seoul, South Korea. The pump operating costs associated with each method were calculated and analyzed using the proposed model, demonstrating its potential for ensuring cost-effective and reliable pump operation. Full article
(This article belongs to the Section Urban Water Management)
Show Figures

Figure 1

27 pages, 3061 KB  
Article
LEO Satellite and UAV-Assisted Maritime Internet of Things: Modeling and Performance Analysis for Data Acquisition
by Xu Hu, Bin Lin, Ping Wang and Xiao Lu
Future Internet 2026, 18(1), 24; https://doi.org/10.3390/fi18010024 - 1 Jan 2026
Viewed by 743
Abstract
The integration of low Earth orbit (LEO) satellites and unmanned aerial vehicles (UAVs) into the maritime Internet of Things (MIoT) offers an effective solution to overcoming the limitations of connectivity and transmission reliability in conventional MIoT, thereby supporting marine data acquisition. However, the [...] Read more.
The integration of low Earth orbit (LEO) satellites and unmanned aerial vehicles (UAVs) into the maritime Internet of Things (MIoT) offers an effective solution to overcoming the limitations of connectivity and transmission reliability in conventional MIoT, thereby supporting marine data acquisition. However, the highly dynamic ocean environment necessitates a theoretical framework for system-level performance evaluation before practical deployment. In this article, we consider a LEO satellite and UAV-assisted MIoT (LSU-MIoT) network and develop an analytical framework to evaluate its transmission performance. Specifically, marine devices and relaying buoys are modeled as a Matérn cluster process on the sea surface, UAVs as a homogeneous Poisson point process, and LEO satellites as a spherical Poisson point process. Signal transmissions over marine, aerial, and space links are characterized by Nakagami-m, Rician, and shadowed Rician fading, respectively, with the two-ray path loss model applied to sea and air links for accurately capturing propagation characteristics. By leveraging stochastic geometry, we derive analytical expressions for transmission success probability and end-to-end delay of regular and emergency data under the time division multiple access and non-orthogonal multiple access schemes. Simulation results validate the accuracy of derived expressions and reveal the impact of key parameters on the performance of LSU-MIoT networks. Full article
(This article belongs to the Special Issue Wireless Sensor Networks and Internet of Things)
Show Figures

Graphical abstract

25 pages, 2764 KB  
Article
Integrated Quality Inspection and Production Run Optimization for Imperfect Production Systems with Zero-Inflated Non-Homogeneous Poisson Deterioration
by Chih-Chiang Fang and Ming-Nan Chen
Mathematics 2025, 13(24), 3901; https://doi.org/10.3390/math13243901 - 5 Dec 2025
Cited by 1 | Viewed by 542
Abstract
This study develops an integrated quality inspection and production optimization framework for an imperfect production system, where system deterioration follows a zero-inflated non-homogeneous Poisson process (ZI-NHPP) characterized by a power-law intensity function. Parameters are estimated from historical data using the Expectation-Maximization (EM) algorithm, [...] Read more.
This study develops an integrated quality inspection and production optimization framework for an imperfect production system, where system deterioration follows a zero-inflated non-homogeneous Poisson process (ZI-NHPP) characterized by a power-law intensity function. Parameters are estimated from historical data using the Expectation-Maximization (EM) algorithm, with a zero-inflation parameter π modeling scenario where the system remains defect-free. Operating in either an in-control or out-of-control state, the system produces products with Weibull hazard rates, exhibiting higher failure rates in the out-of-control state. The proposed model integrates system status, defect rates, employee efficiency, and market demand to jointly optimize the number of conforming items inspected and the production run length, thereby minimizing total costs—including production, inspection, correction, inventory, and warranty expenses. Numerical analyses, supported by sensitivity studies, validate the effectiveness of this integrated approach in achieving cost-efficient quality control. This framework enhances quality assurance and production management, offering practical insights for manufacturing across diverse industries. Full article
(This article belongs to the Section C: Mathematical Analysis)
Show Figures

Figure 1

26 pages, 1957 KB  
Article
Win–Win Pricing of Follow-Up Policies Under Healthcare Warranties for Chronic Diseases: A Mathematical Modeling Approach
by Mei Li, Zixian Liu and Lijun Liang
Healthcare 2025, 13(19), 2461; https://doi.org/10.3390/healthcare13192461 - 28 Sep 2025
Viewed by 864
Abstract
Implementing follow-up policies under healthcare warranties for chronic disease patients plays a crucial role in reducing the risk of adverse outcomes (AOs) and controlling long-term medical costs. However, the additional cost associated with these services often discourages hospitals from providing them. Background/Objectives: [...] Read more.
Implementing follow-up policies under healthcare warranties for chronic disease patients plays a crucial role in reducing the risk of adverse outcomes (AOs) and controlling long-term medical costs. However, the additional cost associated with these services often discourages hospitals from providing them. Background/Objectives: To incentivize participation from both hospitals and patients in follow-up programs, this paper introduces a patient copayment mechanism. We propose a theoretical mathematical modeling framework to investigate the optimal pricing of follow-up policies from both patients’ and hospitals’ perspectives to achieve win–win outcomes. Methods: Using the Cox frailty model, we stratify patients by risk level and model hazard rate functions for three follow-up policies featuring periodic checkups, incorporating the virtual age method. Building on this framework, we employ the Non-Homogeneous Poisson Process to analyze the total expected costs incurred by hospitals and patients across different policies and risk strata. This analysis derives the minimum price acceptable to hospitals for providing follow-up services and the maximum additional cost patients are willing to bear for them. The feasibility and applicability of the proposed model are demonstrated through a case study of pediatric type 1 diabetes mellitus (T1DM). Results: Win–win price intervals for T1DM patients are more achievable for higher-risk individuals. These intervals narrow or widen with the age reduction factor, checkup cost, and AO treatment cost. Hospitals should prioritize higher-risk patients, improve checkup effectiveness, and balance costs of checkups and treatments when optimizing pricing decisions. Conclusions: These insights provide valuable guidance for hospitals in strategically designing follow-up policies tailored to diverse risk cohorts and determining optimal price intervals. Full article
(This article belongs to the Special Issue Evaluation and Potential of Effective Decision-Making in Healthcare)
Show Figures

Figure 1

21 pages, 1150 KB  
Article
Modeling and Assessing Software Reliability in Open-Source Projects
by Maria T. Vasileva and Georgi Penchev
Computation 2025, 13(9), 214; https://doi.org/10.3390/computation13090214 - 3 Sep 2025
Cited by 2 | Viewed by 1889
Abstract
One of the key components of the software quality model is reliability. Its importance has grown with the increasing use and reuse of open-source components in software development. Software reliability growth models are commonly employed to address this aspect by predicting future failure [...] Read more.
One of the key components of the software quality model is reliability. Its importance has grown with the increasing use and reuse of open-source components in software development. Software reliability growth models are commonly employed to address this aspect by predicting future failure rates and estimating the number of remaining defects throughout the development process. This paper investigates two software reliability growth models derived from the Verhulst model, with a particular focus on a structural property known as Hausdorff saturation. We provide analytical estimates for this characteristic and propose it as an additional criterion for model selection. The models are evaluated using four open-source datasets, where the Hausdorff saturation metric supports the conclusions drawn from standard goodness-of-fit measures. Furthermore, we introduce an interactive software reliability assessment tool that integrates with GitHub, enabling expert users to analyze real-time issue-tracking data from open-source repositories. The tool facilitates model comparison and enhances practical applicability. Overall, the proposed approach contributes to more robust reliability assessment by combining theoretical insights with actionable diagnostics. Full article
(This article belongs to the Section Computational Engineering)
Show Figures

Figure 1

18 pages, 844 KB  
Article
LINEX Loss-Based Estimation of Expected Arrival Time of Next Event from HPP and NHPP Processes Past Truncated Time
by M. S. Aminzadeh
Analytics 2025, 4(3), 20; https://doi.org/10.3390/analytics4030020 - 26 Aug 2025
Viewed by 908
Abstract
This article introduces a computational tool for Bayesian estimation of the expected time until the next event occurs in both homogeneous Poisson processes (HPPs) and non-homogeneous Poisson processes (NHPPs), following a truncated time. The estimation utilizes the linear exponential (LINEX) asymmetric loss function [...] Read more.
This article introduces a computational tool for Bayesian estimation of the expected time until the next event occurs in both homogeneous Poisson processes (HPPs) and non-homogeneous Poisson processes (NHPPs), following a truncated time. The estimation utilizes the linear exponential (LINEX) asymmetric loss function and incorporates both gamma and non-informative priors. Furthermore, it presents a minimax-type criterion to ascertain the optimal sample size required to achieve a specified percentage reduction in posterior risk. Simulation studies indicate that estimators employing gamma priors for both HPP and NHPP demonstrate greater accuracy compared to those based on non-informative priors and maximum likelihood estimates (MLE), provided that the proposed data-driven method for selecting hyperparameters is applied. Full article
Show Figures

Figure 1

15 pages, 1572 KB  
Article
AI-Driven Optimization Framework for Smart EV Charging Systems Integrated with Solar PV and BESS in High-Density Residential Environments
by Md Tanjil Sarker, Marran Al Qwaid, Siow Jat Shern and Gobbi Ramasamy
World Electr. Veh. J. 2025, 16(7), 385; https://doi.org/10.3390/wevj16070385 - 9 Jul 2025
Cited by 25 | Viewed by 6372
Abstract
The rapid growth of electric vehicle (EV) adoption necessitates advanced energy management strategies to ensure sustainable, reliable, and efficient operation of charging infrastructure. This study proposes a hybrid AI-based framework for optimizing residential EV charging systems through the integration of Reinforcement Learning (RL), [...] Read more.
The rapid growth of electric vehicle (EV) adoption necessitates advanced energy management strategies to ensure sustainable, reliable, and efficient operation of charging infrastructure. This study proposes a hybrid AI-based framework for optimizing residential EV charging systems through the integration of Reinforcement Learning (RL), Linear Programming (LP), and real-time grid-aware scheduling. The system architecture includes smart wall-mounted chargers, a 120 kWp rooftop solar photovoltaic (PV) array, and a 60 kWh lithium-ion battery energy storage system (BESS), simulated under realistic load conditions for 800 residential units and 50 charging points rated at 7.4 kW each. Simulation results, validated through SCADA-based performance monitoring using MATLAB/Simulink and OpenDSS, reveal substantial technical improvements: a 31.5% reduction in peak transformer load, voltage deviation minimized from ±5.8% to ±2.3%, and solar utilization increased from 48% to 66%. The AI framework dynamically predicts user demand using a non-homogeneous Poisson process and optimizes charging schedules based on a cost-voltage-user satisfaction reward function. The study underscores the critical role of intelligent optimization in improving grid reliability, minimizing operational costs, and enhancing renewable energy self-consumption. The proposed system demonstrates scalability, resilience, and cost-effectiveness, offering a practical solution for next-generation urban EV charging networks. Full article
Show Figures

Figure 1

16 pages, 856 KB  
Article
Comparison of Parametric Rate Models for Gap Times Between Recurrent Events
by Ivo Sousa-Ferreira, Ana Maria Abreu and Cristina Rocha
Mathematics 2025, 13(12), 1931; https://doi.org/10.3390/math13121931 - 10 Jun 2025
Viewed by 1191
Abstract
Over the past two decades, substantial efforts have been made to develop survival models for gap times between recurrent events. An emerging approach involves considering rate models derived from a non-homogeneous Poisson process, thus allowing the conditional distribution of a gap time given [...] Read more.
Over the past two decades, substantial efforts have been made to develop survival models for gap times between recurrent events. An emerging approach involves considering rate models derived from a non-homogeneous Poisson process, thus allowing the conditional distribution of a gap time given the previous recurrence time to be deduced. Under this approach, some parametric rate models have been proposed, differing in their distributional assumptions on gap times. In particular, the extended exponential–Poisson, Weibull and extended Chen–Poisson distributions have been considered. Alternatively, a flexible rate model using restricted cubic splines is proposed here to capture complex non-monotonic rate shapes. Moreover, a comprehensive comparison of parametric rate models is presented. The maximum likelihood method is applied for parameter estimation in the presence of right-censoring. It is shown that some models include important special cases that allow testing of the independence assumption between a gap time and the previous recurrence time. The likelihood ratio test, as well as two information criteria, are discussed for model selection. Model fit is assessed using Cox–Snell residuals. Applications to two well-known clinical data sets illustrate the comparative performance of both the existing and proposed models, as well as their practical relevance. Full article
(This article belongs to the Special Issue Advances in Statistics, Biostatistics and Medical Statistics)
Show Figures

Figure 1

17 pages, 3131 KB  
Article
Non-Homogeneous Poisson Process Software Reliability Model and Multi-Criteria Decision for Operating Environment Uncertainty and Dependent Faults
by Youn Su Kim, Kwang Yoon Song, Hoang Pham and In Hong Chang
Appl. Sci. 2025, 15(9), 5184; https://doi.org/10.3390/app15095184 - 7 May 2025
Viewed by 2004
Abstract
The importance of software has increased significantly over time, and software failures have become a critical concern. As software systems have diversified in functionality, their structure has become more complex, and the types of failures that can occur in software have also diversified, [...] Read more.
The importance of software has increased significantly over time, and software failures have become a critical concern. As software systems have diversified in functionality, their structure has become more complex, and the types of failures that can occur in software have also diversified, stimulating the development of diverse software reliability models. In this study, we make certain assumptions regarding complex software, thereby proposing a new type of non-homogeneous Poisson process (NHPP) software reliability model that considers both dependent cases of software failure and cases that originate from the differences between the developed and actual operating environments. In addition, a new multi-criteria decision method (MCDM) that uses the maximum value for a comprehensive evaluation was proposed to demonstrate the effectiveness of the developed model. This improves the judgment of model excellence through multiple criteria and is suitable for multiple interpretations. To demonstrate the effectiveness of the proposed model, 15 NHPP software reliability models were compared using 13 evaluation criteria and three MCDM methods across two datasets. The results showed that one dataset performed well for all the criteria, whereas the other dataset performed well for the newly proposed a multi-criteria decision method using maximum (MCDMM). The sensitivity analysis also showed a change in the mean value function with a change in the parameters. These results demonstrate that an extended structure for complex software can lead to improved software reliability. Full article
Show Figures

Figure 1

31 pages, 4051 KB  
Article
A Stochastic Model for Traffic Incidents and Free Flow Recovery in Road Networks
by Fahem Mouhous, Djamil Aissani and Nadir Farhi
Mathematics 2025, 13(3), 520; https://doi.org/10.3390/math13030520 - 4 Feb 2025
Viewed by 1608
Abstract
This study addresses the disruptive impact of incidents on road networks, which often lead to traffic congestion. If not promptly managed, congestion can propagate and intensify over time, significantly delaying the recovery of free-flow conditions. We propose an enhanced model based on an [...] Read more.
This study addresses the disruptive impact of incidents on road networks, which often lead to traffic congestion. If not promptly managed, congestion can propagate and intensify over time, significantly delaying the recovery of free-flow conditions. We propose an enhanced model based on an exponential decay of the time required for free flow recovery between incident occurrences. Our approach integrates a shot noise process, assuming that incidents follow a non-homogeneous Poisson process. The increases in recovery time following incidents are modeled using exponential and gamma distributions. We derive key performance metrics, providing insights into congestion risk and the unlocking phenomenon, including the probability of the first passage time for our process to exceed a predefined congestion threshold. This probability is analyzed using two methods: (1) an exact simulation approach and (2) an analytical approximation technique. Utilizing the analytical approximation, we estimate critical extreme quantities, such as the minimum incident clearance rate, the minimum intensity of recovery time increases, and the maximum intensity of incident occurrences required to avoid exceeding a specified congestion threshold with a given probability. These findings offer valuable tools for managing and mitigating congestion risks in road networks. Full article
(This article belongs to the Section E: Applied Mathematics)
Show Figures

Figure 1

14 pages, 290 KB  
Article
Bayesian Assessment of Corrosion-Related Failures in Steel Pipelines
by Fabrizio Ruggeri, Enrico Cagno, Franco Caron, Mauro Mancini and Antonio Pievatolo
Entropy 2024, 26(12), 1111; https://doi.org/10.3390/e26121111 - 19 Dec 2024
Viewed by 1554
Abstract
The probability of gas escapes from steel pipelines due to different types of corrosion is studied with real failure data from an urban gas distribution network. Both the design and maintenance of the network are considered, identifying and estimating (in a Bayesian framework) [...] Read more.
The probability of gas escapes from steel pipelines due to different types of corrosion is studied with real failure data from an urban gas distribution network. Both the design and maintenance of the network are considered, identifying and estimating (in a Bayesian framework) an elementary multinomial model in the first case, and a more sophisticated non-homogeneous Poisson process in the second case. Special attention is paid to the elicitation of the experts’ opinions. We conclude that the corrosion process behaves quite differently depending on the type of corrosion, and that, in most cases, cathodically protected pipes should be installed. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

14 pages, 1244 KB  
Article
Semiparametric Analysis of Additive–Multiplicative Hazards Model with Interval-Censored Data and Panel Count Data
by Tong Wang, Yang Li, Jianguo Sun and Shuying Wang
Mathematics 2024, 12(23), 3667; https://doi.org/10.3390/math12233667 - 22 Nov 2024
Viewed by 1373
Abstract
In survival analysis, interval-censored data and panel count data represent two prevalent types of incomplete data. Given that, within certain research contexts, the events of interest may simultaneously involve both data types, it is imperative to perform a joint analysis of these data [...] Read more.
In survival analysis, interval-censored data and panel count data represent two prevalent types of incomplete data. Given that, within certain research contexts, the events of interest may simultaneously involve both data types, it is imperative to perform a joint analysis of these data to fully comprehend the occurrence process of the events being studied. In this paper, a novel semiparametric joint regression analysis framework is proposed for the analysis of interval-censored data and panel count data. It is hypothesized that the failure time follows an additive–multiplicative hazards model, while the recurrent events follow a nonhomogeneous Poisson process. Additionally, a gamma-distributed frailty is introduced to describe the correlation between the failure time and the count process of recurrent events. To estimate the model parameters, a sieve maximum likelihood estimation method based on Bernstein polynomials is proposed. The performance of this estimation method under finite sample conditions is evaluated through a series of simulation studies, and an empirical study is illustrated. Full article
(This article belongs to the Special Issue Statistical Analysis and Data Science for Complex Data)
Show Figures

Figure 1

30 pages, 4521 KB  
Article
NHPP Software Reliability Model with Rayleigh Fault Detection Rate and Optimal Release Time for Operating Environment Uncertainty
by Kwang Yoon Song and In Hong Chang
Appl. Sci. 2024, 14(21), 10072; https://doi.org/10.3390/app142110072 - 4 Nov 2024
Cited by 1 | Viewed by 2130
Abstract
Software is used in diverse settings and depends on development and testing environments. Software development should improve the reliability, quality, cost, and stability of software, making the software testing period crucial. We proposed a software reliability model (SRM) that considers the uncertainty of [...] Read more.
Software is used in diverse settings and depends on development and testing environments. Software development should improve the reliability, quality, cost, and stability of software, making the software testing period crucial. We proposed a software reliability model (SRM) that considers the uncertainty of software environments and the fault detection rate function as a Rayleigh distribution, with an explicit mean value function solution in the model. The goodness-of-fit of the proposed model relative to several existing nonhomogeneous Poisson process (NHPP) SRMs is presented based on three software application failure datasets. Further, a cost model is also presented that addresses the error-removal risk level and required time. The optimal testing release policy for minimizing the expected total cost (ETC) is also determined for NHPP SRMs. The impact of the software environment is studied by varying it, and the optimal release times and minimum ETCs are compared. The goodness-of-fit comparison confirmed that the proposed model has more accurate prediction values than other models. Further, whereas the existing models applied to the cost model do not change after a certain operation period, the proposed model yields changes in release time even for long operating periods. Full article
Show Figures

Figure 1

18 pages, 5746 KB  
Article
Remaining Useful Life Prediction for Power Storage Electronic Components Based on Fractional Weibull Process and Shock Poisson Model
by Wanqing Song, Xianhua Yang, Wujin Deng, Piercarlo Cattani and Francesco Villecco
Fractal Fract. 2024, 8(8), 485; https://doi.org/10.3390/fractalfract8080485 - 19 Aug 2024
Cited by 8 | Viewed by 2102
Abstract
For lithium-ion batteries and supercapacitors in hybrid power storage facilities, both steady degradation and random shock contribute to their failure. To this end, in this paper, we propose to introduce the degradation-threshold-shock (DTS) model for their remaining useful life (RUL) prediction. Non-homogeneous compound [...] Read more.
For lithium-ion batteries and supercapacitors in hybrid power storage facilities, both steady degradation and random shock contribute to their failure. To this end, in this paper, we propose to introduce the degradation-threshold-shock (DTS) model for their remaining useful life (RUL) prediction. Non-homogeneous compound Poisson process (NHCP) is proposed to simulate the shock effect in the DTS model. Considering the long-range dependence and heavy-tailed characteristics of the degradation process, fractional Weibull process (fWp) is employed in the diffusion term of the stochastic degradation model. Furthermore, the drift and diffusion coefficients are constantly updated to describe the environmental interference. Prior to the model training, steady degradation and shock data must be separated, based on the three-sigma principle. Degradation data for the lithium-ion batteries (LIBs) and ultracapacitors are employed for model verification under different operation protocols in the power system. Recent deep learning models and stochastic process-based methods are utilized for model comparison, and the proposed model shows higher prediction accuracy. Full article
Show Figures

Figure 1

Back to TopTop