Recent Advances of Computational Statistics in Industry and Business II

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "Computational and Applied Mathematics".

Deadline for manuscript submissions: closed (30 June 2022) | Viewed by 20524

Special Issue Editor


E-Mail Website
Guest Editor
Department of Statistics, Tamkang University, Tamsui District, New Taipei City 251, Taiwan
Interests: reliability analysis; quality control; statistical modeling
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The field of computational statistics (CS) emphasizes algorithms and numerical methods and plays an essential role in the areas of industry, science, economics, and business. Many novel CS methods have been proposed in the past decade. Numerous researchers and technicians have dedicated their time to studying novel CS methods and using CS methods to deal with data in various fields, such as engineering, reliability, economics, business, medicine, biology, surveys, and physics. The main purpose of this Special Issue of Mathematics is to provide a collection of manuscripts that propose novel CS methods for statistical inference and decision making or using CS methods for simulations and relevant case studies. Potential topics covered by the Special Issue include:

  • Applications of economics or business;
  • Bayesian methods and their applications;
  • Maintainability and availability;
  • Machine learning and its applications;
  • Modeling analysis and simulation;
  • Optimization and simulation;
  • Quality control and its applications;
  • Reliability modeling and life testing;
  • Supply chain management.

Prof. Dr. Tzong-Ru Tsai
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Bayesian estimation
  • machine learning
  • reliability analysis
  • quality control
  • preventive maintenance
  • supply chain management

Published Papers (11 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

18 pages, 2048 KiB  
Article
Classical and Bayesian Inference of the Inverse Nakagami Distribution Based on Progressive Type-II Censored Samples
by Liang Wang, Sanku Dey and Yogesh Mani Tripathi
Mathematics 2022, 10(12), 2137; https://doi.org/10.3390/math10122137 - 19 Jun 2022
Cited by 2 | Viewed by 1261
Abstract
This paper explores statistical inferences when the lifetime of product follows the inverse Nakagami distribution using progressive Type-II censored data. Likelihood-based and maximum product of spacing (MPS)-based methods are considered for estimating the parameters of the model. In addition, approximate confidence intervals are [...] Read more.
This paper explores statistical inferences when the lifetime of product follows the inverse Nakagami distribution using progressive Type-II censored data. Likelihood-based and maximum product of spacing (MPS)-based methods are considered for estimating the parameters of the model. In addition, approximate confidence intervals are constructed via the asymptotic theory using both likelihood and product spacing functions. Based on traditional likelihood and the product of spacing functions, Bayesian estimates are also considered under a squared error loss function using non-informative priors, and Gibbs sampling based on the MCMC algorithm is proposed to approximate the Bayes estimates, where the highest posterior density credible intervals of the parameters are obtained. Numerical studies are presented to compare the proposed estimators using Monte Carlo simulations. To demonstrate the proposed methodology in a real-life scenario, a well-known data set on agricultural machine elevators with high defect rates is also analyzed for illustration. Full article
Show Figures

Figure 1

14 pages, 312 KiB  
Article
Robust Variable Selection for Single-Index Varying-Coefficient Model with Missing Data in Covariates
by Yunquan Song, Yaqi Liu and Hang Su
Mathematics 2022, 10(12), 2003; https://doi.org/10.3390/math10122003 - 10 Jun 2022
Viewed by 1111
Abstract
As applied sciences grow by leaps and bounds, semiparametric regression analyses have broad applications in various fields, such as engineering, finance, medicine, and public health. Single-index varying-coefficient model is a common class of semiparametric models due to its flexibility and ease of interpretation. [...] Read more.
As applied sciences grow by leaps and bounds, semiparametric regression analyses have broad applications in various fields, such as engineering, finance, medicine, and public health. Single-index varying-coefficient model is a common class of semiparametric models due to its flexibility and ease of interpretation. The standard single-index varying-coefficient regression models consist mainly of parametric regression and semiparametric regression, which assume that all covariates can be observed. The assumptions are relaxed by taking the models with missing covariates into consideration. To eliminate the possibility of bias due to missing data, we propose a probability weighted objective function. In this paper, we investigate the robust variable selection for a single-index varying-coefficient model with missing covariates. Using parametric and nonparametric estimates of the likelihood of observations with fully observed covariates, we examine the estimators for estimating the likelihood of observations. For variable selection, we use a weighted objective function penalized by a non-convex SCAD. Theoretical challenges include the treatment of missing data and a single-index varying-coefficient model that uses both the non-smooth loss function and the non-convex penalty function. We provide Monte Carlo simulations to evaluate the performance of our approach. Full article
22 pages, 3019 KiB  
Article
A Stochastic Multi-Strain SIR Model with Two-Dose Vaccination Rate
by Yen-Chang Chang and Ching-Ti Liu
Mathematics 2022, 10(11), 1804; https://doi.org/10.3390/math10111804 - 25 May 2022
Cited by 8 | Viewed by 1595
Abstract
Infectious diseases remain a substantial public health concern as they are among the leading causes of death. Immunization by vaccination can reduce the infectious diseases-related risk of suffering and death. Many countries have developed COVID-19 vaccines in the past two years to control [...] Read more.
Infectious diseases remain a substantial public health concern as they are among the leading causes of death. Immunization by vaccination can reduce the infectious diseases-related risk of suffering and death. Many countries have developed COVID-19 vaccines in the past two years to control the COVID-19 pandemic. Due to an urgent need for COVID-19 vaccines, the vaccine administration of COVID-19 is in the mode of emergency use authorization to facilitate the availability and use of vaccines. Therefore, the vaccine development time is extraordinarily short, but administering two doses is generally recommended within a specific time to achieve sufficient protection. However, it may be essential to identify an appropriate interval between two vaccinations. We constructed a stochastic multi-strain SIR model for a two-dose vaccine administration to address this issue. We introduced randomness into this model mainly through the transmission rate parameters. We discussed the uniqueness of the positive solution to the model and presented the conditions for the extinction and persistence of disease. In addition, we explored the optimal cost to improve the epidemic based on two cost functions. The numerical simulations showed that the administration rate of both vaccine doses had a significant effect on disease transmission. Full article
Show Figures

Figure 1

21 pages, 5001 KiB  
Article
Graph Network Techniques to Model and Analyze Emergency Department Patient Flow
by Iris Reychav, Roger McHaney, Sunil Babbar, Krishanthi Weragalaarachchi, Nadeem Azaizah and Alon Nevet
Mathematics 2022, 10(9), 1526; https://doi.org/10.3390/math10091526 - 2 May 2022
Cited by 1 | Viewed by 2862
Abstract
This article moves beyond analysis methods related to a traditional relational database or network analysis and offers a novel graph network technique to yield insights from a hospital’s emergency department work model. The modeled data were saved in a Neo4j graphing database as [...] Read more.
This article moves beyond analysis methods related to a traditional relational database or network analysis and offers a novel graph network technique to yield insights from a hospital’s emergency department work model. The modeled data were saved in a Neo4j graphing database as a time-varying graph (TVG), and related metrics, including degree centrality and shortest paths, were calculated and used to obtain time-related insights from the overall system. This study demonstrated the value of using a TVG method to model patient flows during emergency department stays. It illustrated dynamic relationships among hospital and consulting units that could not be shown with traditional analyses. The TVG approach augments traditional network analysis with temporal-related outcomes including time-related patient flows, temporal congestion points details, and periodic resource constraints. The TVG approach is crucial in health analytics to understand both general factors and unique influences that define relationships between time-influenced events. The resulting insights are useful to administrators for making decisions related to resource allocation and offer promise for understanding impacts of physicians and nurses engaged in specific patient emergency department experiences. We also analyzed customer ratings and reviews to better understand overall patient satisfaction during their journey through the emergency department. Full article
Show Figures

Figure 1

9 pages, 1104 KiB  
Article
Efficiency Evaluation of Software Faults Correction Based on Queuing Simulation
by Yuka Minamino, Yusuke Makita, Shinji Inoue and Shigeru Yamada
Mathematics 2022, 10(9), 1438; https://doi.org/10.3390/math10091438 - 24 Apr 2022
Cited by 2 | Viewed by 1168
Abstract
Fault-counting data are collected in the testing process of software development. However, the data are not used for evaluating the efficiency of fault correction activities because the information on the fault detection and correction times of each fault are not recorded in the [...] Read more.
Fault-counting data are collected in the testing process of software development. However, the data are not used for evaluating the efficiency of fault correction activities because the information on the fault detection and correction times of each fault are not recorded in the fault-counting data. Furthermore, it is difficult to collect new data on the detection time of each fault to realize efficiency evaluation for fault correction activities from the collected fault-counting data due to the cost of personnel and data collection. In this paper, we apply the thinning method, using intensity functions of the delayed S-shaped and inflection S-shaped software reliability growth models (SRGMs) to generate sample data of the fault detection time from the fault-counting data. Additionally, we perform simulations based on the infinite server queuing model, using the generated sample data of the fault detection time to visualize the efficiency of fault correction activities. Full article
Show Figures

Figure 1

19 pages, 350 KiB  
Article
Bias Correction Method for Log-Power-Normal Distribution
by Tzong-Ru Tsai, Yuhlong Lio, Ya-Yen Fan and Che-Pin Cheng
Mathematics 2022, 10(6), 955; https://doi.org/10.3390/math10060955 - 17 Mar 2022
Viewed by 1458
Abstract
The log-power-normal distribution is a generalized version of the log-normal distribution. The maximum likelihood estimation method is the most popular method to obtain the estimates of the log-power-normal distribution parameters. In this article, we investigate the performance of the maximum likelihood estimation method [...] Read more.
The log-power-normal distribution is a generalized version of the log-normal distribution. The maximum likelihood estimation method is the most popular method to obtain the estimates of the log-power-normal distribution parameters. In this article, we investigate the performance of the maximum likelihood estimation method for point and interval inferences. Moreover, a simple method that has less impact from the subjective selection of the initial solutions to the model parameters is proposed. The bootstrap bias correction method is used to enhance the estimation performance of the maximum likelihood estimation method. The proposed bias correction method is simple for use. Monte Carlo simulations are conducted to check the quality of the proposed bias correction method. The simulation results indicate that the proposed bias correction method can improve the performance of the maximum likelihood estimation method with a smaller bias and provide a coverage probability close to the nominal confidence coefficient. Two real examples about the air pollution and cement’s concrete strength are used for illustration. Full article
Show Figures

Figure 1

13 pages, 3177 KiB  
Article
Construct Six Sigma DMAIC Improvement Model for Manufacturing Process Quality of Multi-Characteristic Products
by Chun-Min Yu, Tsun-Hung Huang, Kuen-Suan Chen and Tsung-Yu Huang
Mathematics 2022, 10(5), 814; https://doi.org/10.3390/math10050814 - 4 Mar 2022
Cited by 6 | Viewed by 3143
Abstract
After a product has undergone a manufacturing process, it usually has several important quality characteristics. When the process quality of all quality characteristics meets the requirements of the quality level, the process quality of the product can be guaranteed to satisfy customers’ needs. [...] Read more.
After a product has undergone a manufacturing process, it usually has several important quality characteristics. When the process quality of all quality characteristics meets the requirements of the quality level, the process quality of the product can be guaranteed to satisfy customers’ needs. A large number of studies have pointed out that good process quality can raise product yield and product value; at the same time, it can reduce the ratio of rework and scrap, achieve the effect of energy saving and waste reduction, and contribute to the sustainable operation of enterprises as well the environment. Since the six sigma method combines the statistical analysis method of manufacturing cost and production data, it is a useful tool for process improvement and process quality enhancement. Therefore, this paper adopted the six sigma-define, measure, analyze, improve and control (DMAIC) improvement process to lift the manufacturing process quality of multi-characteristic products. Besides, the Taguchi process capability index is one of the commonly used tools for quality assessment in the industry. Not only can it reflect the process loss, but it also can ensure the process yield when the index value is large enough. Consequently, this paper discussed the relationship between the Taguchi process capability index and the six sigma quality level. Meanwhile, the entire six sigma DMAIC improvement process was built on the basis of the process capability index and developed by the method of statistical quality control. Hence, the proposed method is very convenient for process engineers to apply, as well as is helpful for enterprises to move toward the goal of smart manufacturing and sustainability. Full article
Show Figures

Figure 1

14 pages, 376 KiB  
Article
Inference for the Process Performance Index of Products on the Basis of Power-Normal Distribution
by Jianping Zhu, Hua Xin, Chenlu Zheng and Tzong-Ru Tsai
Mathematics 2022, 10(1), 35; https://doi.org/10.3390/math10010035 - 23 Dec 2021
Cited by 6 | Viewed by 2318
Abstract
The process performance index (PPI) can be a simple metric to connect the conforming rate of products. The properties of the PPI have been well studied for the normal distribution and other widely used lifetime distributions, such as the Weibull, Gamma, and Pareto [...] Read more.
The process performance index (PPI) can be a simple metric to connect the conforming rate of products. The properties of the PPI have been well studied for the normal distribution and other widely used lifetime distributions, such as the Weibull, Gamma, and Pareto distributions. Assume that the quality characteristic of product follows power-normal distribution. Statistical inference procedures for the PPI are established. The maximum likelihood estimation method for the model parameters and PPI is investigated and the exact Fisher information matrix is derived. We discuss the drawbacks of using the exact Fisher information matrix to obtain the confidence interval of the model parameters. The parametric bootstrap percentile and bootstrap bias-corrected percentile methods are proposed to obtain approximate confidence intervals for the model parameters and PPI. Monte Carlo simulations are conducted to evaluate the performance of the proposed methods. One example about the flow width of the resist in the hard-bake process is used for illustration. Full article
Show Figures

Figure 1

24 pages, 415 KiB  
Article
Instrumental Variable Quantile Regression of Spatial Dynamic Durbin Panel Data Model with Fixed Effects
by Danqing Chen, Jianbao Chen and Shuangshuang Li
Mathematics 2021, 9(24), 3261; https://doi.org/10.3390/math9243261 - 15 Dec 2021
Cited by 3 | Viewed by 1983
Abstract
This paper studies a quantile regression spatial dynamic Durbin panel data (SDDPD) model with fixed effects. Conventional fixed effects estimators of quantile regression specification are usually biased in the presentation of lagged response variables in spatial and time as regressors. To reduce this [...] Read more.
This paper studies a quantile regression spatial dynamic Durbin panel data (SDDPD) model with fixed effects. Conventional fixed effects estimators of quantile regression specification are usually biased in the presentation of lagged response variables in spatial and time as regressors. To reduce this bias, we propose the instrumental variable quantile regression (IVQR) estimator with lagged covariates in spatial and time as instruments. Under some regular conditions, the consistency and asymptotic normalityof the estimators are derived. Monte Carlo simulations show that our estimators not only perform well in finite sample cases at different quantiles but also have robustness for different spatial weights matrices and for different disturbance term distributions. The proposed method is used to analyze the influencing factors of international tourism foreign exchange earnings of 31 provinces in China from 2011 to 2017. Full article
11 pages, 284 KiB  
Article
A Deterministic Learning Algorithm Estimating the Q-Matrix for Cognitive Diagnosis Models
by Meng-Ta Chung and Shui-Lien Chen
Mathematics 2021, 9(23), 3062; https://doi.org/10.3390/math9233062 - 28 Nov 2021
Viewed by 1459
Abstract
The goal of an exam in cognitive diagnostic assessment is to uncover whether an examinee has mastered certain attributes. Different cognitive diagnosis models (CDMs) have been developed for this purpose. The core of these CDMs is the Q-matrix, which is an item-to-attribute mapping, [...] Read more.
The goal of an exam in cognitive diagnostic assessment is to uncover whether an examinee has mastered certain attributes. Different cognitive diagnosis models (CDMs) have been developed for this purpose. The core of these CDMs is the Q-matrix, which is an item-to-attribute mapping, traditionally designed by domain experts. An expert designed Q-matrix is not without issues. For example, domain experts might neglect some attributes or have different opinions about the inclusion of some entries in the Q-matrix. It is therefore of practical importance to develop an automated method to estimate the Q-matrix. This research proposes a deterministic learning algorithm for estimating the Q-matrix. To obtain a sensible binary Q-matrix, a dichotomizing method is also devised. Results from the simulation study shows that the proposed method for estimating the Q-matrix is useful. The empirical study analyzes the ECPE data. The estimated Q-matrix is compared with the expert-designed one. All analyses in this research are carried out in R. Full article

Other

Jump to: Research

17 pages, 6490 KiB  
Systematic Review
A Bibliometric Review of the Mathematics Journal
by Hansin Bilgili and Chwen Sheu
Mathematics 2022, 10(15), 2701; https://doi.org/10.3390/math10152701 - 30 Jul 2022
Viewed by 1042
Abstract
In this study, we conduct a bibliometric review of the Mathematics journal to map its thematic structure, and to identify major research trends for future research to build on. Our review focuses primarily on the bibliometric clusters derived from an application of a [...] Read more.
In this study, we conduct a bibliometric review of the Mathematics journal to map its thematic structure, and to identify major research trends for future research to build on. Our review focuses primarily on the bibliometric clusters derived from an application of a bibliographic coupling algorithm and offers insights into how studies included in the review sample relate to one another to form coherent research streams. We combine this analysis with keyword frequency and topic modeling analyses to reveal the discourse that is taking place in the journal more recently. We believe that a systematic/computer-assisted review of the Mathematics journal can open a path for new developments and discoveries in research and help editors assess the performance and historic evolution of the journal and predict future developments. In so doing, the findings should advance our cumulative understanding in those areas consistent with the scope of the Mathematics journal, such as applied mathematics, analytics, and computational sciences. Full article
Show Figures

Figure 1

Back to TopTop