Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (831)

Search Parameters:
Keywords = process chart

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 3126 KB  
Review
Integrated Pretreatment and Microbial Matching for PHA Production from Lignocellulosic Agro-Forestry Residues
by Dongna Li, Shanshan Liu, Qiang Wang, Xiaojun Ma and Jianing Li
Fermentation 2025, 11(10), 563; https://doi.org/10.3390/fermentation11100563 - 29 Sep 2025
Abstract
Lignocellulosic agro-forestry residues (LARs), such as rice straw, sugarcane bagasse, and wood wastes, are abundant and low-cost feedstocks for polyhydroxyalkanoate (PHA) bioplastics. However, their complex cellulose–hemicellulose–lignin matrix requires integrated valorization strategies. This review presents a dual-framework approach: “pretreatment–co-substrate compatibility” and “pretreatment–microbial platform matching”, [...] Read more.
Lignocellulosic agro-forestry residues (LARs), such as rice straw, sugarcane bagasse, and wood wastes, are abundant and low-cost feedstocks for polyhydroxyalkanoate (PHA) bioplastics. However, their complex cellulose–hemicellulose–lignin matrix requires integrated valorization strategies. This review presents a dual-framework approach: “pretreatment–co-substrate compatibility” and “pretreatment–microbial platform matching”, to align advanced pretreatment methods (including deacetylation–microwave integration, deep eutectic solvents, and non-sterilized lignin recovery) with engineered or extremophilic microbial hosts. A “metabolic interaction” perspective on co-substrate fermentation, encompassing dynamic carbon flux allocation, synthetic consortia cooperation, and one-pot process coupling, is used to elevate PHA titers and tailor copolymer composition. In addition, we synthesize comprehensive kinetic analyses from the literature that elucidate microbial growth, substrate consumption, and dynamic carbon flux allocation under feast–famine conditions, thereby informing process optimization and scalability. Microbial platforms are reclassified as broad-substrate, process-compatible, or product-customized categories to emphasize adaptive evolution, CRISPR-guided precision design, and consortia engineering. Finally, next-generation techno-economic analyses, embracing multi-product integration, regional adaptation, and carbon-efficiency metrics, are surveyed to chart viable paths for scaling LAR-to-PHA into circular bioeconomy manufacturing. Full article
Show Figures

Figure 1

30 pages, 1350 KB  
Review
Glial Cells as Emerging Therapeutic Targets in Neurodegenerative Diseases: Mechanistic Insights and Translational Perspectives
by Thirupathirao Vishnumukkala, Che Mohd Nasril Che Mohd Nassir, Zaw Myo Hein, Prarthana Kalerammana Gopalakrishna, Barani Karikalan, Aisyah Alkatiri, Saravanan Jagadeesan, Venkatesh R. Naik, Warren Thomas, Mohamad Aris Mohd Moklas and Mohd Amir Kamaruzzaman
Cells 2025, 14(19), 1497; https://doi.org/10.3390/cells14191497 - 24 Sep 2025
Viewed by 36
Abstract
Neurodegenerative diseases such as Alzheimer’s disease (AD), Parkinson’s disease (PD), Huntington’s disease, multiple sclerosis, and amyotrophic lateral sclerosis share converging mechanisms of neuronal dysfunction, including protein aggregation, oxidative stress, and chronic neuroinflammation. Glial cells, once considered passive supporters, are now recognized as central [...] Read more.
Neurodegenerative diseases such as Alzheimer’s disease (AD), Parkinson’s disease (PD), Huntington’s disease, multiple sclerosis, and amyotrophic lateral sclerosis share converging mechanisms of neuronal dysfunction, including protein aggregation, oxidative stress, and chronic neuroinflammation. Glial cells, once considered passive supporters, are now recognized as central drivers of these processes, offering both pathogenic triggers and therapeutic opportunities. Yet, despite compelling preclinical evidence, the translation of glial-targeted therapies into clinical success has been limited. This review provides a critical synthesis of current knowledge by examining therapeutic strategies through the lens of their translational challenges and failures. This narrative review highlights how interspecies variability of glial phenotypes, shifting neuroprotective versus neurotoxic states, limited biomarker stratification, and delivery barriers have constrained trials, such as anti-triggering receptor expressed on myeloid cells 2 (anti-TREM2) antibodies in AD and glial cell line-derived neurotrophic factor (GDNF) in PD. By analyzing these obstacles across major neurodegenerative disorders, this review argue that the next stage of glial medicine requires precision approaches that integrate stage-specific phenotyping, biomarker-guided patient selection, and innovative delivery platforms. Understanding not only what has been tried but why translation has stalled is essential to chart a roadmap for effective, disease-modifying glial therapies in the aging brain. Full article
(This article belongs to the Special Issue Glial Cells in Aging Neuroscience)
Show Figures

Figure 1

21 pages, 488 KB  
Review
Entangled Autopoiesis: Reframing Psychotherapy and Neuroscience Through Cognitive Science and Systems Engineering
by Dana Rad, Monica Maier, Zorica Triff and Radiana Marcu
Brain Sci. 2025, 15(10), 1032; https://doi.org/10.3390/brainsci15101032 - 24 Sep 2025
Viewed by 128
Abstract
The increasing intersection of psychotherapy, cognitive science, neuroscience, and systems engineering beckons us to rethink what it means to talk the language of the human mind in the clinical setting. This position paper proposes the idea of entangled autopoiesis, a metatheoretical paradigm that [...] Read more.
The increasing intersection of psychotherapy, cognitive science, neuroscience, and systems engineering beckons us to rethink what it means to talk the language of the human mind in the clinical setting. This position paper proposes the idea of entangled autopoiesis, a metatheoretical paradigm that addresses the mind and therapy not as linear processes but as self-organizing, adaptive processes enfolded across neural, cognitive, relational, and cultural domains. Psychotherapy, from this viewpoint, is less a corrective technique and more a zone of systemic integration, wherein resilience and meaning are co-created in the interaction of embodied brains, lived stories, and relational fields. Neuroscience informs us about plasticity and regulation; cognitive science emphasizes the embodied and extended nature of cognition; and systems engineering sheds light on feedback, emergence, and adaptive dynamics. Artificial intelligence appears as a double presence: as a metaphor for complexity and as a practical tool able to chart patterns below human sensibility. By adopting a complexity-aware epistemology, we advocate a relocation in clinical thinking—one recognizing the psyche as an autopoietic network, entangled with culture and technology and able to renew itself in therapeutic encounters. The implications for clinical methodology, therapist training, and future interdisciplinary research are discussed. Full article
Show Figures

Figure 1

15 pages, 1559 KB  
Article
Visualization of Medical Record with 3D Human Body Models
by Tz-Jie Liu, Chia-Yi Lai and Yi-Cheng Chiang
Healthcare 2025, 13(19), 2393; https://doi.org/10.3390/healthcare13192393 - 23 Sep 2025
Viewed by 143
Abstract
Background/Objectives: With the rapid development of smart healthcare, medical records have shifted from a disease-centered to a patient-centered approach. However, traditional formats, such as narratives and tables, often make it challenging for physicians to quickly grasp a patient’s condition within a limited timeframe, [...] Read more.
Background/Objectives: With the rapid development of smart healthcare, medical records have shifted from a disease-centered to a patient-centered approach. However, traditional formats, such as narratives and tables, often make it challenging for physicians to quickly grasp a patient’s condition within a limited timeframe, potentially leading to diagnostic errors and a decline in the quality of care. Recently, advances in information visualization and 3D technology have led many medical institutions to employ charts and graphs or use 3D simulations of organs to support clinical practice and education. However, few have integrated 3D models into medical records for use during physician consultations. Methods: This study presents the development and evaluation of a novel web-based 3D EMR system that integrates real-time ICD-10 diagnostic code mapping with interactive 3D human body models, enabling physicians to visualize patient-specific anatomical and diagnostic information in a dynamic and context-aware manner. Results: We employed the System Usability Scale (SUS) to evaluate the system’s usability, conducting a satisfaction survey. Results from the survey indicate that participants rated the system highly in terms of ease of use, satisfaction, and efficiency, with an average SUS score of 70.42, reflecting usability between moderate and good. Comparative evaluations and future expansion plans are also discussed. Conclusions: These findings demonstrate that integrating a 3D human model into the medical record retrieval process significantly improves visualization and interactivity, meeting the needs of healthcare professionals and enhancing both their efficiency and patient satisfaction. Full article
Show Figures

Figure 1

31 pages, 668 KB  
Article
A Novel Moving Average–Exponentiated Exponentially Weighted Moving Average (MA-Exp-EWMA) Control Chart for Detecting Small Shifts
by Jun-Hao Lu and Chang-Yun Lin
Mathematics 2025, 13(18), 3049; https://doi.org/10.3390/math13183049 - 22 Sep 2025
Viewed by 149
Abstract
Process monitoring plays a vital role in ensuring quality stability, and, operational efficiency across fields such as manufacturing, finance, biomedical science, and environmental monitoring. Among statistical tools, control charts are widely adopted for detecting variability and abnormal patterns. Since the introduction of the [...] Read more.
Process monitoring plays a vital role in ensuring quality stability, and, operational efficiency across fields such as manufacturing, finance, biomedical science, and environmental monitoring. Among statistical tools, control charts are widely adopted for detecting variability and abnormal patterns. Since the introduction of the basic X-bar control chart by Shewhart in the 1920s, various improved methods have emerged to address the challenge of identifying small and latent process shifts, including CUSUM, MA, EWMA, and Exp-EWMA control charts. This study introduces a novel control chart—the Moving Average–Exponentiated Exponentially Weighted Moving Average (MA-Exp-EWMA) control chart—combining the smoothing effect of MA and the adaptive weighting of Exp-EWMA. Its goal is to improve the detection of small shifts and gradual changes. Performance is evaluated using average run length (ARL), standard deviation of run length (SDRL), and median run length (MRL). Monte Carlo simulations under different distributions (normal, exponential, gamma, and Student’s t) and parameter settings assess the control chart’s sensitivity under various shift scenarios. Comparisons with existing control charts and an application to real data demonstrate the practical effectiveness of the proposed method in detecting small shifts. Full article
(This article belongs to the Special Issue Mathematical Modelling and Statistical Methods of Quality Engineering)
Show Figures

Figure 1

19 pages, 1059 KB  
Article
Performance Evaluation of Shiryaev–Roberts and Cumulative Sum Schemes for Monitoring Shape and Scale Parameters in Gamma-Distributed Data Under Type I Censoring
by He Li, Peile Chen, Ruicheng Ma and Jiujun Zhang
Axioms 2025, 14(9), 713; https://doi.org/10.3390/axioms14090713 - 22 Sep 2025
Viewed by 103
Abstract
This paper proposes two process monitoring schemes, namely the Shiryaev–Roberts (SR) procedure and the cumulative sum (CUSUM) procedure, to detect shifts in the shape and scale parameters of Type I right-censored Gamma-distributed lifetime data. The performance of the proposed schemes is compared with [...] Read more.
This paper proposes two process monitoring schemes, namely the Shiryaev–Roberts (SR) procedure and the cumulative sum (CUSUM) procedure, to detect shifts in the shape and scale parameters of Type I right-censored Gamma-distributed lifetime data. The performance of the proposed schemes is compared with that of an exponentially weighted moving average (EWMA) control chart based on deep learning networks. The performance of the proposed schemes is evaluated under various censoring rates using Monte Carlo simulations, with the average run length (ARL) as the primary metric. Furthermore, the SR and CUSUM schemes are compared for both zero-state and steady-state shifts. Simulation results indicate that the SR and CUSUM procedures exhibit superior performance, with the SR scheme showing particular advantages when the actual shift is small, while the CUSUM chart proves more effective for identifying larger shifts. The shape parameter has a significant effect on the performance of the control charts such that a reduction in the shape parameter effectively improves the ability to capture early offsets. Increased censoring rates reduce detection sensitivity. To maintain ARL0= 370, control limits h adapt differentially. The SR and CUSUM charts with different censoring rates need to recalibrate the parameter to mitigate performance losses under higher censoring conditions. The monitoring performance of the SR and CUSUM chart is enhanced by an increase in sample size. Finally, a practical example is provided to illustrate the application of the proposed monitoring schemes. Full article
Show Figures

Figure 1

29 pages, 1718 KB  
Review
Bacillus Pectinases as Key Biocatalysts for a Circular Bioeconomy: From Green Extraction to Process Optimization and Industrial Scale-Up
by Fatima Zohra Kaissar, Khelifa Bouacem, Mohammed Lamine Benine, Sondes Mechri, Shubha Rani Sharma, Vishal Kumar Singh, Mahfoud Bakli, Seif El Islam Lebouachera and Giovanni Emiliani
BioTech 2025, 14(3), 74; https://doi.org/10.3390/biotech14030074 - 19 Sep 2025
Viewed by 467
Abstract
Pectins are high-value plant cell-wall polysaccharides with extensive applications in the food, pharmaceutical, textile, paper, and environmental sectors. Traditional extraction and processing methodologies rely heavily on harsh acids, high temperatures, and non-renewable solvents, generating substantial environmental and economic costs. This review consolidates recent [...] Read more.
Pectins are high-value plant cell-wall polysaccharides with extensive applications in the food, pharmaceutical, textile, paper, and environmental sectors. Traditional extraction and processing methodologies rely heavily on harsh acids, high temperatures, and non-renewable solvents, generating substantial environmental and economic costs. This review consolidates recent advances across the entire Bacillus–pectinase value chain, from green pectin extraction and upstream substrate characterization, through process and statistical optimization of enzyme production, to industrial biocatalysis applications. We propose a practical roadmap for developing high-efficiency, low-environmental-footprint enzyme systems that support circular bioeconomy objectives. Critical evaluation of optimization strategies, including submerged versus solid-state fermentation, response surface methodology, artificial neural networks, and design of experiments, is supported by comparative data on strain performance, fermentation parameters, and industrial titers. Sector-specific case studies demonstrate the efficacy of Bacillus pectinases in fruit-juice clarification, textile bio-scouring, paper bio-bleaching, bio-based detergents, coffee and tea processing, oil extraction, animal feed enhancement, wastewater treatment, and plant-virus purification. Remaining challenges, including enzyme stability in complex matrices, techno-economic scale-up, and structure-guided protein engineering, are identified. Future directions are charted toward CRISPR-driven enzyme design and fully integrated circular-economy bioprocessing platforms. Full article
(This article belongs to the Section Industry, Agriculture and Food Biotechnology)
Show Figures

Graphical abstract

35 pages, 2034 KB  
Article
A Nonparametric Double Homogeneously Weighted Moving Average Signed-Rank Control Chart for Monitoring Location Parameter
by Vasileios Alevizakos
Mathematics 2025, 13(18), 3027; https://doi.org/10.3390/math13183027 - 19 Sep 2025
Viewed by 144
Abstract
Nonparametric control charts are widely used in many manufacturing processes when there is a lack of knowledge about the distribution that the quality characteristic of interest follows. If there is evidence that the unknown distribution is symmetric, then the signed-rank statistic is preferred [...] Read more.
Nonparametric control charts are widely used in many manufacturing processes when there is a lack of knowledge about the distribution that the quality characteristic of interest follows. If there is evidence that the unknown distribution is symmetric, then the signed-rank statistic is preferred over other nonparametric statistics because it makes control charts more efficient. In this article, a nonparametric double homogeneously weighted moving average control chart based on the signed-rank statistic, namely, the DHWMA-SR chart, is introduced for monitoring the location parameter of an unknown, continuous and symmetric distribution. Monte Carlo simulations are used to study the run-length distribution of the proposed chart. A performance comparison study with the EWMA-SR, DEWMA-SR and HWMA-SR charts indicates that the DHWMA-SR chart is more effective under the zero-state scenario, while its steady-state performance is poor. Finally, two illustrative examples are given to demonstrate the application of the proposed chart. Full article
Show Figures

Figure 1

25 pages, 783 KB  
Systematic Review
KAVAI: A Systematic Review of the Building Blocks for Knowledge-Assisted Visual Analytics in Industrial Manufacturing
by Adrian J. Böck, Stefanie Größbacher, Jan Vrablicz, Christina Stoiber, Alexander Rind, Josef Suschnigg, Tobias Schreck, Wolfgang Aigner and Markus Wagner
Appl. Sci. 2025, 15(18), 10172; https://doi.org/10.3390/app151810172 - 18 Sep 2025
Viewed by 282
Abstract
Industry 4.0 produces large volumes of sensor and machine data, offering new possibilities for manufacturing analytics but also creating challenges in combining domain knowledge with visual analysis. We present a systematic review of 13 peer-reviewed knowledge-assisted visual analytics (KAVA) systems published between 2014 [...] Read more.
Industry 4.0 produces large volumes of sensor and machine data, offering new possibilities for manufacturing analytics but also creating challenges in combining domain knowledge with visual analysis. We present a systematic review of 13 peer-reviewed knowledge-assisted visual analytics (KAVA) systems published between 2014 and 2024, following PRISMA guidelines for the identification, screening, and inclusion processes. The survey is organized around six predefined building blocks, namely, user group, industrial domain, visualization, knowledge, data and machine learning, with a specific emphasis on the integration of knowledge and visualization in the reviewed studies. We find that ontologies, taxonomies, rule sets, and knowledge graphs provide explicit representations of expert understanding, sometimes enriched with annotations and threshold specifications. These structures are stored in RDF or graph databases, relational tables, or flat files, though interoperability is limited, and post-design contributions are not always persisted. Explicit knowledge is visualized through standard and specialized techniques, including thresholds in time-series plots, annotated dashboards, node–link diagrams, customized machine views from ontologies, and 3D digital twins with expert-defined rules. Line graphs, bar charts, and scatterplots are the most frequently used chart types, often augmented with thresholds and annotations derived from explicit knowledge. Recurring challenges include fragmented storage, heterogeneous data and knowledge types, limited automation, inconsistent validation of user input, and scarce long-term evaluations. Addressing these gaps will be essential for developing adaptable, reusable KAVA systems for industrial analytics. Full article
(This article belongs to the Section Applied Industrial Technologies)
Show Figures

Figure 1

26 pages, 1253 KB  
Article
Integrated Production, EWMA Scheme, and Maintenance Policy for Imperfect Manufacturing Systems of Bolt-On Vibroseis Equipment Considering Quality and Inventory Constraints
by Nuan Xia, Zilin Lu, Yuting Zhang and Jundong Fu
Axioms 2025, 14(9), 703; https://doi.org/10.3390/axioms14090703 - 17 Sep 2025
Viewed by 178
Abstract
In recent years, the synergistic effect among production, maintenance, and quality control within manufacturing systems has garnered increasing attention in academic and industrial circles. In high-quality production settings, the real-time identification of minute process deviations holds significant importance for ensuring product quality. Traditional [...] Read more.
In recent years, the synergistic effect among production, maintenance, and quality control within manufacturing systems has garnered increasing attention in academic and industrial circles. In high-quality production settings, the real-time identification of minute process deviations holds significant importance for ensuring product quality. Traditional approaches, such as routine quality inspections or Shewhart control charts, exhibit limitations in sensitivity and response speed, rendering them inadequate for meeting the stringent requirements of high-precision quality control. To address this issue, this paper presents an integrated framework that seamlessly integrates stochastic process modeling, dynamic optimization, and quality monitoring. In the realm of quality monitoring, an exponentially weighted moving average (EWMA) control chart is employed to monitor the production process. The statistic derived from this chart forms a Markov process, enabling it to more acutely detect minor shifts in the process mean. Regarding maintenance strategies, a state-dependent preventive maintenance (PM) and corrective maintenance (CM) mechanism is introduced. Specifically, preventive maintenance is initiated when the system is in a statistically controlled state and the inventory level falls below a predefined threshold. Conversely, corrective maintenance is triggered when the EWMA control chart generates an out-of-control (OOC) signal. To facilitate continuous production during maintenance activities, an inventory buffer mechanism is incorporated into the model. Building upon this foundation, a joint optimization model is formulated, with system states, including equipment degradation state, inventory level, and quality state, serving as decision variables and the minimization of the expected total cost (ETC) per unit time as the objective. This problem is formalized as a constrained dynamic optimization problem and is solved using the genetic algorithm (GA). Finally, through a case study of the production process of vibroseis equipment, the superiority of the proposed model in terms of cost savings and system performance enhancement is empirically verified. Full article
Show Figures

Figure 1

14 pages, 3985 KB  
Article
Quantitatively Evaluating Formation Pressure Distribution After Hydraulic Fracturing in Tight Sand Oil
by Yu Tang, Chunting Liu, Hong Xiang, Jin Zhang, Heng Zheng, Wenting Lu and Ruiquan Liao
Energies 2025, 18(18), 4894; https://doi.org/10.3390/en18184894 - 15 Sep 2025
Viewed by 260
Abstract
Hydraulic fracturing with a horizontal well is the core technology for the efficient development of unconventional oil and gas resources such as tight oil. Quantitative characterization of formation pressure changes in tight oil reservoirs is of great significance for improving the development efficiency [...] Read more.
Hydraulic fracturing with a horizontal well is the core technology for the efficient development of unconventional oil and gas resources such as tight oil. Quantitative characterization of formation pressure changes in tight oil reservoirs is of great significance for improving the development efficiency of tight oil reservoirs. In response to the difficulty of quantitatively characterizing the range, size, and release process of formation pressure control in the fractured wells of tight oil reservoirs, this work proposes a numerical simulation method to quantitatively evaluate reservoir and fluid elastic properties. Based on a simulation, the elastic energy control zone was divided into a fracture network control zone and a matrix control zone, which achieved the accurate calculation of different zones and elastic energies. The effects of fracturing parameters, formation and fluid elastic parameters, and well spacing on the elastic energy control range were analyzed, and elastic energy calculation charts were drawn under different permeability, half fracture length, and fluid elastic parameter conditions. Based on analysis of the elastic energy release process, the elastic recovery rate of this type of reservoir was predicted. These research results are of great significance for optimizing the parameters of unconventional oil and gas hydraulic fracturing and their development system. Full article
Show Figures

Figure 1

19 pages, 4226 KB  
Article
Comparison of Statistical Process Control Models for Monitoring the Biological Burden of a Buffer Solution Used as Input to Produce an Attenuated Viral Vaccine
by Josiane Machado Vieira Mattoso, Greice Maria Silva da Conceição, Ana Paula Roque da Silva, Paulo Vinicius Pereira Miranda, Letícia de Alencar Pereira Rodrigues, Marcelo Luiz Lima Brandão and Jeancarlo Pereira dos Anjos
Processes 2025, 13(9), 2917; https://doi.org/10.3390/pr13092917 - 12 Sep 2025
Viewed by 242
Abstract
The pharmaceutical industry faces various production challenges. Bioburden control is essential, and appropriate strategies and procedures must be implemented at all stages of production to prevent microbial contamination and comply with regulatory standards. Quality tools can provide important information for data management in [...] Read more.
The pharmaceutical industry faces various production challenges. Bioburden control is essential, and appropriate strategies and procedures must be implemented at all stages of production to prevent microbial contamination and comply with regulatory standards. Quality tools can provide important information for data management in production processes. The objective of this study was to compare two types of statistical process control charts (Laney’s U-chart and Bell distribution) in monitoring the bioburden of a buffer solution used as an input to produce an attenuated viral vaccine. Bioburden data for the buffer solution were obtained over a two-year period. The results showed that the analyzed products met the regulatory specifications, as 99% of them presented ≤ 10 colony-forming units (CFU)/100 mL after filtration. Various microorganisms were identified in the buffer solution, including species from the genus Bacillus spp., Micrococcus spp., Kocuria spp., Staphylococcus spp., and Acinetobacter spp. The Bell distribution proved to be statistically more suitable for application in the management of bioburden data for the buffer solution since the limits were closer to the specified value and could more effectively assist in the investigation of process deviations in the production of an attenuated viral vaccine. Full article
Show Figures

Figure 1

25 pages, 13639 KB  
Article
Simulation Study on Optimization of Structural Parameters of Stope Based on Ground Pressure Control
by Yun Lin, Rui Zhou, Keping Zhou, Jielin Li, Chengye Yang, Chaoyang Que, Fengfeng Wu and Yigai Xiao
Appl. Sci. 2025, 15(18), 9998; https://doi.org/10.3390/app15189998 - 12 Sep 2025
Viewed by 242
Abstract
Aiming at the problem of surrounding rock instability easily induced by high ground stress in the process of deep-well mining, the optimization of stope structure parameters is studied by combining numerical simulation with theoretical analysis. Firstly, the physical and mechanical properties of rock [...] Read more.
Aiming at the problem of surrounding rock instability easily induced by high ground stress in the process of deep-well mining, the optimization of stope structure parameters is studied by combining numerical simulation with theoretical analysis. Firstly, the physical and mechanical properties of rock mass are fully understood using laboratory experiments. Then, six kinds of stope structure parameter schemes are preliminarily designed using the Matthews chart method. According to the geological conditions of the Ruihai Gold Mine, a large three-dimensional numerical model is established. Based on FLAC3D, the follow-filling continuous mining method is used to simulate the six schemes. By analyzing the influence and law of different stope structures on the stress, displacement, and plastic zone evolution of surrounding rock, the most effective mining strategy to balance the safety and economic benefits of the target area is determined. In the area with good rock mass quality, the optimal stope dimensions are 20 m in height, 15 m in width, and 80 m in length. In the rock mass area with fault crossing or relatively developed joint fissures, a reduced configuration of 20 m height, 10 m width, and 70 m length is recommended to enhance stability and stress management. Finally, comparative analysis of mining methods confirms that the follow-filling continuous mining method effectively mitigates ground pressure, offering a theoretical foundation for the safe and efficient extraction of deep mineral resources. Full article
(This article belongs to the Special Issue Advanced Technology in Geotechnical Engineering)
Show Figures

Figure 1

16 pages, 751 KB  
Article
Enhancing Sensitivity of Nonparametric Tukey Extended EWMA-MA Charts for Effective Process Mean Monitoring
by Khanittha Talordphop, Yupaporn Areepong and Saowanit Sukparungsee
Symmetry 2025, 17(9), 1457; https://doi.org/10.3390/sym17091457 - 4 Sep 2025
Viewed by 437
Abstract
A control chart is a crucial statistical process control (SPC) instrument for identifying method variances that may undermine product efficacy. The combined control chart has been utilized to enhance recognition capability. When testing a methodology, nonparametric statistics make a strong and compelling case [...] Read more.
A control chart is a crucial statistical process control (SPC) instrument for identifying method variances that may undermine product efficacy. The combined control chart has been utilized to enhance recognition capability. When testing a methodology, nonparametric statistics make a strong and compelling case when the distribution of a quality feature is uncertain. The primary focus of monitoring this work is to offer a novel control chart to support the surveillance of mean activities. This chart will incorporate a Tukey method, an extended exponentially weighted moving average control chart, and a moving average control chart called the Nonparametric EEWMA-MA chart. The Monte Carlo simulation facilitates assessments for evaluating system performance using average run lengths (ARL) based on zero-state. The comparison analysis demonstrates that the sensitivity of the suggested chart surpasses that of the conventional control chart (including the moving average (MA) chart, the extended exponentially weighted moving average (EEWMA) chart, and the mixed extended exponentially weighted moving average-moving average (EEWMA-MA) chart) in rapidly detecting changes that fluctuate with varying parameter settings by examining the minimal ARL. A simplified monitoring scenario using data on vinyl chloride can be employed to demonstrate the feasibility of the proposed technique. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

18 pages, 1007 KB  
Review
Comprehensive Medication Management for Hypertension in the United States: A Scoping Review of Therapeutic, Humanistic, Safety and Economic Outcomes
by Dalia Regos-Stewart, Noel C. Barragan, Scott Weber, Alexander Cantres, Devin Lee, Luis Larios, Evans Pope, Steven Chen and Tony Kuo
Encyclopedia 2025, 5(3), 133; https://doi.org/10.3390/encyclopedia5030133 - 30 Aug 2025
Viewed by 642
Abstract
Emerging research has shown that pharmacist-led comprehensive medication management (CMM) can be an effective strategy for controlling hypertension. A synthesis of the evidence on the overall effects of CMM on clinical, quality, and economic outcomes could help inform and contribute to improvements in [...] Read more.
Emerging research has shown that pharmacist-led comprehensive medication management (CMM) can be an effective strategy for controlling hypertension. A synthesis of the evidence on the overall effects of CMM on clinical, quality, and economic outcomes could help inform and contribute to improvements in programming and practice. Presently, such a synthesis is limited in the literature. To address this gap, we conducted a scoping review of CMM effects on these outcomes, organized by 4 domains: therapeutic, humanistic, safety and economic. Using predefined search terms for articles on studies published between 2010 and 2024, we performed a literature search utilizing these terms to search the MEDLINE, Cochrane Library and CINAHL databases. For each of the identified studies, we applied a multi-stage screening process to extract data, chart results, and synthesize findings. The process took into account methodology of study design, patient population involved, CMM implementation, relevance of outcomes to clinical improvement, and factors that were deemed relevant to study selection. In total, 49 experimental, observational, and simulation-based studies were included in the scoping review. The synthesis focused on outcomes most frequently reported and those rigorously evaluated by the studies in the review. They included clinical measures of blood pressure reduction and control, frequency and duration of healthcare visits, and changes in medication therapy regimen and medication adherence. Overall, CMM interventions were found to have significantly favorable effects on systolic blood pressure reduction, hypertension control, and medication changes. Other outcomes, which showed positive effects, included self-reported patient experience and behaviors, emergency department visits, hospitalizations, mortality, and program costs and related savings from implementing a CMM program. Some results, however, were mixed. For example, a number of studies reported outcomes data without significance testing and many generally lacked consistent characterization of their programming and implementation processes. Future research and practice evaluations should include these elements in their documentation. Furthermore, a more consistent approach to implementing CMM in the field may lead to better support of program delivery fidelity, helping to optimize CMM, moving it from demonstrated efficacy to intervention effectiveness in the real world. Full article
(This article belongs to the Section Medicine & Pharmacology)
Show Figures

Figure 1

Back to TopTop