Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (48)

Search Parameters:
Keywords = scale model room approach

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
50 pages, 28354 KiB  
Article
Mobile Mapping Approach to Apply Innovative Approaches for Real Estate Asset Management: A Case Study
by Giorgio P. M. Vassena
Appl. Sci. 2025, 15(14), 7638; https://doi.org/10.3390/app15147638 - 8 Jul 2025
Viewed by 578
Abstract
Technological development has strongly impacted all processes related to the design, construction, and management of real estate assets. In fact, the introduction of the BIM approach has required the application of three-dimensional survey technologies, and in particular the use of LiDAR instruments, both [...] Read more.
Technological development has strongly impacted all processes related to the design, construction, and management of real estate assets. In fact, the introduction of the BIM approach has required the application of three-dimensional survey technologies, and in particular the use of LiDAR instruments, both in their static (TLS—terrestrial laser scanner) and dynamic (iMMS—indoor mobile mapping system) implementations. Operators and developers of LiDAR technologies, for the implementation of scan-to-BIM procedures, initially placed particular care on the 3D surveying accuracy obtainable from such tools. The incorporation of RGB sensors into these instruments has progressively expanded LiDAR-based applications from essential topographic surveying to geospatial applications, where the emphasis is no longer on the accurate three-dimensional reconstruction of buildings but on the capability to create three-dimensional image-based visualizations, such as virtual tours, which allow the recognition of assets located in every area of the buildings. Although much has been written about obtaining the best possible accuracy for extensive asset surveying of large-scale building complexes using iMMS systems, it is now essential to develop and define suitable procedures for controlling such kinds of surveying, targeted at specific geospatial applications. We especially address the design, field acquisition, quality control, and mass data management techniques that might be used in such complex environments. This work aims to contribute by defining the technical specifications for the implementation of geospatial mapping of vast asset survey activities involving significant building sites utilizing iMMS instrumentation. Three-dimensional models can also facilitate virtual tours, enable local measurements inside rooms, and particularly support the subsequent integration of self-locating image-based technologies that can efficiently perform field updates of surveyed databases. Full article
(This article belongs to the Section Civil Engineering)
Show Figures

Figure 1

16 pages, 1506 KiB  
Article
Data-Driven Fault Detection for HVAC Control Systems in Pharmaceutical Manufacturing Workshops
by Daiyuan Huang and Wenjun Yan
Processes 2025, 13(7), 2015; https://doi.org/10.3390/pr13072015 - 25 Jun 2025
Viewed by 364
Abstract
Large-scale heating, ventilation, and air conditioning (HVAC) control systems in pharmaceutical manufacturing are characterized by complex operational parameters, delayed and often challenging fault detection, and stringent regulatory compliance requirements. To address these issues, this study presents an innovative data-driven fault detection framework that [...] Read more.
Large-scale heating, ventilation, and air conditioning (HVAC) control systems in pharmaceutical manufacturing are characterized by complex operational parameters, delayed and often challenging fault detection, and stringent regulatory compliance requirements. To address these issues, this study presents an innovative data-driven fault detection framework that integrates Principal Component Analysis (PCA) with Nonlinear State Estimation Technology (NSET), specifically tailored for highly regulated pharmaceutical production environments. A dataset comprising 13,198 operational records was collected from the SCADA system of a pharmaceutical facility in Zhejiang, China. The data underwent preprocessing and key parameter extraction, after which a nonlinear state estimation predictive model was constructed, with PCA applied for dimensionality reduction and sensitivity enhancement. Fault detection was performed by monitoring deviations in the mixing room temperature, identifying faults when the residuals between observed and predicted values exceeded a statistically determined threshold (mean ± three standard deviations), in accordance with the Laida criterion. The framework’s effectiveness was validated through comparative analysis before and after documented fault events, including temperature sensor drift and abnormal equipment operation. Experimental results demonstrate that the proposed PCA-NSET model enables timely and accurate detection of both gradual and abrupt faults, facilitating early intervention and reducing potential production downtime. Notably, this framework outperforms traditional fault detection methods by providing higher sensitivity and specificity, while also supporting continuous quality assurance and regulatory compliance in pharmaceutical HVAC applications. The findings underscore the practical value and novelty of the integrated PCA-NSET approach for robust, real-time fault detection in mission-critical industrial environments. Full article
(This article belongs to the Section Process Control and Monitoring)
Show Figures

Figure 1

24 pages, 2850 KiB  
Article
Solving Three-Stage Operating Room Scheduling Problems with Uncertain Surgery Durations
by Yang-Kuei Lin and Chin Soon Chong
Mathematics 2025, 13(12), 1973; https://doi.org/10.3390/math13121973 - 15 Jun 2025
Viewed by 580
Abstract
Operating room (OR) scheduling problems are often addressed using deterministic models that assume surgery durations are known in advance. However, such assumptions fail to reflect the uncertainty that often occurs in real surgical environments, especially during the surgery and recovery stages. This study [...] Read more.
Operating room (OR) scheduling problems are often addressed using deterministic models that assume surgery durations are known in advance. However, such assumptions fail to reflect the uncertainty that often occurs in real surgical environments, especially during the surgery and recovery stages. This study focuses on a robust scheduling problem involving a three-stage surgical process that includes pre-surgery, surgery, and post-surgery stages. The scheduling needs to coordinate multiple resources—pre-operative holding unit (PHU) beds, ORs, and post-anesthesia care unit (PACU) beds—while following a strict no-wait rule to keep patient flow continuous without delays between stages. The main goal is to minimize the makespan and improve schedule robustness when surgery and post-surgery durations are uncertain. To solve this problem, we propose a Genetic Algorithm for Robust Scheduling (GARS), which evaluates solutions using a scenario-based robustness criterion derived from multiple sampled instances. GARS is compared with four other algorithms: a deterministic GA (GAD), a random search (BRS), a greedy randomized insertion and swap heuristic (GRIS), and an improved version of GARS with simulated annealing (GARS_SA). The results from different problem sizes and uncertainty levels show that GARS and GARS_SA consistently perform better than the other algorithms. In large-scale tests with moderate uncertainty (30 surgeries, α = 0.5), GARS achieves an average makespan of 633.85, a standard deviation of 40.81, and a worst-case performance ratio (WPR) of 1.00, while GAD reaches 673.75, 54.21, and 1.11, respectively. GARS can achieve robust performance without using any extra techniques to strengthen the search process. Its structure remains simple and easy to use, making it a practical and effective approach for creating reliable and efficient surgical schedules under uncertainty. Full article
(This article belongs to the Special Issue Theory and Applications of Scheduling and Optimization)
Show Figures

Figure 1

15 pages, 429 KiB  
Article
Uncovering the Technical Efficiency Divide Among Apple Farmers in China: Insights from Stochastic Frontier Analysis and Micro-Level Data
by Ruopin Qu, Yongchang Wu and Jing Chen
Horticulturae 2025, 11(6), 655; https://doi.org/10.3390/horticulturae11060655 - 9 Jun 2025
Viewed by 371
Abstract
Based on a sample of 412 apple farmer households across Gansu, Shaanxi, Shanxi, and Shandong provinces in China, this study estimates production efficiency and its determinants for apple growers. The stochastic frontier analysis model estimates technical efficiency while the Tobit model identifies influencing [...] Read more.
Based on a sample of 412 apple farmer households across Gansu, Shaanxi, Shanxi, and Shandong provinces in China, this study estimates production efficiency and its determinants for apple growers. The stochastic frontier analysis model estimates technical efficiency while the Tobit model identifies influencing factors. Results show that the average production efficiency of smallholder apple farmers is relatively low at 0.45, indicating significant room for improvement. Production efficiency exhibits an inverted “U” relationship with farm scale, and excessive pesticide inputs have a significant negative impact on efficiency. Computer use to search for information among farmers was found to significantly improve apple production efficiency, indicating the potential benefits of ICT adoption. However, membership in cooperatives had no significant effect on efficiency. Overall, these findings suggest approaches to enhance the productivity of China’s apple growers through improved resource allocation, optimized farm scale, and the promotion of information technology. Full article
Show Figures

Figure 1

23 pages, 4936 KiB  
Article
A Practical Image Augmentation Method for Construction Safety Using Object Range Expansion Synthesis
by Jaemin Kim, Ingook Wang, Jungho Yu and Seulki Lee
Buildings 2025, 15(9), 1447; https://doi.org/10.3390/buildings15091447 - 24 Apr 2025
Viewed by 592
Abstract
This study aims to propose a practical and realistic synthetic data generation method for object recognition in hazardous and data-scarce environments, such as construction sites. Artificial intelligence (AI) applications in such dynamic domains require domain-specific datasets, yet collecting real-world data can be challenging [...] Read more.
This study aims to propose a practical and realistic synthetic data generation method for object recognition in hazardous and data-scarce environments, such as construction sites. Artificial intelligence (AI) applications in such dynamic domains require domain-specific datasets, yet collecting real-world data can be challenging due to safety concerns, logistical constraints, and high labor costs. To address these limitations, we introduce object range expansion synthesis (ORES), a lightweight and non-generative method for generating synthetic image data by inserting real object masks into varied background scenes using open datasets. ORES synthesizes new scenes, while preserving scale and ground alignment, enabling controllable and realistic data augmentation. A dataset of 30,000 synthetic images was created using the proposed method and used to train an object recognition model. When tested on real-world construction site images, the model achieved a mean average precision at IoU 0.50 (mAP50) of 98.74% and a recall of 54.55%. While recall indicates room for improvement, the high precision highlights the practical value of synthetic data in enhancing model performance without requiring extensive field data collection. This research contributes a scalable approach to data generation in safety-critical and data-deficient environments, reducing dependence on direct data acquisition, while maintaining model efficacy. It provides a foundation for accelerating the deployment of AI technologies in high-risk industries by overcoming data bottlenecks and supporting real-world applications through practical synthetic augmentation. Full article
(This article belongs to the Special Issue Automation and Robotics in Building Design and Construction)
Show Figures

Figure 1

16 pages, 695 KiB  
Article
Hierarchical Early Wireless Forest Fire Prediction System Utilizing Virtual Sensors
by Ahshanul Haque and Hamdy Soliman
Electronics 2025, 14(8), 1634; https://doi.org/10.3390/electronics14081634 - 18 Apr 2025
Viewed by 459
Abstract
Deploying thousands of sensors across remote and challenging environments—such as the Amazon rainforest, Californian wilderness, or Australian bushlands—is a critical yet complex task for forest fire monitoring, while our backyard emulation confirmed the feasibility of small-scale deployment as a proof of concept, large-scale [...] Read more.
Deploying thousands of sensors across remote and challenging environments—such as the Amazon rainforest, Californian wilderness, or Australian bushlands—is a critical yet complex task for forest fire monitoring, while our backyard emulation confirmed the feasibility of small-scale deployment as a proof of concept, large-scale scenarios demand a scalable, efficient, and fault-tolerant network design. This paper proposes a Hierarchical Wireless Sensor Network (HWSN) deployment strategy with adaptive head node selection to maximize area coverage and energy efficiency. The network architecture follows a three-level hierarchy as follows: The first level incorporates cells of individual sensor nodes that connect to dynamically assigned cell heads. The second level involves the aggregated clusters of such cell heads, each with an assigned cluster head. Finally, dividing all cluster heads into regions, each with a region head, directly reports all the collected information from the forest floor to a central control sink room for decision making analysis. Unlike traditional centralized or uniformly distributed models, our adaptive approach leverages a greedy coverage maximization algorithm to dynamically select head nodes that contribute to the best forest sensed data coverage at each level. Through extensive simulations, the adaptive model achieved over 96.26% coverage, using significantly fewer nodes, while reducing node transmission distances and energy consumption. This facilitates the real-world deployment of our HWSN model in large-scale, remote forest regions, with very promising performance. Full article
Show Figures

Figure 1

18 pages, 1131 KiB  
Article
A Retrospective Cohort Study on the Side Effects of Intrathecal Morphine Administration Combined with General Anaesthesia Versus General Anaesthesia Alone in Prostatectomy Patients
by Timon Marvin Schnabel, Katharina Fetz, Hanaa Baagil, Kim Kutun, Claus Eisenberger and Mark Ulrich Gerbershagen
Medicina 2025, 61(4), 732; https://doi.org/10.3390/medicina61040732 - 15 Apr 2025
Viewed by 600
Abstract
Background and Objectives: Prostatectomy is a common surgical procedure for prostate cancer, the most frequently diagnosed cancer in the male population. The choice of anaesthetic technique has a significant impact on postoperative pain management. The changes in recommendations between 2015 and 2021 prompted [...] Read more.
Background and Objectives: Prostatectomy is a common surgical procedure for prostate cancer, the most frequently diagnosed cancer in the male population. The choice of anaesthetic technique has a significant impact on postoperative pain management. The changes in recommendations between 2015 and 2021 prompted this study to evaluate the impact of intrathecal morphine administration in combination with general anaesthesia compared to general anaesthesia alone on postoperative analgesic consumption and the associated side effects. Material and Methods: A single-centre retrospective cohort study was conducted, analysing data from 202 patients who underwent a prostatectomy between 2015 and 2021. Patients were divided into two groups: 147 patients received intrathecal morphine combined with general anaesthesia, while 49 patients received general anaesthesia alone. Key postoperative parameters, including numerical rating scale (NRS) scores, analgesic consumption, and side effects (e.g., nausea, pruritus, hypotension, and respiratory depression) were evaluated. Statistical analyses were performed using Mann–Whitney U-tests and multiple regression models. Results: The group receiving intrathecal morphine showed a significant decrease in NRS pain scores at rest and during movement in the recovery room (p < 0.001). The need for postoperative analgesics, especially opioids such as piritramide, was reduced in this group. No significant increase in serious side effects such as respiratory depression was observed. Conclusions: The present study investigates the potential of intrathecal morphine combined with general anaesthesia as a promising approach to improve pain management in prostatectomy patients. By reducing pain intensity, this method shows significant clinical benefits. In addition, the absence of a significant increase in serious adverse events reinforces the safety of this approach. However, further studies are warranted to assess the long-term outcomes and explore optimal dosing strategies. The reintroduction of this anaesthetic technique has great potential to improve patient recovery and satisfaction following major surgery. Full article
(This article belongs to the Special Issue Advanced Research on Anesthesiology and Pain Management)
Show Figures

Figure 1

12 pages, 1303 KiB  
Article
The Effect of Hydrogen Peroxide on Biogas and Methane Produced from Batch Mesophilic Anaerobic Digestion of Spent Coffee Grounds
by Siham Sayoud, Kerroum Derbal, Antonio Panico, Ludovico Pontoni, Massimiliano Fabbricino, Francesco Pirozzi and Abderrezzaq Benalia
Fermentation 2025, 11(2), 60; https://doi.org/10.3390/fermentation11020060 - 29 Jan 2025
Cited by 1 | Viewed by 1400
Abstract
This paper aims to explore both experimental and modeling anaerobic digestion (AD) processes as innovative methods for managing the substantial quantities of spent coffee grounds (SCG) generated in Algeria, transforming them into valuable renewable energy sources (biogas/methane). AD of SCG, while promising, is [...] Read more.
This paper aims to explore both experimental and modeling anaerobic digestion (AD) processes as innovative methods for managing the substantial quantities of spent coffee grounds (SCG) generated in Algeria, transforming them into valuable renewable energy sources (biogas/methane). AD of SCG, while promising, is hindered by its complex lignocellulosic structure, which poses a significant challenge. This study investigates the efficacy of hydrogen peroxide (H2O2) pretreatment in addressing this issue, with a particular focus on enhancing biogas and methane production. The AD of SCG was conducted over a 46-day period, and the impact of H2O2 pretreatment was evaluated using laboratory-scale batch anaerobic reactors. Four different concentrations of H2O2 (0.5, 1, 2, and 4% H2O2 w/w) were studied in mesophilic conditions (37 ± 2) for 24 h at room temperature, providing basic data on biogas and methane production. The results showed a significant increase in soluble oxygen demand (SCOD) and total sugar solubilization in the range of 555.96–713.02% and 748.48–817.75%, respectively. The optimal pretreatment was found to be 4% H2O2 w/w resulting in 16.28% and 16.93% improvements in biogas and methane yield over the untreated SCG. Further, while previous research has established oxidative pretreatment efficacy, this study uniquely combines the empirical analysis of H2O2 pretreatment with a detailed kinetic modeling approach using the modified Gompertz (MG) and logistic function (LF) models to estimate kinetic parameters and determine the accuracy of fit. The MG model showed the most accurate prediction, thus making the present investigation a contribution to understanding the performance of the AD system under oxidative pretreatment and designing and scaling up new systems with predictability. These findings highlight the potential of H2O2-pretreated SCG as a more efficient and readily available resource for sustainable waste management and renewable energy production. Full article
(This article belongs to the Special Issue Biofuels Production and Processing Technology, 3rd Edition)
Show Figures

Figure 1

23 pages, 5966 KiB  
Article
Intelligent Human–Computer Interaction for Building Information Models Using Gesture Recognition
by Tianyi Zhang, Yukang Wang, Xiaoping Zhou, Deli Liu, Jingyi Ji and Junfu Feng
Inventions 2025, 10(1), 5; https://doi.org/10.3390/inventions10010005 - 16 Jan 2025
Cited by 2 | Viewed by 1292
Abstract
Human–computer interaction (HCI) with three-dimensional (3D) Building Information Modelling/Model (BIM) is the crucial ingredient to enhancing the user experience and fostering the value of BIM. Current BIMs mostly use keyboard, mouse, or touchscreen as media for HCI. Using these hardware devices for HCI [...] Read more.
Human–computer interaction (HCI) with three-dimensional (3D) Building Information Modelling/Model (BIM) is the crucial ingredient to enhancing the user experience and fostering the value of BIM. Current BIMs mostly use keyboard, mouse, or touchscreen as media for HCI. Using these hardware devices for HCI with BIM may lead to space constraints and a lack of visual intuitiveness. Somatosensory interaction represents an emergent modality of interaction, e.g., gesture interaction, which requires no equipment or direct touch, presents a potential approach to solving these problems. This paper proposes a computer-vision-based gesture interaction system for BIM. Firstly, a set of gestures for BIM model manipulation was designed, grounded in human ergonomics. These gestures include selection, translation, scaling, rotation, and restoration of the 3D model. Secondly, a gesture understanding algorithm dedicated to 3D model manipulation is introduced in this paper. Then, an interaction system for 3D models based on machine vision and gesture recognition was developed. A series of systematic experiments are conducted to confirm the effectiveness of the proposed system. In various environments, including pure white backgrounds, offices, and conference rooms, even when wearing gloves, the system has an accuracy rate of over 97% and a frame rate maintained between 26 and 30 frames. The final experimental results show that the method has good performance, confirming its feasibility, accuracy, and fluidity. Somatosensory interaction with 3D models enhances the interaction experience and operation efficiency between the user and the model, further expanding the application scene of BIM. Full article
Show Figures

Figure 1

30 pages, 18075 KiB  
Article
A Workflow for a Building Information Modeling-Based Thermo-Hygrometric Digital Twin: An Experimentation in an Existing Building
by Tullio De Rubeis, Annamaria Ciccozzi, Mattia Ragnoli, Vincenzo Stornelli, Stefano Brusaporci, Alessandra Tata and Dario Ambrosini
Sustainability 2024, 16(23), 10281; https://doi.org/10.3390/su162310281 - 24 Nov 2024
Cited by 3 | Viewed by 1641
Abstract
Building Information Modeling (BIM)-based digital twin (DT) could play a fundamental role in overcoming the limitations of traditional monitoring methods by driving the digitalization of the construction sector. While existing studies on the topic have provided valuable insights, significant knowledge gaps remain, which [...] Read more.
Building Information Modeling (BIM)-based digital twin (DT) could play a fundamental role in overcoming the limitations of traditional monitoring methods by driving the digitalization of the construction sector. While existing studies on the topic have provided valuable insights, significant knowledge gaps remain, which continue to hinder the large-scale adoption of this approach. Moreover, to date, there is no standardized procedure available, able to guide the step-by-step creation of a DT. Another significant challenge concerns the choice of technologies able to integrate perfectly with each other throughout the process. This paper outlines a comprehensive workflow for creating a digital twin (DT) of an existing building and proposes various solutions to improve the integration of different technologies involved. These enhancements aim to address the limitations of current monitoring methods and leverage the advantages of BIM and DT for accessing and managing monitoring data, ultimately facilitating the implementation of energy-efficient interventions. This work examines the concept of “Living Lab” in an office building also used as an academic laboratory. The created DT allowed for real-time remote monitoring of four rooms, each with a different functional and occupational characteristic, useful also for future predictive analyses. Full article
Show Figures

Figure 1

11 pages, 1096 KiB  
Article
The Scale Model Room Approach to Test the Performance of Airtight Membranes to Control Indoor Radon Levels and Radiation Exposure
by Manuela Portaro, Ilaria Rocchetti, Paola Tuccimei, Gianfranco Galli, Michele Soligo, Cristina Longoni and Dino Vasquez
Atmosphere 2024, 15(10), 1260; https://doi.org/10.3390/atmos15101260 - 21 Oct 2024
Cited by 1 | Viewed by 1216
Abstract
Indoor radon is one of the most significant contributors to lung cancer after smoking. Mitigation strategies based on protecting buildings with radon barrier materials, combined with home ventilation or room pressurization, are regularly used. A scale model room made from a porous ignimbrite [...] Read more.
Indoor radon is one of the most significant contributors to lung cancer after smoking. Mitigation strategies based on protecting buildings with radon barrier materials, combined with home ventilation or room pressurization, are regularly used. A scale model room made from a porous ignimbrite rich in radon precursors was used as an analogue to test the efficiency of fifteen airtight membranes to reduce radon levels, also in combination with room pressurization. The results of these experiments were considered together with previous ones to propose the scale model room approach as a tool for rapidly evaluating the performance of specially designed radon barrier materials, and for radiation exposure assessment. Relative reduction of indoor radon (RIR) ranges from −20 to −94%. The most effective materials were FPO membrane, single-component silane-terminated polymer membranes and synthetic resins. The presence of additives likely modified the composition and structure of some products, improving their radon barrier capacity. The introduction of room pressurization further reduced radon levels in the model room where the membranes were applied. The overpressure necessary to reach RIRs of the order of 85–90% is very low for materials that powerfully stop radon even without ventilation, but necessarily higher for poorer membranes. Full article
(This article belongs to the Special Issue Environmental Radon Measurement and Radiation Exposure Assessment)
Show Figures

Figure 1

13 pages, 2723 KiB  
Article
Research on Wi-Fi Fingerprint Database Construction Method Based on Environmental Feature Awareness
by Jiaxuan Wu, Tianzhong Yang and Zengting Zhang
Appl. Syst. Innov. 2024, 7(5), 99; https://doi.org/10.3390/asi7050099 - 18 Oct 2024
Cited by 1 | Viewed by 1216
Abstract
Indoor localization technology is becoming increasingly widespread, but traditional methods for constructing Wi-Fi fingerprint databases face significant challenges, particularly in large, multi-room environments. These methods often suffer from low efficiency and high costs associated with manual data collection. To address these issues, various [...] Read more.
Indoor localization technology is becoming increasingly widespread, but traditional methods for constructing Wi-Fi fingerprint databases face significant challenges, particularly in large, multi-room environments. These methods often suffer from low efficiency and high costs associated with manual data collection. To address these issues, various approaches like crowdsourcing and sparse collection have been introduced, but they still struggle with limitations such as inadequate data accuracy and uneven distribution. In this paper, we present a novel method for constructing Wi-Fi fingerprint databases based on environmental feature awareness. By leveraging deep learning to analyze the relationship between environmental features and Wi-Fi signal strength, our method enables faster and more efficient database construction. Experimental results demonstrate that our environmental feature-aware model significantly outperforms existing methods in prediction accuracy, greatly enhancing both the efficiency and accuracy of Wi-Fi fingerprint database construction. This approach also reduces the need for manual intervention and improves generalization capabilities. Our method proves to be highly practical and adaptable, especially in large-scale structures like nursing homes. It holds a substantial potential for broader application in extensive indoor environments, offering considerable value for widespread adoption. Full article
Show Figures

Figure 1

7 pages, 2075 KiB  
Review
Biomedical Flat and Nested Named Entity Recognition: Methods, Challenges, and Advances
by Yesol Park, Gyujin Son and Mina Rho
Appl. Sci. 2024, 14(20), 9302; https://doi.org/10.3390/app14209302 - 12 Oct 2024
Cited by 3 | Viewed by 2806
Abstract
Biomedical named entity recognition (BioNER) aims to identify and classify biomedical entities (i.e., diseases, chemicals, and genes) from text into predefined classes. This process serves as an important initial step in extracting biomedical information from textual sources. Considering the structure of the entities [...] Read more.
Biomedical named entity recognition (BioNER) aims to identify and classify biomedical entities (i.e., diseases, chemicals, and genes) from text into predefined classes. This process serves as an important initial step in extracting biomedical information from textual sources. Considering the structure of the entities it addresses, BioNER tasks are divided into two categories: flat NER, where entities are non-overlapping, and nested NER, which identifies entities embedded within another. While early studies primarily addressed flat NER, recent advances in neural models have enabled more sophisticated approaches to nested NER, gaining increasing relevance in the biomedical field, where entity relationships are often complex and hierarchically structured. This review, thus, focuses on the latest progress in large-scale pre-trained language model-based approaches, which have shown the significantly improved performance of NER. The state-of-the-art flat NER models have achieved average F1-scores of 84% on BC2GM, 89% on NCBI Disease, and 92% on BC4CHEM, while nested NER models have reached 80% on the GENIA dataset, indicating room for enhancement. In addition, we discuss persistent challenges, including inconsistencies of named entities annotated across different corpora and the limited availability of named entities of various entity types, particularly for multi-type or nested NER. To the best of our knowledge, this paper is the first comprehensive review of pre-trained language model-based flat and nested BioNER models, providing a categorical analysis among the methods and related challenges for future research and development in the field. Full article
(This article belongs to the Special Issue Advances and Applications of Complex Data Analysis and Computing)
Show Figures

Figure 1

15 pages, 3035 KiB  
Article
Multicenter Analysis of Emergency Patient Severity through Local Model Evaluation Client Selection: Optimizing Client Selection Based on Local Model Evaluation
by Yong-gyom Kim, SeMo Yang and KangYoon Lee
Appl. Sci. 2024, 14(16), 6876; https://doi.org/10.3390/app14166876 - 6 Aug 2024
Cited by 1 | Viewed by 1093
Abstract
In multi-institutional emergency room settings, the early identification of high-risk patients is crucial for effective severity management. This necessitates the development of advanced models capable of accurately predicting patient severity based on initial conditions. However, collecting and analyzing large-scale data for high-performance predictive [...] Read more.
In multi-institutional emergency room settings, the early identification of high-risk patients is crucial for effective severity management. This necessitates the development of advanced models capable of accurately predicting patient severity based on initial conditions. However, collecting and analyzing large-scale data for high-performance predictive models is challenging due to privacy and data security concerns in integrating data from multiple emergency rooms. To address this, our work applies federated learning (FL) techniques, maintaining privacy without centralizing data. Medical data, which are often non-independent and identically distributed (non-IID), pose challenges for existing FL, where random client selection can impact overall FL performance. Therefore, we introduce a new client selection mechanism based on local model evaluation (LMECS), enhancing performance and practicality. This approach shows that the proposed FL model can achieve comparable performance to centralized models and maintain data privacy. The execution time was reduced by up to 27% compared to the existing FL algorithm. In addition, compared to the average performance of local models without FL, our LMECS improved the AUC by 2% and achieved up to 23% performance improvement compared to the existing FL algorithm. This work presents the potential for effective patient severity management in multi-institutional emergency rooms using FL without data movement, offering an innovative approach that satisfies both medical data privacy and efficient utilization. Full article
Show Figures

Figure 1

17 pages, 5784 KiB  
Article
Advanced Computational Analysis of Cobalt-Based Superalloys through Crystal Plasticity
by Shahriyar Keshavarz, Carelyn E. Campbell and Andrew C. E. Reid
Materials 2024, 17(10), 2458; https://doi.org/10.3390/ma17102458 - 20 May 2024
Cited by 2 | Viewed by 1439
Abstract
This study introduces an advanced computational method aimed at accelerating continuum-scale processes using crystal plasticity approaches to predict mechanical responses in cobalt-based superalloys. The framework integrates two levels, namely, sub-grain and homogenized, at the meso-scale through crystal plasticity finite element (CPFE) platforms. The [...] Read more.
This study introduces an advanced computational method aimed at accelerating continuum-scale processes using crystal plasticity approaches to predict mechanical responses in cobalt-based superalloys. The framework integrates two levels, namely, sub-grain and homogenized, at the meso-scale through crystal plasticity finite element (CPFE) platforms. The model is applicable across a temperature range from room temperature up to 900 °C, accommodating various dislocation mechanisms in the microstructure. The sub-grain level explicitly incorporates precipitates and employs a dislocation density-based constitutive model that is size-dependent. In contrast, the homogenized level utilizes an activation energy-based constitutive model, implicitly representing the γ phase for efficiency in computations. This level considers the effects of composition and morphology on mechanical properties, demonstrating the potential for cobalt-based superalloys to rival nickel-based superalloys. The study aims to investigate the impacts of elements including tungsten, tantalum, titanium, and chromium through the homogenized constitutive model. The model accounts for the locking mechanism to address the cross-slip of screw dislocations at lower temperatures as well as the glide and climb mechanism to simulate diffusions at higher temperatures. The model’s validity is established across diverse compositions and morphologies, as well as various temperatures, through comparison with experimental data. This advanced computational framework not only enables accurate predictions of mechanical responses in cobalt-based superalloys across a wide temperature range, but also provides valuable insights into the design and optimization of these materials for high-temperature applications. Full article
Show Figures

Figure 1

Back to TopTop