Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (3,797)

Search Parameters:
Keywords = trend extraction

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
12 pages, 1839 KiB  
Article
A Knowledge–Data Dual-Driven Groundwater Condition Prediction Method for Tunnel Construction
by Yong Huang, Wei Fu and Xiewen Hu
Information 2025, 16(8), 659; https://doi.org/10.3390/info16080659 (registering DOI) - 1 Aug 2025
Abstract
This paper introduces a knowledge–data dual-driven method for predicting groundwater conditions during tunnel construction. Unlike existing methods, our approach effectively integrates trend characteristics of apparent resistivity from detection results with geological distribution characteristics and expert insights. This dual-driven strategy significantly enhances the accuracy [...] Read more.
This paper introduces a knowledge–data dual-driven method for predicting groundwater conditions during tunnel construction. Unlike existing methods, our approach effectively integrates trend characteristics of apparent resistivity from detection results with geological distribution characteristics and expert insights. This dual-driven strategy significantly enhances the accuracy of the prediction model. The intelligent prediction process for tunnel groundwater conditions proceeds in the following steps: First, the apparent resistivity data matrix is obtained from transient electromagnetic detection results and standardized. Second, to improve data quality, trend characteristics are extracted from the apparent resistivity data, and outliers are eliminated. Third, expert insights are systematically integrated to fully utilize prior information on groundwater conditions at the construction face, leading to the establishment of robust predictive models tailored to data from various construction surfaces. Finally, the relevant prediction segment is extracted to complete the groundwater condition forecast. Full article
Show Figures

Figure 1

37 pages, 2065 KiB  
Review
Research Activities on Acid Mine Drainage Treatment in South Africa (1998–2025): Trends, Challenges, Bibliometric Analysis and Future Directions
by Tumelo M. Mogashane, Johannes P. Maree, Lebohang Mokoena and James Tshilongo
Water 2025, 17(15), 2286; https://doi.org/10.3390/w17152286 (registering DOI) - 31 Jul 2025
Abstract
Acid mine drainage (AMD) remains a critical environmental challenge in South Africa due to its severe impact on water quality, ecosystems and public health. Numerous studies on AMD management, treatment and resource recovery have been conducted over the past 20 years. This study [...] Read more.
Acid mine drainage (AMD) remains a critical environmental challenge in South Africa due to its severe impact on water quality, ecosystems and public health. Numerous studies on AMD management, treatment and resource recovery have been conducted over the past 20 years. This study presents a comprehensive review of research activities on AMD in South Africa from 1998 to 2025, highlighting key trends, emerging challenges and future directions. The study reveals a significant focus on passive and active treatment methods, environmental remediation and the recovery of valuable resources, such as iron, rare earth elements (REEs) and gypsum. A bibliometric analysis was conducted to identify the most influential studies and thematic research areas over the years. Bibliometric tools (Biblioshiny and VOSviewer) were used to analyse the data that was extracted from the PubMed database. The findings indicate that research production has increased significantly over time, with substantial contributions from top academics and institutions. Advanced treatment technologies, the use of artificial intelligence and circular economy strategies for resource recovery are among the new research prospects identified in this study. Despite substantial progress, persistent challenges, such as scalability, economic viability and policy implementation, remain. Furthermore, few technologies have moved beyond pilot-scale implementation, underscoring the need for greater investment in field-scale research and technology transfer. This study recommends stronger industry–academic collaboration, the development of standardised treatment protocols and enhanced government policy support to facilitate sustainable AMD management. The study emphasises the necessity of data-driven approaches, sustainable technology and interdisciplinary cooperation to address AMD’s socioeconomic and environmental effects in the ensuing decades. Full article
27 pages, 2137 KiB  
Article
DKWM-XLSTM: A Carbon Trading Price Prediction Model Considering Multiple Influencing Factors
by Yunlong Yu, Xuan Song, Guoxiong Zhou, Lingxi Liu, Meixi Pan and Tianrui Zhao
Entropy 2025, 27(8), 817; https://doi.org/10.3390/e27080817 (registering DOI) - 31 Jul 2025
Abstract
Forestry carbon sinks play a crucial role in mitigating climate change and protecting ecosystems, significantly contributing to the development of carbon trading systems. Remote sensing technology has become increasingly important for monitoring carbon sinks, as it allows for precise measurement of carbon storage [...] Read more.
Forestry carbon sinks play a crucial role in mitigating climate change and protecting ecosystems, significantly contributing to the development of carbon trading systems. Remote sensing technology has become increasingly important for monitoring carbon sinks, as it allows for precise measurement of carbon storage and ecological changes, which are vital for forecasting carbon prices. Carbon prices fluctuate due to the interaction of various factors, exhibiting non-stationary characteristics and inherent uncertainties, making accurate predictions particularly challenging. To address these complexities, this study proposes a method for predicting carbon trading prices influenced by multiple factors. We introduce a Decomposition (DECOMP) module that separates carbon price data and its influencing factors into trend and cyclical components. To manage non-stationarity, we propose the KAN with Multi-Domain Diffusion (KAN-MD) module, which efficiently extracts relevant features. Furthermore, a Wave-MH attention module, based on wavelet transformation, is introduced to minimize interference from uncertainties, thereby enhancing the robustness of the model. Empirical research using data from the Hubei carbon trading market demonstrates that our model achieves superior predictive accuracy and resilience to fluctuations compared to other benchmark methods, with an MSE of 0.204% and an MAE of 0.0277. These results provide reliable support for pricing carbon financial derivatives and managing associated risks. Full article
52 pages, 2383 KiB  
Review
Enhancing Human Health Through Nutrient and Bioactive Compound Recovery from Agri-Food By-Products: A Decade of Progress
by Cinzia Ingallina, Mattia Spano, Sabrina Antonia Prencipe, Giuliana Vinci, Antonella Di Sotto, Donatella Ambroselli, Valeria Vergine, Maria Elisa Crestoni, Chiara Di Meo, Nicole Zoratto, Luana Izzo, Abel Navarré, Giuseppina Adiletta, Paola Russo, Giacomo Di Matteo, Luisa Mannina and Anna Maria Giusti
Nutrients 2025, 17(15), 2528; https://doi.org/10.3390/nu17152528 (registering DOI) - 31 Jul 2025
Abstract
In light of pressing global nutritional needs, the valorization of agri-food waste constitutes a vital strategy for enhancing human health and nutrition, while simultaneously supporting planetary health. This integrated approach is increasingly indispensable within sustainable and equitable food systems. Recently, a sustainability-driven focus [...] Read more.
In light of pressing global nutritional needs, the valorization of agri-food waste constitutes a vital strategy for enhancing human health and nutrition, while simultaneously supporting planetary health. This integrated approach is increasingly indispensable within sustainable and equitable food systems. Recently, a sustainability-driven focus has shifted attention toward the valorization of the agri-food by-products as rich sources of bioactive compounds useful in preventing or treating chronic diseases. Agri-food by-products, often regarded as waste, actually hold great potential as they are rich in bioactive components, dietary fiber, and other beneficial nutrients from which innovative food ingredients, functional foods, and even therapeutic products are developed. This review aims to provide a comprehensive analysis of the current advances in recovering and applying such compounds from agri-food waste, with a particular focus on their roles in human health, sustainable packaging, and circular economy strategies. Methods: This review critically synthesizes recent scientific literature on the extraction, characterization, and utilization of bioactive molecules from agri-food by-products. After careful analysis of the PubMed and Scopus databases, only English-language articles from the last 10 years were included in the final narrative review. The analysis also encompasses applications in the nutraceutical, pharmaceutical, and food packaging sectors. Results: Emerging technologies have enabled the efficient and eco-friendly recovery of compounds such as polyphenols, carotenoids, and dietary fibers that demonstrate antioxidant, antimicrobial, and anti-inflammatory properties. These bioactive compounds support the development of functional foods and biodegradable packaging materials. Furthermore, these valorization strategies align with global health trends by promoting dietary supplements that counteract the effects of the Western diet and chronic diseases. Conclusions: Valorization of agri-food by-products offers a promising path toward sustainable development by reducing waste, enhancing public health, and driving innovation. This strategy not only minimizes waste and supports sustainability, but also promotes a more nutritious and resilient food system. Full article
(This article belongs to the Special Issue Nutrition 3.0: Between Tradition and Innovation)
23 pages, 5770 KiB  
Article
Assessment of Influencing Factors and Robustness of Computable Image Texture Features in Digital Images
by Diego Andrade, Howard C. Gifford and Mini Das
Tomography 2025, 11(8), 87; https://doi.org/10.3390/tomography11080087 (registering DOI) - 31 Jul 2025
Abstract
Background/Objectives: There is significant interest in using texture features to extract hidden image-based information. In medical imaging applications using radiomics, AI, or personalized medicine, the quest is to extract patient or disease specific information while being insensitive to other system or processing variables. [...] Read more.
Background/Objectives: There is significant interest in using texture features to extract hidden image-based information. In medical imaging applications using radiomics, AI, or personalized medicine, the quest is to extract patient or disease specific information while being insensitive to other system or processing variables. While we use digital breast tomosynthesis (DBT) to show these effects, our results would be generally applicable to a wider range of other imaging modalities and applications. Methods: We examine factors in texture estimation methods, such as quantization, pixel distance offset, and region of interest (ROI) size, that influence the magnitudes of these readily computable and widely used image texture features (specifically Haralick’s gray level co-occurrence matrix (GLCM) textural features). Results: Our results indicate that quantization is the most influential of these parameters, as it controls the size of the GLCM and range of values. We propose a new multi-resolution normalization (by either fixing ROI size or pixel offset) that can significantly reduce quantization magnitude disparities. We show reduction in mean differences in feature values by orders of magnitude; for example, reducing it to 7.34% between quantizations of 8–128, while preserving trends. Conclusions: When combining images from multiple vendors in a common analysis, large variations in texture magnitudes can arise due to differences in post-processing methods like filters. We show that significant changes in GLCM magnitude variations may arise simply due to the filter type or strength. These trends can also vary based on estimation variables (like offset distance or ROI) that can further complicate analysis and robustness. We show pathways to reduce sensitivity to such variations due to estimation methods while increasing the desired sensitivity to patient-specific information such as breast density. Finally, we show that our results obtained from simulated DBT images are consistent with what we see when applied to clinical DBT images. Full article
Show Figures

Figure 1

28 pages, 2174 KiB  
Article
Validating Lava Tube Stability Through Finite Element Analysis of Real-Scene 3D Models
by Jiawang Wang, Zhizhong Kang, Chenming Ye, Haiting Yang and Xiaoman Qi
Electronics 2025, 14(15), 3062; https://doi.org/10.3390/electronics14153062 (registering DOI) - 31 Jul 2025
Abstract
The structural stability of lava tubes is a critical factor for their potential use in lunar base construction. Previous studies could not reflect the details of lava tube boundaries and perform accurate mechanical analysis. To this end, this study proposes a robust method [...] Read more.
The structural stability of lava tubes is a critical factor for their potential use in lunar base construction. Previous studies could not reflect the details of lava tube boundaries and perform accurate mechanical analysis. To this end, this study proposes a robust method to construct a high-precision, real-scene 3D model based on ground lava tube point cloud data. By employing finite element analysis, this study investigated the impact of real-world cross-sectional geometry, particularly the aspect ratio, on structural stability under surface pressure simulating meteorite impacts. A high-precision 3D reconstruction was achieved using UAV-mounted LiDAR and SLAM-based positioning systems, enabling accurate geometric capture of lava tube profiles. The original point cloud data were processed to extract cross-sections, which were then classified by their aspect ratios for analysis. Experimental results confirmed that the aspect ratio is a significant factor in determining stability. Crucially, unlike the monotonic trends often suggested by idealized models, analysis of real-world geometries revealed that the greatest deformation and structural vulnerability occur in sections with an aspect ratio between 0.5 and 0.6. For small lava tubes buried 3 m deep, the ground pressure they can withstand does not exceed 6 GPa. This process helps identify areas with weaker load-bearing capacity. The analysis demonstrated that a realistic 3D modeling approach provides a more accurate and reliable assessment of lava tube stability. This framework is vital for future evaluations of lunar lava tubes as safe habitats and highlights that complex, real-world geometry can lead to non-intuitive structural weaknesses not predicted by simplified models. Full article
Show Figures

Figure 1

42 pages, 3045 KiB  
Review
HBIM and Information Management for Knowledge and Conservation of Architectural Heritage: A Review
by Maria Parente, Nazarena Bruno and Federica Ottoni
Heritage 2025, 8(8), 306; https://doi.org/10.3390/heritage8080306 (registering DOI) - 30 Jul 2025
Abstract
This paper presents a comprehensive review of research on Historic Building Information Modeling (HBIM), focusing on its role as a tool for managing knowledge and supporting conservation practices of Architectural Heritage. While previous review articles and most research works have predominantly addressed geometric [...] Read more.
This paper presents a comprehensive review of research on Historic Building Information Modeling (HBIM), focusing on its role as a tool for managing knowledge and supporting conservation practices of Architectural Heritage. While previous review articles and most research works have predominantly addressed geometric modeling—given its significant challenges in the context of historic buildings—this study places greater emphasis on the integration of non-geometric data within the BIM environment. A systematic search was conducted in the Scopus database to extract the 451 relevant publications analyzed in this review, covering the period from 2008 to mid-2024. A bibliometric analysis was first performed to identify trends in publication types, geographic distribution, research focuses, and software usage. The main body of the review then explores three core themes in the development of the information system: the definition of model entities, both semantic and geometric; the data enrichment phase, incorporating historical, diagnostic, monitoring and conservation-related information; and finally, data use and sharing, including on-site applications and interoperability. For each topic, the review highlights and discusses the principal approaches documented in the literature, critically evaluating the advantages and limitations of different information management methods with respect to the distinctive features of the building under analysis and the specific objectives of the information model. Full article
Show Figures

Figure 1

18 pages, 10854 KiB  
Article
A Novel Method for Predicting Landslide-Induced Displacement of Building Monitoring Points Based on Time Convolution and Gaussian Process
by Jianhu Wang, Xianglin Zeng, Yingbo Shi, Jiayi Liu, Liangfu Xie, Yan Xu and Jie Liu
Electronics 2025, 14(15), 3037; https://doi.org/10.3390/electronics14153037 - 30 Jul 2025
Viewed by 15
Abstract
Accurate prediction of landslide-induced displacement is essential for the structural integrity and operational safety of buildings and infrastructure situated in geologically unstable regions. This study introduces a novel hybrid predictive framework that synergistically integrates Gaussian Process Regression (GPR) with Temporal Convolutional Neural Networks [...] Read more.
Accurate prediction of landslide-induced displacement is essential for the structural integrity and operational safety of buildings and infrastructure situated in geologically unstable regions. This study introduces a novel hybrid predictive framework that synergistically integrates Gaussian Process Regression (GPR) with Temporal Convolutional Neural Networks (TCNs), herein referred to as the GTCN model, to forecast displacement at building monitoring points subject to landslide activity. The proposed methodology is validated using time-series monitoring data collected from the slope adjacent to the Zhongliang Reservoir in Wuxi County, Chongqing, an area where slope instability poses a significant threat to nearby structural assets. Experimental results demonstrate the GTCN model’s superior predictive performance, particularly under challenging conditions of incomplete or sparsely sampled data. The model proves highly effective in accurately characterizing both abrupt fluctuations within the displacement time series and capturing long-term deformation trends. Furthermore, the GTCN framework outperforms comparative hybrid models based on Gated Recurrent Units (GRUs) and GPR, with its advantage being especially pronounced in data-limited scenarios. It also exhibits enhanced capability for temporal feature extraction relative to conventional imputation-based forecasting strategies like forward-filling. By effectively modeling both nonlinear trends and uncertainty within displacement sequences, the GTCN framework offers a robust and scalable solution for landslide-related risk assessment and early warning applications. Its applicability to building safety monitoring underscores its potential contribution to geotechnical hazard mitigation and resilient infrastructure management. Full article
Show Figures

Figure 1

24 pages, 8636 KiB  
Article
Oil Film Segmentation Method Using Marine Radar Based on Feature Fusion and Artificial Bee Colony Algorithm
by Jin Xu, Bo Xu, Xiaoguang Mou, Boxi Yao, Zekun Guo, Xiang Wang, Yuanyuan Huang, Sihan Qian, Min Cheng, Peng Liu and Jianning Wu
J. Mar. Sci. Eng. 2025, 13(8), 1453; https://doi.org/10.3390/jmse13081453 - 29 Jul 2025
Viewed by 96
Abstract
In the wake of the continuous development of the international strategic petroleum reserve system, the tonnage and quantity of oil tankers have been increasing. This trend has driven the expansion of offshore oil exploration and transportation, resulting in frequent incidents of ship oil [...] Read more.
In the wake of the continuous development of the international strategic petroleum reserve system, the tonnage and quantity of oil tankers have been increasing. This trend has driven the expansion of offshore oil exploration and transportation, resulting in frequent incidents of ship oil spills. Catastrophic impacts have been exerted on the marine environment by these accidents, posing a serious threat to economic development and ecological security. Therefore, there is an urgent need for efficient and reliable methods to detect oil spills in a timely manner and minimize potential losses as much as possible. In response to this challenge, a marine radar oil film segmentation method based on feature fusion and the artificial bee colony (ABC) algorithm is proposed in this study. Initially, the raw experimental data are preprocessed to obtain denoised radar images. Subsequently, grayscale adjustment and local contrast enhancement operations are carried out on the denoised images. Next, the gray level co-occurrence matrix (GLCM) features and Tamura features are extracted from the locally contrast-enhanced images. Then, the generalized least squares (GLS) method is employed to fuse the extracted texture features, yielding a new feature fusion map. Afterwards, the optimal processing threshold is determined to obtain effective wave regions by using the bimodal graph direct method. Finally, the ABC algorithm is utilized to segment the oil films. This method can provide data support for oil spill detection in marine radar images. Full article
(This article belongs to the Section Ocean Engineering)
Show Figures

Figure 1

14 pages, 1487 KiB  
Article
On the Interplay Between Roughness and Elastic Modulus at the Nanoscale: A Methodology Study with Bone as Model Material
by Alessandro Gambardella, Gregorio Marchiori, Melania Maglio, Marco Boi, Matteo Montesissa, Jessika Bertacchini, Stefano Biressi, Nicola Baldini, Gianluca Giavaresi and Marco Bontempi
J. Funct. Biomater. 2025, 16(8), 276; https://doi.org/10.3390/jfb16080276 - 29 Jul 2025
Viewed by 150
Abstract
Atomic force microscopy (AFM)-based nanoindentation enables investigation of the mechanical response of biological materials at a subcellular scale. However, quantitative estimates of mechanical parameters such as the elastic modulus (E) remain unreliable because the influence of sample roughness on E measurements at the [...] Read more.
Atomic force microscopy (AFM)-based nanoindentation enables investigation of the mechanical response of biological materials at a subcellular scale. However, quantitative estimates of mechanical parameters such as the elastic modulus (E) remain unreliable because the influence of sample roughness on E measurements at the nanoscale is still poorly understood. This study re-examines the interpretation of roughness from a more rigorous perspective and validates an experimental methodology to extract roughness at each nanoindentation site—i.e., the local roughness γs—with which the corresponding E value can be accurately correlated. Cortical regions of a murine tibia cross-section, characterized by complex nanoscale morphology, were selected as a testbed. Eighty non-overlapping nanoindentations were performed using two different AFM tips, maintaining a maximum penetration depth of 10 nm for each measurement. Our results show a slight decreasing trend of E versus γs (Spearman’s rank correlation coefficient ρ = −0.27187). A total of 90% of the E values are reliable when γs < 10 nm (coefficient of determination R2 > 0.90), although low γs values are associated with significant dispersion around E (γs = 0) = E0 = 1.18 GPa, with variations exceeding 50%. These findings are consistent with a qualitative tip-to-sample contact model that accounts for the pronounced roughness heterogeneity typical of bone topography at the nanoscale. Full article
(This article belongs to the Section Biomaterials and Devices for Healthcare Applications)
Show Figures

Figure 1

17 pages, 2524 KiB  
Article
A Model-Driven Approach to Assessing the Fouling Mechanism in the Crossflow Filtration of Laccase Extract from Pleurotus ostreatus 202
by María Augusta Páez, Mary Casa-Villegas, Vanesa Naranjo-Moreno, Neyda Espín Félix, Katty Cabezas-Terán and Alfonsina Andreatta
Membranes 2025, 15(8), 226; https://doi.org/10.3390/membranes15080226 - 29 Jul 2025
Viewed by 217
Abstract
Membrane technology is primarily used for the separation and purification of biotechnological products, which contain proteins and enzymes. Membrane fouling during crossflow filtration remains a significant challenge. This study aims to initially validate crossflow filtration models, particularly related to pore-blocking mechanisms, through a [...] Read more.
Membrane technology is primarily used for the separation and purification of biotechnological products, which contain proteins and enzymes. Membrane fouling during crossflow filtration remains a significant challenge. This study aims to initially validate crossflow filtration models, particularly related to pore-blocking mechanisms, through a comparative analysis with dead-end filtration models. One crossflow microfiltration (MF) and six consecutive ultrafiltration (UF) stages were implemented to concentrate laccase extracts from Pleurotus ostreatus 202 fungi. The complete pore-blocking mechanism significantly impacts the MF, UF 1000, UF 100 and UF 10 stages, with the highest related filtration constant (KbF) estimated at 12.60 × 10−4 (m−1). Although the intermediate pore-blocking mechanism appears across all filtration stages, UF 100 is the most affected, with an associated filtration constant (KiF) of 16.70 (m−1). This trend is supported by the highest purification factor (6.95) and the presence of 65, 62 and 56 kDa laccases in the retentate. Standard pore blocking occurs at the end of filtration, only in the MF and UF 1000 stages, with filtration constants (KsF) of 29.83 (s−0.5m−0.5) and 31.17 (s−0.5m−0.5), respectively. The absence of cake formation and the volume of permeate recovered indicate that neither membrane was exposed to exhaustive fouling that could not be reversed by backwashing. Full article
(This article belongs to the Section Membrane Applications for Other Areas)
Show Figures

Figure 1

24 pages, 3694 KiB  
Article
Enhancing the Distinguishability of Minor Fluctuations in Time Series Classification Using Graph Representation: The MFSI-TSC Framework
by He Nai, Chunlei Zhang and Xianjun Hu
Sensors 2025, 25(15), 4672; https://doi.org/10.3390/s25154672 - 29 Jul 2025
Viewed by 176
Abstract
In industrial systems, sensors often classify collected time series data for incipient fault diagnosis. However, time series data from sensors during the initial stages of a fault often exhibits minor fluctuation characteristics. Existing time series classification (TSC) methods struggle to achieve high classification [...] Read more.
In industrial systems, sensors often classify collected time series data for incipient fault diagnosis. However, time series data from sensors during the initial stages of a fault often exhibits minor fluctuation characteristics. Existing time series classification (TSC) methods struggle to achieve high classification accuracy when these minor fluctuations serve as the primary distinguishing feature. This limitation arises because the low-amplitude variations of these fluctuations, compared with trends, lead the classifier to prioritize and learn trend features while ignoring the minor fluctuations crucial for accurate classification. To address this challenge, this paper proposes a novel graph-based time series classification framework, termed MFSI-TSC. MFSI-TSC first extracts the trend component of the raw time series. Subsequently, both the trend series and the raw series are represented as graphs by extracting the “visible relationship” of the series. By performing a subtraction operation between these graphs, the framework isolates the differential information arising from the minor fluctuations. The subtracted graph effectively captures minor fluctuations by highlighting topological variations, thereby making them more distinguishable. Furthermore, the framework incorporates optimizations to reduce computational complexity, facilitating its deployment in resource-constrained sensor systems. Finally, empirical evaluation of MFSI-TSC on both real-world and publicly available datasets demonstrates its effectiveness. Compared with ten benchmark methods, MFSI-TSC exhibits both high accuracy and computational efficiency, making it more suitable for deployment in sensor systems to complete incipient fault detection tasks. Full article
(This article belongs to the Section Fault Diagnosis & Sensors)
Show Figures

Figure 1

40 pages, 6652 KiB  
Systematic Review
How Architectural Heritage Is Moving to Smart: A Systematic Review of HBIM
by Huachun Cui and Jiawei Wu
Buildings 2025, 15(15), 2664; https://doi.org/10.3390/buildings15152664 - 28 Jul 2025
Viewed by 190
Abstract
Heritage Building Information Modeling (HBIM) has emerged as a key tool in advancing heritage conservation and sustainable management. Preceding reviews had typically concentrated on specific technical aspects but did not provide sufficient bibliometric analysis. This study aims to integrate existing HBIM research to [...] Read more.
Heritage Building Information Modeling (HBIM) has emerged as a key tool in advancing heritage conservation and sustainable management. Preceding reviews had typically concentrated on specific technical aspects but did not provide sufficient bibliometric analysis. This study aims to integrate existing HBIM research to identify key research patterns, emerging trends, and forecast future directions. A total of 1516 documents were initially retrieved from the Web of Science Core Collection using targeted search terms. Following a relevance screening, 1175 documents were related to the topic. CiteSpace 6.4.R1, VOSviewer 1.6.20, and Bibliometrix 4.1, three bibliometric tools, were employed to conduct both quantitative and qualitative assessments. The results show three historical phases of HBIM, identify core journals, influential authors, and leading regions, and extract six major keyword clusters: risk assessment, data acquisition, semantic annotation, digital twins, and energy and equipment management. Nine co-citation clusters further outline the foundational literature in the field. The results highlight growing scholarly interest in workflow integration and digital twin applications. Future projections emphasize the transformative potential of artificial intelligence in HBIM, while also recognizing critical implementation barriers, particularly in developing countries and resource-constrained contexts. This study provides a comprehensive and systematic framework for HBIM research, offering valuable insights for scholars, practitioners, and policymakers involved in heritage preservation and digital management. Full article
Show Figures

Figure 1

25 pages, 3545 KiB  
Article
Combined Effects of PFAS, Social, and Behavioral Factors on Liver Health
by Akua Marfo and Emmanuel Obeng-Gyasi
Med. Sci. 2025, 13(3), 99; https://doi.org/10.3390/medsci13030099 - 28 Jul 2025
Viewed by 184
Abstract
Background: Environmental exposures, such as per- and polyfluoroalkyl substances (PFAS), in conjunction with social and behavioral factors, can significantly impact liver health. This research investigates the combined effects of PFAS (perfluorooctanoic acid (PFOA) and perfluorooctane sulfonate (PFOS), alcohol consumption, smoking, income, and education [...] Read more.
Background: Environmental exposures, such as per- and polyfluoroalkyl substances (PFAS), in conjunction with social and behavioral factors, can significantly impact liver health. This research investigates the combined effects of PFAS (perfluorooctanoic acid (PFOA) and perfluorooctane sulfonate (PFOS), alcohol consumption, smoking, income, and education on liver function among the U.S. population, utilizing data from the 2017–2018 National Health and Nutrition Examination Survey (NHANES). Methods: PFAS concentrations in blood samples were analyzed using online solid-phase extraction combined with liquid chromatography–tandem mass spectrometry (LC-MS/MS), a highly sensitive and specific method for detecting levels of PFAS. Liver function was evaluated using biomarkers such as alanine aminotransferase (ALT), aspartate aminotransferase (AST), alkaline phosphatase (ALP), gamma-glutamyltransferase (GGT), total bilirubin, and the fatty liver index (FLI). Descriptive statistics and multivariable linear regression analyses were employed to assess the associations between exposures and liver outcomes. Bayesian Kernel Machine Regression (BKMR) was utilized to explore the nonlinear and interactive effects of these exposures. To determine the relative influence of each factor on liver health, Posterior Inclusion Probabilities (PIPs) were calculated. Results: Linear regression analyses indicated that income and education were inversely associated with several liver injury biomarkers, while alcohol use and smoking demonstrated stronger and more consistent associations. Bayesian Kernel Machine Regression (BKMR) further highlighted alcohol and smoking as the most influential predictors, particularly for GGT and total bilirubin, with posterior inclusion probabilities (PIPs) close to 1.0. In contrast, PFAS showed weaker associations. Regression coefficients were small and largely non-significant, and PIPs were comparatively lower across most liver outcomes. Notably, education had a higher PIP for ALT and GGT than PFAS, suggesting a more protective role in liver health. People with higher education levels tend to live healthier lifestyles, have better access to healthcare, and are generally more aware of health risks. These factors can all help reduce the risk of liver problems. Overall mixture effects demonstrated nonlinear trends, including U-shaped relationships for ALT and GGT, and inverse associations for AST, FLI, and ALP. Conclusion: These findings underscore the importance of considering both environmental and social–behavioral determinants in liver health. While PFAS exposures remain a long-term concern, modifiable lifestyle and structural factors, particularly alcohol, smoking, income, and education, exert more immediate and pronounced effects on hepatic biomarkers in the general population. Full article
Show Figures

Figure 1

17 pages, 1307 KiB  
Review
Starch Valorisation as Biorefinery Concept Integrated by an Agro-Industry Case Study to Improve Sustainability
by Maider Gomez Palmero, Ana Carrasco, Paula de la Sen, María Dolores Mainar-Toledo, Sonia Ascaso Malo and Francisco Javier Royo Herrer
Sustainability 2025, 17(15), 6808; https://doi.org/10.3390/su17156808 - 27 Jul 2025
Viewed by 297
Abstract
The production of bio-based products for different purposes has become an increasingly common strategy over the last few decades, both in Europe and worldwide. This trend seeks to contribute to mitigating the impacts associated with climate change and to cope with the ambitious [...] Read more.
The production of bio-based products for different purposes has become an increasingly common strategy over the last few decades, both in Europe and worldwide. This trend seeks to contribute to mitigating the impacts associated with climate change and to cope with the ambitious objectives established at European level. Over recent decades, agro-industries have shown significant potential as biomass suppliers, triggering the development of robust logistical supply chains and the valorization of by-products to obtain bio-based products that can be marketed at competitive prices. However, this transformation may, in some cases, involve restructuring traditional business model to incorporate the biorefinery concept. In this sense, the first step in developing a bio-based value chain involves assessing the resource’s availability and characterizing the feedstock to select the valorization pathway and the bio-application with the greatest potential. The paper incorporates inputs from a case study on PATURPAT, a company commercializing a wide range of ready-prepared potato products, which has commissioned a starch extraction facility to process the rejected pieces of potatoes and water from the process to obtain starch that can be further valorized for different bio-applications. This study aims to comprehensively review current trends and frameworks for potatoes processing agro-industries and define the most suitable bio-applications to target, as well as identify opportunities and challenges. Full article
Show Figures

Figure 1

Back to TopTop