Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

Search Results (236)

Search Parameters:
Keywords = multiscale procedures

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
26 pages, 10910 KB  
Article
A Framework for Cultural Heritage Documentation, Safeguarding and Preservation Planning in Urban Environments—The Case of the Morosini Fountain
by Dimitrios Makris, Christina Sakellariou, Leonidas Karampinis, Maria Deli, Alexios-Nikolaos Stefanis, Georgios Bardis and Maria Mertzani
Heritage 2026, 9(3), 97; https://doi.org/10.3390/heritage9030097 - 28 Feb 2026
Viewed by 130
Abstract
This research establishes a high-fidelity documentation framework utilizing multi-sensor 3D data to support critical decisions regarding the conservation and preservation of monuments in urban environments. Focus is placed on the Morosini Fountain, Heraklion, Crete, a 17th-century monument facing significant deterioration due to environmental [...] Read more.
This research establishes a high-fidelity documentation framework utilizing multi-sensor 3D data to support critical decisions regarding the conservation and preservation of monuments in urban environments. Focus is placed on the Morosini Fountain, Heraklion, Crete, a 17th-century monument facing significant deterioration due to environmental stressors, material-specific decay of limestone and marble, and cumulative historical interventions. Placed within the context of contemporary cultural heritage management, the research establishes a high-fidelity 3D digital representative to support interdisciplinary documentation and a decision-support framework for restoration. The methodology employs handheld structured light scanning for high geometric accuracy with close-range digital photogrammetry to ensure high-fidelity color acquisition. Strategic semantic segmentation of the monument into architectural components—such as lobes, lions, and basins—facilitated large scale dataset management and optimized alignment procedures under challenging urban conditions, including intense direct sunlight and active water flow. Results include the delivery of metrically accurate multi-resolution models and 2D orthographic products. Quantitative pathology mapping successfully identified extensive affected surface areas on specific panels, while multi-scale geometric morphological analysis effectively identified high-complexity surface areas, which were subsequently classified as either intentional artistic form or active decay through expert visual assessment between intentional artistic form and active alveolar erosion or exogenous accretions. The study concludes that this enhanced digital model serves as an indispensable tool for sustainable management, transforming passive records into active predictive simulations. The implementation of multi-sensor 3D data provides the essential evidentiary basis for high-stakes conservation decisions, demonstrating that comprehensive digital recording is vital for the resilience of urban heritage landmarks. Full article
(This article belongs to the Special Issue Applications of Digital Technologies in the Heritage Preservation)
Show Figures

Figure 1

23 pages, 4959 KB  
Article
LMD-YOLO: An Efficient Silkworm Cocoon Defect Detection Model via Large Separable Kernel Attention and Dynamic Upsampling
by Jiajun Zhu, Depeng Gao, Xiangxiang Mei, Yipeng Geng, Shuxi Chen, Jianlin Qiu and Yuanzhi Zhang
Agriculture 2026, 16(5), 515; https://doi.org/10.3390/agriculture16050515 - 26 Feb 2026
Viewed by 174
Abstract
Sorting defective cocoons is a critical procedure in the silk reeling industry to ensure the quality of raw silk products. Currently, this process relies heavily on manual inspection, which is labor-intensive, subjective, and inefficient. While automated sorting based on machine vision offers a [...] Read more.
Sorting defective cocoons is a critical procedure in the silk reeling industry to ensure the quality of raw silk products. Currently, this process relies heavily on manual inspection, which is labor-intensive, subjective, and inefficient. While automated sorting based on machine vision offers a promising alternative, existing object detection algorithms struggle to balance accuracy and computational complexity, particularly when detecting tiny surface defects or distinguishing morphologically similar cocoons in dense scenarios. To address these challenges, this paper proposes an efficient silkworm cocoon defect detection model named LMD-YOLO, based on the YOLOv10 architecture. In this model, we introduce three key improvements to enhance feature extraction and multi-scale perception. First, we integrate a Large Separable Kernel Attention (LSKA) module into the C2f structure (C2f-LSKA) of the backbone. This design decomposes large kernels to capture global shape features with minimal computational cost, effectively distinguishing double cocoons from normal ones. Second, we replace standard upsampling with a DySample module in the neck, which utilizes dynamic point sampling to recover fine-grained texture details of tiny defects like surface stains. Third, a Multi-Scale Dilated Attention (MSDA) mechanism is embedded before the detection heads to aggregate semantic information across different scales, improving robustness against background interference. YOLOv10 was selected as the baseline due to its NMS-free characteristic, which mitigates the latency caused by post-processing in high-speed sorting tasks. Evaluations on a self-constructed multi-category dataset indicate that LMD-YOLO surpasses established detectors, including YOLOv8n and Faster R-CNN. Relative to the YOLOv10n baseline, our method improves mAP@0.5 by 3.11%, achieving 94.46%. Notably, Precision and Recall are increased by 3.50% and 2.97%, reaching 89.98% and 93.61%, respectively. With a compact size of 2.68 M parameters and an inference speed of 115 FPS, the proposed model offers a practical trade-off between accuracy and latency for real-time cocoon defect detection. Full article
(This article belongs to the Section Artificial Intelligence and Digital Agriculture)
Show Figures

Figure 1

19 pages, 2815 KB  
Article
Federated Intrusion Detection via Unidirectional Serialization and Multi-Scale 1D Convolutions with Attention Reweighting
by Wenqing Li, Di Gao and Tianrong Zhang
Future Internet 2026, 18(3), 117; https://doi.org/10.3390/fi18030117 - 26 Feb 2026
Viewed by 221
Abstract
Deployed in distributed organizations and edge networks, contemporary intrusion detection increasingly requires high-performing models without centralizing sensitive traffic logs. This study presents a lightweight federated intrusion detection framework that integrates (i) unidirectional serialization to convert tabular flow records into short sequences, (ii) multi-scale [...] Read more.
Deployed in distributed organizations and edge networks, contemporary intrusion detection increasingly requires high-performing models without centralizing sensitive traffic logs. This study presents a lightweight federated intrusion detection framework that integrates (i) unidirectional serialization to convert tabular flow records into short sequences, (ii) multi-scale one-dimensional convolutions to capture heterogeneous temporal–statistical patterns at different receptive fields, and (iii) an attention-based reweighting module that emphasizes informative feature channels prior to classification. A sample-size-weighted FedAvg aggregation protocol is used to train a global detector without transferring raw data. Experiments on three widely used benchmarks (UNSW-NB15, KDD Cup 99, and NSL-KDD) under multiple client configurations report consistently high detection effectiveness, with peak accuracies of 99.38% (UNSW-NB15), 99.86% (KDD Cup 99), and 99.02% (NSL-KDD), alongside strong precision, recall, and F1 scores. In addition, the proposed framework is quantitatively benchmarked on UNSW-NB15 against two recent federated intrusion detection baselines, FedMSP-SPEC and a multi-view federated CAE-NSVM model, demonstrating improvements of more than 10 percentage points in macro F1-score while retaining a compact architecture. The manuscript further specifies a concrete threat model, clarifies the client data partitioning strategy and Non-IID quantification, and provides a reproducibility protocol (hyperparameters, random seeds, and evaluation procedures) to facilitate independent verification. Full article
(This article belongs to the Section Cybersecurity)
Show Figures

Graphical abstract

14 pages, 1435 KB  
Article
Recurrence with Correlation Network for Medical Image Registration
by Vignesh Sivan, Teodora Vujovic, Raj Kumar Ranabhat, Alexander Wong, Stewart Mclachlin and Michael Hardisty
Appl. Sci. 2026, 16(4), 2084; https://doi.org/10.3390/app16042084 - 20 Feb 2026
Viewed by 230
Abstract
This work presents Recurrence with Correlation Network(RWCNet), a novel multi-scale recurrent neural network architecture for medical image registration that integrates core principles from optical flow, including correlation volume computation and inference-time instance optimization. In evaluations on the large-displacement National Lung Screening Test (NLST) [...] Read more.
This work presents Recurrence with Correlation Network(RWCNet), a novel multi-scale recurrent neural network architecture for medical image registration that integrates core principles from optical flow, including correlation volume computation and inference-time instance optimization. In evaluations on the large-displacement National Lung Screening Test (NLST) dataset, RWCNet exhibited superior performance (total registration error (TRE) of 2.11 mm) compared to other deep learning alternatives, and achieved results on par with variational optimization techniques. In contrast, on the OASIS dataset, which is characterized by smaller displacements, RWCNet achieved an average Dice similarity of 81.7%, representing only a modest improvement over other multi-scale deep learning models. Ablation experiments showed that multi-scale features consistently improved performance, whereas the correlation volume, number of recurrent steps, and inference-time instance optimization had large impacts on performance within the large-displacement NLST dataset. The performance of RWCNet compared to approaches that use instance optimization show that deep learning-based methods can find local minima that escape instance optimization methods. The results highlight the need for algorithm hyperparameter selection that adjusts with the dataset characteristics. RWCNet’s promising results may improve registration accuracy and computation efficiency, enabling many potential applications such as treatment planning, intra-procedural guidance, and longitudinal monitoring. Full article
(This article belongs to the Special Issue Advanced Biomedical Imaging Technologies and Their Applications)
Show Figures

Figure 1

25 pages, 3654 KB  
Article
MDF2Former: Multi-Scale Dual-Domain Feature Fusion Transformer for Hyperspectral Image Classification of Bacteria in Murine Wounds
by Decheng Wu, Wendan Liu, Rui Li, Xudong Fu, Lin Tao, Yinli Tian, Anqiang Zhang, Zhen Wang and Hao Tang
J. Imaging 2026, 12(2), 90; https://doi.org/10.3390/jimaging12020090 - 19 Feb 2026
Viewed by 207
Abstract
Bacterial wound infection poses a major challenge in trauma care and can lead to severe complications such as sepsis and organ failure. Therefore, rapid and accurate identification of the pathogen, along with targeted intervention, is of vital importance for improving treatment outcomes and [...] Read more.
Bacterial wound infection poses a major challenge in trauma care and can lead to severe complications such as sepsis and organ failure. Therefore, rapid and accurate identification of the pathogen, along with targeted intervention, is of vital importance for improving treatment outcomes and reducing risks. However, current detection methods are still constrained by procedural complexity and long processing times. In this study, a hyperspectral imaging (HSI) acquisition system for bacterial analysis and a multi-scale dual-domain feature fusion transformer (MDF2Former) were developed for classifying wound bacteria. MDF2Former integrates three modules: a multi-scale feature enhancement and fusion module that generates tokens with multi-scale discriminative representations, a spatial–spectral dual-branch attention module that strengthens joint feature modeling, and a frequency and spatial–spectral domain encoding module that captures global and local interactions among tokens through a hierarchical stacking structure, thereby enabling more efficient feature learning. Extensive experiments on our self-constructed HSI dataset of typical wound bacteria demonstrate that MDF2Former achieved outstanding performance across five metrics: Accuracy (91.94%), Precision (92.26%), Recall (91.94%), F1-score (92.01%), and Kappa coefficient (90.73%), surpassing all comparative models. These results have verified the effectiveness of combining HSI with deep learning for bacterial identification, and have highlighted its potential in assisting in the identification of bacterial species and making personalized treatment decisions for wound infections. Full article
(This article belongs to the Section Color, Multi-spectral, and Hyperspectral Imaging)
Show Figures

Figure 1

45 pages, 2207 KB  
Article
Integrating the Contrasting Perspectives Between the Constrained Disorder Principle and Deterministic Optical Nanoscopy: Enhancing Information Extraction from Imaging of Complex Systems
by Yaron Ilan
Bioengineering 2026, 13(1), 103; https://doi.org/10.3390/bioengineering13010103 - 15 Jan 2026
Viewed by 435
Abstract
This paper examines the contrasting yet complementary approaches of the Constrained Disorder Principle (CDP) and Stefan Hell’s deterministic optical nanoscopy for managing noise in complex systems. The CDP suggests that controlled disorder within dynamic boundaries is crucial for optimal system function, particularly in [...] Read more.
This paper examines the contrasting yet complementary approaches of the Constrained Disorder Principle (CDP) and Stefan Hell’s deterministic optical nanoscopy for managing noise in complex systems. The CDP suggests that controlled disorder within dynamic boundaries is crucial for optimal system function, particularly in biological contexts, where variability acts as an adaptive mechanism rather than being merely a measurement error. In contrast, Hell’s recent breakthrough in nanoscopy demonstrates that engineered diffraction minima can achieve sub-nanometer resolution without relying on stochastic (random) molecular switching, thereby replacing randomness with deterministic measurement precision. Philosophically, these two approaches are distinct: the CDP views noise as functionally necessary, while Hell’s method seeks to overcome noise limitations. However, both frameworks address complementary aspects of information extraction. The primary goal of microscopy is to provide information about structures, thereby facilitating a better understanding of their functionality. Noise is inherent to biological structures and functions and is part of the information in complex systems. This manuscript achieves integration through three specific contributions: (1) a mathematical framework combining CDP variability bounds with Hell’s precision measurements, validated through Monte Carlo simulations showing 15–30% precision improvements; (2) computational demonstrations with N = 10,000 trials quantifying performance under varying biological noise regimes; and (3) practical protocols for experimental implementation, including calibration procedures and real-time parameter optimization. The CDP provides a theoretical understanding of variability patterns at the system level, while Hell’s technique offers precision tools at the molecular level for validation. Integrating these approaches enables multi-scale analysis, allowing for deterministic measurements to accurately quantify the functional variability that the CDP theory predicts is vital for system health. This synthesis opens up new possibilities for adaptive imaging systems that maintain biologically meaningful noise while achieving unprecedented measurement precision. Specific applications include cancer diagnostics through chromosomal organization variability, neurodegenerative disease monitoring via protein aggregation disorder patterns, and drug screening by assessing cellular response heterogeneity. The framework comprises machine learning integration pathways for automated recognition of variability patterns and adaptive acquisition strategies. Full article
(This article belongs to the Section Biosignal Processing)
Show Figures

Figure 1

19 pages, 3110 KB  
Article
Multi-Scale Decomposition and Autocorrelation Modeling for Classical and Machine Learning-Based Time Series Forecasting
by Khawla Al-Saeedi, Andrew Fish, Diwei Zhou, Katerina Tsakiri and Antonios Marsellos
Mathematics 2026, 14(2), 283; https://doi.org/10.3390/math14020283 - 13 Jan 2026
Viewed by 345
Abstract
Environmental time series, such as near-surface air temperature, exhibit strong multi-scale structure and persistent autocorrelation. Accurate forecasting therefore requires careful consideration of both temporal scale separation and serial dependence. In this study, we evaluate a unified framework that integrates Kolmogorov–Zurbenko (KZ) filtering with [...] Read more.
Environmental time series, such as near-surface air temperature, exhibit strong multi-scale structure and persistent autocorrelation. Accurate forecasting therefore requires careful consideration of both temporal scale separation and serial dependence. In this study, we evaluate a unified framework that integrates Kolmogorov–Zurbenko (KZ) filtering with two classes of models: (i) classical regression with Cochrane–Orcutt autocorrelation correction, and (ii) an autocorrelation-adjusted Long Short-Term Memory (LSTM) network that learns an embedded correlation coefficient (ρ). All models are assessed using standardized meteorological predictors of T2M under walk-forward validation. The LSTM trained on raw predictors shows moderate performance (RMSE = 0.73, R2=0.46, DW = 0.79), which improves after KZ filtering (RMSE = 0.59, R2=0.63, DW = 1.84). Classical regression applied to KZ-decomposed predictors and corrected using the Cochrane–Orcutt procedure achieves substantially higher accuracy (RMSE = 0.41, R2=0.89, DW 2.0), outperforming the LSTM in both predictive precision and residual behavior. Visual diagnostics further confirm tighter predicted–actual alignment and near-white residuals in the classical models, whereas the LSTM retains small systematic deviations even after filtering. Overall, the results demonstrate that addressing multi-scale structures and autocorrelation had a greater impact than increasing model complexity. Integrating spectral decomposition with autocorrelation correction thus produces more reliable, statistically valid forecasts, demonstrating that classical regression with KZ filtering can surpass LSTM models in both accuracy and interpretability. These findings emphasize the value of combining time series–aware pre-processing with both traditional and neural network approaches for environmental prediction. Full article
(This article belongs to the Section D1: Probability and Statistics)
Show Figures

Figure 1

29 pages, 5537 KB  
Article
A Multi-Scale Approach for the Piezoelectric Modal Analysis in Periodically Perforated Structures
by Mengyu Zhang, Shuyu Ye and Qiang Ma
Mathematics 2025, 13(24), 3967; https://doi.org/10.3390/math13243967 - 12 Dec 2025
Viewed by 298
Abstract
Piezoelectric composites have found a wide range of applications in smart structures and devices and effective numerical methods should be developed to simulate not only the macroscopic coupled piezoelectric performances, but also the details of the local distributions of the stress and electric [...] Read more.
Piezoelectric composites have found a wide range of applications in smart structures and devices and effective numerical methods should be developed to simulate not only the macroscopic coupled piezoelectric performances, but also the details of the local distributions of the stress and electric field. In this paper, we proposed a multi-scale asymptotic algorithm based on the Second-Order Two-Scale (SOTS) analysis method for the piezoelectric eigenvalue problem in perforated domain with periodic micro-configurations. The eigenfunctions and eigenvalues are expanded to the second-order terms and the homogenized eigensolutions; the expressions of the first- and second-order correctors are derived successively. The first- and second-order correctors of the eigenvalues are determined according to the integration forms of the correctors of the corresponding eigenfunctions. Explicit expressions of the homogenized material coefficients are derived for the laminated structures and the finite element procedures are proposed to compute the homogenized solutions and the correctors numerically. The error estimations for the approximations of eigenvalues are proved under some regularity assumptions and a typical numerical experiment is carried out for the two-dimensional perforated domain. The computed results show that the SOTS analysis method is efficient in identifying the piezoelectric eigenvalues accurately and reproducing the original eigenfunctions effectively. This approach also provides an efficient computational tool for piezoelectric eigenvalue analysis and can extend to other multi-physics problems with complex microstructures. Full article
(This article belongs to the Special Issue Multiscale Modeling in Engineering and Mechanics, 2nd Edition)
Show Figures

Figure 1

19 pages, 6720 KB  
Article
Deep Model Based on Mamba Fusion Multi-Scale Convolution LSTM for OSA Severity Grading
by Changyun Li, Shengwen He, Xi Xu and Zhibing Wang
Appl. Sci. 2025, 15(24), 12990; https://doi.org/10.3390/app152412990 - 10 Dec 2025
Viewed by 426
Abstract
Obstructive sleep apnea (OSA) affects nearly one billion individuals worldwide, and its rising prevalence exacerbates cardiovascular and metabolic burdens. Although overnight polysomnography (PSG) is the diagnosis gold standard, its cost and procedural complexity constrain population-level deployment. To address this gap, an end-to-end model [...] Read more.
Obstructive sleep apnea (OSA) affects nearly one billion individuals worldwide, and its rising prevalence exacerbates cardiovascular and metabolic burdens. Although overnight polysomnography (PSG) is the diagnosis gold standard, its cost and procedural complexity constrain population-level deployment. To address this gap, an end-to-end model is developed. It operates directly on full-night peripheral oxygen saturation (SpO2) sequences sampled at 1 Hz. The model integrates multi-scale convolution, state space modeling, and channel-wise attention to classify obstructive sleep apnea severity into four levels. Evaluated on the Sleep Heart Health Study cohorts, the approach achieved overall accuracies of 80.51% on SHHS1 and 76.61% on SHHS2, with mean specificity exceeding 92%. These results suggest that a single-channel SpO2 signal is sufficient for four-class OSA classification without extensive preprocessing or additional PSG channels. This approach may enable low-cost, large-scale home screening and provide a basis for future multimodal, real-time monitoring. Full article
(This article belongs to the Section Computing and Artificial Intelligence)
Show Figures

Figure 1

30 pages, 8375 KB  
Article
MC-H-Geo: A Multi-Scale Contextual Hierarchical Framework for Fine-Grained Lithology Classification
by Lang Liu, Yanlin Shao, Yaxiong Shao, Peijin Li, Qingqing Yang and Rui Zeng
Sensors 2025, 25(22), 6859; https://doi.org/10.3390/s25226859 - 10 Nov 2025
Cited by 1 | Viewed by 748
Abstract
High-resolution lithological mapping of outcrops is fundamental for reservoir characterization and petroleum geology, yet distinguishing lithologies with subtle petrophysical contrasts remains a major challenge. This study proposes MC-H-Geo, a multi-scale contextual hierarchical framework for automated lithology classification from terrestrial laser scanning (TLS) point [...] Read more.
High-resolution lithological mapping of outcrops is fundamental for reservoir characterization and petroleum geology, yet distinguishing lithologies with subtle petrophysical contrasts remains a major challenge. This study proposes MC-H-Geo, a multi-scale contextual hierarchical framework for automated lithology classification from terrestrial laser scanning (TLS) point clouds. The framework integrates three modules: (i) a multi-scale contextual feature engine that extracts spectral, geometric, and textural descriptors across local and stratigraphic contexts, enhanced by cross-scale differentials to capture stratigraphic variability; (ii) a gated expert classifier with task-adaptive feature subsets for hierarchical vegetation–rock and intra-rock discrimination; and (iii) a two-step geological post-processing procedure that enforces stratigraphic continuity through Z-axis correction and neighborhood smoothing. Experiments on the Qianwangjiahe outcrop (Ordos Basin, China) demonstrate state-of-the-art performance (OA = 94.3%, Macro F1 = 0.944), outperforming PointNet++ (77.1%), SG-RFGeo (74.2%), and XGBoost (61.7%). Error analysis reveals that residual sandstone–vegetation confusion results from feature aliasing in weathered zones, highlighting the intrinsic limitations of TLS-only data. Overall, MC-H-Geo establishes an advanced framework for fine-grained lithological mapping and identifies multi-sensor data fusion as a promising pathway toward robust, geologically consistent outcrop interpretation. Full article
(This article belongs to the Special Issue Application of LiDAR Remote Sensing and Mapping)
Show Figures

Figure 1

29 pages, 4325 KB  
Article
A 1-Dimensional Physiological Signal Prediction Method Based on Composite Feature Preprocessing and Multi-Scale Modeling
by Peiquan Chen, Jie Li, Bo Peng, Zhaohui Liu and Liang Zhou
Sensors 2025, 25(21), 6726; https://doi.org/10.3390/s25216726 - 3 Nov 2025
Viewed by 1106
Abstract
The real-time, precise monitoring of physiological signals such as intracranial pressure (ICP) and arterial blood pressure (BP) holds significant clinical importance. However, traditional methods like invasive ICP monitoring and invasive arterial blood pressure measurement present challenges including complex procedures, high infection risks, and [...] Read more.
The real-time, precise monitoring of physiological signals such as intracranial pressure (ICP) and arterial blood pressure (BP) holds significant clinical importance. However, traditional methods like invasive ICP monitoring and invasive arterial blood pressure measurement present challenges including complex procedures, high infection risks, and difficulties in continuous measurement. Consequently, learning-based prediction utilizing observable signals (e.g., BP/pulse waves) has emerged as a crucial alternative approach. Existing models struggle to simultaneously capture multi-scale local features and long-range temporal dependencies, while their computational complexity remains prohibitively high for meeting real-time clinical demands. To address this, this paper proposes a physiological signal prediction method combining composite feature preprocessing with multiscale modeling. First, a seven-dimensional feature matrix is constructed based on physiological prior knowledge to enhance feature discriminative power and mitigate phase mismatch issues. Second, a network architecture CNN-LSTM-Attention (CBAnet), integrating multiscale convolutions, long short-term memory (LSTM), and attention mechanisms is designed to effectively capture both local waveform details and long-range temporal dependencies, thereby improving waveform prediction accuracy and temporal consistency. Experiments on GBIT-ABP, CHARIS, and our self-built PPG-HAF dataset show that CBAnet achieves competitive performance relative to bidirectional long short-term Memory (BiLSTM), convolutional neural network-long short-term memory network (CNN-LSTM), Transformer, and Wave-U-Net baselines across Root Mean Square Error (RMSE), Mean Absolute Error (MAE), and Coefficient of Determination (R2). This study provides a promising, efficient approach for non-invasive, continuous physiological parameter prediction. Full article
(This article belongs to the Section Biomedical Sensors)
Show Figures

Figure 1

22 pages, 1633 KB  
Article
On the Autonomic Control of Heart Rate Variability: How the Mean Heart Rate Affects Spectral and Complexity Analysis and a Way to Mitigate Its Influence
by Paolo Castiglioni, Antonio Zaza, Giampiero Merati and Andrea Faini
Mathematics 2025, 13(18), 2955; https://doi.org/10.3390/math13182955 - 12 Sep 2025
Viewed by 1106
Abstract
Heart Rate Variability (HRV) analysis allows for assessing autonomic control from the beat-by-beat dynamics of the time series of cardiac intervals. However, some HRV indices may strongly correlate with the mean heart rate, possibly flawed by the interpretation of HRV changes in terms [...] Read more.
Heart Rate Variability (HRV) analysis allows for assessing autonomic control from the beat-by-beat dynamics of the time series of cardiac intervals. However, some HRV indices may strongly correlate with the mean heart rate, possibly flawed by the interpretation of HRV changes in terms of autonomic control. Therefore, this study aims to (1) investigate how HRV indices of fluctuation amplitude and multiscale complex dynamics of cardiac time series faithfully describe the autonomic control at different heart rates through a mathematical model of the generation of cardiac action potentials driven by realistically synthesized autonomic modulations; and (2) propose an alternative procedure of HRV analysis less sensitive to the mean heart rate. Results on the synthesized series confirm a strong dependency of amplitude indices of HRV on the mean heart rate due to a nonlinearity in the model, which can be removed by our procedure. Application of our procedure to real cardiac intervals recorded in different postures suggests that the dependency of these indices on the heart rate may importantly affect the physiological interpretation of HRV. By contrast, multiscale complexity indices do not substantially depend on the heart rate provided that multiscale analyses are defined on a time- rather than a beat-basis. Full article
(This article belongs to the Special Issue Recent Advances in Time Series Analysis)
Show Figures

Figure 1

39 pages, 35445 KB  
Article
A GIS-Based Common Data Environment for Integrated Preventive Conservation of Built Heritage Systems
by Francisco M. Hidalgo-Sánchez, Ignacio Ruiz-Moreno, Jacinto Canivell, Cristina Soriano-Cuesta and Martin Kada
Buildings 2025, 15(16), 2962; https://doi.org/10.3390/buildings15162962 - 21 Aug 2025
Cited by 2 | Viewed by 1897
Abstract
Preventive conservation (PC) of built heritage has proved to be one of the most efficient and sustainable approaches to ensure its long-term preservation. Nevertheless, the management of all the areas involved in a PC project is complex, often resulting in poor interaction between [...] Read more.
Preventive conservation (PC) of built heritage has proved to be one of the most efficient and sustainable approaches to ensure its long-term preservation. Nevertheless, the management of all the areas involved in a PC project is complex, often resulting in poor interaction between them. This research proposes a GIS-based methodology for integrating data from different PC areas into a centralised digital model, establishing a Common Data Environment (CDE) to optimise PC strategies for heritage systems in complex contexts. Applying this method to the pavilions of the 1929 Ibero-American Exhibition in Seville (Spain), the study addresses five key PC areas: active follow-up, damage detection and assessment, risk analysis, maintenance, and dissemination and valorisation. The approach involved designing a robust relational database structure—using PostgreSQL—tailored for heritage management, defining several data standardisation criteria, and testing semi-automated procedures for generating multi-scale 2D and 3D GIS (LOD2 and LOD4) entities using remote sensing data sources. The proposed spatial database has been designed to function seamlessly with major GIS platforms (QGIS and ArcGIS Pro), demonstrating successful integration and interoperability for data management, analysis, and decision-making. Geographic web services derived from the database content were created and uploaded to a WebGIS platform. While limitations exist, this research demonstrates that simplified GIS models are sufficient for managing PC data across various working scales, offering a resource-efficient alternative compared to more demanding existing methods. Full article
(This article belongs to the Section Construction Management, and Computers & Digitization)
Show Figures

Figure 1

27 pages, 24146 KB  
Article
Large-Scale Flood Detection and Mapping in the Yangtze River Basin (2016–2021) Using Convolutional Neural Networks with Sentinel-1 SAR Images
by Xuan Wu, Zhijie Zhang, Wanchang Zhang, Bangsheng An, Zhenghao Li, Rui Li and Qunli Chen
Remote Sens. 2025, 17(16), 2909; https://doi.org/10.3390/rs17162909 - 21 Aug 2025
Viewed by 2257
Abstract
Synthetic Aperture Radar (SAR) technology offers unparalleled advantages by delivering high-quality images under all-weather conditions, enabling effective flood monitoring. This capability provides massive remote sensing data for flood mapping, while recent rapid advances in deep learning (DL) offer methodologies for large-scale flood mapping. [...] Read more.
Synthetic Aperture Radar (SAR) technology offers unparalleled advantages by delivering high-quality images under all-weather conditions, enabling effective flood monitoring. This capability provides massive remote sensing data for flood mapping, while recent rapid advances in deep learning (DL) offer methodologies for large-scale flood mapping. However, the full potential of deep learning in large-scale flood monitoring utilizing remote sensing data remains largely untapped, necessitating further exploration of both data and methodologies. This paper presents an innovative approach that harnesses convolutional neural networks (CNNs) with Sentinel-1 SAR images for large-scale inundation detection and dynamic flood monitoring in the Yangtze River Basin (YRB). An efficient CNN model entitled FloodsNet was constructed based on multi-scale feature extraction and reuse. The study compiled 16 flood events comprising 32 Sentinel-1 images for CNN training, validation, inundation detection, and flood mapping. A semi-automatic inundation detection approach was developed to generate representative flood samples with labels, resulting in a total of 5296 labeled flood samples. The proposed model FloodsNet achieves 1–2% higher F1-score than the other five DL models on this dataset. Experimental inundation detection in the YRB from 2016 to 2021 and dynamic flood monitoring in the Dongting and Poyang Lakes corroborated the scheme’s outstanding performance through various validation procedures. This study marks the first application of deep learning with SAR images for large-scale flood monitoring in the YRB, providing a valuable reference for future research in flood disaster studies. This study explores the potential of SAR imagery and deep learning in large-scale flood monitoring across the Yangtze River Basin, providing a valuable reference for future research in flood disaster studies. Full article
Show Figures

Figure 1

14 pages, 2309 KB  
Article
Multiscale and Failure Analysis of Periodic Lattice Structures
by Young Kwon and Matthew Minck
Appl. Sci. 2025, 15(12), 6701; https://doi.org/10.3390/app15126701 - 14 Jun 2025
Cited by 2 | Viewed by 861
Abstract
A full-cycle, multiscale analysis technique was developed for periodic lattice structures with geometric repetition, aiming for more efficient modeling to predict their failure loads. The full-cycle analysis includes both upscaling and downscaling procedures. The objective of the upscaling procedure is to obtain the [...] Read more.
A full-cycle, multiscale analysis technique was developed for periodic lattice structures with geometric repetition, aiming for more efficient modeling to predict their failure loads. The full-cycle analysis includes both upscaling and downscaling procedures. The objective of the upscaling procedure is to obtain the effective material properties of the lattice structures such that the lattice structures can be analyzed as continuum models. The continuum models are analyzed to determine the structures’ displacements or buckling failure loads. Then, the downscaling process is applied to the continuum models to determine the stresses in actual lattice members, which were applied to the stress and stress gradient based failure criterion to predict failure. Example problems were presented to demonstrate the accuracy and reliability of the proposed multiscale analysis technique. The results from the multiscale analysis were compared to those of the discrete finite element analysis without any homogenization. Furthermore, physical experiments were also conducted to determine the failure loads. Then, multiscale analysis was undertaken in conjunction with the failure criterion, based on both stress and stress gradient conditions, to compare the predicted failure loads to the experimental data. Full article
Show Figures

Figure 1

Back to TopTop