Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (6,481)

Search Parameters:
Keywords = temporal variability

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 2455 KB  
Article
Classification of Hemiplegic Gait and Mimicked Hemiplegic Gait: A Treadmill Gait Analysis Study in Stroke Patients and Healthy Individuals
by Young-ung Lee, Seungwon Kwon, Cheol-Hyun Kim, Jeong-Woo Seo and Sangkwan Lee
Bioengineering 2025, 12(10), 1074; https://doi.org/10.3390/bioengineering12101074 - 2 Oct 2025
Abstract
Differentiating genuine hemiplegic gait (HG) in stroke survivors from hemiplegic-like gait voluntarily imitated by healthy adults (MHG) is essential for reliable assessment and intervention planning. Treadmill-based gait data were obtained from 79 participants—39 stroke patients (HG) and 40 healthy adults—instructed to mimic HG [...] Read more.
Differentiating genuine hemiplegic gait (HG) in stroke survivors from hemiplegic-like gait voluntarily imitated by healthy adults (MHG) is essential for reliable assessment and intervention planning. Treadmill-based gait data were obtained from 79 participants—39 stroke patients (HG) and 40 healthy adults—instructed to mimic HG (MHG). Forty-eight spatiotemporal and force-related variables were extracted. Random Forest, support vector machine (SVM), and logistic regression classifiers were trained with (i) the full feature set and (ii) the 10 most important features selected via Random Forest Gini importance. Performance was assessed with 5-fold stratified cross-validation and an 80/20 hold-out test, using accuracy, F1-score, and the area under the receiver operating characteristic curve (AUC). All models achieved high discrimination (AUC > 0.93). The SVM attained perfect discrimination (AUC = 1.000, test set) with the full feature set and maintained excellent accuracy (AUC = 0.983) with only the top 10 features. Temporal asymmetries, delayed vertical ground reaction force peaks, and mediolateral spatial instability ranked highest in importance. Reduced-feature models showed negligible performance loss, highlighting their parsimony and interpretability. Supervised machine learning algorithms can accurately distinguish true hemiplegic gait from mimicked patterns using a compact subset of gait features. The findings support data-driven, time-efficient gait assessments for clinical neurorehabilitation and for validating experimental protocols that rely on gait imitation. Full article
(This article belongs to the Special Issue Biomechanics and Motion Analysis)
Show Figures

Figure 1

17 pages, 1172 KB  
Article
Data-Driven Baseline Analysis of Climate Variability at an Antarctic AWS (2020–2024)
by Arpitha Javali Ashok, Shan Faiz, Raja Hashim Ali and Talha Ali Khan
Digital 2025, 5(4), 50; https://doi.org/10.3390/digital5040050 - 2 Oct 2025
Abstract
Climate change in Antarctica has profound global implications, influencing sea level rise, atmospheric circulation, and the Earth’s energy balance. This study presents a data-driven baseline analysis of meteorological observations from a British Antarctic Survey automatic weather station (2020–2024). Temporal and seasonal analyses reveal [...] Read more.
Climate change in Antarctica has profound global implications, influencing sea level rise, atmospheric circulation, and the Earth’s energy balance. This study presents a data-driven baseline analysis of meteorological observations from a British Antarctic Survey automatic weather station (2020–2024). Temporal and seasonal analyses reveal strong insolation-driven variability in temperature, snow depth, and solar radiation, reflecting the extreme polar day–night cycle. Correlation analysis highlights solar radiation, upwelling longwave flux, and snow depth as the most reliable predictors of near-surface temperature, while humidity, pressure, and wind speed contribute minimally. A linear regression baseline and a Random Forest model are evaluated for temperature prediction, with the ensemble approach demonstrating superior accuracy. Although the short data span limits long-term trend attribution, the findings underscore the potential of lightweight, reproducible pipelines for site-specific climate monitoring. All analysis codes are openly available in github, enabling transparency and future methodological extensions to advanced, non-linear models and multi-site datasets. Full article
14 pages, 21399 KB  
Article
Temporal Variability of Major Stratospheric Sudden Warmings in CMIP5 Climate Change Scenarios
by Víctor Manuel Chávez-Pérez, Juan A. Añel, Citlalli Almaguer-Gómez and Laura de la Torre
Climate 2025, 13(10), 207; https://doi.org/10.3390/cli13100207 - 2 Oct 2025
Abstract
Major stratospheric sudden warmings are key processes in the coupling between the stratosphere and the troposphere, exerting a direct influence on mid-latitude climate variability. This study evaluates projected changes in the frequency of these phenomena during the 2006–2100 period using six high-top general [...] Read more.
Major stratospheric sudden warmings are key processes in the coupling between the stratosphere and the troposphere, exerting a direct influence on mid-latitude climate variability. This study evaluates projected changes in the frequency of these phenomena during the 2006–2100 period using six high-top general circulation models from the CMIP5 project under the Representative Concentration Pathway scenarios 2.6, 4.5, and 8.5. The analysis combines the full future period with a moving-window approach of 27 and 48 years, compared against both the satellite-era (1979–2005) and extended historical (1958–2005) periods. This methodology reveals that model responses are highly heterogeneous, with alternating periods of significant increases and decreases in event frequency, partially modulated by internal variability. The magnitude and statistical significance of the projected changes strongly depend on the chosen historical reference period, and most models tend to reproduce displacement-type polar vortex events preferentially over split-type events. These results indicate that assessments based solely on multi-model means or long aggregated periods may mask subperiods with robust signals, although some of these may arise by chance given the 5% significance threshold. This underscores the need for temporally resolved analyses to improve the understanding of stratospheric variability and its potential impact on climate predictability. Full article
(This article belongs to the Section Climate and Environment)
Show Figures

Figure 1

28 pages, 11737 KB  
Article
Comparative Evaluation of SNO and Double Difference Calibration Methods for FY-3D MERSI TIR Bands Using MODIS/Aqua as Reference
by Shufeng An, Fuzhong Weng, Xiuzhen Han and Chengzhi Ye
Remote Sens. 2025, 17(19), 3353; https://doi.org/10.3390/rs17193353 - 2 Oct 2025
Abstract
Radiometric consistency across satellite platforms is fundamental to producing high-quality Climate Data Records (CDRs). Because different cross-calibration methods have distinct advantages and limitations, comparative evaluation is necessary to ensure record accuracy. This study presents a comparative assessment of two widely applied calibration approaches—Simultaneous [...] Read more.
Radiometric consistency across satellite platforms is fundamental to producing high-quality Climate Data Records (CDRs). Because different cross-calibration methods have distinct advantages and limitations, comparative evaluation is necessary to ensure record accuracy. This study presents a comparative assessment of two widely applied calibration approaches—Simultaneous Nadir Overpass (SNO) and Double Difference (DD)—for the thermal infrared (TIR) bands of FY-3D MERSI. MODIS/Aqua serves as the reference sensor, while radiative transfer simulations driven by ERA5 inputs are generated with the Advanced Radiative Transfer Modeling System (ARMS) to support the analysis. The results show that SNO performs effectively when matchup samples are sufficiently large and globally representative but is less applicable under sparse temporal sampling or orbital drift. In contrast, the DD method consistently achieves higher calibration accuracy for MERSI Bands 24 and 25 under clear-sky conditions. It reduces mean biases from ~−0.5 K to within ±0.1 K and lowers RMSE from ~0.6 K to 0.3–0.4 K during 2021–2022. Under cloudy conditions, DD tends to overcorrect because coefficients derived from clear-sky simulations are not directly transferable to cloud-covered scenes, whereas SNO remains more stable though less precise. Overall, the results suggest that the two methods exhibit complementary strengths, with DD being preferable for high-accuracy calibration in clear-sky scenarios and SNO offering greater stability across variable atmospheric conditions. Future work will validate both methods under varied surface and atmospheric conditions and extend their use to additional sensors and spectral bands. Full article
Show Figures

Figure 1

21 pages, 11783 KB  
Article
Spatio-Temporal Pattern Analysis of African Swine Fever Spreading in Northwestern Italy—The Role of Habitat Interfaces
by Samuele De Petris, Tommaso Orusa, Annalisa Viani, Francesco Feliziani, Marco Sordilli, Sabatino Troisi, Simona Zoppi, Marco Ragionieri, Riccardo Orusa and Enrico Borgogno-Mondino
Animals 2025, 15(19), 2886; https://doi.org/10.3390/ani15192886 - 2 Oct 2025
Abstract
African swine fever (ASF) is a highly contagious viral disease with significant impacts on domestic pigs and wild boar populations. This study applies GIS-based spatial analysis to monitor ASF outbreaks in northwestern Italy (Piedmont and Liguria) and identify areas at increased risk. Key [...] Read more.
African swine fever (ASF) is a highly contagious viral disease with significant impacts on domestic pigs and wild boar populations. This study applies GIS-based spatial analysis to monitor ASF outbreaks in northwestern Italy (Piedmont and Liguria) and identify areas at increased risk. Key factors considered include pig density, wildlife proximity, and environmental conditions. The spatial analysis revealed that central–western municipalities exhibited higher risk due to favorable environmental conditions and dense wild boar populations, while peripheral areas showed a temporal delay in outbreak emergence. Mapping the spreading rate and habitat interfaces allowed the development of a spatial risk model, which was further analyzed using geostatistical techniques to understand disease dynamics. The results demonstrate the effectiveness of geospatial modeling in identifying high-risk zones, characterizing spatio-temporal patterns, and supporting targeted prevention and surveillance strategies. These findings provide actionable insights for ASF management and resource allocation. Future studies may refine these models by integrating additional datasets and environmental variables, enhancing predictive capacity and applicability across different regions. Full article
Show Figures

Figure 1

16 pages, 1005 KB  
Article
A Two-Step Machine Learning Approach Integrating GNSS-Derived PWV for Improved Precipitation Forecasting
by Laura Profetto, Andrea Antonini, Luca Fibbi, Alberto Ortolani and Giovanna Maria Dimitri
Entropy 2025, 27(10), 1034; https://doi.org/10.3390/e27101034 - 2 Oct 2025
Abstract
Global Navigation Satellite System (GNSS) meteorology has emerged as a valuable tool for atmospheric monitoring, providing high-resolution, near-real-time data that can significantly improve precipitation nowcasting. This study aims to enhance short-term precipitation forecasting by integrating GNSS-derived Precipitable Water Vapor (PWV)—a key indicator of [...] Read more.
Global Navigation Satellite System (GNSS) meteorology has emerged as a valuable tool for atmospheric monitoring, providing high-resolution, near-real-time data that can significantly improve precipitation nowcasting. This study aims to enhance short-term precipitation forecasting by integrating GNSS-derived Precipitable Water Vapor (PWV)—a key indicator of atmospheric moisture—with traditional meteorological observations. A novel two-step machine learning framework is proposed that combines a Random Forest (RF) model and a Long Short-Term Memory (LSTM) neural network. The RF model first estimates current precipitation based on PWV, surface weather parameters, and auxiliary atmospheric variables. Then, the LSTM network leverages temporal dependencies within the data to predict precipitation for the subsequent hour. This hybrid method capitalizes on the RF’s ability to model complex nonlinear relationships and the LSTM’s strength in handling time series data. The results demonstrate that the proposed approach improves forecasting accuracy, particularly during extreme weather events such as intense rainfall and thunderstorms, outperforming conventional models. By integrating GNSS meteorology with advanced machine learning techniques, this study offers a promising tool for meteorological services, early warning systems, and disaster risk management. The findings highlight the potential of GNSS-based nowcasting for real-time decision-making in weather-sensitive applications. Full article
(This article belongs to the Special Issue Entropy in Machine Learning Applications, 2nd Edition)
Show Figures

Figure 1

20 pages, 990 KB  
Article
Hybrid Stochastic–Machine Learning Framework for Postprandial Glucose Prediction in Type 1 Diabetes
by Irina Naskinova, Mikhail Kolev, Dilyana Karova and Mariyan Milev
Algorithms 2025, 18(10), 623; https://doi.org/10.3390/a18100623 - 1 Oct 2025
Abstract
This research introduces a hybrid framework that integrates stochastic modeling and machine learning for predicting postprandial glucose levels in individuals with Type 1 Diabetes (T1D). The primary aim is to enhance the accuracy of glucose predictions by merging a biophysical Glucose–Insulin–Meal (GIM) model [...] Read more.
This research introduces a hybrid framework that integrates stochastic modeling and machine learning for predicting postprandial glucose levels in individuals with Type 1 Diabetes (T1D). The primary aim is to enhance the accuracy of glucose predictions by merging a biophysical Glucose–Insulin–Meal (GIM) model with advanced machine learning techniques. This framework is tailored to utilize the Kaggle BRIST1D dataset, which comprises real-world data from continuous glucose monitoring (CGM), insulin administration, and meal intake records. The methodology employs the GIM model as a physiological prior to generate simulated glucose and insulin trajectories, which are then utilized as input features for the machine learning (ML) component. For this component, the study leverages the Light Gradient Boosting Machine (LightGBM) due to its efficiency and strong performance with tabular data, while Long Short-Term Memory (LSTM) networks are applied to capture temporal dependencies. Additionally, Bayesian regression is integrated to assess prediction uncertainty. A key advancement of this research is the transition from a deterministic GIM formulation to a stochastic differential equation (SDE) framework, which allows the model to represent the probabilistic range of physiological responses and improves uncertainty management when working with real-world data. The findings reveal that this hybrid methodology enhances both the precision and applicability of glucose predictions by integrating the physiological insights of Glucose Interaction Models (GIM) with the flexibility of data-driven machine learning techniques to accommodate real-world variability. This innovative framework facilitates the creation of robust, transparent, and personalized decision-support systems aimed at improving diabetes management. Full article
31 pages, 1105 KB  
Article
MoCap-Impute: A Comprehensive Benchmark and Comparative Analysis of Imputation Methods for IMU-Based Motion Capture Data
by Mahmoud Bekhit, Ahmad Salah, Ahmed Salim Alrawahi, Tarek Attia, Ahmed Ali, Esraa Eldesouky and Ahmed Fathalla
Information 2025, 16(10), 851; https://doi.org/10.3390/info16100851 - 1 Oct 2025
Abstract
Motion capture (MoCap) data derived from wearable Inertial Measurement Units is essential to applications in sports science and healthcare robotics. However, a significant amount of the potential of this data is limited due to missing data derived from sensor limitations, network issues, and [...] Read more.
Motion capture (MoCap) data derived from wearable Inertial Measurement Units is essential to applications in sports science and healthcare robotics. However, a significant amount of the potential of this data is limited due to missing data derived from sensor limitations, network issues, and environmental interference. Such limitations can introduce bias, prevent the fusion of critical data streams, and ultimately compromise the integrity of human activity analysis. Despite the plethora of data imputation techniques available, there have been few systematic performance evaluations of these techniques explicitly for the time series data of IMU-derived MoCap data. We address this by evaluating the imputation performance across three distinct contexts: univariate time series, multivariate across players, and multivariate across kinematic angles. To address this limitation, we propose a systematic comparative analysis of imputation techniques, including statistical, machine learning, and deep learning techniques, in this paper. We also introduce the first publicly available MoCap dataset specifically for the purpose of benchmarking missing value imputation, with three missingness mechanisms: missing completely at random, block missingness, and a simulated value-dependent missingness pattern simulated at the signal transition points. Using data from 53 karate practitioners performing standardized movements, we artificially generated missing values to create controlled experimental conditions. We performed experiments across the 53 subjects with 39 kinematic variables, which showed that discriminating between univariate and multivariate imputation frameworks demonstrates that multivariate imputation frameworks surpassunivariate approaches when working with more complex missingness mechanisms. Specifically, multivariate approaches achieved up to a 50% error reduction (with the MAE improving from 10.8 ± 6.9 to 5.8 ± 5.5) compared to univariate methods for transition point missingness. Specialized time series deep learning models (i.e., SAITS, BRITS, GRU-D) demonstrated a superior performance with MAE values consistently below 8.0 for univariate contexts and below 3.2 for multivariate contexts across all missing data percentages, significantly surpassing traditional machine learning and statistical methods. Notable traditional methods such as Generative Adversarial Imputation Networks and Iterative Imputers exhibited a competitive performance but remained less stable than the specialized temporal models. This work offers an important baseline for future studies, in addition to recommendations for researchers looking to increase the accuracy and robustness of MoCap data analysis, as well as integrity and trustworthiness. Full article
(This article belongs to the Section Information Processes)
18 pages, 30918 KB  
Article
Beyond Local Indicators: Integrating Aggregated Runoff into Rainwater Harvesting Potential Mapping
by Christy Mathew Damascene, Irene Pomarico, Aldo Fiori and Antonio Zarlenga
Water 2025, 17(19), 2866; https://doi.org/10.3390/w17192866 - 1 Oct 2025
Abstract
Water scarcity, driven by over-consumption, population growth, climate change, and pollution, poses severe threats to both human health and ecosystems. Rainwater harvesting (RWH) has emerged as a sustainable solution to mitigate these impacts, offering environmental, social, and economic benefits. Traditional RWH site selection [...] Read more.
Water scarcity, driven by over-consumption, population growth, climate change, and pollution, poses severe threats to both human health and ecosystems. Rainwater harvesting (RWH) has emerged as a sustainable solution to mitigate these impacts, offering environmental, social, and economic benefits. Traditional RWH site selection methods rely heavily on GIS-based Multi-Criteria Approaches, such as the Analytical Hierarchy Process, which typically assess runoff potential at the pixel scale using proxy indicators like runoff coefficients or drainage density. However, these methods often overlook horizontal water fluxes and temporal variability, leading to underestimation of the actual runoff available for harvesting. This study introduces an innovative enhancement to AHP/GIS-based methodologies for rainwater harvesting (RWH) site selection by incorporating Aggregated Runoff (AR) as a key criterion. Unlike traditional approaches, the use of AR—representing the total upstream surface water collected at each pixel—enables a more realistic and accurate assessment of RWH potential without increasing data or computational requirements. The proposed criterion is independent of the specific methodology or data layers adopted, making it broadly applicable and easily integrable into existing frameworks. The methodology is applied to the upper Tiber River catchment in Central Italy, demonstrating that AR-based assessments yield more realistic RWH potential maps compared to conventional methods. Additionally, the study proposes a quantile-based scoring system to account for inter-annual hydrological variability, enhancing the robustness of site selection under changing climate conditions. Full article
(This article belongs to the Topic Water Management in the Age of Climate Change)
Show Figures

Figure 1

23 pages, 10418 KB  
Article
Daily Water Mapping and Spatiotemporal Dynamics Analysis over the Tibetan Plateau
by Qi Feng, Kai Yu and Luyan Ji
Hydrology 2025, 12(10), 257; https://doi.org/10.3390/hydrology12100257 - 30 Sep 2025
Abstract
The Tibetan Plateau, known as the “Asian Water Tower”, contains thousands of lakes that are sensitive to climate variability and human activities. To investigate their long-term and short-term dynamics, we developed a daily surface-water mapping dataset covering the period from 2000 to 2024 [...] Read more.
The Tibetan Plateau, known as the “Asian Water Tower”, contains thousands of lakes that are sensitive to climate variability and human activities. To investigate their long-term and short-term dynamics, we developed a daily surface-water mapping dataset covering the period from 2000 to 2024 based on MODIS daily reflectance time series (MOD09GQ/MYD09GQ and MOD09GA/MYD09GA). A hybrid methodology combining per-pixel spectral indices, superpixel segmentation, and fusion of Terra and Aqua results was applied, followed by temporal interpolation to produce cloud-free daily water maps. Validation against Landsat classifications and the 30 m global water dataset indicates an overall accuracy of 96.89% and a mean relative error below 9.1%, confirming the robustness of our dataset. Based on this dataset, we analyzed the spatiotemporal evolution of 1293 lakes (no less than 5 km2). Results show that approximately 87.7% of lakes expanded, with the fastest growth reaching +43.18 km2/y, whereas 12.3% shrank, with the largest decrease being −5.91 km2/y. Seasonal patterns reveal that most lakes reach maximum extent in October and minimum extent in January. This study provides a long-term, cloud-free daily water mapping product for the Tibetan Plateau, which can serve as a valuable resource for future research on regional hydrology, ecosystem vulnerability, and climate–water interactions in high-altitude regions. Full article
(This article belongs to the Special Issue Advances in Cold Regions' Hydrology and Hydrogeology)
Show Figures

Figure 1

56 pages, 1777 KB  
Review
Vis Inertiae and Statistical Inference: A Review of Difference-in-Differences Methods Employed in Economics and Other Subjects
by Bruno Paolo Bosco and Paolo Maranzano
Econometrics 2025, 13(4), 38; https://doi.org/10.3390/econometrics13040038 - 30 Sep 2025
Abstract
Difference in Differences (DiD) is a useful statistical technique employed by researchers to estimate the effects of exogenous events on the outcome of some response variables in random samples of treated units (i.e., units exposed to the event) ideally drawn from an infinite [...] Read more.
Difference in Differences (DiD) is a useful statistical technique employed by researchers to estimate the effects of exogenous events on the outcome of some response variables in random samples of treated units (i.e., units exposed to the event) ideally drawn from an infinite population. The term “effect” should be understood as the discrepancy between the post-event realisation of the response and the hypothetical realisation of that same outcome for the same treated units in the absence of the event. This theoretical discrepancy is clearly unobservable. To circumvent the implicit missing variable problem, DiD methods utilise the realisations of the response variable observed in comparable random samples of untreated units. The latter are samples of units drawn from the same population, but they are not exposed to the event under investigation. They function as the control or comparison group and serve as proxies for the non-existent untreated realisations of the responses in treated units during post-treatment periods. In summary, the DiD model posits that, in the absence of intervention and under specific conditions, treated units would exhibit behaviours that are indistinguishable from those of control or untreated units during the post-treatment periods. For the purpose of estimation, the method employs a combination of before–after and treatment–control group comparisons. The event that affects the response variables is referred to as “treatment.” However, it could also be referred to as “causal factor” to emphasise that, in the DiD approach, the objective is not to estimate a mere statistical association among variables. This review introduces the DiD techniques for researchers in economics, public policy, health research, management, environmental analysis, and other fields. It commences with the rudimentary methods employed to estimate the so-called Average Treatment Effect upon Treated (ATET) in a two-period and two-group case and subsequently addresses numerous issues that arise in a multi-unit and multi-period context. A particular focus is placed on the statistical assumptions necessary for a precise delineation of the identification process of the cause–effect relationship in the multi-period case. These assumptions include the parallel trend hypothesis, the no-anticipation assumption, and the SUTVA assumption. In the multi-period case, both the homogeneous and heterogeneous scenarios are taken into consideration. The homogeneous scenario refers to the situation in which the treated units are initially treated in the same periods. In contrast, the heterogeneous scenario involves the treatment of treated units in different periods. A portion of the presentation will be allocated to the developments associated with the DiD techniques that can be employed in the context of data clustering or spatio-temporal dependence. The present review includes a concise exposition of some policy-oriented papers that incorporate applications of DiD. The areas of focus encompass income taxation, migration, regulation, and environmental management. Full article
Show Figures

Figure 1

14 pages, 2003 KB  
Article
Changes in Camelina sativa Yield Based on Temperature and Precipitation Using FDA
by Małgorzata Graczyk, Danuta Kurasiak-Popowska and Grażyna Niedziela
Agriculture 2025, 15(19), 2051; https://doi.org/10.3390/agriculture15192051 - 30 Sep 2025
Abstract
Camelina (Camelina sativa) is an oilseed crop of increasing importance, valued not only for its adaptability to diverse environmental conditions and potential for sustainable agriculture but also for its economic advantages, including low input requirements and suitability for biofuel production and [...] Read more.
Camelina (Camelina sativa) is an oilseed crop of increasing importance, valued not only for its adaptability to diverse environmental conditions and potential for sustainable agriculture but also for its economic advantages, including low input requirements and suitability for biofuel production and niche markets. This study examines the relationship between camelina yield and climatic variables—specifically temperature and precipitation—based on a ten-year field experiment conducted in Poland. To capture the temporal dynamics of weather conditions, Functional Data Analysis (FDA) was applied to daily temperature and precipitation data. The analysis revealed that yield variability was strongly influenced by the length of the vegetative period and specific weather patterns in April and July. Higher yields were recorded in years characterized by moderate spring temperatures, elevated temperatures in July, and evenly distributed rainfall during the early generative growth stages. The Maximal Information Coefficient (MIC) confirmed the relevance of these variables, with the duration of the vegetative phase showing the strongest correlation with yield. Cluster analysis further distinguished high- and low-yield years based on functional weather profiles. The FDA-based approach provided clear, interpretable insights into climate–yield interactions and demonstrated greater effectiveness than traditional regression models in capturing complex, time-dependent relationships. These findings enhance our understanding of camelina’s response to climatic variability and support the development of predictive tools for resilient, climate-smart crop management. Full article
(This article belongs to the Section Ecosystem, Environment and Climate Change in Agriculture)
Show Figures

Figure 1

17 pages, 2010 KB  
Article
Spontaneous Seizure Outcomes in Mice Using an Improved Version of the Pilocarpine Model of Temporal Lobe Epilepsy
by Ronald P. Gaykema, Madison J. Failor, Aleksandra Maciejczuk, Magda Pikus, Mariia Oliinyk, Maggie B. Ellison, Amir A. Behrooz, Kiran Singh, John M. Williamson and Edward Perez-Reyes
Int. J. Mol. Sci. 2025, 26(19), 9540; https://doi.org/10.3390/ijms26199540 - 29 Sep 2025
Abstract
Temporal lobe epilepsy (TLE) is a debilitating disorder that affects millions of people worldwide and is difficult to treat with medicines. There has been little progress in the development of novel therapies for these patients because of the lack of suitable animal models. [...] Read more.
Temporal lobe epilepsy (TLE) is a debilitating disorder that affects millions of people worldwide and is difficult to treat with medicines. There has been little progress in the development of novel therapies for these patients because of the lack of suitable animal models. Current rodent models of TLE use chemoconvulsants or electrical stimulation to induce status epilepticus, which evolves into chronic epilepsy with spontaneous recurring seizures. These models have face validity in human TLE as they share similarities with seizure onset in the hippocampus, EEG patterns, tonic–clonic convulsions behavior, and hippocampal sclerosis. Unfortunately, seizure frequencies are so variable that they hinder drug testing. The ideal model for screening epilepsy therapies would have spontaneous seizure frequencies that are greater than two per day, little-to-no seizure-free days, and would maintain these features for more than 4 weeks. This study describes a series of improvements to the mouse pilocarpine TLE model. First, a pharmacokinetic model was developed to guide pilocarpine dosing. Second, induction was combined with EEG monitoring, allowing for real-time monitoring of pilocarpine-induced EEG discharges and electrographic seizures that precede behavioral manifestations. Third, strains of mice were identified that withstand pilocarpine-induced status epilepticus and reliably develop spontaneous recurring seizures. The pilocarpine model was improved by lowering mortality and increasing the fraction of mice that developed spontaneous seizures and had seizure frequencies that are amenable to drug screening. Future studies are required to identify the ideal mouse strain for drug screening and validate the response to known anti-epileptic drugs. Full article
(This article belongs to the Special Issue Molecular and Cellular Mechanisms of Epilepsy—3rd Edition)
15 pages, 8126 KB  
Article
Spatio-Temporal Variability of Key Habitat Drivers in China’s Coastal Waters
by Shuhui Cao, Yingchao Dang, Xuan Ban, Yadong Zhou, Jiahuan Luo, Jiazhi Zhu and Fei Xiao
J. Mar. Sci. Eng. 2025, 13(10), 1874; https://doi.org/10.3390/jmse13101874 - 29 Sep 2025
Abstract
China’s coastal fisheries face challenges to their sustainability due to climate and human-induced pressures on key habitat drivers. This study provides an 18-year (2003–2020) assessment of six key ecological and data-available environmental factors (sea-surface temperature (SST), salinity, transparency, currents (eastward velocity, EV; northward [...] Read more.
China’s coastal fisheries face challenges to their sustainability due to climate and human-induced pressures on key habitat drivers. This study provides an 18-year (2003–2020) assessment of six key ecological and data-available environmental factors (sea-surface temperature (SST), salinity, transparency, currents (eastward velocity, EV; northward velocity, NV), and net primary productivity (NPP), selected for their ecological relevance and data availability, across the Bohai, Yellow, and East China Seas at a spatial resolution of 0.083°. Non-parametric trend tests and seasonal climatologies were applied using MODIS-Aqua and CMEMS data with a refined quasi-analytical algorithm (QAA-v6). The results show distinct gradients: SST ranging from 9 to 13 °C (Bohai Sea) to >20 °C (East China Sea); transparency ranging from <5 m (turbid coasts) to 29.20 m (offshore). Seasonal peaks occurred for SST (summer: 18.92 °C), transparency (summer: 12.54 m), and primary productivity (spring: 1289 mg/m2). Long-term trends reveal regional SST warming in the northern Yellow Sea (9.78% of the area), but cooling in the central East China Sea. Widespread increases in transparency were observed (65.14% of the area), though productivity declined significantly (27.3%). The drivers showed spatial coupling (e.g., SST–salinity r = 0.95), but the long-term trends were decoupled. This study provides a comprehensive and long-term assessment of multiple key habitat drivers across China’s coastal seas. The results provide an unprecedented empirical baseline and dynamic management tools for China’s changing coastal ecosystems. Full article
Show Figures

Figure 1

15 pages, 2790 KB  
Article
A Machine Learning Approach for Real-Time Detection of Inadequate Sedation Using Non-EEG Physiological Signals
by Huiquan Wang, Chunliang Jiang, Guanjun Liu, Jing Yuan, Ming Yu, Xin Ma, Chong Liu, Jingyu Xiao and Guang Zhang
Bioengineering 2025, 12(10), 1049; https://doi.org/10.3390/bioengineering12101049 - 29 Sep 2025
Abstract
Sedation is an essential component of the anesthesia process. Inadequate sedation during anesthesia increases the risk of patient discomfort, intraoperative awareness, and psychological trauma. Conventional electroencephalography (EEG) based depth of anesthesia monitoring is often impractical in out-of-hospital settings due to equipment limitations and [...] Read more.
Sedation is an essential component of the anesthesia process. Inadequate sedation during anesthesia increases the risk of patient discomfort, intraoperative awareness, and psychological trauma. Conventional electroencephalography (EEG) based depth of anesthesia monitoring is often impractical in out-of-hospital settings due to equipment limitations and signal artifacts. Alternative non-EEG-based approaches are therefore required. In this study, we developed a machine learning model to detect inadequate sedation using 27 feature parameters, including demographics, vital signs, and heart rate variability metrics, from the open-access VitalDB database. Patient states were defined as inadequate sedation when the bispectral index (BIS) > 60. We systematically evaluated four temporal windows and four algorithms, and assessed model interpretability using Shapley Additive Explanations (SHAP). The Light Gradient Boosting Machine (LGBM) achieved the best performance, with an area under the receiver operating characteristic curve (AUC) of 0.825 and an accuracy (ACC) of 0.741 using a 2 s time window. Extending the time window to 20 s improved both metrics by approximately 0.012. Feature selection identified 12 key parameters that maintained comparable accuracy, confirming robustness with reduced complexity. These findings demonstrate the feasibility of using non-EEG-based physiological data for real-time detection of inadequate sedation. The developed model is interpretable, resource-efficient, scalable, and shows strong potential for integration into portable monitoring systems in prehospital, emergency, and low-resource surgical settings. Full article
Show Figures

Figure 1

Back to TopTop