Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (3,430)

Search Parameters:
Keywords = time series data generation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
23 pages, 1938 KiB  
Article
Algorithmic Silver Trading via Fine-Tuned CNN-Based Image Classification and Relative Strength Index-Guided Price Direction Prediction
by Yahya Altuntaş, Fatih Okumuş and Adnan Fatih Kocamaz
Symmetry 2025, 17(8), 1338; https://doi.org/10.3390/sym17081338 (registering DOI) - 16 Aug 2025
Abstract
Predicting short-term buy and sell signals in financial markets remains a significant challenge for algorithmic trading. This difficulty stems from the data’s inherent volatility and noise, which often leads to spurious signals and poor trading performance. This paper presents a novel algorithmic trading [...] Read more.
Predicting short-term buy and sell signals in financial markets remains a significant challenge for algorithmic trading. This difficulty stems from the data’s inherent volatility and noise, which often leads to spurious signals and poor trading performance. This paper presents a novel algorithmic trading model for silver that combines fine-tuned Convolutional Neural Networks (CNNs) with a decision filter based on the Relative Strength Index (RSI). The technique allows for the prediction of buy and sell points by turning time series data into chart images. Daily silver price per ounce data were turned into chart images using technical analysis indicators. Four pre-trained CNNs, namely AlexNet, VGG16, GoogLeNet, and ResNet-50, were fine-tuned using the generated image dataset to find the best architecture based on classification and financial performance. The models were evaluated using walk-forward validation with an expanding window. This validation method made the tests more realistic and the performance evaluation more robust under different market conditions. Fine-tuned VGG16 with the RSI filter had the best cost-adjusted profitability, with a cumulative return of 115.03% over five years. This was nearly double the 61.62% return of a buy-and-hold strategy. This outperformance is especially impressive because the evaluation period was mostly upward, which makes it harder to beat passive benchmarks. Adding the RSI filter also helped models make more disciplined decisions. This reduced transactions with low confidence. In general, the results show that pre-trained CNNs fine-tuned on visual representations, when supplemented with domain-specific heuristics, can provide strong and cost-effective solutions for algorithmic trading, even when realistic cost assumptions are used. Full article
(This article belongs to the Section Computer)
Show Figures

Figure 1

18 pages, 5932 KiB  
Article
Surface Elevation Dynamics of Lake Karakul from 1991 to 2020 Inversed by ICESat, CryoSat-2 and ERS-1/2
by Zihui Zhang, Ping Ma, Xiaofei Wang, Jiayu Hou, Qinqin Zhang, Yuchuan Guo, Zhonglin Xu, Yao Wang and Kayumov Abdulhamid
Remote Sens. 2025, 17(16), 2816; https://doi.org/10.3390/rs17162816 - 14 Aug 2025
Viewed by 45
Abstract
High-altitude lakes are sensitive indicators of climate change, reflecting the hydrological impacts of global warming in alpine regions. This study investigates the long-term dynamics of the water level and surface area of Lake Karakul on the eastern Pamir Plateau from 1991 to 2020 [...] Read more.
High-altitude lakes are sensitive indicators of climate change, reflecting the hydrological impacts of global warming in alpine regions. This study investigates the long-term dynamics of the water level and surface area of Lake Karakul on the eastern Pamir Plateau from 1991 to 2020 using integrated satellite altimetry data from ERS-1/2, ICESat, and CryoSat-2. A multi-source fusion approach was applied to generate a continuous time series, overcoming the temporal limitations of individual missions. The results show a significant upward trend in both water level and area, with an average lake level rise of 8 cm per year and a surface area increase of approximately 13.2 km2 per decade. The two variables exhibit a strong positive correlation (r = 0.84), and the Mann–Kendall test confirms the significance of the trends at the 95% confidence level. The satellite-derived water levels show high reliability, with an RMSE of 0.15 m when compared to reference data. These changes are primarily attributed to increased glacial meltwater inflow, driven by regional warming and accelerated glacier retreat, with glacier area shrinking by over 10% from 1978 to 2001 in the eastern Pamir. This study highlights the value of integrating multi-sensor satellite data for monitoring inland waters and provides critical insights into the climatic drivers of hydrological change in high-altitude endorheic basins. Full article
Show Figures

Figure 1

13 pages, 970 KiB  
Article
A Mixture Integer GARCH Model with Application to Modeling and Forecasting COVID-19 Counts
by Wooi Chen Khoo, Seng Huat Ong, Victor Jian Ming Low and Hari M. Srivastava
Stats 2025, 8(3), 73; https://doi.org/10.3390/stats8030073 - 13 Aug 2025
Viewed by 140
Abstract
This article introduces a flexible time series regression model known as the Mixture of Integer-Valued Generalized Autoregressive Conditional Heteroscedasticity (MINGARCH). Mixture models provide versatile frameworks for capturing heterogeneity in count data, including features such as multiple peaks, seasonality, and intervention effects. The proposed [...] Read more.
This article introduces a flexible time series regression model known as the Mixture of Integer-Valued Generalized Autoregressive Conditional Heteroscedasticity (MINGARCH). Mixture models provide versatile frameworks for capturing heterogeneity in count data, including features such as multiple peaks, seasonality, and intervention effects. The proposed model is applied to regional COVID-19 data from Malaysia. To account for geographical variability, five regions—Selangor, Kuala Lumpur, Penang, Johor, and Sarawak—were selected for analysis, covering a total of 86 weeks of data. Comparative analysis with existing time series regression models demonstrates that MINGARCH outperforms alternative approaches. Further investigation into forecasting reveals that MINGARCH yields superior performance in regions with high population density, and significant influencing factors have been identified. In low-density regions, confirmed cases peaked within three weeks, whereas high-density regions exhibited a monthly seasonal pattern. Forecasting metrics—including MAPE, MAE, and RMSE—are significantly lower for the MINGARCH model compared to other models. These results suggest that MINGARCH is well-suited for forecasting disease spread in urban and densely populated areas, offering valuable insights for policymaking. Full article
Show Figures

Figure 1

19 pages, 4594 KiB  
Article
Spatial Mapping of Thermal Anomalies and Change Detection in the Sierra Madre Occidental, Mexico, from 2000 to 2024
by Sandoval Sarahi and Escobar-Flores Jonathan Gabriel
Land 2025, 14(8), 1635; https://doi.org/10.3390/land14081635 - 13 Aug 2025
Viewed by 168
Abstract
We quantified monthly changes in land surface temperature (LST) over the Sierra Madre Occidental (SMO) in Mexico from 2000 to 2024 using MODIS satellite imagery (MOD11B3). The SMO is the longest continuous mountain complex in Mexico, covering an area of 251,648 km2 [...] Read more.
We quantified monthly changes in land surface temperature (LST) over the Sierra Madre Occidental (SMO) in Mexico from 2000 to 2024 using MODIS satellite imagery (MOD11B3). The SMO is the longest continuous mountain complex in Mexico, covering an area of 251,648 km2. It is an area of great importance for biodiversity conservation, as it is home to numerous endemic flora and fauna species. The Intergovernmental Panel on Climate Change (IPCC) has stated that high mountain areas are among the regions most affected by climate change and are a key element of the water cycle. We calculated an anomaly index by vegetation type in the SMO and applied change detection to spatially identify where changes in LST had taken place. The lowest LST values were in December and January (20 to 25 °C), and the highest LST values occurred in April, May, and June (>40 °C). Change detection applied to the time series showed that the months with the highest positive LST changes were May to July, and that November was notable for increases of up to 5.86 °C. The time series that showed the greatest changes compared to 2000 was the series for 2024, where LST increases were found in all months of the year. The maximun average increase was 6.98 °C from 2000 to June 2005. In general, LST anomalies show a pattern of occurrence in the months of March through July for the three vegetation types distributed in the Sierra Madre Occidental. In the case of the pine forest, which is distributed at 2000 m above sea level, and higher, it was expected that there would be no LST anomalies; however, anomalies were present in all time series for the spring and early summer months. The LST values were validated with in situ data from weather stations using linear regression models. It was found that almost all the values were related, with R2 > 0.60 (p < 0.001). In conclusion, the constant increases in LST throughout the SMO are probably related to the loss of 34% of forest cover due to forest fires, logging, land use changes, and increased forest plantations. Full article
(This article belongs to the Section Land – Observation and Monitoring)
Show Figures

Figure 1

17 pages, 3563 KiB  
Article
A Phenology-Informed Framework for Detecting Deforestation in North Korea Using Fused Satellite Time-Series
by Yihua Jin, Jingrong Zhu, Zhenhao Yin, Weihong Zhu and Dongkun Lee
Remote Sens. 2025, 17(16), 2789; https://doi.org/10.3390/rs17162789 - 12 Aug 2025
Viewed by 182
Abstract
Accurate mapping of deforestation in regions characterized by complex, heterogeneous landscapes and frequent cloud cover remains a major challenge in remote sensing. This study presents a phenology-informed, spatiotemporal data fusion framework for robust deforestation mapping in North Korea, focusing particularly on hillside fields [...] Read more.
Accurate mapping of deforestation in regions characterized by complex, heterogeneous landscapes and frequent cloud cover remains a major challenge in remote sensing. This study presents a phenology-informed, spatiotemporal data fusion framework for robust deforestation mapping in North Korea, focusing particularly on hillside fields and unstocked forests—two dominant deforested land cover types in the region. By integrating multi-temporal satellite observations with variables derived from phenological dynamics, our approach effectively distinguishes spectrally similar classes that are otherwise challenging to separate. The Flexible Spatiotemporal Data Fusion Algorithm (FSDAF) was employed to generate high-frequency, Landsat-like time-series from MODIS data, thereby ensuring fine spatial detail alongside temporal consistency. Key classification features—including NDVI, NDSI, NDWI, and snowmelt timing—were identified and ranked using the Random Forest (RF) algorithm. The classification results were validated against reference Landsat imagery, achieving high correlation coefficients (R > 0.8) and structural similarity index values (SSIM > 0.85). The RF-based land cover classification reached an overall accuracy of 86.1% and a Kappa coefficient of 0.837, reflecting strong agreement with ground reference data. Comparative analyses demonstrated that this method outperformed global land cover products, such as MCD12Q1, in capturing the spatial variability and fragmented patterns of deforestation at the regional scale. This research underscores the value of combining spatiotemporal fusion with phenological indicators for accurate, high-resolution deforestation monitoring in data-limited environments, providing practical insights for sustainable forest management and ecological restoration planning. Full article
Show Figures

Graphical abstract

21 pages, 4852 KiB  
Article
Series Arc Fault Detection Method Based on Time Domain Imaging and Long Short-Term Memory Network for Residential Applications
by Ruobo Chu, Schweitzer Patrick and Kai Yang
Algorithms 2025, 18(8), 497; https://doi.org/10.3390/a18080497 - 11 Aug 2025
Viewed by 169
Abstract
This article presents a novel method for detecting series arc faults (SAFs) in residential applications using time-domain imaging (TDI) and Long Short-Term Memory (LSTM) networks. The proposed method transforms current signals into grayscale images by filtering out the fundamental frequency, allowing key arc [...] Read more.
This article presents a novel method for detecting series arc faults (SAFs) in residential applications using time-domain imaging (TDI) and Long Short-Term Memory (LSTM) networks. The proposed method transforms current signals into grayscale images by filtering out the fundamental frequency, allowing key arc fault characteristics—such as high-frequency noise and waveform distortions—to become visually apparent. The use of Ensemble Empirical Mode Decomposition (EEMD) helped isolate meaningful signal components, although it was computationally intensive. To address real-time requirements, a simpler yet effective TDI method was developed for generating 2D images from current data. These images were then used as inputs to an LSTM network, which captures temporal dependencies and classifies both arc faults and appliance types. The proposed TDI-LSTM model was trained and tested on 7000 labeled datasets across five common household appliances. The experimental results show an average detection accuracy of 98.1%, with reduced accuracy for loads using thyristors (e.g., dimmers). The method is robust across different appliance types and conditions; comparisons with prior methods indicate that the proposed TDI-LSTM approach offers superior accuracy and broader applicability. Trade-offs in sampling rates and hardware implementation were discussed to balance accuracy and system cost. Overall, the TDI-LSTM approach offers a highly accurate, efficient, and scalable solution for series arc fault detection in smart home systems. Full article
(This article belongs to the Special Issue AI and Computational Methods in Engineering and Science)
Show Figures

Graphical abstract

24 pages, 2791 KiB  
Article
Short-Term Wind Power Forecasting Based on Improved Modal Decomposition and Deep Learning
by Bin Cheng, Wenwu Li and Jie Fang
Processes 2025, 13(8), 2516; https://doi.org/10.3390/pr13082516 - 9 Aug 2025
Viewed by 309
Abstract
With the continued growth in wind power installed capacity and electricity generation, accurate wind power forecasting has become increasingly critical for power system stability and economic operations. Currently, short-term wind power forecasting often employs deep learning models following modal decomposition of wind power [...] Read more.
With the continued growth in wind power installed capacity and electricity generation, accurate wind power forecasting has become increasingly critical for power system stability and economic operations. Currently, short-term wind power forecasting often employs deep learning models following modal decomposition of wind power time series. However, the optimal length of the time series used for decomposition remains unclear. To address this issue, this paper proposes a short-term wind power forecasting method that integrates improved modal decomposition with deep learning techniques. First, the historical wind power series is segmented using the Pruned Exact Linear Time (PELT) method. Next, the segmented series is decomposed using an enhanced Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (ICEEMDAN) to extract multiple modal components. High-frequency oscillatory components are then further decomposed using Variational Mode Decomposition (VMD), and the resulting modes are clustered using the K-means algorithm. The reconstructed components are subsequently input into a Long Short-Term Memory (LSTM) network for prediction, and the final forecast is obtained by aggregating the outputs of the individual modes. The proposed method is validated using historical wind power data from a wind farm. Experimental results demonstrate that this approach enhances forecasting accuracy, supports grid power balance, and increases the economic benefits for wind farm operators in electricity markets. Full article
(This article belongs to the Section Energy Systems)
Show Figures

Figure 1

45 pages, 2170 KiB  
Article
EnergiQ: A Prescriptive Large Language Model-Driven Intelligent Platform for Interpreting Appliance Energy Consumption Patterns
by Christoforos Papaioannou, Ioannis Tzitzios, Alexios Papaioannou, Asimina Dimara, Christos-Nikolaos Anagnostopoulos and Stelios Krinidis
Sensors 2025, 25(16), 4911; https://doi.org/10.3390/s25164911 - 8 Aug 2025
Viewed by 191
Abstract
The increased usage of smart sensors has introduced both opportunities and complexities in managing residential energy consumption. Despite advancements in sensor data analytics and machine learning (ML), existing energy management systems (EMS) remain limited in interpretability, adaptability, and user engagement. This paper presents [...] Read more.
The increased usage of smart sensors has introduced both opportunities and complexities in managing residential energy consumption. Despite advancements in sensor data analytics and machine learning (ML), existing energy management systems (EMS) remain limited in interpretability, adaptability, and user engagement. This paper presents EnergiQ, an intelligent, end-to-end platform that leverages sensors and Large Language Models (LLMs) to bridge the gap between technical energy analytics and user comprehension. EnergiQ integrates smart plug-based IoT sensing, time-series ML for device profiling and anomaly detection, and an LLM reasoning layer to deliver personalized, natural language feedback. The system employs statistical feature-based XGBoost classifiers for appliance identification and hybrid CNN-LSTM autoencoders for anomaly detection. Through dynamic user feedback loops and instruction-tuned LLMs, EnergiQ generates context-aware, actionable recommendations that enhance energy efficiency and device management. Evaluations demonstrate high appliance classification accuracy (94%) using statistical feature-based XGBoost and effective anomaly detection across varied devices via a CNN-LSTM autoencoder. The LLM layer, instruction-tuned on a domain-specific dataset, achieved over 91% agreement with expert-written energy-saving recommendations in simulated feedback scenarios. By translating complex consumption data into intuitive insights, EnergiQ empowers consumers to engage with energy use more proactively, fostering sustainability and smarter home practices. Full article
(This article belongs to the Section Intelligent Sensors)
Show Figures

Figure 1

26 pages, 5545 KiB  
Article
Time-Series MODIS-Based Remote Sensing and Explainable Machine Learning for Assessing Grassland Resilience in Arid Regions
by Ruihan Liu, Yang Yu, Ireneusz Malik, Malgorzata Wistuba, Zengkun Guo, Yuanbo Lu, Xiaoyun Ding, Jing He, Lingxiao Sun, Chunlan Li and Ruide Yu
Remote Sens. 2025, 17(16), 2749; https://doi.org/10.3390/rs17162749 - 8 Aug 2025
Viewed by 315
Abstract
Grassland ecosystems in arid regions increasingly experience resilience loss due to intensifying climatic variability. However, the limited interpretability of conventional machine learning models constrains our understanding of underlying ecological drivers. This study constructs an integrative framework that combines temporal autocorrelation (TAC) metrics with [...] Read more.
Grassland ecosystems in arid regions increasingly experience resilience loss due to intensifying climatic variability. However, the limited interpretability of conventional machine learning models constrains our understanding of underlying ecological drivers. This study constructs an integrative framework that combines temporal autocorrelation (TAC) metrics with explainable machine learning, employing Random Forest and SHAP (SHapley Additive exPlanations) analysis. Time series of satellite-derived vegetation indices from MODIS (2001–2023), particularly the kernel Normalized Difference Vegetation Index (KNDVI), support the generation of TAC and its trend-based derivative δTAC. The framework assesses ecosystem resilience across seven representative grassland types in Xinjiang, capturing diverse responses to climate variability and vegetation dynamics. Results reveal pronounced spatial heterogeneity: resilience declines in radiation-stressed arid zones, while hydrothermally stable regions maintain stronger recovery capacity. Key drivers include temperature variability and vegetation dynamics, with divergent effects among grassland types. Meadow and Typical Steppe exhibit higher resilience under stable hydrothermal regimes, whereas desert and alpine systems show greater sensitivity to warming and climatic fluctuations. This framework enhances diagnostic transparency and ecological insight, offering a spatially explicit, data-driven tool for resilience monitoring. The findings support the formulation of targeted adaptation strategies and sustainable grassland management in response to ongoing climate change. Full article
Show Figures

Graphical abstract

26 pages, 3766 KiB  
Article
Water Quality Evaluation and Analysis by Integrating Statistical and Machine Learning Approaches
by Amar Lokman, Wan Zakiah Wan Ismail and Nor Azlina Ab Aziz
Algorithms 2025, 18(8), 494; https://doi.org/10.3390/a18080494 - 8 Aug 2025
Viewed by 241
Abstract
Water quality assessment plays a vital role in environmental monitoring and resource management. This study aims to enhance the predictive modeling of the Water Quality Index (WQI) using a combination of statistical diagnostics and machine learning techniques. Data collected from six river locations [...] Read more.
Water quality assessment plays a vital role in environmental monitoring and resource management. This study aims to enhance the predictive modeling of the Water Quality Index (WQI) using a combination of statistical diagnostics and machine learning techniques. Data collected from six river locations in Malaysia are analyzed. The methodology involves collecting water quality data from six river locations in Malaysia, followed by a series of statistical analyses including assumption testing (shapiro–wilk and breusch–pagan tests), diagnostic evaluations, feature importance analysis, and principal component analysis (PCA). Decision tree regression (DTR) and autoregressive integrated moving average (ARIMA) are employed for regression, while random forest is used for classification. Learning curve analysis is conducted to evaluate model performance and generalization. The results indicate that dissolved oxygen (DO) and ammoniacal nitrogen (AN) are the most influential parameters, with normalized importance scores of 1.000 and 0.565, respectively. The breusch–pagan test identifies significant heteroscedasticity (p-value = (3.138e115)), while the Shapiro–Wilk test confirms non-normality (p-value = 0.0). PCA effectively reduces dimensionality while preserving 95% of dataset variance, optimizing computational efficiency. Among the regression models, ARIMA demonstrates better predictive accuracy than DTR. Meanwhile, random forest achieves high classification performance and shows strong generalization capability with increasing training data. Learning curve analysis reveals overfitting in the regression model, suggesting the need for hyperparameter tuning, while the classification model demonstrates improved generalization with additional training data. Strong correlations among key parameters indicate potential multicollinearity, emphasizing the need for careful feature selection. These findings highlight the synergy between statistical pre-processing and machine learning, offering a more accurate and efficient approach to water quality prediction for informed environmental policy and real-time monitoring systems. Full article
Show Figures

Figure 1

26 pages, 2638 KiB  
Article
How Explainable Really Is AI? Benchmarking Explainable AI
by Giacomo Bergami and Oliver Robert Fox
Logics 2025, 3(3), 9; https://doi.org/10.3390/logics3030009 - 6 Aug 2025
Viewed by 230
Abstract
This work contextualizes the possibility of deriving a unifying artificial intelligence framework by walking in the footsteps of General, Explainable, and Verified Artificial Intelligence (GEVAI): by considering explainability not only at the level of the results produced by a specification but also considering [...] Read more.
This work contextualizes the possibility of deriving a unifying artificial intelligence framework by walking in the footsteps of General, Explainable, and Verified Artificial Intelligence (GEVAI): by considering explainability not only at the level of the results produced by a specification but also considering the explicability of the inference process as well as the one related to the data processing step, we can not only ensure human explainability of the process leading to the ultimate results but also mitigate and minimize machine faults leading to incorrect results. This, on the other hand, requires the adoption of automated verification processes beyond system fine-tuning, which are essentially relevant in a more interconnected world. The challenges related to full automation of a data processing pipeline, mostly requiring human-in-the-loop approaches, forces us to tackle the framework from a different perspective: while proposing a preliminary implementation of GEVAI mainly used as an AI test-bed having different state-of-the-art AI algorithms interconnected, we propose two other data processing pipelines, LaSSI and EMeriTAte+DF, being a specific instantiation of GEVAI for solving specific problems (Natural Language Processing, and Multivariate Time Series Classifications). Preliminary results from our ongoing work strengthen the position of the proposed framework by showcasing it as a viable path to improve current state-of-the-art AI algorithms. Full article
Show Figures

Figure 1

25 pages, 1751 KiB  
Review
Large Language Models for Adverse Drug Events: A Clinical Perspective
by Md Muntasir Zitu, Dwight Owen, Ashish Manne, Ping Wei and Lang Li
J. Clin. Med. 2025, 14(15), 5490; https://doi.org/10.3390/jcm14155490 - 4 Aug 2025
Viewed by 554
Abstract
Adverse drug events (ADEs) significantly impact patient safety and health outcomes. Manual ADE detection from clinical narratives is time-consuming, labor-intensive, and costly. Recent advancements in large language models (LLMs), including transformer-based architectures such as Bidirectional Encoder Representations from Transformers (BERT) and Generative Pretrained [...] Read more.
Adverse drug events (ADEs) significantly impact patient safety and health outcomes. Manual ADE detection from clinical narratives is time-consuming, labor-intensive, and costly. Recent advancements in large language models (LLMs), including transformer-based architectures such as Bidirectional Encoder Representations from Transformers (BERT) and Generative Pretrained Transformer (GPT) series, offer promising methods for automating ADE extraction from clinical data. These models have been applied to various aspects of pharmacovigilance and clinical decision support, demonstrating potential in extracting ADE-related information from real-world clinical data. Additionally, chatbot-assisted systems have been explored as tools in clinical management, aiding in medication adherence, patient engagement, and symptom monitoring. This narrative review synthesizes the current state of LLMs in ADE detection from a clinical perspective, organizing studies into categories such as human-facing decision support tools, immune-related ADE detection, cancer-related and non-cancer-related ADE surveillance, and personalized decision support systems. In total, 39 articles were included in this review. Across domains, LLM-driven methods have demonstrated promising performances, often outperforming traditional approaches. However, critical limitations persist, such as domain-specific variability in model performance, interpretability challenges, data quality and privacy concerns, and infrastructure requirements. By addressing these challenges, LLM-based ADE detection could enhance pharmacovigilance practices, improve patient safety outcomes, and optimize clinical workflows. Full article
(This article belongs to the Section Pharmacology)
Show Figures

Figure 1

26 pages, 4116 KiB  
Article
Robust Optimal Operation of Smart Microgrid Considering Source–Load Uncertainty
by Zejian Qiu, Zhuowen Zhu, Lili Yu, Zhanyuan Han, Weitao Shao, Kuan Zhang and Yinfeng Ma
Processes 2025, 13(8), 2458; https://doi.org/10.3390/pr13082458 - 4 Aug 2025
Viewed by 475
Abstract
The uncertainties arising from high renewable energy penetration on both the generation and demand sides pose significant challenges to distribution network security. Smart microgrids are considered an effective way to solve this problem. Existing studies exhibit limitations in prediction accuracy, Alternating Current (AC) [...] Read more.
The uncertainties arising from high renewable energy penetration on both the generation and demand sides pose significant challenges to distribution network security. Smart microgrids are considered an effective way to solve this problem. Existing studies exhibit limitations in prediction accuracy, Alternating Current (AC) power flow modeling, and integration with optimization frameworks. This paper proposes a closed-loop technical framework combining high-confidence interval prediction, second-order cone convex relaxation, and robust optimization to facilitate renewable energy integration in distribution networks via smart microgrid technology. First, a hybrid prediction model integrating Variational Mode Decomposition (VMD), Long Short-Term Memory (LSTM), and Quantile Regression (QR) is designed to extract multi-frequency characteristics of time-series data, generating adaptive prediction intervals that accommodate individualized decision-making preferences. Second, a second-order cone relaxation method transforms the AC power flow optimization problem into a mixed-integer second-order cone programming (MISOCP) model. Finally, a robust optimization method considering source–load uncertainties is developed. Case studies demonstrate that the proposed approach reduces prediction errors by 21.15%, decreases node voltage fluctuations by 16.71%, and reduces voltage deviation at maximum offset nodes by 17.36%. This framework significantly mitigates voltage violation risks in distribution networks with large-scale grid-connected photovoltaic systems. Full article
(This article belongs to the Special Issue Applications of Smart Microgrids in Renewable Energy Development)
Show Figures

Figure 1

29 pages, 9514 KiB  
Article
Kennaugh Elements Allow Early Detection of Bark Beetle Infestation in Temperate Forests Using Sentinel-1 Data
by Christine Hechtl, Sarah Hauser, Andreas Schmitt, Marco Heurich and Anna Wendleder
Forests 2025, 16(8), 1272; https://doi.org/10.3390/f16081272 - 3 Aug 2025
Viewed by 368
Abstract
Climate change is generally having a negative impact on forest health by inducing drought stress and favouring the spread of pest species, such as bark beetles. The terrestrial monitoring of bark beetle infestation is very time-consuming, especially in the early stages, and therefore [...] Read more.
Climate change is generally having a negative impact on forest health by inducing drought stress and favouring the spread of pest species, such as bark beetles. The terrestrial monitoring of bark beetle infestation is very time-consuming, especially in the early stages, and therefore not feasible for extensive areas, emphasising the need for a comprehensive approach based on remote sensing. Although numerous studies have researched the use of optical data for this task, radar data remains comparatively underexplored. Therefore, this study uses the weekly and cloud-free acquisitions of Sentinel-1 in the Bavarian Forest National Park. Time series analysis within a Multi-SAR framework using Random Forest enables the monitoring of moisture content loss and, consequently, the assessment of tree vitality, which is crucial for the detection of stress conditions conducive to bark beetle outbreaks. High accuracies are achieved in predicting future bark beetle infestation (R2 of 0.83–0.89). These results demonstrate that forest vitality trends ranging from healthy to bark beetle-affected states can be mapped, supporting early intervention strategies. The standard deviation of 0.44 to 0.76 years indicates that the model deviates on average by half a year, mainly due to the uncertainty in the reference data. This temporal uncertainty is acceptable, as half a year provides a sufficient window to identify stressed forest areas and implement targeted management actions before bark beetle damage occurs. The successful application of this technique to extensive test sites in the state of North Rhine-Westphalia proves its transferability. For the first time, the results clearly demonstrate the expected relationship between radar backscatter expressed in the Kennaugh elements K0 and K1 and bark beetle infestation, thereby providing an opportunity for the continuous and cost-effective monitoring of forest health from space. Full article
(This article belongs to the Section Forest Health)
Show Figures

Graphical abstract

27 pages, 4742 KiB  
Article
Modeling and Generating Extreme Fluctuations in Time Series with a Multilayer Linear Response Model
by Yusuke Naritomi, Tetsuya Takaishi and Takanori Adachi
Entropy 2025, 27(8), 823; https://doi.org/10.3390/e27080823 - 3 Aug 2025
Viewed by 446
Abstract
A multilayer linear response model (MLRM) is proposed to generate time-series data based on linear response theory. The proposed MLRM is designed to generate data for anomalous dynamics by extending the conventional single-layer linear response model (SLRM) into multiple layers. While the SLRM [...] Read more.
A multilayer linear response model (MLRM) is proposed to generate time-series data based on linear response theory. The proposed MLRM is designed to generate data for anomalous dynamics by extending the conventional single-layer linear response model (SLRM) into multiple layers. While the SLRM is a linear equation with respect to external forces, the MLRM introduces nonlinear interactions, enabling the generation of a wider range of dynamics. The MLRM is applicable to various fields, such as finance, as it does not rely on machine learning techniques and maintains interpretability. We investigated whether the MLRM could generate anomalous dynamics, such as those observed during the coronavirus disease 2019 (COVID-19) pandemic, using pre-pandemic data. Furthermore, an analysis of the log returns and realized volatility derived from the MLRM-generated data demonstrated that both exhibited heavy-tailed characteristics, consistent with empirical observations. These results indicate that the MLRM can effectively reproduce the extreme fluctuations and tail behavior seen during high-volatility periods. Full article
(This article belongs to the Section Complexity)
Show Figures

Figure 1

Back to TopTop