Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,434)

Search Parameters:
Keywords = probabilistic information

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
23 pages, 1425 KB  
Article
TPP-TimeNet: A Time-Aware AI Framework for Robust Abnormality Detection in Bioprocess Monitoring
by Hye-Kyeong Ko
Appl. Sci. 2026, 16(7), 3295; https://doi.org/10.3390/app16073295 (registering DOI) - 28 Mar 2026
Abstract
Temporal monitoring of bioprocesses is inherently complex because process variables do not evolve independently over time, and their interpretation changes as the reaction progresses. In many existing abnormality detection methods, sensor signals are analyzed at isolated time points or temporal characteristics are only [...] Read more.
Temporal monitoring of bioprocesses is inherently complex because process variables do not evolve independently over time, and their interpretation changes as the reaction progresses. In many existing abnormality detection methods, sensor signals are analyzed at isolated time points or temporal characteristics are only weakly reflected through model structures. As a result, such approaches struggle to explain or detect abnormal behavior that emerges differently across reaction states. This study proposes TPP-TimeNet, a time-aware artificial intelligence framework developed to improve abnormality detection in bioprocess monitoring. Unlike conventional methods, the proposed framework explicitly incorporates reaction time as contextual information. Multivariate process signals are reorganized into sliding windows that reflect reaction-state transitions rather than uniform time segmentation. Temporal behavior inside each window is captured using a sequential encoding model, and reaction-state information is subsequently integrated to form state-dependent representations. Through this design, the model can distinguish between temporal patterns that are similar in shape but occur at different points in the reaction timeline. This capability leads to improved sensitivity to abnormal events that may otherwise remain undetected. Abnormality is evaluated at the window level using a probabilistic scoring scheme with a fixed threshold, enabling consistent and reproducible decision-making. The performance of TPP-TimeNet was evaluated using publicly available process control datasets from Kaggle. The datasets were reinterpreted in a bioprocess context by mapping variables such as temperature, pH, and pressure. Experimental results show that the proposed method outperforms traditional machine learning models as well as deep learning approaches that focus only on temporal features, achieving higher accuracy, sensitivity, and F1-score. These findings suggest that incorporating explicit reaction-state awareness is essential for effective abnormality detection in bioprocess monitoring systems. Full article
27 pages, 5008 KB  
Article
Unified Multiscale and Explainable Machine Learning Framework for Wear-Regime Transitions in MWCNT and Nanoclay-Reinforced Sustainable Bio-Based Epoxy Composites
by Manjodh Kaur, Pavan Hiremath, Dundesh S. Chiniwar, Bhagyajyothi Rao, Krishnamurthy D. Ambiger, Arunkumar H. S., P. Krishnananda Rao and Muralidhar Nagarajaiah
J. Compos. Sci. 2026, 10(4), 186; https://doi.org/10.3390/jcs10040186 (registering DOI) - 28 Mar 2026
Abstract
This study develops a unified multiscale–machine learning framework to interpret and predict thermo-mechanical wear regime transitions in MWCNT- and nanoclay-reinforced bio-based epoxy composites. A physics-informed master wear formulation integrating real contact mechanics, geometry-dependent shear transfer, interfacial adhesion energetics, and fracture-controlled matrix detachment was [...] Read more.
This study develops a unified multiscale–machine learning framework to interpret and predict thermo-mechanical wear regime transitions in MWCNT- and nanoclay-reinforced bio-based epoxy composites. A physics-informed master wear formulation integrating real contact mechanics, geometry-dependent shear transfer, interfacial adhesion energetics, and fracture-controlled matrix detachment was combined with interpretable machine learning analytics on a unified tribological dataset. In the CNT system, increasing loading from 0.1 to 0.4 wt.% enhanced interfacial adhesion energy density from 0.00813 to 0.01906 J/m2, resulting in a monotonic reduction in the wear rate from 0.00918 to 0.00613 mm3/N·m (~33% reduction). In contrast, nanoclay exhibited an optimum behavior, with a minimum wear at 0.25 wt.% (0.000093 mm3/N·m; 7.9% reduction vs. neat clay baseline), followed by deterioration at a higher loading due to dispersion loss. The unified probabilistic regime classification of low-wear conditions (k < 0.007 mm3/N·m) achieved an ROC − AUC = 0.9256 and balanced accuracy = 94.3%, with thermo-mechanical severity identified as the dominant regime-switching driver. Reinforcement identity significantly modulated regime stability, confirming distinct shear transfer (Carbon Nano Tubes(CNT)) and confinement/tribofilm (clay) mechanisms within a common mathematical framework. By enabling the durability-oriented design of bio-based tribological systems and extending component service life through predictive stability mapping, this work contributes to resource-efficient materials engineering and reduced lifecycle waste, supporting Sustainable Development Goals SDG 9 (Industry, Innovation and Infrastructure), SDG 12 (Responsible Consumption and Production), and SDG 13 (Climate Action). Full article
(This article belongs to the Special Issue Sustainable Biocomposites, 3rd Edition)
Show Figures

Figure 1

20 pages, 1191 KB  
Article
Bridging the Semantic Gap in 5G: A Hybrid RAG Framework for Dual-Domain Understanding of O-RAN Standards and srsRAN Implementation
by Yedil Nurakhov, Nurislam Kassymbek, Duman Marlambekov, Aksultan Mukhanbet and Timur Imankulov
Appl. Sci. 2026, 16(7), 3275; https://doi.org/10.3390/app16073275 (registering DOI) - 28 Mar 2026
Abstract
The rapid evolution of the Open Radio Access Network (O-RAN) architecture and the exponential growth in specification complexity create significant barriers for researchers translating 5G standards into practical implementations. Existing evaluation frameworks for large language models, such as ORAN-Bench-13K, focus predominantly on the [...] Read more.
The rapid evolution of the Open Radio Access Network (O-RAN) architecture and the exponential growth in specification complexity create significant barriers for researchers translating 5G standards into practical implementations. Existing evaluation frameworks for large language models, such as ORAN-Bench-13K, focus predominantly on the theoretical comprehension of regulatory documents while neglecting the critical aspect of software execution. This disparity results in a profound semantic gap, defined here as the structural and conceptual misalignment between abstract normative requirements and their concrete realization in the source code of open platforms like srsRAN. To bridge this divide and enable advanced cognitive reasoning, this paper presents a Hybrid Retrieval-Augmented Generation (RAG) framework designed to unify two heterogeneous knowledge domains: the O-RAN/3GPP specification corpus and the srsRAN C++ codebase. The proposed architecture leverages a hierarchical Parent–Child Chunking strategy to preserve the structural integrity of complex code and normative protocols. Additionally, it introduces a probabilistic Semantic Query Routing mechanism that dynamically selects the relevant context domain based on query intent. This routing actively mitigates semantic interference—a phenomenon where merging conflicting cross-domain terminology introduces informational noise, which our baseline tests showed degrades response accuracy by 4.7%. Empirical evaluation demonstrates that the hybrid approach successfully overcomes this, achieving an overall accuracy of 76.70% and outperforming the standard RAG baseline of 72.00%. Furthermore, system performance analysis reveals that effective context filtering reduces the average response generation latency to 3.47 s, compared to 3.73 s for traditional RAG methods, rendering the framework highly suitable for real-time telecommunications engineering tasks. Full article
(This article belongs to the Section Computing and Artificial Intelligence)
Show Figures

Figure 1

28 pages, 657 KB  
Article
An Uncertainty-Aware Temporal Transformer for Probabilistic Interval Modeling in Wind Power Forecasting
by Shengshun Sun, Meitong Chen, Mafangzhou Mo, Xu Yan, Ziyu Xiong, Yang Hu and Yan Zhan
Sensors 2026, 26(7), 2072; https://doi.org/10.3390/s26072072 - 26 Mar 2026
Viewed by 303
Abstract
Under high renewable energy penetration, wind power forecasting faces pronounced challenges due to strong randomness and uncertainty, making conventional point-forecast-centric paradigms insufficient for risk-aware and reliable power system scheduling. An uncertainty-aware temporal transformer framework for wind power forecasting is presented, integrating probabilistic modeling [...] Read more.
Under high renewable energy penetration, wind power forecasting faces pronounced challenges due to strong randomness and uncertainty, making conventional point-forecast-centric paradigms insufficient for risk-aware and reliable power system scheduling. An uncertainty-aware temporal transformer framework for wind power forecasting is presented, integrating probabilistic modeling with deep temporal representation learning to jointly optimize prediction accuracy and uncertainty characterization. Crucially, rather than treating uncertainty quantification merely as a post-processing step, the central conceptual contribution lies in modularizing uncertainty directly within the attention mechanism. A probability-driven temporal attention mechanism is incorporated at the encoding stage to emphasize high-variability and high-risk time slices during feature aggregation, while a multi-quantile output and interval modeling strategy is adopted at the prediction stage to directly learn the conditional distribution of wind power, enabling simultaneous point and interval forecasts with statistical confidence. Extensive experiments on multiple public wind power datasets demonstrate that the proposed method consistently outperforms traditional statistical models, deep temporal models, and deterministic transformers, as validated by formal statistical significance testing. Specifically, the method achieves an MAE of 0.089, an RMSE of 0.132, and a MAPE of 10.84% on the test set, corresponding to reductions of approximately 8%10% relative to the deterministic transformer. In uncertainty evaluation, a PICP of 0.91 is attained while compressing the MPIW to 0.221 and reducing the CWC to 0.241, indicating a favorable balance between coverage reliability and interval compactness. Compared with mainstream probabilistic forecasting methods, the model further reduces RMSE while maintaining coverage levels close to the 90% target, effectively mitigating excessive interval conservatism. Moreover, by adaptively generating heteroscedastic intervals that widen during high-volatility events and narrow under stable conditions, the model achieves a highly focused and effective capture of critical uncertainty information. Full article
(This article belongs to the Special Issue Artificial Intelligence-Driven Sensing)
Show Figures

Figure 1

25 pages, 2296 KB  
Article
Land-Use and Flood Risk Assessment Under Uncertainty: A Monte Carlo Approach in Hunan Province, China
by Qiong Li, Xinying Huang, Fei Pan, Qiang Hu and Xinran Xu
Land 2026, 15(4), 541; https://doi.org/10.3390/land15040541 - 26 Mar 2026
Viewed by 107
Abstract
Climate change and rapid urbanization are intensifying flood risks in China, particularly in regions with complex terrain and dense populations. Traditional risk assessment methods often lack the flexibility to handle uncertainties in multi-dimensional risk systems. This study proposes a probabilistic flood risk assessment [...] Read more.
Climate change and rapid urbanization are intensifying flood risks in China, particularly in regions with complex terrain and dense populations. Traditional risk assessment methods often lack the flexibility to handle uncertainties in multi-dimensional risk systems. This study proposes a probabilistic flood risk assessment framework integrating Monte Carlo simulation with a composite indicator system from the perspective of disaster system theory. Taking Hunan Province as a case study, we constructed a hierarchical indicator system encompassing environmental susceptibility, hazard intensity, exposure vulnerability, and mitigation capacity. The analytic hierarchy process (AHP) and coefficient of variation (CV) methods were combined for indicator weighting, and Monte Carlo simulation was employed to quantify uncertainties and classify risk levels. Results reveal significant spatial heterogeneity in flood risk across the province, with high-risk areas concentrated in regions exhibiting intense rainfall, dense river networks, and insufficient mitigation infrastructure. The study provides a transferable, data-driven approach for spatially explicit flood risk zoning, offering evidence-based insights for land-use planning, resilient infrastructure development, and sustainable flood governance. This research contributes to the integration of probabilistic modeling into land system science, supporting disaster risk reduction and climate adaptation strategies aligned with SDG 11. This study also provides policy-relevant insights for regional flood governance by supporting risk-informed land-use planning, targeted infrastructure investment, and adaptive flood management strategies, thereby contributing to more resilient and sustainable land system development under increasing climate uncertainty. Full article
(This article belongs to the Section Land Systems and Global Change)
Show Figures

Figure 1

19 pages, 969 KB  
Article
Media Narratives and the Construction of Meaning in Times of War: Evidence from the MeInWar Project
by Patrícia Silveira, Clarisse Pessôa and Simone Petrella
Youth 2026, 6(2), 39; https://doi.org/10.3390/youth6020039 - 25 Mar 2026
Viewed by 144
Abstract
Armed conflicts are at the epicentre of an information war, amplified by false claims about the motivations of the conflicts and refugees. The spread of narratives, especially in digital media, challenges the European Union to implement effective strategies to combat misinformation and to [...] Read more.
Armed conflicts are at the epicentre of an information war, amplified by false claims about the motivations of the conflicts and refugees. The spread of narratives, especially in digital media, challenges the European Union to implement effective strategies to combat misinformation and to adopt measures to scrutinise and hold the main communication channels accountable, in order to prevent hostile narratives from influencing public opinion and political decision-makers. In this context, this article seeks to analyse the implications of media discourses and misinformation in the development of social representations about the Russian–Ukrainian war and refugees, as well as the use of social networks by individuals to share this type of content. The research is based on an exploratory study as part of the R&D Project MeInWar—Study on the media and social representations of the Russian-Ukrainian conflict, funded by Europeia University. The study employed a survey method and an online questionnaire applied to a non-probabilistic convenience sample of 222 individuals aged between 18 and 38. The results revealed that media narratives influence attitudes towards refugees and migration policies, and it is clear that factors such as age and gender have an impact on content-sharing practices and the motivations behind them. Full article
Show Figures

Figure 1

22 pages, 6206 KB  
Article
Parameter Estimation and Interval Assessment of the Collapse Capacity of Viscous-Damped Structures Under Degradation and Partial Failure Scenarios
by Xi Zhao and Wen Pan
Buildings 2026, 16(6), 1271; https://doi.org/10.3390/buildings16061271 - 23 Mar 2026
Viewed by 176
Abstract
In-service deviations of viscous dampers can reduce the collapse safety margin of viscous-damped structures under strong earthquakes. This study examines two representative mechanisms: global degradation of the damper group and local failure of a subset of dampers. Incremental dynamic analyses are conducted for [...] Read more.
In-service deviations of viscous dampers can reduce the collapse safety margin of viscous-damped structures under strong earthquakes. This study examines two representative mechanisms: global degradation of the damper group and local failure of a subset of dampers. Incremental dynamic analyses are conducted for five damper-state scenarios using the 22 far-field ground-motion records recommended by ATC-63. To support reliability-oriented, uncertainty-aware collapse-capacity comparison with limited records, three complementary probabilistic inference frameworks are developed: an event-based fragility model using binary collapse indicators, a drift-margin model leveraging continuous deformation information from non-collapse responses, and a fusion model that combines both sources via a weighted composite likelihood with fusion strength governed by the weight w. For each scenario, the capacity scale parameter μm is reported as IM50,m, and record-level bootstrap resampling is used to construct interval estimates. Multi-scenario effects are further summarized by the ensemble mean reduction b and inter-path dispersion σdamper, offering compact measures of systematic shift and pathway-to-pathway variability. Results indicate a dominant systematic downward shift in median collapse capacity, with IM50,m reduced by approximately 2.4–2.9% overall, whereas differences among degradation pathways are secondary and bounded by the intervals. Scenario rankings remain consistent across the three frameworks; fusion outputs show weak sensitivity to w and yield tighter interval constraints on σdamper than the event-only baseline. The resulting interval-based parameters enable risk- and reliability-informed interpretation of degradation effects and provide a consistent basis for uncertainty quantification in probabilistic performance comparisons across scenarios. Full article
(This article belongs to the Special Issue Reliability and Risk Assessment of Building Structures)
Show Figures

Figure 1

49 pages, 1088 KB  
Article
Correlation Coefficient-Based Group Decision-Making Approach Under Probabilistic Dual Hesitant Fuzzy Linguistic Environment to Resilient Supplier Selection
by Xiao-Wen Qi, Jun-Ling Zhang, Jun-Tao Lai and Chang-Yong Liang
Systems 2026, 14(3), 334; https://doi.org/10.3390/systems14030334 - 23 Mar 2026
Viewed by 119
Abstract
In order to tackle resilient supplier selection (RSS) of high uncertainty in resilient supply chain management, an effective correlation coefficients-based multicriteria group decision-making (MCGDM) methodology has been constructed. The major contribution of the present study is twofold. Firstly, in view of that extant [...] Read more.
In order to tackle resilient supplier selection (RSS) of high uncertainty in resilient supply chain management, an effective correlation coefficients-based multicriteria group decision-making (MCGDM) methodology has been constructed. The major contribution of the present study is twofold. Firstly, in view of that extant criteria systems are all in lack of theoretical rationality, this paper establishes a capabilities-based analytical framework for intensive evaluation of supplier resilience by taking processual viewpoints of dynamic capabilities theory and risk management theory. Secondly, to empower the proposed correlation coefficients-based MCGDM methodology, probabilistic dual hesitant fuzzy uncertain unbalanced linguistic set (PDHF_UUBLS) is employed to capture hybrid uncertainties in decision processes of RSS. Then, theoretically compliant correlation coefficients (CCs) for PDHF_UUBLS are developed, including statistics-based CC, information energy-based CC and their weighted versions. Especially, information energy-based CCs overcome limitations of statistics-based CCs in special cases, thus exhibiting general applicability. In addition, a compatibility-based programming model has also been developed to objectively derive an unknown weighting vector for DMUs. Furthermore, illustrative case studies and comparative experiments have been carried out to verify effectiveness and stability of the proposed methodology. Taken together, this paper satisfies the new normal demand of resilience building in supply chain management and presents an effective MCGDM methodology for handling the key problems of RSS. Full article
(This article belongs to the Section Systems Practice in Social Science)
Show Figures

Figure 1

44 pages, 4569 KB  
Article
LSTM-Based Fast Prediction of Seismic Response and Fragility for Bridge Pile-Group Foundations: A Data-Driven Design Approach
by Zhenfeng Han, Deming She and Jun Liu
Designs 2026, 10(2), 37; https://doi.org/10.3390/designs10020037 - 23 Mar 2026
Viewed by 214
Abstract
Rapid and accurate prediction of seismic response and fragility for bridge pile-group foundations (PGFs) is crucial for assessing seismic resilience. However, the high computational cost of traditional high-fidelity nonlinear analysis limits the application of probabilistic seismic risk analysis. To address this, an integrated [...] Read more.
Rapid and accurate prediction of seismic response and fragility for bridge pile-group foundations (PGFs) is crucial for assessing seismic resilience. However, the high computational cost of traditional high-fidelity nonlinear analysis limits the application of probabilistic seismic risk analysis. To address this, an integrated deep learning framework is proposed that employs a unidirectional, multi-layer LSTM network for end-to-end prediction of structural responses directly from ground motions. The proposed model features two innovations. First, its multi-output capability enables simultaneous prediction of complete response time histories and peak values for key engineering demand parameters—bending moment, curvature, and pile cap displacement. Second, the network incorporates sliding time windows and residual connections to capture complex nonlinear soil–structure interaction. These predictions are integrated into a probabilistic seismic demand model to generate fragility curves. The framework is validated using a high-fidelity OpenSees model of a real bridge PGF subjected to 1000 ground motions. Results demonstrate the model’s excellent predictive accuracy: for peak bending moment, the mean predicted-to-actual ratio ranges from 0.97 to 1.03, with standard deviation below 0.12; the derived fragility curves show excellent agreement with benchmarks, achieving an average R2 of 0.985 across four damage states. More importantly, the framework reduces the time for a complete fragility assessment (200 incremental dynamic analyses) from approximately 12 h to about 1 s—a 40,000× speed-up—making data-driven rapid and large-scale seismic risk assessment a reality. The proposed framework provides engineers with a practical design tool for rapidly evaluating alternative foundation configurations and informing seismic design decisions, thereby integrating advanced data-driven methods directly into the engineering design workflow. Full article
(This article belongs to the Special Issue Intelligent Infrastructure and Construction in Civil Engineering)
Show Figures

Figure 1

27 pages, 590 KB  
Perspective
Machine Unlearning: A Perspective, Taxonomy, and Benchmark Evaluation
by Cristian Cosentino, Simone Gatto, Pietro Liò and Fabrizio Marozzo
Future Internet 2026, 18(3), 174; https://doi.org/10.3390/fi18030174 - 23 Mar 2026
Viewed by 304
Abstract
Machine Learning (ML) models trained on large-scale datasets learn useful predictive patterns, but they may also memorize undesired information, leading to risks such as information leakage, bias, copyright violations, and privacy attacks. As these models are increasingly deployed in real-world and regulated settings, [...] Read more.
Machine Learning (ML) models trained on large-scale datasets learn useful predictive patterns, but they may also memorize undesired information, leading to risks such as information leakage, bias, copyright violations, and privacy attacks. As these models are increasingly deployed in real-world and regulated settings, the consequences of such memorization become practical and high-stakes, reinforced by data-protection frameworks that grant individuals a Right to be Forgotten (e.g., the GDPR). Simply removing a record from the training dataset does not guarantee the elimination of its influence from the model, while retrain-from-scratch procedures are often prohibitive for modern architectures, including Transformers and Large Language Models (LLMs). In this work, we provide a perspective on Machine Unlearning (MU) in supervised learning settings, with a particular focus on Natural Language Processing (NLP) scenarios, grounded in a PRISMA-driven systematic review. We propose a multi-level taxonomy that organizes MU techniques along practical and conceptual dimensions, including exactness (exact versus approximate), unlearning granularity, guarantees, and application constraints. To complement this perspective, we run an illustrative benchmark evaluation using a standardized unlearning protocol on DistilBERT trained on a public corpus of news headlines for topic classification, contrasting the retraining gold standard with representative design-for-unlearning and approximate post hoc techniques. For completeness, we also report two oracle-assisted upper-bound baselines (distillation and scrubbing) that rely on a clean retrained reference model, and we account for their incremental cost separately. Our analysis jointly considers model utility, probabilistic quality, forgetting and privacy indicators, as well as computational efficiency. The results highlight systematic trade-offs between accuracy, computational cost, and removal effectiveness, providing practical guidance for selecting machine unlearning techniques in realistic deployment scenarios. Full article
Show Figures

Graphical abstract

31 pages, 7554 KB  
Article
Credible Reserve Assessment Method for Virtual Power Plants Considering User-Bounded Rationality Response
by Ting Yang, Qi Cheng, Butian Chen, Danhong Lu, Han Wu and Yiming Zhu
Sustainability 2026, 18(6), 3130; https://doi.org/10.3390/su18063130 - 23 Mar 2026
Viewed by 130
Abstract
Virtual power plants (VPPs) aggregate flexible resources, such as distributed photovoltaics (PV), energy storage, and flexible loads, to provide substantial reserve capacity for grid operation. However, the combined effects of renewable energy output uncertainty, load forecast errors, and user-bounded rationality responses lead to [...] Read more.
Virtual power plants (VPPs) aggregate flexible resources, such as distributed photovoltaics (PV), energy storage, and flexible loads, to provide substantial reserve capacity for grid operation. However, the combined effects of renewable energy output uncertainty, load forecast errors, and user-bounded rationality responses lead to significant errors in traditional deterministic VPP reserve assessment methods, severely affecting the balance between system supply and demand. To address this challenge, this paper proposes a credible reserve assessment method that accounts for user-bounded rationality. First, thermodynamic models with on–off constraints for air conditioning loads, energy feasible region, and power constraint models for electric vehicles (EVs) and energy storage systems (ESSs), as well as PV forecast error models are established to characterize physical reserve boundaries. Second, prospect theory is introduced to describe user-bounded rationality and a logit-based response probability model is developed. Monte Carlo sampling and kernel density estimation are employed to derive credible reserve sets under different confidence levels, achieving a probabilistic quantification of VPP reserve capacity distribution. Case studies demonstrate that the proposed method accurately characterizes the probabilistic distribution characteristics of VPP reserve provision under multiple uncertainties, providing comprehensive and reliable assessment information for power dispatching agencies. Full article
(This article belongs to the Special Issue Smart Grid Technology Contributing to Sustainable Energy Development)
Show Figures

Figure 1

19 pages, 10695 KB  
Article
Probabilistic Shaping-Assisted Bases Precoding in QAM Quantum Noise Stream Cipher
by Shuang Wei, Sheng Liu, Wei Wang, Chao Lei, Kongni Zhu, Mingrui Zhang, Yuang Li, Yunbo Li, Dong Wang, Dechao Zhang, Han Li, Yajie Li, Yongli Zhao and Jie Zhang
Photonics 2026, 13(3), 307; https://doi.org/10.3390/photonics13030307 - 23 Mar 2026
Viewed by 213
Abstract
We propose a probabilistic shaping-assisted base precoding quantum noise stream cipher (PSABP QNSC) scheme to effectively alleviate the encryption penalty in QAM QNSC systems. In contrast to the uniformly distributed bases adopted in traditional QNSC, Gaussian distributed bases can provide shaping gain. We [...] Read more.
We propose a probabilistic shaping-assisted base precoding quantum noise stream cipher (PSABP QNSC) scheme to effectively alleviate the encryption penalty in QAM QNSC systems. In contrast to the uniformly distributed bases adopted in traditional QNSC, Gaussian distributed bases can provide shaping gain. We theoretically analyze the underlying gain mechanism of Gaussian distributed bases in the PSABP QNSC scheme. Experimental results of 160 km reveal that the encryption penalties of QPSK and 16QAM are reduced by 0.44 dB and 0.27 dB, in terms of OSNR. Moreover, the security is quantified through the number of masked signals as a primary key metric. To mitigate the impact of base precoding, we propose the effective bases and effective ciphertext symbol points to refine the security evaluation. Moreover, the security is estimated in terms of mutual information leakage, with 2.2×104 bits of QPSK and 1.85×104 bits of 16QAM. The results indicate that the PSABP QNSC scheme provides effective protection against eavesdropping. Full article
Show Figures

Figure 1

21 pages, 1823 KB  
Article
Two-Stage Distributed Robust Air-Ground Cooperative Mission Planning: An Emergency Communication Solution for Addressing Probabilistic Uncertainty in Road Interruption
by Miao Miao, Wei Wang and Xiaokai Lian
Future Internet 2026, 18(3), 170; https://doi.org/10.3390/fi18030170 - 20 Mar 2026
Viewed by 123
Abstract
Earthquake disasters often cause communication base stations to fail, severely hindering rescue operations and information transmission. While traditional air-ground collaborative emergency communication systems can rapidly restore communications, they still face challenges such as the “time gap” caused by the endurance limitations of unmanned [...] Read more.
Earthquake disasters often cause communication base stations to fail, severely hindering rescue operations and information transmission. While traditional air-ground collaborative emergency communication systems can rapidly restore communications, they still face challenges such as the “time gap” caused by the endurance limitations of unmanned aerial vehicle (UAV) and the “spatial blind spots” resulting from the uncertainty of road disruptions. These issues reduce the continuity and reliability of system services. To address the robustness of air-ground platform coordinated deployment and path planning under uncertain road disruptions, this paper proposes a two-stage distributionally robust deployment and path planning (DRDPRP) method for fixed-wing UAV and ground unmanned vehicles (UGVs) in post-disaster emergency communications. This method constructs a distributionally robust uncertainty set based on a probabilistic distance metric to characterize road disruption risks. It establishes a two-stage distributionally robust optimization model to jointly optimize the deployment and paths of fixed-wing UAV and UGVs. Concurrently, it employs the Column and Constraint Generation (C&CG) algorithm as the solution framework, combined with branch-and-bound and local optimization strategies to enhance computational efficiency. Simulation results demonstrate that this method generates more robust collaborative deployment plans under road disruption uncertainties, thereby enhancing the continuity and reliability of post-disaster emergency communication systems. Full article
(This article belongs to the Section Internet of Things)
Show Figures

Figure 1

22 pages, 2677 KB  
Article
A Hybrid Interval Prediction Framework for Photovoltaic Power Prediction Using BiLSTM–Transformer and Adaptive Kernel Density Estimation
by Laiyuan Li and Zhibin Li
Appl. Sci. 2026, 16(6), 3023; https://doi.org/10.3390/app16063023 - 20 Mar 2026
Viewed by 170
Abstract
Photovoltaic (PV) power forecasting is strongly influenced by volatility, randomness, and changing meteorological conditions, while conventional point forecasting provides limited uncertainty information for engineering use. This study proposes a hybrid interval forecasting framework for PV prediction. Similar-day clustering first segments weather data into [...] Read more.
Photovoltaic (PV) power forecasting is strongly influenced by volatility, randomness, and changing meteorological conditions, while conventional point forecasting provides limited uncertainty information for engineering use. This study proposes a hybrid interval forecasting framework for PV prediction. Similar-day clustering first segments weather data into distinct scenarios (sunny, cloudy and overcast) to reduce noise and redundant information within sequences, enhancing stability and thereby providing a more refined feature space for deep learning. A BiLSTM–Transformer model is then used as the core forecaster, taking multiple meteorological variables as multi-feature time-series inputs. BiLSTM captures bidirectional temporal dependencies, and the Transformer enhances long-range feature extraction via attention. To improve robustness and stability, the Alpha Evolution (AE) algorithm is applied for hyperparameter optimization, balancing global exploration and local refinement. For probabilistic forecasting, Adaptive Bandwidth Kernel Density Estimation (ABKDE) is employed to construct prediction intervals, where the local bandwidth is determined by minimizing a local error function to adapt to data density and error distribution. Case studies utilizing a full-year, 5 min high-resolution dataset from the DKASC station demonstrate that the proposed AE-BiLSTM–Transformer achieves highly accurate point forecasts across diverse weather conditions, reducing the RMSE by 81.85%, 76.99%, and 72.26% under sunny, cloudy, and overcast scenarios, respectively, compared to the baseline LSTM. ABKDE further produces reliable and compact intervals; at the 90% confidence level on sunny days, it achieves PICP = 0.921 with PINAW = 0.0378, reducing PINAW by 75.16% relative to conventional KDE while maintaining comparable coverage. Full article
Show Figures

Figure 1

24 pages, 2012 KB  
Article
An Adaptive Consensus Model to Manage Non-Cooperative Behaviors in Large Group Decision-Making with Probabilistic Linguistic Information
by Xun Han, Xingrui Guan, Gang Chen, Jiangyue Fu and Xinchuan Liu
Mathematics 2026, 14(6), 1049; https://doi.org/10.3390/math14061049 - 20 Mar 2026
Viewed by 213
Abstract
To address challenges in complex group decision-making (GDM)—specifically preference fuzziness, intricate subgroup segmentation, and non-cooperative behavior—this study proposes an adaptive consensus model based on probabilistic linguistic term sets (PLTSs). By integrating fuzzy C-means (FCM) clustering with a Gaussian mixture model (GMM), a fuzzy [...] Read more.
To address challenges in complex group decision-making (GDM)—specifically preference fuzziness, intricate subgroup segmentation, and non-cooperative behavior—this study proposes an adaptive consensus model based on probabilistic linguistic term sets (PLTSs). By integrating fuzzy C-means (FCM) clustering with a Gaussian mixture model (GMM), a fuzzy Gaussian mixture model (FGMM) is constructed to achieve soft segmentation of expert preference distributions. On this basis, an adaptive consensus feedback mechanism is developed, which dynamically integrates interactive and automated adjustment strategies via multi-level consensus thresholds, thereby balancing decision efficiency and quality. To identify and control non-cooperative behaviors, a cooperation index and a three-tier management strategy, which incorporates intra-group negotiation, weight penalties and an exit-delegation mechanism, were introduced. In the case of strategic decision-making of new energy vehicles (NEV), after four rounds of feedback iterations, the group consensus level increased from the initial 0.316 to 0.804, reaching the preset threshold and verifying the effectiveness of the consensus mechanism. Compared with the existing literature methods, the framework in this paper achieves more comprehensive integration and innovation in four aspects: preference expression, clustering mechanism, consensus feedback and behavior management. Full article
(This article belongs to the Section D2: Operations Research and Fuzzy Decision Making)
Show Figures

Figure 1

Back to TopTop