Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (4,585)

Search Parameters:
Keywords = entropy generation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 21906 KB  
Article
On Fractional Discrete-Time Power Systems: Chaos, Complexity and Control
by Omar Kahouli, Imane Zouak, Sulaiman Almohaimeed, Adel Ouannas, Lilia El Amraoui and Mohamed Ayari
Mathematics 2026, 14(8), 1354; https://doi.org/10.3390/math14081354 - 17 Apr 2026
Abstract
In this paper, based on the Caputo-like delta fractional difference operator, we will present a fractional discrete model of a 4D Power System. We present an extension of the popular integer-order single-machine infinite-bus formulation to two fractional cases, one with commensurate (equal) fractional [...] Read more.
In this paper, based on the Caputo-like delta fractional difference operator, we will present a fractional discrete model of a 4D Power System. We present an extension of the popular integer-order single-machine infinite-bus formulation to two fractional cases, one with commensurate (equal) fractional orders and another incommensurate (not equal). This extension captures long-memory effects in dynamics and thus offers a consistent mathematical description of the nonlinear behavior of power systems. The orders of the fractional models are analyzed numerically. Using time series evolution, phase-space plots, bifurcation maps, Lyapunov spectra, and the 0–1 chaos test, spectral entropy and C0 complexity metrics, we identify chaotic regimes. Additionally, techniques for controlling chaos are explored to stabilize and regulate the dynamics of the system. Both the fractional formulations exhibit richer dynamical features than their integer counterparts, and for the incommensurate case, the sensitivity to the fractional variations is larger, generating complex nonlinear oscillations. The fractional discrete power system framework provides a new perspective for studying instability, the voltage collapse phenomenon, and chaotic oscillations in power engineering applications. Full article
(This article belongs to the Special Issue Mathematical Modeling and Control for Engineering Applications)
20 pages, 718 KB  
Article
Robustness of Energy Delivery and Economic Sensitivity in Onshore and Offshore Wind Power
by Fernando M. Camilo, Paulo J. Santos and Armando J. Pires
Energies 2026, 19(8), 1951; https://doi.org/10.3390/en19081951 - 17 Apr 2026
Abstract
The increasing penetration of wind generation requires performance evaluation methods that extend beyond average annual energy production. Temporal delivery characteristics, such as monthly dispersion and exposure to low-production periods, can influence both technical robustness and economic sensitivity. Building upon a previously developed probabilistic [...] Read more.
The increasing penetration of wind generation requires performance evaluation methods that extend beyond average annual energy production. Temporal delivery characteristics, such as monthly dispersion and exposure to low-production periods, can influence both technical robustness and economic sensitivity. Building upon a previously developed probabilistic and entropy-based assessment framework, this study evaluates the robustness of delivery-oriented performance metrics for onshore and offshore wind units under parametric and economic uncertainty. Using high-resolution operational data from four wind units (three onshore and one offshore), the analysis incorporates percentile sensitivity, threshold variation in low-production exposure, bootstrap-based uncertainty intervals, and Monte Carlo simulation of economic inputs including CAPEX, operation and maintenance costs, and discount rate. The results indicate that variations in percentile definitions and stochastic economic assumptions modify absolute performance values but do not substantially alter the relative positioning between offshore and onshore units. Averaged over 2022–2024, the analyzed offshore unit exhibited a lower monthly energy dispersion coefficient (CVE=0.255) [Reviewer2]than the analyzed onshore units (CVE=0.368), [Reviewer2]corresponding to an approximate 30% reduction in relative variability. The offshore unit also showed lower mean low-production exposure (LPE=0.526 versus 0.581 for onshore units) [Reviewer2]and consistently lower amplification of robustness-adjusted LCOE under conservative delivery assumptions. These results indicate that the analyzed offshore unit retains stronger delivery robustness and lower economic sensitivity across the tested parameter ranges. The proposed robustness-validation framework complements conventional yield-based assessments and provides additional insight for risk-aware evaluation of wind generation assets in renewable-dominated power systems. Full article
(This article belongs to the Special Issue Recent Innovations in Offshore Wind Energy)
27 pages, 1869 KB  
Article
NEF-DHR: A Non-Equivalent Functional Dynamic Heterogeneous Redundancy Architecture for Endogenous Safety and Security
by Bingbing Jiang, Yilin Kang and Hanzhi Cai
Entropy 2026, 28(4), 463; https://doi.org/10.3390/e28040463 - 17 Apr 2026
Abstract
Endogenous safety and security (ESS), which advocates for designing systems that are inherently safe and secure by nature, has emerged as a pivotal paradigm for addressing the inherent vulnerabilities of information systems. The Dynamic Heterogeneous Redundancy (DHR) architecture serves as its typical implementation [...] Read more.
Endogenous safety and security (ESS), which advocates for designing systems that are inherently safe and secure by nature, has emerged as a pivotal paradigm for addressing the inherent vulnerabilities of information systems. The Dynamic Heterogeneous Redundancy (DHR) architecture serves as its typical implementation by introducing dynamic, heterogeneous, redundant executors with equivalent function (EF) into the information system. However, the functional equivalence property explicitly connects the system’s output to that of the individual executors, thereby creating potential security risks that adversaries could exploit. In addition, EF-DHR faces an inherent contradiction between functional equivalence and heterogeneous implementations (HIS), leading to high engineering costs and limited applicability. To address these problems, this paper proposes the Non-Equivalent Functional DHR (NEF-DHR) architecture, leveraging function secret sharing (FSS) theory to replace EF executors with NEF components, which fundamentally eliminates the EF-HIS contradiction. Specifically, we propose the concept of `terminal executor output information entropy loss’ to formalize the risk of output information interception by adversaries and theoretically prove that NEF-DHR improves unpredictability and resistance to attacks. Experimental results further validate that NEF-DHR exhibits lower error rates under various attack levels, with enhanced robustness and superior ESS performance. Additionally, we generalize the DHR architecture based on three core properties (indistinguishability, output recoverability, verification) and classify ESS into three types with corresponding DHR variants. This work advances the application of entropy theory in ESS and provides a novel entropy-enhanced solution for the large-scale deployment of DHR security systems. Full article
(This article belongs to the Section Complexity)
21 pages, 635 KB  
Article
Agentic Hallucination Risk Scoring for Medical LLMs via Uncertainty Quantification and Clinical Knowledge Injection
by Mayank Kapadia and Mohammad Masum
Algorithms 2026, 19(4), 315; https://doi.org/10.3390/a19040315 - 17 Apr 2026
Abstract
Large Language Models (LLMs) have witnessed significant adoption across numerous domains since 2020, but their proclivity to hallucinate creates unacceptable dangers in high-risk environments like healthcare, where wrong outputs can directly jeopardize human safety. While present systems focus on pre-generation mitigation strategies, they [...] Read more.
Large Language Models (LLMs) have witnessed significant adoption across numerous domains since 2020, but their proclivity to hallucinate creates unacceptable dangers in high-risk environments like healthcare, where wrong outputs can directly jeopardize human safety. While present systems focus on pre-generation mitigation strategies, they cannot ensure the safety of individual outputs during inference. We provide a post hoc Hallucination Risk Scoring (HRS) methodology that intercepts questionable outputs before they reach patients via an agentic pipeline. Given a medical question, a domain-specific LLM generates an initial response from which five complimentary uncertainty signals are computed, which are then separated into a decision layer that governs escalation and a guidance layer that directs clinical knowledge injection by a GPT. The framework is tested using three biological question-answering datasets of various complexity: PubMedQA-Labeled, PubMedQA-Artificial, and BioASQ Task B. The results show an up to 38% safety increase at the most sensitive threshold configuration, zero deterioration across all experimental configurations enforced by the Revert Baseline method, and complexity-aware escalation rates that scale organically with dataset difficulty. Tunable thresholds allow physicians to calibrate system behavior based on deployment requirements, providing a practical safety–accuracy trade-off. Statistical research finds entropy as the primary uncertainty signal separating escalated from non-escalated situations across all datasets. These findings provide a deployable, interpretable, and configurable post hoc safety paradigm for reliable medical AI implementation. Full article
(This article belongs to the Special Issue Evolution of Algorithms in the Era of Generative AI)
Show Figures

Figure 1

22 pages, 3205 KB  
Article
Context-Responsive Building Footprint Generation via Conditional Inpainting Using Latent Diffusion Models
by Eunseok Jang and Kyunghwan Kim
Sustainability 2026, 18(8), 3987; https://doi.org/10.3390/su18083987 - 17 Apr 2026
Abstract
Generative AI has advanced rapidly in architectural design; however, existing building footprint generation models tend to emphasize stylistic exploration while insufficiently integrating site context as a fundamental physical constraint that facilitates alignment with the surrounding urban fabric. To address this limitation, this study [...] Read more.
Generative AI has advanced rapidly in architectural design; however, existing building footprint generation models tend to emphasize stylistic exploration while insufficiently integrating site context as a fundamental physical constraint that facilitates alignment with the surrounding urban fabric. To address this limitation, this study proposes a context-responsive methodology for generating building footprints using a multi-layered four-channel representation of site conditions—including roads, sidewalks, adjacent buildings, and site boundaries—within a Latent Diffusion Model framework. The proposed approach encodes these physical conditions into a structured tensor and concatenates them directly to the U-Net input, enabling site context to function as an explicit spatial control variable during generation. An ablation study evaluated the effectiveness of the proposed contextual configuration. Compared with a single-channel model, the four-channel model achieved an 18.08% reduction in average pixel-wise information entropy, indicating a measurable decrease in generative uncertainty. Qualitative analyses further demonstrated that the enriched contextual input promotes geometrically coherent footprint configurations, such as context-responsive setbacks and spatial alignment with surrounding built forms. These findings suggest that structured multi-channel site information enhances contextual grounding in generative design processes and may contribute to more environmentally integrated and spatially coherent architectural outcomes. Full article
Show Figures

Figure 1

26 pages, 4576 KB  
Article
AdaProtoNet: A Noise-Tolerant Few-Shot ISAR Image Classification Network with Adaptive Relaxation Strategy
by Zheng Zhang, Ming Lv, Zhenhong Jia, Liangliang Li, Xueyu Zhang, Xiaobin Zhao and Hongbing Ma
Remote Sens. 2026, 18(8), 1207; https://doi.org/10.3390/rs18081207 - 16 Apr 2026
Abstract
Inverse synthetic aperture radar (ISAR) image classification plays a crucial role in remote sensing, traffic monitoring, and maritime surveillance. However, existing methods often suffer from limited labeled data, degraded image quality, and the insufficient adaptability of conventional loss functions. To address these issues, [...] Read more.
Inverse synthetic aperture radar (ISAR) image classification plays a crucial role in remote sensing, traffic monitoring, and maritime surveillance. However, existing methods often suffer from limited labeled data, degraded image quality, and the insufficient adaptability of conventional loss functions. To address these issues, this paper proposes AdaProtoNet, a few-shot ISAR image classification framework based on a ResNet10 backbone and a combined adaptive and cross-entropy loss function. The model adopts a Prototypical Network architecture that balances feature extraction and class discrimination. A customized multicategory ISAR dataset is constructed through 3D target modeling and simulated radar imaging to support few-shot learning. Within the meta-learning paradigm, AdaProtoNet generates class prototypes by averaging support features and performs classification via Euclidean distance measurement. Experimental results demonstrate that AdaProtoNet achieves higher overall accuracy (OA) and stronger generalization than conventional ISAR classification methods. These findings highlight the effectiveness of adaptive-margin optimization in few-shot learning and provide guidance for the development of next-generation remote sensing recognition systems. Full article
(This article belongs to the Special Issue Temporal and Spatial Analysis of Multi-Source Remote Sensing Images)
Show Figures

Figure 1

25 pages, 465 KB  
Article
Digital Economy, Agricultural Technological Innovation, and Agricultural Economic Resilience: A Sustainable Agricultural Development Perspective
by Zhiying Chen and Xiangyu Ma
Sustainability 2026, 18(8), 3973; https://doi.org/10.3390/su18083973 - 16 Apr 2026
Abstract
Digital economy and agricultural technological innovation are key drivers of agricultural economic resilience and sustainable development. However, existing research has yet to clarify how they jointly affect agricultural economic resilience, particularly through potential nonlinear patterns and spatial spillover effects. Using panel data from [...] Read more.
Digital economy and agricultural technological innovation are key drivers of agricultural economic resilience and sustainable development. However, existing research has yet to clarify how they jointly affect agricultural economic resilience, particularly through potential nonlinear patterns and spatial spillover effects. Using panel data from 30 Chinese provinces, this study measures digital economy development and agricultural economic resilience via the entropy weight method. It systematically examines the direct impact, transmission mechanisms, threshold effects, and spatial spillover effects using two-way fixed effects, mediation, threshold regression, and spatial Durbin models. The findings are as follows. First, the digital economy significantly improves agricultural economic resilience, a result robust to various tests and endogeneity treatments. Second, agricultural technological innovation plays a partial mediating role, accounting for 19.37% of the total effect. Third, the resilience-enhancing effect of agricultural technological innovation exhibits a double-threshold pattern: its positive impact gradually strengthens as the digital economy develops to a higher level. Fourth, the digital economy generates a positive spatial spillover effect on agricultural economic resilience. Fifth, although the digital economy and agricultural technological innovation show synergistic development, their coupling coordination degree remains relatively low, indicating substantial untapped potential for synergy. From a sustainable development perspective, this study reveals the mechanisms through which the digital economy and agricultural technological innovation enhance agricultural economic resilience, providing empirical evidence and policy insights for strengthening agricultural risk resistance and achieving agricultural sustainability via digital transformation and technological progress. Full article
20 pages, 1118 KB  
Article
Lossless Reversible Color Image Encryption Using Multilayer Hybrid Chaos with Gram–Schmidt Orthogonalization and ChaCha20-HMAC-Authenticated Transport
by Saadia Drissi, Faiq Gmira and Meriyem Chergui
Technologies 2026, 14(4), 235; https://doi.org/10.3390/technologies14040235 - 16 Apr 2026
Abstract
In this study, a hybrid multi-layer scheme for reversible color image encryption is proposed, ensuring lossless reconstruction and strong cryptographic security concurrently. This method consists of three main stages. First, session-specific keys are generated using HKDF-SHA256 along with a timestamp-based mechanism to prevent [...] Read more.
In this study, a hybrid multi-layer scheme for reversible color image encryption is proposed, ensuring lossless reconstruction and strong cryptographic security concurrently. This method consists of three main stages. First, session-specific keys are generated using HKDF-SHA256 along with a timestamp-based mechanism to prevent replay attacks and support dynamic key management. Second, a four-layer confusion–diffusion structure is applied. It uses Gram–Schmidt orthogonal matrices, integer-based PWLCM chaotic mapping, the Hill cipher, and dynamically created S-Boxes. These operations rely on integer modular arithmetic Z256 and Q16.16 fixed-point precision. Finally, ChaCha20 stream encryption with HMAC-SHA256 authentication is used to secure data transmission in distributed environments. Experimental tests conducted on standard images show strong cryptographic performance, including near-ideal entropy (7.9993 bits), a significant avalanche effect (NPCR99.6%, UACI33.4%), and very low pixel correlation. The method achieves perfect lossless reconstruction and provides an effective key space 2¹². These results confirm the suitability of the proposed scheme for secure image protection in applications requiring bit-exact recovery, such as medical imaging, digital forensics, and satellite communications. Full article
29 pages, 450 KB  
Article
Quantum-Informational History Optimization Theory (QIHOT): A Single-History Selection Framework with Consistency Results
by Freeman Hui
Quantum Rep. 2026, 8(2), 34; https://doi.org/10.3390/quantum8020034 - 16 Apr 2026
Abstract
We present Quantum-Informational History Optimization Theory (QIHOT) as a formal proposal for selecting a single realized quantum history from a space of dynamically admissible histories subject to boundary constraints. In the present paper, we restrict attention to finite-dimensional and toy-model settings, where the [...] Read more.
We present Quantum-Informational History Optimization Theory (QIHOT) as a formal proposal for selecting a single realized quantum history from a space of dynamically admissible histories subject to boundary constraints. In the present paper, we restrict attention to finite-dimensional and toy-model settings, where the framework can be stated explicitly. QIHOT separates two levels: a dynamical prior over admissible histories generated by standard quantum evolution, and an informational selection rule that reweights those histories by an entropy-based cost functional. Within this structure, we show that standard Born statistics are recovered in symmetric-cost measurement scenarios when the prior is the usual Hilbert-space quantum prior. We further formulate conditions under which operational no-signaling is preserved, provided the selection functional factorizes locally for spacelike-separated regions. A fully worked two-outcome model illustrates how the framework interpolates between coherent evolution and measurement-like branch selection. We contrast QIHOT with the Many-Worlds Interpretation, the Transactional Interpretation, the Consistent Histories formalism, the Schwinger–Keldysh formalism, and Lagrangian-based retrocausal models, highlighting structural similarities and key differences. We emphasize that the present paper develops QIHOT as a scoped formal proposal with partial consistency results rather than as a complete replacement for quantum theory. Possible extensions to consciousness and cosmology are deferred to brief outlook-level discussion. Full article
Show Figures

Figure 1

25 pages, 753 KB  
Article
A Dual-Source Evidence–Driven Semi-Supervised Belief Rule Base for Fault Diagnosis
by Xin Zhang, Zhiying Fan, Wei He and Huafeng He
Sensors 2026, 26(8), 2444; https://doi.org/10.3390/s26082444 - 16 Apr 2026
Abstract
In the fault diagnosis of complex industrial systems, labeled samples are expensive to obtain, which leads to insufficient training data for the belief rule base (BRB) model. Although unlabeled samples are abundant, the uncertainty of their pseudo-labels may undermine semi-supervised learning and hinder [...] Read more.
In the fault diagnosis of complex industrial systems, labeled samples are expensive to obtain, which leads to insufficient training data for the belief rule base (BRB) model. Although unlabeled samples are abundant, the uncertainty of their pseudo-labels may undermine semi-supervised learning and hinder accurate parameter optimization of the BRB model. To address these issues, a dual-source evidence-driven semi-supervised BRB method (SS-BRB) is proposed for fault diagnosis. The proposed method makes effective use of unlabeled samples while preserving the interpretability and inference transparency of the BRB model. To improve the reliability of pseudo-labels in semi-supervised learning, a dual-source evidence-driven pseudo-labeling mechanism is designed. In this mechanism, local similarity information is combined with the global inference results of the BRB model. An entropy factor and a feature distance factor are introduced to adaptively adjust the confidence of pseudo-labels. In this way, the quality of pseudo-labels is improved, and the influence of noisy samples is reduced. Based on this mechanism, high-confidence pseudo-labeled samples are incorporated into the training set to further optimize the model. Experimental results show that the proposed method achieves good diagnostic performance on both the gearbox dataset and the WD615 diesel engine dataset. Even with limited labeled data, the proposed method still achieves high accuracy, robustness, and good generalization performance. Full article
(This article belongs to the Section Fault Diagnosis & Sensors)
Show Figures

Figure 1

18 pages, 835 KB  
Article
Entropy-Driven Isosymmetric Phase Transition in L-Serine Under Pressure: A Periodic DFT Study
by Anna Maria Mazurek, Monika Franczak-Rogowska and Łukasz Szeleszczuk
Crystals 2026, 16(4), 266; https://doi.org/10.3390/cryst16040266 - 16 Apr 2026
Abstract
Understanding pressure-induced isosymmetric phase transitions in molecular crystals requires consideration of both structural and thermodynamic factors, particularly in hydrogen-bonded systems. In this work, periodic density functional theory (DFT) calculations were employed to investigate the pressure-dependent behavior of L-serine and to elucidate the origin [...] Read more.
Understanding pressure-induced isosymmetric phase transitions in molecular crystals requires consideration of both structural and thermodynamic factors, particularly in hydrogen-bonded systems. In this work, periodic density functional theory (DFT) calculations were employed to investigate the pressure-dependent behavior of L-serine and to elucidate the origin of its experimentally observed phase transition between Phase I and Phase IV. Geometry optimizations performed at ambient pressure and 8.8 GPa reproduce the compression of the crystal lattice and the pressure-driven stabilization of Phase IV. However, no spontaneous reorientation of the hydroxyl groups is observed, indicating that the transition is not accessible within a purely static framework. To further explore the stability of the system, a series of modified crystal structures with different hydroxyl group orientations was generated and analyzed, revealing a complex energy landscape at ambient conditions that becomes significantly simplified under compression. Phonon calculations within the quasi-harmonic approximation demonstrate that the experimentally observed Phase I structure is not stabilized by enthalpy but by vibrational entropy, whose contribution increases with temperature. These results show that the phase transition in L-serine is governed by an interplay between lattice energy, hydrogen-bond rearrangement, and vibrational effects, and highlight that an accurate description of polymorphic stability in such systems requires inclusion of both static and dynamic contributions. Full article
Show Figures

Figure 1

7 pages, 397 KB  
Article
Reversible Evaporation and the Entropy of Black Holes
by Friedrich Herrmann and Michael Pohlig
Entropy 2026, 28(4), 455; https://doi.org/10.3390/e28040455 - 15 Apr 2026
Abstract
The entropy of a Schwarzschild black hole is commonly derived using thermodynamic relations whose physical interpretation is not always transparent, in particular with respect to the localization of temperature and entropy. In this paper, we present a derivation of the Bekenstein–Hawking entropy based [...] Read more.
The entropy of a Schwarzschild black hole is commonly derived using thermodynamic relations whose physical interpretation is not always transparent, in particular with respect to the localization of temperature and entropy. In this paper, we present a derivation of the Bekenstein–Hawking entropy based exclusively on the principles of phenomenological thermodynamics, formulated entirely in regions where spacetime is effectively flat. The analysis considers a reversible evaporation process in which the black hole is surrounded by a tunable thermal radiation bath whose temperature is kept arbitrarily close to the Hawking temperature. In this limit, entropy production can be made negligible. By integrating the entropy flux through a distant reference surface over the evaporation process, the standard entropy formula is obtained without invoking assumptions about the localization of the black hole entropy or about microscopic degrees of freedom. The derivation is mathematically simple but conceptually instructive. The approach is intended to be accessible to readers familiar with classical thermodynamics and general relativity at an advanced undergraduate or graduate level. Full article
(This article belongs to the Section Astrophysics, Cosmology, and Black Holes)
39 pages, 533 KB  
Article
A Novel Extension of the Weibull Distribution with Application in Quantitative and Reliability Sciences
by Shoaib Iqbal, Bassant Elkalzah, Zawar Hussain and Farrukh Jamal
Symmetry 2026, 18(4), 659; https://doi.org/10.3390/sym18040659 - 15 Apr 2026
Abstract
The main focus of this paper is to introduce a new probability model. Specifically, this paper presents a modified form of the Weibull distribution and investigates its various statistical properties, such as moments, moment-generating functions, reliability functions, quantile functions, and inequality measures such [...] Read more.
The main focus of this paper is to introduce a new probability model. Specifically, this paper presents a modified form of the Weibull distribution and investigates its various statistical properties, such as moments, moment-generating functions, reliability functions, quantile functions, and inequality measures such as Bonferroni and Lorenz curves. It also investigates the mean absolute deviation and entropy. Distributions of order statistics, reversed order statistics, and upper record values are also obtained. Additionally, univariate and bivariate moment structures are considered. The model parameters are estimated via the maximum likelihood method under simple random sampling and ranked set sampling, allowing an empirical evaluation of efficiency and reliability. Graphical representations exhibit the flexibility of the model, capturing various shapes in the probability density and hazard rate functions. To measure the practical quality of the model, actuarial metrics are used. A comparative analysis based on insurance, biomedical, and reliability datasets demonstrates the empirically improved performance and stability of the proposed new model for these specific datasets. Full article
(This article belongs to the Section Mathematics)
30 pages, 2720 KB  
Article
Influence of Data Structure on Prediction Error in Machine Learning-Based Concrete Compressive Strength Models
by Yelan Mo, Bixiong Li, Chengcheng Yan and Xiangxin Hu
Buildings 2026, 16(8), 1537; https://doi.org/10.3390/buildings16081537 - 14 Apr 2026
Viewed by 123
Abstract
Machine learning has been widely used for concrete compressive strength prediction, yet previous studies have focused mainly on algorithm comparison and isolated feature-processing strategies. The coupled influence of dataset characteristics on prediction error has received less systematic attention. This study investigates concrete strength [...] Read more.
Machine learning has been widely used for concrete compressive strength prediction, yet previous studies have focused mainly on algorithm comparison and isolated feature-processing strategies. The coupled influence of dataset characteristics on prediction error has received less systematic attention. This study investigates concrete strength prediction from a data structure perspective by examining three structural variables, namely, sample size, feature size, and compressive strength range. A unified experimental framework was constructed using 15 concrete datasets. Correlation, partial correlation, information entropy, and relief were employed to reorganize feature subsets, and the resulting error trends were evaluated using artificial neural network (ANN), support vector regression (SVR), and random forest (RF) models. The results show that prediction error generally decreases first and then becomes stable as feature size increases, although the location of the low-error region depends on the dataset and the filtering method. Larger sample size is associated with improved prediction stability, whereas wider strength range tends to increase prediction difficulty. Based on these observations, an empirical relationship was established to describe the joint effect of sample size, feature size, and strength range on prediction error. The findings indicate that the attainable error level in concrete strength prediction is controlled not only by model form but also by dataset organization and feature configuration. Within the present framework, the study provides a practical basis for designing feature systems and interpreting model performance across datasets with different structural characteristics. Full article
21 pages, 1178 KB  
Article
Soft-Community Kernel Rényi Spectrum for Semantic Uncertainty Estimation in Large Language Models
by Zongkai Li and Junliang Du
Entropy 2026, 28(4), 442; https://doi.org/10.3390/e28040442 - 14 Apr 2026
Viewed by 169
Abstract
Uncertainty estimation is critical for deploying large language models (LLMs) in safety-sensitive and decision-critical applications. Recent approaches estimate semantic uncertainty by clustering multiple sampled responses into equivalence classes and measuring their diversity via entropy-based criteria. However, existing methods typically rely on greedy hard [...] Read more.
Uncertainty estimation is critical for deploying large language models (LLMs) in safety-sensitive and decision-critical applications. Recent approaches estimate semantic uncertainty by clustering multiple sampled responses into equivalence classes and measuring their diversity via entropy-based criteria. However, existing methods typically rely on greedy hard clustering and von Neumann entropy, which suffer from sensitivity to clustering order, noise in semantic equivalence judgments, and limited control over spectral contributions. In this work, we propose a principled information-theoretic framework for LLM semantic uncertainty estimation based on soft semantic communities and kernel Rényi entropy. Given multiple generations for a query, we construct a weighted semantic graph using pairwise semantic similarity scores and infer soft community assignments via weighted graph community detection. These soft assignments induce a positive semi-definite semantic kernel that captures the distribution of semantic modes without enforcing hard equivalence relations. Uncertainty is then quantified by the Rényi entropy of the kernel spectrum, yielding a tunable measure that interpolates between sensitivity to dominant semantic modes and long-tail semantic diversity. Compared to prior von Neumann entropy-based estimators, the proposed Rényi spectral uncertainty offers improved robustness to semantic noise, reduced dependence on clustering heuristics, and greater flexibility through its order parameter. Extensive experiments on question answering tasks demonstrate that our method provides more stable and discriminative uncertainty estimates, particularly under limited sampling budgets and noisy semantic judgments. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

Back to TopTop