Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,272)

Search Parameters:
Keywords = Shannon Entropy

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
24 pages, 36343 KB  
Article
Partial Multi-Label Feature Selection via Entropy-Weighted Multi-Scale Neighborhood Granular Label Distribution Learning
by Yifan Cao, Mao Li, Cong Wang, Shuyu Fan, Ziqiao Yin and Binghui Guo
Entropy 2026, 28(4), 422; https://doi.org/10.3390/e28040422 - 9 Apr 2026
Viewed by 91
Abstract
Partial multi-label feature selection aims to identify discriminative features from data where each instance is associated with an ambiguous candidate label set. Existing methods are typically built upon single-scale modeling assumptions and may fail to fully exploit the multi-granularity structure underlying instance–label relationships. [...] Read more.
Partial multi-label feature selection aims to identify discriminative features from data where each instance is associated with an ambiguous candidate label set. Existing methods are typically built upon single-scale modeling assumptions and may fail to fully exploit the multi-granularity structure underlying instance–label relationships. To address this limitation, we propose a novel framework termed PML-FSMNG, which integrates entropy-weighted multi-scale neighborhood granules with label distribution learning. Specifically, multi-scale neighborhood systems are constructed to estimate label distinguishability at multiple structural scales, and Shannon entropy is employed to adaptively fuse scale-specific label distributions into a robust soft supervisory signal. Based on the learned label distribution, an embedded sparse regression model with 2,1-norm regularization is developed for discriminative feature selection, together with an entropy-regularized adaptive graph learning mechanism to preserve intrinsic geometric structure. Extensive experiments on benchmark datasets demonstrate that the proposed method consistently outperforms several state-of-the-art approaches, validating the effectiveness of multi-scale modeling and entropy-guided adaptive learning under label ambiguity. Full article
28 pages, 2879 KB  
Article
Spatial Analysis and Prioritization of Solar Energy Development in South Khorasan Province, Iran: An Integrated GIS and Multi-Criteria Decision Analysis Framework
by Mohammad Eskandari Sani, Amir Hossin Nazari, Mostafa Fadaei, Amir Karbassi Yazdi and Gonzalo Valdés González
Land 2026, 15(4), 617; https://doi.org/10.3390/land15040617 - 9 Apr 2026
Viewed by 66
Abstract
The use of solar photovoltaic technology is among the most promising approaches to achieving SDG7—Affordable and Clean Energy—which seeks to provide modern, reliable, sustainable, and efficient energy for everyone globally, especially in developing areas with high irradiation, where both energy access and decarbonization [...] Read more.
The use of solar photovoltaic technology is among the most promising approaches to achieving SDG7—Affordable and Clean Energy—which seeks to provide modern, reliable, sustainable, and efficient energy for everyone globally, especially in developing areas with high irradiation, where both energy access and decarbonization are major challenges. South Khorasan Province, Iran, is one of the most highly irradiated regions in the world. However, despite the abundance of solar resources, most previous research in Iran on solar potential has focused on technical potential, with little emphasis on actual energy consumption patterns and economic viability. To the best of our knowledge, this is the first demand-driven assessment at the county level and the first national-scale implementation of the MARCOS (Measurement of Alternatives and Ranking according to Compromise Solution) method for selecting solar energy sites in Iran. A spatially explicit integrated framework based on GIS-MARCOS was established for each of the eleven counties of South Khorasan Province, and five benefits were used as criteria (solar irradiance, population, per capita electrical consumption in residential, industrial, and agricultural sectors). Objective weights were calculated using Shannon’s Entropy. The analysis indicates that residential electricity demand emerges as the most influential factor in the prioritization process. Therefore, the counties of Birjand, Qaenat, and Tabas were identified as top priority counties, while counties with high irradiation levels but low demand (for example, Boshruyeh) received the least priority. These results clearly indicate the need to transition from irradiation-based to demand-based planning to minimize transmission losses and maximize the ability to integrate solar-generated electricity into the electric power grid. This proposed methodology provides a transferable decision-support tool for other high-irradiation, demand-heterogeneous regions around the globe. Full article
(This article belongs to the Section Water, Energy, Land and Food (WELF) Nexus)
Show Figures

Figure 1

38 pages, 519 KB  
Review
Advancements in CO2 Capture and Storage: Technologies, Performance, and Strategic Pathways to Net-Zero by 2050
by Ahmed A. Bhran and Abeer M. Shoaib
Materials 2026, 19(8), 1497; https://doi.org/10.3390/ma19081497 - 8 Apr 2026
Viewed by 356
Abstract
In order to reach net-zero by 2050, we need to have strong decarbonization policies, especially in hard-to-abate clean-ups like steel (8% of the global emissions), cement (7%), and power generation (30%), and negative emissions through direct air capture (DAC) and bioenergy with carbon [...] Read more.
In order to reach net-zero by 2050, we need to have strong decarbonization policies, especially in hard-to-abate clean-ups like steel (8% of the global emissions), cement (7%), and power generation (30%), and negative emissions through direct air capture (DAC) and bioenergy with carbon capture and storage (BECCS). This review paper summarizes the progress in CO2 capture, compression, transportation, and storage technologies between 2020 and 2025, including energy penalty (20–40%) and cost (15–30%) reductions, with innovations such as metal–organic frameworks (MOFs), bio-inspired catalysts, ionic liquids, and artificial intelligence (AI)-based optimization. This paper, as a new input into the carbon capture and storage (CCS) field, uses the Weighted Sum Model (WSM) as a multi-criteria decision-making tool to rank the best technologies in the capture, storage, monitoring, and transportation sectors. The weights of the criteria are calculated based on Shannon entropy, and the assessment is performed in three conditions, namely, optimistic, pessimistic, and expected. The weights are computed with sensitivity analysis to make the assessment robust. The viability of key projects, such as Northern Lights (Norway, 1.5 MtCO2/year), Porthos (The Netherlands, 2.5 MtCO2/year), Quest (Canada, 1 MtCO2/year), and Petra Nova (USA, 1.6 MtCO2/year), is evident, and it is projected that, globally, CCS will reach 49 MtCO2/year across 43 plants in 2025. The review incorporates socio-economic and environmental justice, including barriers such as high costs ($30–600/MtCO2), energy penalties (1–10 GJ/tCO2), and opposition between people (20–40% in EU/US). In comparison with previous reviews, this article has a more comprehensive focus, provides quantitative synthesis through WSM, and discusses the implications for researchers, policymakers, and stakeholders towards achieving faster CCS implementation on the path to net-zero. Full article
(This article belongs to the Section Energy Materials)
41 pages, 699 KB  
Article
Mathematical Framework for Characterizing Emotional Individuality in Large Language Models: Temperature Control, Fuzzy Entropy, and Persona-Based Diversity Analysis
by Naruki Shirahama, Yuma Yoshimoto, Naofumi Nakaya and Satoshi Watanabe
Mathematics 2026, 14(7), 1224; https://doi.org/10.3390/math14071224 - 6 Apr 2026
Viewed by 215
Abstract
Evaluating emotional understanding in Large Language Models (LLMs) is challenging because assessments are subjective, ambiguous, multidimensional, and sensitive to controllable generation parameters. We developed a unified mathematical framework for characterizing LLM “emotional individuality” that integrates softmax sampling–temperature control (the decoding-time temperature parameter exposed [...] Read more.
Evaluating emotional understanding in Large Language Models (LLMs) is challenging because assessments are subjective, ambiguous, multidimensional, and sensitive to controllable generation parameters. We developed a unified mathematical framework for characterizing LLM “emotional individuality” that integrates softmax sampling–temperature control (the decoding-time temperature parameter exposed by the API and typically used to modulate output randomness during token generation), fuzzy set theory with Shannon-type fuzzy entropy, and persona-based cognitive diversity analysis. We evaluated 36 API-accessible LLMs from seven major vendors on Japanese literary texts, using four personas each assigned a sampling temperature (T{0.1,0.4,0.7,0.9}), yielding 4227/4320 trial responses (97.8% coverage), of which 4067/4227 contained valid numeric emotion scores (96.2%). Temperature controllability varied approximately 25-fold (κM[0.039,0.982]) with both positive and negative temperature–variance relationships across models. Because each sampling temperature is deterministically assigned to a persona in our design, κM should be interpreted as an operational temperature–variance association across persona conditions rather than an isolated causal temperature effect. The model-level mean fuzzy entropy ranged from approximately 0.40 to 0.66, and the numerical stability consistency scores ranged from approximately 0.548 to 0.780. We also observed text-dependent structure, including genre-specific variation in the Interest–Sadness relationship. For practitioners, the framework is most directly useful as a benchmark-design and model-screening template for structured emotion-scoring tasks; its empirical conclusions remain limited to the present Japanese literary, text-only setting. Full article
Show Figures

Figure 1

14 pages, 734 KB  
Article
Complexity of Cardiovascular Regulation and Its Association with Physical and Cardiorespiratory Fitness in Men with Type 2 Diabetes Mellitus
by Étore De F. Signini, Raphael M. de Abreu, Alex Castro, Andréia M. Santos, Gabriela A. M. Galdino, Silvia C. G. Moura, Stephanie N. Linares, Juliana C. Milan-Mattos, Rafaella M. Zambetta, Alberto Porta and Aparecida M. Catai
Healthcare 2026, 14(7), 940; https://doi.org/10.3390/healthcare14070940 - 3 Apr 2026
Viewed by 228
Abstract
Background/Objectives: Cardiovascular regulation complexity (CRC) is an underexplored health marker in the context of type 2 diabetes mellitus (T2DM). Additionally, associating CRC with physical and cardiorespiratory fitness variables could provide greater insight into how physical conditioning impacts cardiovascular health in the context [...] Read more.
Background/Objectives: Cardiovascular regulation complexity (CRC) is an underexplored health marker in the context of type 2 diabetes mellitus (T2DM). Additionally, associating CRC with physical and cardiorespiratory fitness variables could provide greater insight into how physical conditioning impacts cardiovascular health in the context of T2DM. This study aims to investigate whether the relationship between physical and cardiorespiratory fitness and CRC differs according to the presence or absence of T2DM. Methods: Sixty-eight men were equally divided into the T2DM group (T2DMG; 57 ± 6 years old and 28.4 ± 3.1 kg/m2) and the control group (CG; 52 ± 5 years old and 25.1 ± 2.8 kg/m2). Participants underwent a resting cardiovascular data collection and a cardiopulmonary exercise test on a cycle ergometer. For each group, the relative peak power (W/kgPEAK) and peak oxygen consumption (VO2PEAK) were correlated with the CRC indices, namely, Shannon entropy, the complexity index, the normalized complexity index, and the sample entropy from heart period (HP) and systolic arterial pressure (SAP) series. A partial correlation was performed for each group, controlling for age, physical activity level, and metabolic cart. Results: Only the CG showed positive and significant correlations between relative VO2PEAK and W/kgPEAK and CRC indices derived from the HP series (0.354 ≤ r ≤ 0.548 and 0.001 ≤ p ≤ 0.047). Correlations with the SAP series were not significant, regardless of the groups. Conclusions: In this sample, there was no positive relationship between physical and cardiorespiratory fitness variables and CRC indices among individuals with T2DM. Further large sample studies are needed to elucidate the factors involved in T2DM that impact CRC. Full article
(This article belongs to the Special Issue Effects of Physical Exercise on Cardiometabolic Disorders)
Show Figures

Figure 1

17 pages, 287 KB  
Article
A Reproducible Computational Pipeline for Cross-Database Scientometric Network Construction: Architecture, Algorithms, and Structural Validation
by Denny Moreno-Castro, Omar Orlando Franco-Arias, Cícero Pimenteira, Nicolás Márquez and Cristian Vidal-Silva
Computers 2026, 15(4), 213; https://doi.org/10.3390/computers15040213 - 31 Mar 2026
Viewed by 285
Abstract
The rapid expansion of scientific publications indexed in multiple bibliographic databases has created new computational challenges for large-scale scientometric analysis. Differences in metadata schemas, identifier structures, and export formats across indexing systems such as Web of Science and Scopus introduce inconsistencies that may [...] Read more.
The rapid expansion of scientific publications indexed in multiple bibliographic databases has created new computational challenges for large-scale scientometric analysis. Differences in metadata schemas, identifier structures, and export formats across indexing systems such as Web of Science and Scopus introduce inconsistencies that may distort network-based bibliometric analyses. These issues affect duplicate detection, node identification, and network topology construction. This study proposes a reproducible computational pipeline for cross-database scientometric network construction. The framework formalizes the preprocessing workflow into explicit computational modules, including metadata harmonization, deterministic duplicate detection, sparse graph construction, normalization, and structural diagnostics. The proposed architecture separates preprocessing stages into reproducible algorithmic components, enabling transparent evaluation of methodological assumptions. Empirical evaluation using an interdisciplinary dataset of 317 publications (1990–2023) demonstrate that deterministic preprocessing significantly improves network stability and preserves clustering structure. Structural diagnostics based on modularity, Herfindahl–Hirschman Index, Shannon entropy, and Gini coefficient provide multi-dimensional evaluation of network topology. Scalability experiments confirm near-linear computational growth under sparse graph construction. The principal contribution of this work lies in the formalization of a transparent and extensible computational architecture for reproducible scientometric analysis. The proposed pipeline supports reliable cross-database integration and enables scalable knowledge-mapping applications in interdisciplinary research domains. Full article
Show Figures

Figure 1

19 pages, 8328 KB  
Article
A Robust 3D Active Learning Framework Based on Multi-Metric Voting for Fast Electromagnetic Field Reconstruction with Sparse Sampling
by Yidi Hu, Kuiyuan Wang, Yujie Qi, Jiewen Deng, Kai Zhang, Zhi Tang, Lei Zhang and Tianwu Li
Electronics 2026, 15(7), 1434; https://doi.org/10.3390/electronics15071434 - 30 Mar 2026
Viewed by 244
Abstract
To mitigate the high measurement costs in electromagnetic compatibility (EMC) assessment, this paper proposes a robust active learning framework for fast 3D field reconstruction with sparse sampling. A novel “Four-Vote” query criterion is proposed to guide intelligent sample selection, which integrates Shannon entropy, [...] Read more.
To mitigate the high measurement costs in electromagnetic compatibility (EMC) assessment, this paper proposes a robust active learning framework for fast 3D field reconstruction with sparse sampling. A novel “Four-Vote” query criterion is proposed to guide intelligent sample selection, which integrates Shannon entropy, committee variance, spatial density, and clustering-based representativeness, all derived from a heterogeneous radial basis function (RBF) committee. Furthermore, an adaptive polynomial degree adjustment mechanism is implemented to ensure stability in data-scarce 3D environments. Validated through full-wave HFSS simulations, the proposed method significantly outperforms traditional sampling strategies in both 2D and 3D scenarios, achieving high-fidelity field reconstruction with minimal sampling points. This framework provides an efficient solution for rapid spatial field mapping and EMC fault diagnosis in practical engineering scenarios. Full article
Show Figures

Figure 1

24 pages, 2600 KB  
Article
A Normalized Shannon Entropy–CV Framework for Diagnosing Short-Term Surface Water Quality Instability from High-Frequency WQI Data in Southwest China
by Junran Kuang, Yu Zhang, Qingdong Liu, Jing Hu and Shaoqi Zhou
Sustainability 2026, 18(7), 3216; https://doi.org/10.3390/su18073216 - 25 Mar 2026
Viewed by 379
Abstract
High-frequency water quality monitoring generates large volumes of sub-daily observations, but concise and scalable indicators for diagnosing short-term instability remain limited. Using four-hourly records from 336 national automatic monitoring stations in Southwest China (November 2022–September 2024), we constructed a nine-parameter water quality index [...] Read more.
High-frequency water quality monitoring generates large volumes of sub-daily observations, but concise and scalable indicators for diagnosing short-term instability remain limited. Using four-hourly records from 336 national automatic monitoring stations in Southwest China (November 2022–September 2024), we constructed a nine-parameter water quality index (WQI) and developed a normalized Shannon entropy–coefficient of variation (hCV) framework to characterize short-term instability in fixed three-day windows. A composite separation index combining the Kolmogorov–Smirnov distance of pollution-event counts and the effect size of entropy distributions, together with bootstrap resampling, identified CV ≈ 0.10 as an operational threshold for high-fluctuation windows. The joint hCV distribution revealed four typical short-term dynamic patterns and showed good consistency across three-, five-, and seven-day windows. At the station scale, instability hotspots were concentrated in southern Yunnan–Guizhou–Guangxi, the southeastern margins of the Sichuan Basin, and several mid-lower mainstream reaches, whereas alpine headwaters and upstream segments remained relatively stable. Overall, the proposed framework provides an interpretable and generalizable tool for short-term water-quality diagnosis, with practical value for risk zoning, early warning, and monitoring network optimization. Full article
(This article belongs to the Section Environmental Sustainability and Applications)
Show Figures

Figure 1

103 pages, 2567 KB  
Article
Thermodynamics à la Souriau on Kähler Non-Compact Symmetric Spaces for Cartan Neural Networks
by Pietro G. Fré, Alexander S. Sorin and Mario Trigiante
Entropy 2026, 28(4), 365; https://doi.org/10.3390/e28040365 - 24 Mar 2026
Viewed by 193
Abstract
In this paper, we clarify several issues concerning the abstract geometrical formulation of thermodynamics on non-compact symmetric spaces U/H that are the mathematical model of hidden layers in the new paradigm of Cartan Neural Networks. We introduce a clear-cut distinction between [...] Read more.
In this paper, we clarify several issues concerning the abstract geometrical formulation of thermodynamics on non-compact symmetric spaces U/H that are the mathematical model of hidden layers in the new paradigm of Cartan Neural Networks. We introduce a clear-cut distinction between the generalized thermodynamics associated with Integrable Dynamical Systems and the challenging proposal of Gibbs probability distributions on U/H provided by generalized thermodynamics à la Souriau. Our main result is the proof that U/H.s supporting such Gibbs distributions are only the Kähler ones. Furthermore, for the latter, we solve the problem of determining the space of temperatures, namely, of Lie algebra elements for which the partition function converges. The space of generalized temperatures is the orbit under the adjoint action of U of a positivity domain in the Cartan subalgebra CcH of the maximal compact subalgebra HU. We illustrate how our explicit constructions for the Poincaré and Siegel planes might be extended to the whole class of Calabi–Vesentini manifolds utilizing Paint Group symmetry. Furthermore, we claim that Rao’s, Chentsov’s, and Amari’s Information Geometry and the thermodynamical geometry of Ruppeiner and Lychagin are the very same thing. In particular, we provide an explicit study of thermodynamical geometry for the Poincaré plane. The key feature of the Gibbs probability distributions in this setup is their covariance under the entire group of symmetries U. The partition function is invariant against U transformations, and the set of its arguments, namely the generalized temperatures, can always be reduced to a minimal set whose cardinality is equal to the rank of the compact denominator group HU. Full article
(This article belongs to the Collection Feature Papers in Information Theory)
Show Figures

Figure 1

20 pages, 2033 KB  
Article
On the Predictability of Green Finance Markets: An Assessment Based on Fractal and Shannon Entropy
by Sonia Benghiat and Salim Lahmiri
Fractal Fract. 2026, 10(3), 205; https://doi.org/10.3390/fractalfract10030205 - 22 Mar 2026
Viewed by 260
Abstract
Econophysics is an interdisciplinary field that applies physics concepts to economic and financial systems. By utilizing tools such as statistical physics, including fractal analysis and entropy measures, econophysics helps model the complex and non-linear dynamics of equity markets. This paper examines the intrinsic [...] Read more.
Econophysics is an interdisciplinary field that applies physics concepts to economic and financial systems. By utilizing tools such as statistical physics, including fractal analysis and entropy measures, econophysics helps model the complex and non-linear dynamics of equity markets. This paper examines the intrinsic dynamics and regularity in information content in green finance markets (carbon, clean energy, and sustainability markets) by means of range scale analysis (R/S), detrended fluctuation analysis (DFA), fractionally integrated generalized auto-regressive conditionally heteroskedastic (FIGARCH) process, and Shannon entropy (SE). The empirical results can be summarized as follows. First, prices in all markets are persistent; however, returns are likely random as estimated Hurst exponents are close to 0.5. Second, the FIGARCH process shows that volatility series in carbon and sustainability markets are persistent, whilst volatility in clean energy is anti-persistent. Third, in carbon and sustainability markets, entropy is high in prices compared to returns and volatility series. On the contrary, the clean energy market shows lower entropy for prices than for returns and volatility. In sum, it is concluded that price and volatility series are predictable, whilst return series are not. Finally, based on a rolling window framework, it is concluded that the COVID-19 pandemic and the Russia–Ukraine war have altered long memory and randomness in all three green finance markets. Full article
(This article belongs to the Special Issue Fractal Approaches and Machine Learning in Financial Markets)
Show Figures

Figure 1

21 pages, 6628 KB  
Article
Shannon Entropy of a Hydrogenic Impurity on a Conical Surface: Confinement and Aharonov–Bohm Effects
by Luis Manuel Arvizu, Eleuterio Castaño and Norberto Aquino
Entropy 2026, 28(3), 356; https://doi.org/10.3390/e28030356 - 22 Mar 2026
Viewed by 221
Abstract
In this work, we solve the Schrödinger equation for a hydrogenic impurity located at the apex of a right circular cone, with the electron constrained to move on the conical surface of semi-aperture angle θ0 and subjected to an Aharonov–Bohm magnetic flux [...] Read more.
In this work, we solve the Schrödinger equation for a hydrogenic impurity located at the apex of a right circular cone, with the electron constrained to move on the conical surface of semi-aperture angle θ0 and subjected to an Aharonov–Bohm magnetic flux along the symmetry axis. Analytical expressions for the energy eigenvalues and normalized radial wave functions are obtained in terms of the principal quantum number n and the angular quantum number m, the magnetic flux ν, and the cone angle. The Shannon entropy is evaluated in both configuration and momentum spaces for several low-lying states, and its variation with ν and θ0 is analyzed in detail. When the magnetic flux vanishes, pairs of states n, m and n, m share the same entropic behavior; for finite flux, this degeneracy is lifted and the entropies depend explicitly on the state, the cone geometry, and the flux strength. Finally, we verify that the entropic sum Sr+Sp fulfills the Bialynicki-Birula–Mycielski bound, providing an information-theoretic consistency check for the model. Full article
Show Figures

Figure 1

31 pages, 4919 KB  
Article
Comparison of Resting-State EEG and Synchronization Between Young Adults with Down Syndrome and Controls in Bipolar Montage
by Jesús Pastor, Lorena Vega-Zelaya and Diego Real de Asúa
Brain Sci. 2026, 16(3), 328; https://doi.org/10.3390/brainsci16030328 - 19 Mar 2026
Viewed by 307
Abstract
The qEEG findings of subjects with Down syndrome (DS) have not been described in the context of bipolar montage. Resting-state EEG (rsEEG) with a bipolar montage was performed in 22 young adults (26.0 ± 1.2 years) with DS but without psychiatric or neurological [...] Read more.
The qEEG findings of subjects with Down syndrome (DS) have not been described in the context of bipolar montage. Resting-state EEG (rsEEG) with a bipolar montage was performed in 22 young adults (26.0 ± 1.2 years) with DS but without psychiatric or neurological pathology and matched control subjects of the same sex and age, and the results were conventionally and numerically analyzed. Channels were grouped into frontal, parieto-occipital, and temporal lobes. For every channel, the power spectrum was calculated and used to compute the area for the delta, theta, alpha and beta bands and was log-transformed. Shannon’s spectral entropy (SSE) and coherence by bands were computed. Finally, we also calculated the peak frequency distribution of the alpha band. qEEG revealed alterations in the rsEEG that were not detected visually. Subjects with DS showed a significant generalized increase in the power of the delta and theta bands, along with a decrease in the power of the alpha band in the posterior half of the scalp. This alpha activity also exhibited features corresponding to older euploid subjects, showing interhemispheric asynchrony in one-third of the individuals. The beta band power was significantly increased in the frontal lobes and adjacent regions, such as the parietal and mid-temporal regions. Individuals with DS showed a generalized decrease in parieto-occipital synchronization associated with intelligence quotient. Left temporal synchronization was also lower. The synchronization of specific channel pairs was greater in subjects with DS in the frontal lobe and much lower in the occipital and temporal regions. These results indicate that alterations in band structure and synchronization in subjects with DS are highly specific and can aid in the clinical evaluation of these individuals. Full article
(This article belongs to the Section Neurotechnology and Neuroimaging)
Show Figures

Figure 1

27 pages, 1237 KB  
Article
Constraint, Asymmetry, and Meaning: A Cybernetic Reinterpretation of Probabilistic Emergence Across Complex Systems
by Ezra N. S. Lockhart
Symmetry 2026, 18(3), 518; https://doi.org/10.3390/sym18030518 - 18 Mar 2026
Viewed by 346
Abstract
This study develops a Constraint-Driven Model of Intelligence to explain the emergence of structured meaning in complex systems, reconciling probability and cybernetics. It applies a conceptual–analytic procedure, conducted entirely through logical reasoning and theoretical analysis, without empirical measurement, data acquisition, experimental manipulation, or [...] Read more.
This study develops a Constraint-Driven Model of Intelligence to explain the emergence of structured meaning in complex systems, reconciling probability and cybernetics. It applies a conceptual–analytic procedure, conducted entirely through logical reasoning and theoretical analysis, without empirical measurement, data acquisition, experimental manipulation, or statistical testing, and is therefore methodologically separate from empirical artificial intelligence research. Phenomena such as model collapse are cited as theoretical instances for epistemic argumentation, without asserting empirical verification. Building on Émile Borel’s Infinite Monkey Theorem, which demonstrates the theoretical inevitability of order in unbounded stochastic processes, and Gregory Bateson’s principle of negative explanation, which defines structure as the result of systematically eliminated alternatives, the analysis formalizes how constraints break ergodicity and generate asymmetry. Shannon’s entropy quantifies the informational effects of constraints, while Simon’s bounded rationality and Turing’s algorithmic limits show how cognitive and computational boundaries produce tractable outcomes. Applied to modern AI, the model accounts for model collapse in recursive training, showing that the loss of asymmetric constraints produces low-entropy, repetitive outputs, demonstrating the epistemic necessity of constraint regulation. Comparing probabilistic and cybernetic accounts of emergence, the study shows that structured intelligence arises not from stochastic exploration alone, but from bounded, recursive, selective processes. This model is transdisciplinary, formalizing how constraints from socioeconomic pressures to subcultural circulation shape diversity, innovation, and functional asymmetry, establishing a generalizable cybernetic epistemology for the generation of structured intelligence and meaning across domains. By formalizing these concepts through set-theoretic derivations and integrative synthesis, this non-empirical model advances a cybernetic epistemology, separate from quantitative AI evaluations or experimental designs. Full article
Show Figures

Figure 1

23 pages, 2885 KB  
Article
AI-Controlled Modular Decoy Generation for Reconstruction-Resistant Hybrid and Multi-Cloud Storage Systems
by Munir Ahmed and Jiann-Shiun Yuan
Electronics 2026, 15(6), 1231; https://doi.org/10.3390/electronics15061231 - 16 Mar 2026
Viewed by 230
Abstract
Although cloud storage is widely trusted by users and enterprises, externally stored encrypted and fragmented data remain vulnerable to reconstruction and inference attacks following partial exposure. Existing decoy-based defenses often rely on static configurations or randomly generated artifacts that can be filtered during [...] Read more.
Although cloud storage is widely trusted by users and enterprises, externally stored encrypted and fragmented data remain vulnerable to reconstruction and inference attacks following partial exposure. Existing decoy-based defenses often rely on static configurations or randomly generated artifacts that can be filtered during adversarial analysis. This paper presents an Artificial Intelligence (AI)-controlled modular decoy generation method to enhance reconstruction resistance in distributed storage systems. The method operates as a system-agnostic post-fragmentation layer and does not require modification of encryption or storage architecture. Given encrypted fragments as input, decoys are generated using a supervised Extreme Gradient Boosting (XGBoost) regression model that adapts decoy quantity based on system telemetry and resource conditions. Decoys maintain statistical alignment with real encrypted fragments in size and Shannon entropy characteristics. To address scalability, the method is evaluated across small, medium, and large deployments comprising up to 413 externally exposed fragments and compared against fixed-ratio (10%, 20%) and randomized baselines. Experimental evaluation demonstrates increased adversarial uncertainty without altering legitimate reconstruction procedures or encryption mechanisms. Kolmogorov–Smirnov analysis indicates no statistically significant difference between AI-generated decoys and real fragments, whereas baseline decoys produce significant deviations in size and entropy distributions, supporting reconstruction resistance at scale in multi-cloud environments. Full article
Show Figures

Figure 1

17 pages, 394 KB  
Article
Statistical Measures and Complexity of Supersymmetric Polynomials in Quantum Mechanics
by Vasil Avramov, Hristo Dimov, Miroslav Radomirov, Radoslav C. Rashkov and Tsvetan Vetsov
Mathematics 2026, 14(6), 998; https://doi.org/10.3390/math14060998 - 16 Mar 2026
Viewed by 275
Abstract
We study information-theoretic and complexity measures for the Dunkl-supersymmetric harmonic oscillator to identify the effect of supersymmetry on these quantities. Using the Rakhmanov probability density of the Dunkl-SUSY functions, we analyze the Shannon entropy, spreading measures (Heller, Rényi, and Fisher lengths), and several [...] Read more.
We study information-theoretic and complexity measures for the Dunkl-supersymmetric harmonic oscillator to identify the effect of supersymmetry on these quantities. Using the Rakhmanov probability density of the Dunkl-SUSY functions, we analyze the Shannon entropy, spreading measures (Heller, Rényi, and Fisher lengths), and several statistical and dynamical complexities. The Shannon entropy is obtained both asymptotically and in closed analytic form, showing that supersymmetry does not affect the leading large-n scaling. In contrast, spreading measures reveal enhanced localization of the SUSY eigenstates relative to the standard harmonic oscillator. Finally, we find that LMC and Fisher–Shannon complexities are higher in the supersymmetric case. Full article
(This article belongs to the Special Issue Advances in Mathematical Methods of Quantum Mechanics)
Show Figures

Figure 1

Back to TopTop