Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (231)

Search Parameters:
Keywords = entropy of sums

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
26 pages, 2781 KB  
Article
Iterative Optimization of Structural Entropy for Enhanced Network Fragmentation Analysis
by Fatih Ozaydin, Vasily Lubashevskiy and Seval Yurtcicek Ozaydin
Information 2025, 16(10), 828; https://doi.org/10.3390/info16100828 - 24 Sep 2025
Viewed by 358
Abstract
Identifying and ranking influential nodes is central to tasks such as targeted immunization, misinformation containment, and resilient design. Structural entropy (SE) offers a principled, community-aware scoring rule, yet the one-shot (static) use of SE may become suboptimal after each intervention, as the residual [...] Read more.
Identifying and ranking influential nodes is central to tasks such as targeted immunization, misinformation containment, and resilient design. Structural entropy (SE) offers a principled, community-aware scoring rule, yet the one-shot (static) use of SE may become suboptimal after each intervention, as the residual topology and its modular structure change. We introduce iterative structural entropy (ISE), a simple yet powerful modification that recomputes SE on the residual graph before every removal, thus turning node targeting into a sequential, feedback-driven policy. We evaluate SE and ISE on seven benchmark networks using (i) cumulative structural entropy (CSE), (ii) cumulative sum of largest connected component sizes (LCCs), and (iii) dynamic panels that track average shortest-path length and diameter within the residual LCC together with a near-threshold percolation proxy (expected outbreak size). Across datasets, ISE consistently fragments earlier and more decisively than SE; on the Netscience network, ISE reduces the cumulative LCC size by 43% (RLCCs =0.567). In parallel, ISE achieves perfect discriminability (monotonicity M=1.0) among positively scored nodes on all benchmarks, while SE and degree-based baselines display method-dependent ties. These results support ISE as a practical, adaptive alternative to static SE when sequential decisions matter, delivering sharper rankings and faster structural degradation under identical measurement protocols. Full article
(This article belongs to the Special Issue Optimization Algorithms and Their Applications)
Show Figures

Figure 1

21 pages, 599 KB  
Article
Healthcare Expenditure and COVID-19 in Europe: Correlation, Entropy, and Functional Data Analysis-Based Prediction of Hospitalizations and ICU Admissions
by Patrycja Hęćka, Wiktor Ejsmont and Marek Biernacki
Entropy 2025, 27(9), 962; https://doi.org/10.3390/e27090962 - 16 Sep 2025
Viewed by 511
Abstract
This article aims to analyze the correlation between healthcare expenditure per capita in 2021 and the sum of the number of hospitalized patients, ICU admissions, confirmed COVID-19 cases, and deaths in a selected period of time. The analysis covers 2017 (before the pandemic), [...] Read more.
This article aims to analyze the correlation between healthcare expenditure per capita in 2021 and the sum of the number of hospitalized patients, ICU admissions, confirmed COVID-19 cases, and deaths in a selected period of time. The analysis covers 2017 (before the pandemic), 2021 (during the pandemic), and 2022/2023 (the initial post-pandemic recovery period). To assess the variability and stability of pandemic dynamics across countries, we compute Shannon entropy for hospitalization and ICU admission data. Additionally, we examine functional data on hospitalizations, ICU patients, confirmed cases, and deaths during a selected period of the COVID-19 pandemic in several European countries. To achieve this, we transform the data into smooth functions and apply principal component analysis along with a multiple function-on-function linear regression model to predict the number of hospitalizations and ICU patients. Full article
(This article belongs to the Special Issue Entropy-Based Time Series Analysis: Theory and Applications)
Show Figures

Figure 1

14 pages, 255 KB  
Article
The Retention of Information in the Presence of Increasing Entropy Using Lie Algebras Defines Fibonacci-Type Sequences
by Joseph E. Johnson
Symmetry 2025, 17(9), 1454; https://doi.org/10.3390/sym17091454 - 4 Sep 2025
Viewed by 471
Abstract
In the general linear Lie algebra of continuous linear transformations in n dimensions, we show that unequal Abelian scaling transformations on the components of a vector can stabilize the system information in the presence of Markov component transformations on the vector, which, alone, [...] Read more.
In the general linear Lie algebra of continuous linear transformations in n dimensions, we show that unequal Abelian scaling transformations on the components of a vector can stabilize the system information in the presence of Markov component transformations on the vector, which, alone, would lead to increasing entropy. The more interesting results follow from seeking Diophantine (integer) solutions, with the result that the system can be stabilized with constant information for each of a set of entropy rates (k=1,2,3, ). The first of these—the simplest—where k=1, results in the Fibonacci sequence, with information determined by the olden mean, and Fibonacci interpolating functions. Other interesting results include the fact that a new set of higher order generalized Fibonacci sequences, functions, golden means, and geometric patterns emerges for k=2, 3,  Specifically, we define the kth order golden mean as Φk=k/2+(k/2)2+1 for k =1, 2, 3, .. One can easily observe that one can form a right triangle with sides of 1 and k/2 and that this will give a hypotenuse of (k/2)2+1. Thus, the sum of the k/2 side plus the hypotenuse of these triangles so proportioned will give geometrically the exact value of the golden means for any value of k relative to the third side with a value of unity. The sequential powers of the matrix (k2+1,k,k,1) for any integer value of k provide a generalized Fibonacci sequence. Also, using the general equation expressed as Φk=k2+(k/2)2+1 for k =1,2,3, , one can easily prove that Φk=k+1/Φk which is a generalization of the familiar equation expressed as Φ=1+1/Φ. We suggest that one could look for these new ratios and patterns in nature, with the possibility that all of these systems are connected with the retention of information in the presence of increasing entropy. Thus, we show that two components of the general linear Lie algebra (GL(n,R)), acting simultaneously with certain parameters, can stabilize the information content of a vector over time. Full article
(This article belongs to the Special Issue Supersymmetry Approaches in Quantum Mechanics and Field Theory)
18 pages, 862 KB  
Article
Integration of Multi-Criteria Decision-Making and Dimensional Entropy Minimization in Furniture Design
by Anna Jasińska and Maciej Sydor
Information 2025, 16(8), 692; https://doi.org/10.3390/info16080692 - 14 Aug 2025
Viewed by 614
Abstract
Multi-criteria decision analysis (MCDA) in furniture design is challenged by increasing product complexity and component proliferation. This study introduces a novel framework that integrates entropy reduction—achieved through dimensional standardization and modularity—as a core factor in the MCDA methodologies. The framework addresses both individual [...] Read more.
Multi-criteria decision analysis (MCDA) in furniture design is challenged by increasing product complexity and component proliferation. This study introduces a novel framework that integrates entropy reduction—achieved through dimensional standardization and modularity—as a core factor in the MCDA methodologies. The framework addresses both individual furniture evaluation and product family optimization through systematic complexity reduction. The research employed a two-phase methodology. First, a comparative analysis evaluated two furniture variants (laminated particleboard versus oak wood) using the Weighted Sum Model (WSM) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS). The divergent rankings produced by these methods revealed inherent evaluation ambiguities stemming from their distinct mathematical foundations, highlighting the need for additional decision criteria. Building on these findings, the study further examined ten furniture variants, identifying the potential to transform their individual components into universal components, applicable across various furniture variants (or configurations) in a furniture line. The proposed dimensional modifications enhance modularity and interoperability within product lines, simplifying design processes, production, warehousing logistics, product servicing, and liquidation at end of lifetime. The integration of entropy reduction as a quantifiable criterion within MCDA represents a significant methodological advancement. By prioritizing dimensional standardization and modularity, the framework reduces component variety while maintaining design flexibility. This approach offers furniture manufacturers a systematic method for balancing product diversity with operational efficiency, addressing a critical gap in current design evaluation practices. Full article
(This article belongs to the Special Issue New Applications in Multiple Criteria Decision Analysis, 3rd Edition)
Show Figures

Figure 1

17 pages, 386 KB  
Article
A Horizon-as-Apparatus Model That Reproduces Black Hole Thermodynamics
by Daegene Song
Entropy 2025, 27(8), 859; https://doi.org/10.3390/e27080859 - 14 Aug 2025
Viewed by 809
Abstract
We present a measurement-driven model in which the black hole horizon functions as a classical apparatus, with Planck-scale patches acting as detectors for quantum field modes. This approach reproduces the Bekenstein–Hawking area law SBH=A4p2 and provides [...] Read more.
We present a measurement-driven model in which the black hole horizon functions as a classical apparatus, with Planck-scale patches acting as detectors for quantum field modes. This approach reproduces the Bekenstein–Hawking area law SBH=A4p2 and provides a concrete statistical interpretation of the 1/4 factor, while adhering to established principles rather than deriving the entropy anew from first principles. Each patch generates a thermal ensemble (∼0.25 nat per mode), and summing over area-scaling patches yields the total entropy. Quantum simulations incorporating a realistic Hawking spectrum produce Sk=0.257 nat (3% above 0.25 nat), and we outline testable predictions for analogue systems. Our main contribution is the horizon-as-apparatus mechanism and its information-theoretic bookkeeping. Full article
(This article belongs to the Special Issue Coarse and Fine-Grained Aspects of Gravitational Entropy)
Show Figures

Figure 1

34 pages, 56730 KB  
Article
Land Consolidation Potential Assessment by Using the Production–Living–Ecological Space Framework in the Guanzhong Plain, China
by Ziyi Xie, Siying Wu, Xin Liu, Hejia Shi, Mintong Hao, Weiwei Zhao, Xin Fu and Yepeng Liu
Sustainability 2025, 17(15), 6887; https://doi.org/10.3390/su17156887 - 29 Jul 2025
Viewed by 605
Abstract
Land consolidation (LC) is a sustainability-oriented policy tool designed to address land fragmentation, inefficient spatial organization, and ecological degradation in rural areas. This research proposes a Production–Living–Ecological (PLE) spatial utilization efficiency evaluation system, based on an integrated methodological framework combining Principal Component Analysis [...] Read more.
Land consolidation (LC) is a sustainability-oriented policy tool designed to address land fragmentation, inefficient spatial organization, and ecological degradation in rural areas. This research proposes a Production–Living–Ecological (PLE) spatial utilization efficiency evaluation system, based on an integrated methodological framework combining Principal Component Analysis (PCA), Entropy Weight Method (EWM), Attribute-Weighting Method (AWM), Linear Weighted Sum Method (LWSM), Threshold-Verification Coefficient Method (TVCM), Jenks Natural Breaks (JNB) classification, and the Obstacle Degree Model (ODM). The framework is applied to Qian County, located in the Guanzhong Plain in Shaanxi Province. The results reveal three key findings: (1) PLE efficiency exhibits significant spatial heterogeneity. Production efficiency shows a spatial pattern characterized by high values in the central region that gradually decrease toward the surrounding areas. In contrast, the living efficiency demonstrates higher values in the eastern and western regions, while remaining relatively low in the central area. Moreover, ecological efficiency shows a marked advantage in the northern region, indicating a distinct south–north gradient. (2) Integrated efficiency consolidation potential zones present distinct spatial distributions. Preliminary consolidation zones are primarily located in the western region; priority zones are concentrated in the south; and intensive consolidation zones are clustered in the central and southeastern areas, with sporadic distributions in the west and north. (3) Five primary obstacle factors hinder land use efficiency: intensive utilization of production land (PC1), agricultural land reutilization intensity (PC2), livability of living spaces (PC4), ecological space security (PC7), and ecological space fragmentation (PC8). These findings provide theoretical insights and practical guidance for formulating tar-gated LC strategies, optimizing rural spatial structures, and advancing sustainable development in similar regions. Full article
Show Figures

Figure 1

25 pages, 2377 KB  
Article
Assessment of Storm Surge Disaster Response Capacity in Chinese Coastal Cities Using Urban-Scale Survey Data
by Li Zhu and Shibai Cui
Water 2025, 17(15), 2245; https://doi.org/10.3390/w17152245 - 28 Jul 2025
Viewed by 743
Abstract
Currently, most studies evaluating storm surges are conducted at the provincial level, and there is a lack of detailed research focusing on cities. This paper focuses on the urban scale, using some fine-scale data of coastal areas obtained through remote sensing images. This [...] Read more.
Currently, most studies evaluating storm surges are conducted at the provincial level, and there is a lack of detailed research focusing on cities. This paper focuses on the urban scale, using some fine-scale data of coastal areas obtained through remote sensing images. This research is based on the Hazard–Exposure–Vulnerability (H-E-V) framework and PPRR (Prevention, Preparedness, Response, and Recovery) crisis management theory. It focuses on 52 Chinese coastal cities as the research subject. The evaluation system for the disaster response capabilities of Chinese coastal cities was constructed based on three aspects: the stability of the disaster-incubating environment (S), the risk of disaster-causing factors (R), and the vulnerability of disaster-bearing bodies (V). The significance of this study is that the storm surge capability of China’s coastal cities can be analyzed based on the results of the evaluation, and the evaluation model can be used to identify its deficiencies. In this paper, these storm surge disaster response capabilities of coastal cities were scored using the entropy weighted TOPSIS method and the weight rank sum ratio (WRSR), and the results were also analyzed. The results indicate that Wenzhou has the best comprehensive disaster response capability, while Yancheng has the worst. Moreover, Tianjin, Ningde, and Shenzhen performed well in the three aspects of vulnerability of disaster-bearing bodies, risk of disaster-causing factors, and stability of disaster-incubating environment separately. On the contrary, Dandong (tied with Qinzhou), Jiaxing, and Chaozhou performed poorly in the above three areas. Full article
(This article belongs to the Special Issue Advanced Research on Marine Geology and Sedimentology)
Show Figures

Figure 1

20 pages, 927 KB  
Article
An Optimization Model with “Perfect Rationality” for Expert Weight Determination in MAGDM
by Yuetong Liu, Chaolang Hu, Shiquan Zhang and Qixiao Hu
Mathematics 2025, 13(14), 2286; https://doi.org/10.3390/math13142286 - 16 Jul 2025
Cited by 1 | Viewed by 433
Abstract
Given the evaluation data of all the experts in multi-attribute group decision making, this paper establishes an optimization model for learning and determining expert weights based on minimizing the sum of the differences between the individual evaluation and the overall consistent evaluation results. [...] Read more.
Given the evaluation data of all the experts in multi-attribute group decision making, this paper establishes an optimization model for learning and determining expert weights based on minimizing the sum of the differences between the individual evaluation and the overall consistent evaluation results. The paper proves the uniqueness of the solution of the optimization model and rigorously proves that the expert weights obtained by the model have “perfect rationality”, i.e., the weights are inversely proportional to the distance to the “overall consistent scoring point”. Based on the above characteristics, the optimization problem is further transformed into solving a system of nonlinear equations to obtain the expert weights. Finally, numerical experiments are conducted to verify the rationality of the model and the feasibility of transforming the problem into a system of nonlinear equations. Numerical experiments demonstrate that the deviation metric for the expert weights produced by our optimization model is significantly lower than that obtained under equal weighting or the entropy weight method, and it approaches zero. Within numerical tolerance, this confirms the model’s “perfect rationality”. Furthermore, the weights determined by solving the corresponding nonlinear equations coincide exactly with the optimization solution, indicating that a dedicated algorithm grounded in perfect rationality can directly solve the model. Full article
Show Figures

Figure 1

13 pages, 3313 KB  
Article
CT Texture Patterns Reflect HPV Status but Not Histological Differentiation in Oropharyngeal Squamous Cell Carcinoma
by Lays Assolini Pinheiro de Oliveira, Caio Elias Irajaya Lobo Peresi, Daniel Vitor Aguiar Nozaki, Ericka Francislaine Dias Costa, Lana Ferreira Santos, Carmen Silvia Passos Lima, Sérgio Lúcio Pereira de Castro Lopes and Andre Luiz Ferreira Costa
Cancers 2025, 17(14), 2317; https://doi.org/10.3390/cancers17142317 - 11 Jul 2025
Cited by 1 | Viewed by 646
Abstract
Background: Texture analysis (TA) has shown promise in characterizing intratumoral heterogeneity from imaging data. We add to the literature that shows its capability to differentiate oropharyngeal cancers based on HPV status. Methods: Multislice CT analysis was done in 120 patients with confirmed OP [...] Read more.
Background: Texture analysis (TA) has shown promise in characterizing intratumoral heterogeneity from imaging data. We add to the literature that shows its capability to differentiate oropharyngeal cancers based on HPV status. Methods: Multislice CT analysis was done in 120 patients with confirmed OP SCC: a single 5 mm region of interest was placed on three consecutive homogeneous CT slices per patient. Texture features were extracted by using gray-level co-occurrence matrices averaged per patient. HPV status (via p16 IHC and molecular confirmation) and differentiation grade (i.e., good, moderate, and poor) were recorded. Non-parametric statistical tests assessed differences between subgroups. Results: Seven texture parameters (i.e., angular second moment, contrast, sum of squares, sum entropy, entropy, inverse difference moment, and difference variance) differed significantly between HPV+ and HPV− tumors (all p < 0.05). HPV+ tumors exhibited increased heterogeneity and complexity on CT imaging. No texture feature correlated with histological grade. Conclusions: This study adds to the growing evidence that CT-based TA can assess HPV status in OP SCC. TA may be promising, though it requires further validation as an adjunctive method integrating into radiomics workflows to develop predictive models for diagnosis, prognosis, and treatment planning. Full article
(This article belongs to the Collection Imaging Biomarker in Oncology)
Show Figures

Figure 1

19 pages, 2054 KB  
Article
Enhancing Multi-Label Chest X-Ray Classification Using an Improved Ranking Loss
by Muhammad Shehzad Hanif, Muhammad Bilal, Abdullah H. Alsaggaf and Ubaid M. Al-Saggaf
Bioengineering 2025, 12(6), 593; https://doi.org/10.3390/bioengineering12060593 - 31 May 2025
Viewed by 2590
Abstract
This article addresses the non-trivial problem of classifying thoracic diseases in chest X-ray (CXR) images. A single CXR image may exhibit multiple diseases, making this a multi-label classification problem. Additionally, the inherent class imbalance makes the task even more challenging as some diseases [...] Read more.
This article addresses the non-trivial problem of classifying thoracic diseases in chest X-ray (CXR) images. A single CXR image may exhibit multiple diseases, making this a multi-label classification problem. Additionally, the inherent class imbalance makes the task even more challenging as some diseases occur more frequently than others. Our methodology is based on transfer learning aiming to fine-tune a pretrained DenseNet121 model using CXR images from the NIH Chest X-ray14 dataset. Training from scratch would require a large-scale dataset containing millions of images, which is not available in the public domain for this multi-label classification task. To address class imbalance problem, we propose a rank-based loss derived from the Zero-bounded Log-sum-exp and Pairwise Rank-based (ZLPR) loss, which we refer to as focal ZLPR (FZLPR). In designing FZLPR, we draw inspiration from the focal loss where the objective is to emphasize hard-to-classify examples (instances of rare diseases) during training compared to well-classified ones. We achieve this by incorporating a “temperature” parameter to scale the label scores predicted by the model during training in the original ZLPR loss function. Experimental results on the NIH Chest X-ray14 dataset demonstrate that FZLPR loss outperforms other loss functions including binary cross entropy (BCE) and focal loss. Moreover, by using test-time augmentations, our model trained using FZLPR loss achieves an average AUC of 80.96% which is competitive with existing approaches. Full article
(This article belongs to the Special Issue Machine Learning and Deep Learning Applications in Healthcare)
Show Figures

Figure 1

31 pages, 19763 KB  
Article
Square-Based Division Scheme for Image Encryption Using Generalized Fibonacci Matrices
by Panagiotis Oikonomou, George K. Kranas, Maria Sapounaki, Georgios Spathoulas, Aikaterini Aretaki, Athanasios Kakarountas and Maria Adam
Mathematics 2025, 13(11), 1781; https://doi.org/10.3390/math13111781 - 27 May 2025
Viewed by 615
Abstract
This paper proposes a novel image encryption and decryption scheme, called Square Block Division-Fibonacci (SBD-Fibonacci), which dynamically partitions any input image into optimally sized square blocks to enable efficient encryption without resizing or distortion. The proposed encryption scheme can dynamically adapt to the [...] Read more.
This paper proposes a novel image encryption and decryption scheme, called Square Block Division-Fibonacci (SBD-Fibonacci), which dynamically partitions any input image into optimally sized square blocks to enable efficient encryption without resizing or distortion. The proposed encryption scheme can dynamically adapt to the image dimensions and ensure compatibility with images of varying and high resolutions, while it serves as a yardstick for any symmetric-key image encryption algorithm. An optimization model, combined with the Lagrange Four-Square theorem, minimizes trivial block sizes, strengthening the encryption structure. Encryption keys are generated using the direct sum of generalized Fibonacci matrices, ensuring key matrix invertibility and strong diffusion properties and security levels. Experimental results on widely-used benchmark images and a comparative analysis against State-of-the-Art encryption algorithms demonstrate that SBD-Fibonacci achieves high entropy, strong resistance to differential and statistical attacks, and efficient runtime performance—even for large images. Full article
(This article belongs to the Section E1: Mathematics and Computer Science)
Show Figures

Figure 1

37 pages, 6596 KB  
Article
Optimizing Route Planning via the Weighted Sum Method and Multi-Criteria Decision-Making
by Guanquan Zhu, Minyi Ye, Xinqi Yu, Junhao Liu, Mingju Wang, Zihang Luo, Haomin Liang and Yubin Zhong
Mathematics 2025, 13(11), 1704; https://doi.org/10.3390/math13111704 - 22 May 2025
Viewed by 1704
Abstract
Choosing the optimal path in planning is a complex task due to the numerous options and constraints; this is known as the trip design problem (TTDP). This study aims to achieve path optimization through the weighted sum method and multi-criteria decision analysis. Firstly, [...] Read more.
Choosing the optimal path in planning is a complex task due to the numerous options and constraints; this is known as the trip design problem (TTDP). This study aims to achieve path optimization through the weighted sum method and multi-criteria decision analysis. Firstly, this paper proposes a weighted sum optimization method using a comprehensive evaluation model to address TTDP, a complex multi-objective optimization problem. The goal of the research is to balance experience, cost, and efficiency by using the Analytic Hierarchy Process (AHP) and Entropy Weight Method (EWM) to assign subjective and objective weights to indicators such as ratings, duration, and costs. These weights are optimized using the Lagrange multiplier method and integrated into the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) model. Additionally, a weighted sum optimization method within the Traveling Salesman Problem (TSP) framework is used to maximize ratings while minimizing costs and distances. Secondly, this study compares seven heuristic algorithms—the genetic algorithm (GA), particle swarm optimization (PSO), the tabu search (TS), genetic-particle swarm optimization (GA-PSO), the gray wolf optimizer (GWO), and ant colony optimization (ACO)—to solve the TOPSIS model, with GA-PSO performing the best. The study then introduces the Lagrange multiplier method to the algorithms, improving the solution quality of all seven heuristic algorithms, with an average solution quality improvement of 112.5% (from 0.16 to 0.34). The PSO algorithm achieves the best solution quality. Based on this, the study introduces a new variant of PSO, namely PSO with Laplace disturbance (PSO-LD), which incorporates a dynamic adaptive Laplace perturbation term to enhance global search capabilities, improving stability and convergence speed. The experimental results show that PSO-LD outperforms the baseline PSO and other algorithms, achieving higher solution quality and faster convergence speed. The Wilcoxon signed-rank test confirms significant statistical differences among the algorithms. This study provides an effective method for experience-oriented path optimization and offers insights into algorithm selection for complex TTDP problems. Full article
Show Figures

Figure 1

18 pages, 1890 KB  
Article
Symmetry-Entropy-Constrained Matrix Fusion for Dynamic Dam-Break Emergency Planning
by Shuai Liu, Dewei Yang, Hao Hu and Junping Wang
Symmetry 2025, 17(5), 792; https://doi.org/10.3390/sym17050792 - 20 May 2025
Viewed by 597
Abstract
Existing studies on ontology evolution lack automated mechanisms to balance semantic coherence and adaptability under real-time uncertainties, particularly in resolving spatiotemporal asymmetry and multidimensional coupling imbalances in dam-break scenarios. Traditional methods such as WordNet’s tree symmetry and FrameNet’s frame symmetry fail to formalize [...] Read more.
Existing studies on ontology evolution lack automated mechanisms to balance semantic coherence and adaptability under real-time uncertainties, particularly in resolving spatiotemporal asymmetry and multidimensional coupling imbalances in dam-break scenarios. Traditional methods such as WordNet’s tree symmetry and FrameNet’s frame symmetry fail to formalize dynamic adjustments through quantitative metrics, leading to path dependency and delayed responses. This study addresses this gap by introducing a novel symmetry-entropy-constrained matrix fusion algorithm, which integrates algebraic direct sum operations and Hadamard product with entropy-driven adaptive weighting. The original contribution lies in the symmetry entropy metric, which quantifies structural deviations during fusion to systematically balance semantic stability and adaptability. This work formalizes ontology evolution as a symmetry-driven optimization process. Experimental results demonstrate that shared concepts between ontologies (s = 3) reduce structural asymmetry by 25% compared to ontologies (s = 1), while case studies validate the algorithm’s ability to reconcile discrepancies between theoretical models and practical challenges in evacuation efficiency and crowd dynamics. This advancement promotes the evolution of traditional emergency management systems towards an adaptive intelligent form. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

25 pages, 349 KB  
Article
Quantum κ-Entropy: A Quantum Computational Approach
by Demosthenes Ellinas and Giorgio Kaniadakis
Entropy 2025, 27(5), 482; https://doi.org/10.3390/e27050482 - 29 Apr 2025
Viewed by 675
Abstract
A novel approach to the quantum version of κ-entropy that incorporates it into the conceptual, mathematical and operational framework of quantum computation is put forward. Various alternative expressions stemming from its definition emphasizing computational and algorithmic aspects are worked out: First, for [...] Read more.
A novel approach to the quantum version of κ-entropy that incorporates it into the conceptual, mathematical and operational framework of quantum computation is put forward. Various alternative expressions stemming from its definition emphasizing computational and algorithmic aspects are worked out: First, for the case of canonical Gibbs states, it is shown that κ-entropy is cast in the form of an expectation value for an observable that is determined. Also, an operational method named “the two-temperatures protocol” is introduced that provides a way to obtain the κ-entropy in terms of the partition functions of two auxiliary Gibbs states with temperatures κ-shifted above, the hot-system, and κ-shifted below, the cold-system, with respect to the original system temperature. That protocol provides physical procedures for evaluating entropy for any κ. Second, two novel additional ways of expressing the κ-entropy are further introduced. One determined by a non-negativity definite quantum channel, with Kraus-like operator sum representation and its extension to a unitary dilation via a qubit ancilla. Another given as a simulation of the κ-entropy via the quantum circuit of a generalized version of the Hadamard test. Third, a simple inter-relation of the von Neumann entropy and the quantum κ-entropy is worked out and a bound of their difference is evaluated and interpreted. Also the effect on the κ-entropy of quantum noise, implemented as a random unitary quantum channel acting in the system’s density matrix, is addressed and a bound on the entropy, depending on the spectral properties of the noisy channel and the system’s density matrix, is evaluated. The results obtained amount to a quantum computational tool-box for the κ-entropy that enhances its applicability in practical problems. Full article
(This article belongs to the Section Statistical Physics)
Show Figures

Figure 1

31 pages, 19278 KB  
Article
Fractal Dimension of Pollutants and Urban Meteorology of a Basin Geomorphology: Study of Its Relationship with Entropic Dynamics and Anomalous Diffusion
by Patricio Pacheco and Eduardo Mera
Fractal Fract. 2025, 9(4), 255; https://doi.org/10.3390/fractalfract9040255 - 17 Apr 2025
Viewed by 410
Abstract
A total of 108 maximum Kolmogorov entropy (SK) values, calculated by means of chaos theory, are obtained from 108 time series (TSs) (each consisting of 28,463 hourly data points). The total TSs are divided into 54 urban meteorological (temperature (T), relative [...] Read more.
A total of 108 maximum Kolmogorov entropy (SK) values, calculated by means of chaos theory, are obtained from 108 time series (TSs) (each consisting of 28,463 hourly data points). The total TSs are divided into 54 urban meteorological (temperature (T), relative humidity (RH) and wind speed magnitude (WS)) and 54 pollutants (PM10, PM2.5 and CO). The measurement locations (6) are located at different heights and the data recording was carried out in three periods, 2010–2013, 2017–2020 and 2019–2022, which determines a total of 3,074,004 data points. For each location, the sum of the maximum entropies of urban meteorology and the sum of maximum entropies of pollutants, SK, MV and SK, P, are calculated and plotted against h, generating six different curves for each of the three data-recording periods. The tangent of each figure is determined and multiplied by the average temperature value of each location according to the period, obtaining, in a first approximation, the magnitude of the entropic forces associated with urban meteorology (FK, MV) and pollutants (FK, P), respectively. It is verified that all the time series have a fractal dimension, and that the fractal dimension of the pollutants shows growth towards the most recent period. The entropic dynamics of pollutants is more dominant with respect to the dynamics of urban meteorology. It is found that this greater influence favors subdiffusion processes (α < 1), which is consistent with a geographic basin with lower atmospheric resilience. By applying a heavy-tailed probability density analysis, it is shown that atmospheric pollution states are more likely, generating an extreme environment that favors the growth of respiratory diseases and low relative humidity, makes heat islands more stable over time, and strengthens heat waves. Full article
Show Figures

Figure 1

Back to TopTop