Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,228)

Search Parameters:
Keywords = shannon entropy

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
32 pages, 8889 KB  
Article
Geodiversity Assessment and Global Geopark Construction in Changzhi City, Shanxi Province, China
by Yong Lei, Jie Cui, Shuai Li, Feng Tian, Lu Tian, Zeliang Du, Mengyue Wen, Binghua Yan, Tongtong Jiao and Yang Zhang
Sustainability 2026, 18(3), 1252; https://doi.org/10.3390/su18031252 - 26 Jan 2026
Abstract
Objective: Given the global trend of ecological protection and sustainable development, Global Geoparks have become an essential platform for resource conservation and regional growth. Changzhi City in Shanxi Province, China, is actively applying for Global Geopark status, relying on its rich geoheritage sites, [...] Read more.
Objective: Given the global trend of ecological protection and sustainable development, Global Geoparks have become an essential platform for resource conservation and regional growth. Changzhi City in Shanxi Province, China, is actively applying for Global Geopark status, relying on its rich geoheritage sites, cultural history, and natural landscapes. This paper presents a systematic evaluation of the city’s geodiversity and relic value, analyzes the feasibility of establishing a Global Geopark in Changzhi City, and provides scientific support for Changzhi City’s Global Geopark application. Methods: Geodiversity data were collected by region using a 1:25,000 grid for sampling. Four methods were adopted for evaluation, namely, the Shannon diversity index, Simpson diversity index, entropy weight method (EWM), and Pielou evenness index. Upon comprehensive comparison of the four approaches, the most suitable approach was selected to produce the final results. For the value evaluation of the geoheritage, a combination of the analytic hierarchy process and the entropy weight method was employed. Results: (1) According to the results of all four methods, the geodiversity of Changzhi City is higher in the eastern and western regions and lower in the central area. (2) The geoheritage sites are mainly distributed in the eastern part of the city and have relatively high relic value. (3) Changzhi City contains abundant natural reserves and cultural resources, meeting the fundamental requirements for Global Geopark construction. Specifically, 38 townships across eight counties were identified as potential geopark areas, encompassing 54 geoheritage sites, 76 provincial-level or higher cultural-relic protection sites, and 15 provincial-level or higher natural protected areas, with a total area of 4458.51 km2. Conclusions: Our results suggest that the Shannon diversity index is an effective tool for evaluating geodiversity in Changzhi City. Based on the region’s geological and natural conditions, the delineated geopark area is feasible. In summary, our findings provide essential references for the protection and sustainable development of geoheritage sites, geodiversity, and geoparks and offer strong theoretical and data support for Changzhi City’s Global Geopark application. Full article
Show Figures

Figure 1

21 pages, 13708 KB  
Article
Image Encryption Using Chaotic Box Partition–Permutation and Modular Diffusion with PBKDF2 Key Derivation
by Javier Alberto Vargas Valencia, Mauricio A. Londoño-Arboleda, Hernán David Salinas Jiménez, Carlos Alberto Marín Arango and Luis Fernando Duque Gómez
J. Cybersecur. Priv. 2026, 6(1), 21; https://doi.org/10.3390/jcp6010021 - 22 Jan 2026
Viewed by 52
Abstract
This work presents a hybrid chaotic–cryptographic image encryption method that integrates a physical two-dimensional delta-kicked oscillator with a PBKDF2-HMAC-SHA256 key derivation function (KDF). The user-provided key material—a 12-character, human-readable key and four salt words—is transformed by the KDF into 256 bits of high-entropy [...] Read more.
This work presents a hybrid chaotic–cryptographic image encryption method that integrates a physical two-dimensional delta-kicked oscillator with a PBKDF2-HMAC-SHA256 key derivation function (KDF). The user-provided key material—a 12-character, human-readable key and four salt words—is transformed by the KDF into 256 bits of high-entropy data, which is then converted into 96 balanced decimal digits to seed the chaotic system. Encryption operates in the real number domain through a chaotic partition–permutation stage followed by modular diffusion. Experimental results confirm perfect reversibility, high randomness (Shannon entropy 7.9981), and negligible adjacent-pixel correlation. The method resists known- and chosen-plaintext attacks, showing no statistical dependence between plain and cipher images. Differential analysis yields NPCR99.6% and UACI33.9%, demonstrating complete diffusion. The PBKDF2-based key derivation expands the effective key space to 2256, eliminates weak-key conditions, and ensures full reproducibility. The proposed approach bridges deterministic chaos and modern cryptography, offering a secure, verifiable framework for protecting sensitive images. Full article
(This article belongs to the Section Cryptography and Cryptology)
Show Figures

Figure 1

30 pages, 2546 KB  
Article
Entropy and Normalization in MCDA: A Data-Driven Perspective on Ranking Stability
by Ewa Roszkowska
Entropy 2026, 28(1), 114; https://doi.org/10.3390/e28010114 - 18 Jan 2026
Viewed by 110
Abstract
Normalization is a critical step in Multiple-Criteria Decision Analysis (MCDA) because it transforms heterogeneous criterion values into comparable information. This study examines normalization techniques through the lens of entropy, highlighting how criterion data structure shapes normalization behavior and ranking stability within TOPSIS (Technique [...] Read more.
Normalization is a critical step in Multiple-Criteria Decision Analysis (MCDA) because it transforms heterogeneous criterion values into comparable information. This study examines normalization techniques through the lens of entropy, highlighting how criterion data structure shapes normalization behavior and ranking stability within TOPSIS (Technique for Order Preference by Similarity to Ideal Solution). Seven widely used normalization procedures are analyzed regarding mathematical properties, sensitivity to extreme values, treatment of benefit and cost criteria, and rank reversal. Normalization is treated as a source of uncertainty in MCDA outcomes, as different schemes can produce divergent rankings under identical decision settings. Shannon entropy is employed as a descriptive measure of information dispersion and structural uncertainty, capturing the heterogeneity and discriminatory potential of criteria rather than serving as a weighting mechanism. An illustrative experiment with ten alternatives and four criteria (two high-entropy, two low-entropy) demonstrates how entropy mediates normalization effects. Seven normalization schemes are examined, including vector, max, linear Sum, and max–min procedures. For vector, max, and linear sum, cost-type criteria are treated using either linear inversion or reciprocal transformation, whereas max–min is implemented as a single method. This design separates the choice of normalization form from the choice of cost-criteria transformation, allowing a cleaner identification of their respective contributions to ranking variability. The analysis shows that normalization choice alone can cause substantial differences in preference values and rankings. High-entropy criteria tend to yield stable rankings, whereas low-entropy criteria amplify sensitivity, especially with extreme or cost-type data. These findings position entropy as a key mediator linking data structure with normalization-induced ranking variability and highlight the need to consider entropy explicitly when selecting normalization procedures. Finally, a practical entropy-based method for choosing normalization techniques is introduced to enhance methodological transparency and ranking robustness in MCDA. Full article
(This article belongs to the Special Issue Entropy Method for Decision Making with Uncertainty)
Show Figures

Figure 1

44 pages, 996 KB  
Article
Adaptive Hybrid Consensus Engine for V2X Blockchain: Real-Time Entropy-Driven Control for High Energy Efficiency and Sub-100 ms Latency
by Rubén Juárez and Fernando Rodríguez-Sela
Electronics 2026, 15(2), 417; https://doi.org/10.3390/electronics15020417 - 17 Jan 2026
Viewed by 163
Abstract
We present an adaptive governance engine for blockchain-enabled Vehicular Ad Hoc Networks (VANETs) that regulates the latency–energy–coherence trade-off under rapid topology changes. The core contribution is an Ideal Information Cycle (an operational abstraction of information injection/validation) and a modular VANET Engine implemented as [...] Read more.
We present an adaptive governance engine for blockchain-enabled Vehicular Ad Hoc Networks (VANETs) that regulates the latency–energy–coherence trade-off under rapid topology changes. The core contribution is an Ideal Information Cycle (an operational abstraction of information injection/validation) and a modular VANET Engine implemented as a real-time control loop in NS-3.35. At runtime, the Engine monitors normalized Shannon entropies—informational entropy S over active transactions and spatial entropy Hspatial over occupancy bins (both on [0,1])—and adapts the consensus mode (latency-feasible PoW versus signature/quorum-based modes such as PoS/FBA) together with rigor parameters via calibrated policy maps. Governance is formulated as a constrained operational objective that trades per-block resource expenditure (radio + cryptography) against a Quality-of-Information (QoI) proxy derived from delay/error tiers, while maintaining timeliness and ledger-coherence pressure. Cryptographic cost is traced through counted operations, Ecrypto=ehnhash+esignsig, and coherence is tracked using the LCP-normalized definition Dledger(t) computed from the longest common prefix (LCP) length across nodes. We evaluate the framework under urban/highway mobility, scheduled partitions, and bounded adversarial stressors (Sybil identities and Byzantine proposers), using 600 s runs with 30 matched random seeds per configuration and 95% bias-corrected and accelerated (BCa) bootstrap confidence intervals. In high-disorder regimes (S0.8), the Engine reduces total per-block energy (radio + cryptography) by more than 90% relative to a fixed-parameter PoW baseline tuned to the same agreement latency target. A consensus-first triggering policy further lowers agreement latency and improves throughput compared with broadcast-first baselines. In the emphasized urban setting under high mobility (v=30 m/s), the Engine keeps agreement/commit latency in the sub-100 ms range while maintaining finality typically within sub-150 ms ranges, bounds orphaning (≤10%), and reduces average ledger divergence below 0.07 at high spatial disorder. The main evaluation is limited to N100 vehicles under full PHY/MAC fidelity. PoW targets are intentionally latency-feasible and are not intended to provide cryptocurrency-grade majority-hash security; operational security assumptions and mode transition safeguards are discussed in the manuscript. Full article
(This article belongs to the Special Issue Intelligent Technologies for Vehicular Networks, 2nd Edition)
Show Figures

Figure 1

25 pages, 1436 KB  
Article
Entropy-Augmented Forecasting and Portfolio Construction at the Industry-Group Level: A Causal Machine-Learning Approach Using Gradient-Boosted Decision Trees
by Gil Cohen, Avishay Aiche and Ron Eichel
Entropy 2026, 28(1), 108; https://doi.org/10.3390/e28010108 - 16 Jan 2026
Viewed by 222
Abstract
This paper examines whether information-theoretic complexity measures enhance industry-group return forecasting and portfolio construction within a machine-learning framework. Using daily data for 25 U.S. GICS industry groups spanning more than three decades, we augment gradient-boosted decision tree models with Shannon entropy and fuzzy [...] Read more.
This paper examines whether information-theoretic complexity measures enhance industry-group return forecasting and portfolio construction within a machine-learning framework. Using daily data for 25 U.S. GICS industry groups spanning more than three decades, we augment gradient-boosted decision tree models with Shannon entropy and fuzzy entropy computed from recent return dynamics. Models are estimated at weekly, monthly, and quarterly horizons using a strictly causal rolling-window design and translated into two economically interpretable allocation rules, a maximum-profit strategy and a minimum-risk strategy. Results show that the top performing strategy, the weekly maximum-profit model augmented with Shannon entropy, achieves an accumulated return exceeding 30,000%, substantially outperforming both the baseline model and the fuzzy-entropy variant. On monthly and quarterly horizons, entropy and fuzzy entropy generate smaller but robust improvements by maintaining lower volatility and better downside protection. Industry allocations display stable and economically interpretable patterns, profit-oriented strategies concentrate primarily in cyclical and growth-sensitive industries such as semiconductors, automobiles, technology hardware, banks, and energy, while minimum-risk strategies consistently favor defensive industries including utilities, food, beverage and tobacco, real estate, and consumer staples. Overall, the results demonstrate that entropy-based complexity measures improve both economic performance and interpretability, yielding industry-rotation strategies that are simultaneously more profitable, more stable, and more transparent. Full article
(This article belongs to the Special Issue Entropy, Artificial Intelligence and the Financial Markets)
Show Figures

Figure 1

14 pages, 2976 KB  
Article
Extreme Values and Convergence of the Voronoi Entropy for 2D Random Point Processes and for Long-Range Order
by Mark Frenkel, Irina Legchenkova, Edward Bormashenko, Shraga Shoval and Michael Nosonovsky
Entropy 2026, 28(1), 95; https://doi.org/10.3390/e28010095 - 13 Jan 2026
Viewed by 319
Abstract
We investigate the asymptotic maximum value and convergence of the Voronoi Entropy (VE) for a 2D random point process (S = 1.690 ± 0.001) and point sets with long-range order characterized by hyperuniformity. We find that for the number of polygons of [...] Read more.
We investigate the asymptotic maximum value and convergence of the Voronoi Entropy (VE) for a 2D random point process (S = 1.690 ± 0.001) and point sets with long-range order characterized by hyperuniformity. We find that for the number of polygons of about n > 100, the VE range is between S = 0 (ordered set of seed points) and S = 1.69 (random set of seed points). For circular regions with the dimensionless radius R normalized by the average distance between points, we identify two limits: Limit-1 (R = 2.5, 16 ± 6 points) is the minimum radius, for which it is possible to construct a Voronoi diagram, and Limit-2 (R = 5.5, 96 ± 6 points) at which the VE reaches the saturation level. We also discuss examples of seed point patterns for which the values of VE exceed the asymptotic value of S > 1.69. While the VE accounts only for neighboring polygons, covering the 2D plane imposes constraints on the number of polygons and the number of edges in polygons. Consequently, unlike the conventional Shannon Entropy, the VE captures some long-range order properties of the system. We calculate the VE for several hyperuniform sets of points and compare it with the values of exponents of collective density variables characterizing long-range correlations in the system. We show that the VE correlates with the latter up to a certain saturation level, after which the value of the VE falls to S = 0, and we explain this phenomenon. Full article
(This article belongs to the Section Statistical Physics)
Show Figures

Figure 1

30 pages, 1128 KB  
Article
Analysis of Technological Readiness Indexes for Offshore Renewable Energies in Ibero-American Countries
by Claudio Moscoloni, Emiliano Gorr-Pozzi, Manuel Corrales-González, Adriana García-Mendoza, Héctor García-Nava, Isabel Villalba, Giuseppe Giorgi, Gustavo Guarniz-Avalos, Rodrigo Rojas and Marcos Lafoz
Energies 2026, 19(2), 370; https://doi.org/10.3390/en19020370 - 12 Jan 2026
Viewed by 190
Abstract
The energy transition in Ibero-American countries demands significant diversification, yet the vast potential of offshore renewable energies (ORE) remains largely untapped. Slow adoption is often attributed to the hostile marine environment, high investment costs, and a lack of institutional, regulatory, and industrial readiness. [...] Read more.
The energy transition in Ibero-American countries demands significant diversification, yet the vast potential of offshore renewable energies (ORE) remains largely untapped. Slow adoption is often attributed to the hostile marine environment, high investment costs, and a lack of institutional, regulatory, and industrial readiness. A critical barrier for policymakers is the absence of methodologically robust tools to assess national preparedness. Existing indices typically rely on simplistic weighting schemes or are susceptible to known flaws, such as the rank reversal phenomenon, which undermines their credibility for strategic decision-making. This study addresses this gap by developing a multi-criteria decision-making (MCDM) framework based on a problem-specific synthesis of established optimization principles to construct a comprehensive Offshore Readiness Index (ORI) for 13 Ibero-American countries. The framework moves beyond traditional methods by employing an advanced weight-elicitation model rooted in the Robust Ordinal Regression (ROR) paradigm to analyze 42 sub-criteria across five domains: Regulation, Planning, Resource, Industry, and Grid. Its methodological core is a non-linear objective function that synergistically combines a Shannon entropy term to promote a maximally unbiased weight distribution and to prevent criterion exclusion, with an epistemic regularization penalty that anchors the solution to expert-derived priorities within each domain. The model is guided by high-level hierarchical constraints that reflect overarching policy assumptions, such as the primacy of Regulation and Planning, thereby ensuring strategic alignment. The resulting ORI ranks Spain first, followed by Mexico and Costa Rica. Spain’s leadership is underpinned by its exceptional performance in key domains, supported by specific enablers, such as a dedicated renewable energy roadmap. The optimized block weights validate the model’s structure, with Regulation (0.272) and Electric Grid (0.272) receiving the highest importance. In contrast, lower-ranked countries exhibit systemic deficiencies across multiple domains. This research offers a dual contribution: methodological innovation in readiness assessment and an actionable tool for policy instruments. The primary policy conclusion is clear: robust regulatory frameworks and strategic planning are the pivotal enabling conditions for ORE development, while industrial capacity and infrastructure are consequent steps that must follow, not precede, a solid policy foundation. Full article
(This article belongs to the Special Issue Advanced Technologies for the Integration of Marine Energies)
Show Figures

Figure 1

31 pages, 1304 KB  
Article
The Informational Birth of the Universe: A Theory of Everything from Quantum Complexity
by Gastón Sanglier Contreras, Roberto Alonso González-Lezcano and Eduardo J. López Fernández
Quantum Rep. 2026, 8(1), 4; https://doi.org/10.3390/quantum8010004 - 12 Jan 2026
Viewed by 273
Abstract
We propose a unified theoretical framework grounded in a Primordial Quantum Field (PQF)—a continuous, non-local informational substrate that precedes space-time and matter. The PQF is represented by a wave functional evolving in an abstract configuration space, where physical properties emerge through the self-organization [...] Read more.
We propose a unified theoretical framework grounded in a Primordial Quantum Field (PQF)—a continuous, non-local informational substrate that precedes space-time and matter. The PQF is represented by a wave functional evolving in an abstract configuration space, where physical properties emerge through the self-organization of complexity. We introduce a novel physical quantity—complexity entropy Sc[ϕ]—which quantifies the structural organization of the PQF. Unlike traditional entropy measures (Shannon, von Neumann, Kolmogorov), Sc[ϕ] captures non-trivial coherence and functional correlations. We demonstrate how complexity gradients induce an emergent geometry, from which spacetime curvature, physical constants, and the arrow of time arise. The model predicts measurable phenomena such as entanglement waves and reinterprets dark energy as informational coherence pressure, suggesting empirical pathways for testing via highly correlated quantum systems. Full article
(This article belongs to the Special Issue Exclusive Feature Papers of Quantum Reports in 2024–2025)
Show Figures

Figure 1

21 pages, 4969 KB  
Article
Analysis of Temporal Changes in the Floating Vegetation and Algae Surface of the Water Bodies of Kis-Balaton Based on Aerial Image Classification and Meteorological Data
by Kristóf Kozma-Bognár, Angéla Anda, Ariel Tóth, Veronika Kozma-Bognár and József Berke
Geomatics 2026, 6(1), 3; https://doi.org/10.3390/geomatics6010003 - 3 Jan 2026
Viewed by 280
Abstract
Climate change and related weather extremes are increasingly having an impact on all aspects of life. The main objective of the research was to analyze the impact of the most important meteorological elements and the image data of various water bodies of the [...] Read more.
Climate change and related weather extremes are increasingly having an impact on all aspects of life. The main objective of the research was to analyze the impact of the most important meteorological elements and the image data of various water bodies of the Kis-Balaton wetland, Hungary. The primary question was which meteorological elements have a positive or negative influence on vegetational surface cover. Drones have facilitated the visual surveying and monitoring of challenging-to-reach water bodies in the area, including a lake and multiple channels. The individual channels had different flow conditions. Aerial surveys were conducted monthly, based on pre-prepared flight plans. Images captured by a Mavic 3 drone flying at an altitude of 150 m and equipped with a multispectral sensor were processed. The time-series images were aligned and assembled into orthophotos. The image details relevant to the research were segregated and classified using Maximum Likelihood classification algorithm. The reliability of the image data used was checked by Shannon entropy and spectral fractal dimension measurements. The results of the classification were compared with the meteorological data collected by a QLC-50 automatic climate station of Keszthely. The investigations revealed that the surface cover of the examined water bodies was different in the two years but showed a kind of periodicity during the year. In those periods, where photosynthetic organisms multiplied in a higher proportion in the water body, higher monthly average air temperatures and higher monthly global solar radiation sums were observed. Full article
Show Figures

Figure 1

23 pages, 2795 KB  
Article
A Bio-Inspired Approach to Sustainable Building Design Optimization: Multi-Objective Flow Direction Algorithm with One-Hot Encoding
by Ahmet Serhan Canbolat and Emre İsa Albak
Biomimetics 2026, 11(1), 31; https://doi.org/10.3390/biomimetics11010031 - 2 Jan 2026
Viewed by 408
Abstract
The urgent need for sustainable building design calls for advanced optimization methods that simultaneously address economic and environmental objectives, particularly those involving mixed discrete-continuous variables such as insulation material, heating source, and insulation thickness. While nature-inspired metaheuristics have shown promise in engineering optimization, [...] Read more.
The urgent need for sustainable building design calls for advanced optimization methods that simultaneously address economic and environmental objectives, particularly those involving mixed discrete-continuous variables such as insulation material, heating source, and insulation thickness. While nature-inspired metaheuristics have shown promise in engineering optimization, their application to building envelope design remains limited, especially in handling discrete choices efficiently within a multi-objective framework. Inspired by the natural process of rainwater runoff and drainage basin dynamics, this study presents a novel hybrid approach integrating the Multi-Purpose Flow Direction Algorithm (MOFDA) with One-Hot Encoding to optimize external wall insulation. This bio-inspired algorithm mimics how water seeks optimal paths across terrain, enabling effective navigation of complex design spaces with both categorical and continuous variables. The model aims to minimize total lifecycle costs and CO2 emissions across Türkiye’s six updated climatic regions. Pareto-optimal solutions are created using MOFDA, after which the Complex Proportional Assessment (COPRAS) method, weighted by Shannon Entropy, selects the most balanced designs. The results reveal significant climate-dependent variations: in the warmest region, the cost-optimal thickness is 3.3 cm (Rock Wool), while the emission-optimal reaches 17.3 cm (Glass Wool). In colder regions, emission-driven scenarios consistently require up to 40 cm insulation, indicating a practical limit of current materials. Under balanced weighting, fuel preferences shift from LPG in milder climates to Fuel Oil in harsher climates. Notably, Shannon Entropy assigned a weight of 88–92% to emissions due to their wider variability across the Pareto front, underscoring the environmental priority in data-driven decisions. This study demonstrates that the bio-inspired MOFDA framework, enhanced with One-Hot Encoding, effectively handles mixed discrete-continuous optimization and provides a robust, climate-aware decision tool for sustainable building design, reinforcing the value of translating natural flow processes into engineering solutions. Full article
(This article belongs to the Section Biological Optimisation and Management)
Show Figures

Graphical abstract

23 pages, 12759 KB  
Article
Mapping Urban Vitality: Geospatial Analysis of Commercial Diversity and Tourism
by Sié Cyriac Noufe, Rachid Belaroussi, Francis Dupin and Pierre-Olivier Vandanjon
Urban Sci. 2026, 10(1), 21; https://doi.org/10.3390/urbansci10010021 - 1 Jan 2026
Viewed by 298
Abstract
Business diversity in proximity-based environments is emerging as an important requirement in urban planning, especially with the rise of concepts such as the 15-min city, which aim to enhance urban vitality. While many studies have focused on assessing vitality through the conditions defined [...] Read more.
Business diversity in proximity-based environments is emerging as an important requirement in urban planning, especially with the rise of concepts such as the 15-min city, which aim to enhance urban vitality. While many studies have focused on assessing vitality through the conditions defined by Jane Jacobs, few have specifically measured commercial diversity and analyzed its relationship with place popularity, attendance, and tourism activity. Using geo-localized data on businesses and Google Maps reviews in Paris, a diversity index was constructed based on Shannon entropy derived from business categories—Culture and leisure, Food and beverage, Retail stores, Local services—and explored its correlations through statistical analysis. The study reveals a higher level of commercial diversity in central areas compared to the outskirts, as indicated by spatial clustering analysis, along with a positive association between diversity and attendance. However, no significant relationship was observed between commercial diversity and the popularity of the selected establishments. These findings may inform policymakers and urban planners in designing more locally diversified cities and, more broadly, in promoting sustainable urban vitality. Full article
(This article belongs to the Special Issue GIS in Urban Planning and Spatial Analysis)
Show Figures

Figure 1

15 pages, 1829 KB  
Article
Longitudinal Analysis of Vulvovaginal Bacteriome Following Use of Water- and Silicone-Based Personal Lubricants: Stability, Spatial Specificity, and Clinical Implications
by Jose A. Freixas-Coutin, Jin Seo, Lingyao Su and Sarah Hood
Microorganisms 2026, 14(1), 82; https://doi.org/10.3390/microorganisms14010082 - 30 Dec 2025
Viewed by 267
Abstract
The vulvovaginal microbiome is a complex and dynamic ecosystem of microorganisms. The potential effects of common personal lubricants on its balance, which have implications for reproductive health, are still unknown. This study longitudinally assessed the impact of two commercially available lubricants on the [...] Read more.
The vulvovaginal microbiome is a complex and dynamic ecosystem of microorganisms. The potential effects of common personal lubricants on its balance, which have implications for reproductive health, are still unknown. This study longitudinally assessed the impact of two commercially available lubricants on the composition and stability of the vaginal and vulvar bacteriome. Paired vaginal and vulvar swabs were collected at baseline and after repeated lubricant use, and the bacteriome was assessed using 16S rRNA gene amplicon sequencing. Alpha and beta diversity were assessed using Shannon entropy and Bray–Curtis dissimilarity, respectively. The results showed that the vaginal bacteriome was dominated by Lactobacillus and Firmicutes, while vulvar communities were more diverse and had higher abundances of Prevotella, Finegoldia, and Peptoniphilus. Both alpha and beta diversity measures indicated that the vaginal and vulvar bacteriome remained largely stable even after repeated lubricant use. Minor and non-significant changes in genus-level composition were observed, particularly in the vulvar samples. A moderate but significant correlation (Mantel r = 0.274, p = 0.001) was also observed between the vaginal and vulvar bacteriome. Overall, this study shows that short-term, repeated use of the water-based lubricant and the silicone-based lubricant tested in this study does not significantly disrupt the vaginal or vulvar bacteriome. Full article
(This article belongs to the Special Issue The Vaginal Microbiome in Health and Disease)
Show Figures

Figure 1

16 pages, 365 KB  
Article
Disentangling Brillouin’s Negentropy Law of Information and Landauer’s Law on Data Erasure
by Didier Lairez
Entropy 2026, 28(1), 37; https://doi.org/10.3390/e28010037 - 27 Dec 2025
Viewed by 291
Abstract
The link between information and energy introduces the observer and their knowledge into the understanding of a fundamental quantity in physics. Two approaches compete to account for this link—Brillouin’s negentropy law of information and Landauer’s law on data erasure—which are often confused. The [...] Read more.
The link between information and energy introduces the observer and their knowledge into the understanding of a fundamental quantity in physics. Two approaches compete to account for this link—Brillouin’s negentropy law of information and Landauer’s law on data erasure—which are often confused. The first, based on Clausius’ inequality and Shannon’s mathematical results, is very robust, whereas the second, based on the simple idea that information requires a material embodiment (data bits), is now perceived as more physical and therefore prevails. In this paper, we show that Landauer’s idea results from a confusion between information (a global emergent concept) and data (a local material object). This confusion leads to many inconsistencies and is incompatible with thermodynamics and information theory. The reason it prevails is interpreted as being due to a frequent tendency of materialism towards reductionism, neglecting emergence and seeking to eliminate the role of the observer. A paradoxical trend, considering that it is often accompanied by the materialist idea that all scientific knowledge, nevertheless, originates from observation. Information and entropy are actually emergent quantities introduced in the theory by convention. Full article
Show Figures

Figure 1

23 pages, 7685 KB  
Article
Literal Pattern Analysis of Texts Written with the Multiple Form of Characters: A Comparative Study of the Human and Machine Styles
by Kazuya Hayata
Entropy 2026, 28(1), 36; https://doi.org/10.3390/e28010036 - 27 Dec 2025
Viewed by 241
Abstract
Aside from languages having no form of written expression, it is usually the case with every language on this planet that texts are written in a single character. But every rule has its exceptions. A very rare exception is Japanese, the texts of [...] Read more.
Aside from languages having no form of written expression, it is usually the case with every language on this planet that texts are written in a single character. But every rule has its exceptions. A very rare exception is Japanese, the texts of which are written in the three kinds of characters. In European languages, no one can find a text written in a mixture of the Latin, Cyrillic, and Greek alphabets. For several Japanese texts currently available, we conduct a quantitative analysis of how the three characters are mixed using a methodology based on a binary pattern approach to the sequence that has been generated by a procedure. Specifically, we consider two different texts in the former and present constitutions as well as a famous American story that has been translated at least 13 times into Japanese. For the latter, a comparison is made among the human translations and four machine translations by DeepL and Google Translate. As metrics of divergence and diversity, the Hellinger distance, chi-square value, normalized Shannon entropy, and Simpson’s diversity index are employed. Numerical results suggest that in terms of the entropy, the 17 translations consist of three clusters, and that overall, the machine-translated texts exhibit entropy higher than the human translations. The finding suggests that the present method can provide a tool useful for stylometry and author attribution. Finally, through comparison with the diversity index, capabilities of the entropic measure are confirmed. Lastly, in addition to the abovementioned texts, applicability to the Japanese version of the periodic table of elements is investigated. Full article
(This article belongs to the Special Issue Entropy-Based Time Series Analysis: Theory and Applications)
Show Figures

Figure 1

14 pages, 13792 KB  
Article
Probing Lorentz Invariance Violation at High Energies Using LHAASO Observations of GRB221009A via DisCan Algorithm
by Yu-Chen Hua, Xiao-Jun Bi, Yu-Ming Yang and Peng-Fei Yin
Universe 2026, 12(1), 3; https://doi.org/10.3390/universe12010003 - 24 Dec 2025
Viewed by 263
Abstract
The Lorentz invariance violation (LIV) predicted by some quantum gravity theories would manifest as an energy-dependent speed of light, which may potentially distort the observed temporal profile of photons from astrophysical sources at cosmological distances. The dispersion cancellation (DisCan) algorithm offers a powerful [...] Read more.
The Lorentz invariance violation (LIV) predicted by some quantum gravity theories would manifest as an energy-dependent speed of light, which may potentially distort the observed temporal profile of photons from astrophysical sources at cosmological distances. The dispersion cancellation (DisCan) algorithm offers a powerful methodology for investigating such effects by employing quantities such as Shannon entropy, which reflects the initial temporal characteristics. In this study, we apply the DisCan algorithm to search for LIV effects in the LHAASO observations of GRB 221009A, combining data from both the Water Cherenkov Detector Array (WCDA) and Kilometer Squared Array (KM2A) detectors that collectively span an energy range of ∼0.2–13 TeV. Our analysis accounts for the uncertainties from both energy resolution and temporal binning. We derive 95% confidence level lower limits on the LIV energy scale of EQG,1/1019GeV>14.6 (11.2) for the first-order subluminal (superluminal) scenario, and EQG,2/1011GeV>13.7 (12.5) for the second-order subluminal (superluminal) scenario. Full article
Show Figures

Figure 1

Back to TopTop