Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (205)

Search Parameters:
Keywords = conformational entropy

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
26 pages, 2798 KB  
Article
Economic Entropy and the Cobb-Douglas Function: A Scientometric Analysis
by Isabel Cristina Betancur-Hinestroza, Nini Johana Marín-Rodríguez, Francisco J. Caro-Lopera and Éver Alberto Velásquez Sierra
Entropy 2026, 28(5), 480; https://doi.org/10.3390/e28050480 - 22 Apr 2026
Abstract
Economic entropy, as an emerging concept in econophysics, has gained increasing relevance in the analysis of complex systems characterized by uncertainty, nonlinearity, and out-of-equilibrium dynamics. However, its integration into conventional economic modeling—particularly in production functions such as the Cobb–Douglas function—remains fragmented and lacks [...] Read more.
Economic entropy, as an emerging concept in econophysics, has gained increasing relevance in the analysis of complex systems characterized by uncertainty, nonlinearity, and out-of-equilibrium dynamics. However, its integration into conventional economic modeling—particularly in production functions such as the Cobb–Douglas function—remains fragmented and lacks systematic empirical validation. This study conducts a scientometric analysis of 345 Scopus-indexed documents (1973–2024) addressing the intersection between entropy, econophysics, and production functions, with the aim of mapping the intellectual structure of the field, characterizing its growth trends, identifying its core contributions, and highlighting its main research gaps. The results reveal that the field has experienced sustained growth since 2004, with a notable acceleration between 2020 and 2023, although it exhibits a fragmented authorship structure that does not conform to Lotka’s Law, suggesting that the field is still in a stage of scientific consolidation. The Cobb–Douglas function emerges as a niche topic within the econophysics literature, with limited integration between entropy-based approaches—informational, thermodynamic, and maximum entropy—and the empirical modeling of production. Furthermore, weak citation linkages between econophysics and conventional economics are observed, confirming the interdisciplinary fragmentation of the field. These findings provide a structured reference for researchers interested in advancing toward analytical frameworks that explicitly incorporate uncertainty, information, and physical constraints into economic analysis, thereby contributing to the development of econophysics as an integrative discipline. Full article
Show Figures

Figure 1

20 pages, 3313 KB  
Article
Dynamical Analysis and Analytical Solutions of the Fractional Benjamin–Bona–Mahony–Burger Equation
by Beenish, Mohammed M. Babatin and Mohamed A. Abdelkawy
Symmetry 2026, 18(4), 634; https://doi.org/10.3390/sym18040634 - 9 Apr 2026
Viewed by 186
Abstract
In this paper, we study the dynamical analysis and solutions of the fractional Benjamin–Bona–Mahony–Burger equation. We demonstrate various derived solutions using different definitions of fractional derivatives, namely the β-derivative, conformable derivative, and M-truncated derivative, to examine their kinetic characteristics. Firstly, we find [...] Read more.
In this paper, we study the dynamical analysis and solutions of the fractional Benjamin–Bona–Mahony–Burger equation. We demonstrate various derived solutions using different definitions of fractional derivatives, namely the β-derivative, conformable derivative, and M-truncated derivative, to examine their kinetic characteristics. Firstly, we find the solution of the fractional Benjamin–Bona–Mahony–Burger equation using two different approaches. We then discuss the effects of the fractional derivative on the solutions using 3D graphical discussion. Finally, we discuss the dynamical analysis using sensitivity and chaos analysis. We also discuss the chaos analysis using permutation entropy, 2D and 3D phase portrait, fractal dimension, time analysis, return map, Lyapunov exponent, and multistability through Poincare map and basins of attraction. To explore a diverse range of phenomena across the fields of physical science and engineering, this study highlights the computational strength and flexibility of the proposed method. Full article
Show Figures

Figure 1

44 pages, 2527 KB  
Article
Managing Uncertainty and Information Dynamics with Graphics-Enhanced TOGAF Architecture in Higher Education
by A’aeshah Alhakamy
Entropy 2026, 28(3), 361; https://doi.org/10.3390/e28030361 - 22 Mar 2026
Viewed by 424
Abstract
Adaptive learning at scale requires explicit handling of uncertainty and information flow across diverse educational technologies. This paper proposes a TOGAF-conformant enterprise architecture for the University of Tabuk (UT) that embeds entropy- and uncertainty-aware requirements from the outset and aligns them with institutional [...] Read more.
Adaptive learning at scale requires explicit handling of uncertainty and information flow across diverse educational technologies. This paper proposes a TOGAF-conformant enterprise architecture for the University of Tabuk (UT) that embeds entropy- and uncertainty-aware requirements from the outset and aligns them with institutional goals in teaching, research, and administration. Using the Architecture Development Method (ADM), we map information-theoretic requirements to architectural artifacts across the architecture vision, business, information systems, and technology domains; formally specify core entropy-informed observables, including predictive entropy, expected information gain, workflow variability entropy, and uncertainty hot-spot severity; and define semantic and metadata standards for their near-real-time computation. These indicators are positioned explicitly across the TOGAF domains: business architecture identifies where uncertainty matters, information systems architecture defines the computable data and application representations, technology architecture operationalizes secure and scalable computation, and later ADM phases use the resulting metrics for prioritization and governance. The architecture also establishes governance that ranks initiatives by their expected uncertainty reduction through Architecture Review Board (ARB) decision gates. We address three research questions: (R.Q.1) how to design a TOGAF-conformant architecture for UT that natively encodes uncertainty-aware requirements and aligns with institutional needs; (R.Q.2) how to integrate dispersed data, achieve semantic harmonization, and deliver analytics-ready streams that support information-theoretic indicators for personalization without delay; and (R.Q.3) how to embed IT demand planning in opportunities and solutions and migration planning using uncertainty reduction and expected information gain as prioritization criteria. The resulting architecture offers a university-wide foundation for adaptive learning: it unifies learner and system interaction data under governed schemas, supports low-latency analytics, and formalizes decision processes that treat uncertainty as a primary metric. Though learner-level operational validation is future work, the design establishes the technical and organizational foundations for responsible, large-scale deployment of entropy-driven learner modeling, content sequencing, and feedback optimization. Full article
Show Figures

Figure 1

15 pages, 377 KB  
Article
Planar Black Holes and Entanglement Entropy in Analog Gravity Models
by Neven Bilic and Tobias Zingg
Entropy 2026, 28(3), 345; https://doi.org/10.3390/e28030345 - 19 Mar 2026
Viewed by 348
Abstract
Via constructing an explicit Lagrangian for which the perturbation equations are analogs of a scalar field propagating in a planar black-hole space–time, it is found that all planar black holes conformal to a Painlevé–Gullstrand-type line element can be realized as analog metrics. We [...] Read more.
Via constructing an explicit Lagrangian for which the perturbation equations are analogs of a scalar field propagating in a planar black-hole space–time, it is found that all planar black holes conformal to a Painlevé–Gullstrand-type line element can be realized as analog metrics. We also introduce the concept of holographic entanglement entropy for planar black-hole space–times. This is valid for an arbitrary choice of conformal and blackening factor, thereby vastly extending the number of known examples of explicitly known analog metrics. Full article
(This article belongs to the Special Issue Coarse and Fine-Grained Aspects of Gravitational Entropy)
Show Figures

Figure 1

16 pages, 1596 KB  
Article
Information-Theoretic and Conceptual Density Functional Theory Insights on Frustration in Molecular Clusters
by Xinyue Zhao, Ziqing Yan, Lei Zeng, Yaqin Zheng and Chunying Rong
Entropy 2026, 28(2), 213; https://doi.org/10.3390/e28020213 - 12 Feb 2026
Viewed by 572
Abstract
Frustration is an intrinsic feature of molecular complexes, arising when individual constituents must distort from their optimal isolated geometries to achieve collective stabilization. Although energetic frustration can be defined as the average distortion energy associated with complex formation, its quantitative origin and its [...] Read more.
Frustration is an intrinsic feature of molecular complexes, arising when individual constituents must distort from their optimal isolated geometries to achieve collective stabilization. Although energetic frustration can be defined as the average distortion energy associated with complex formation, its quantitative origin and its connection to other molecular descriptors remain insufficiently understood. In this work, we systematically investigate frustration in four representative molecular complexes—two homogeneous clusters, (H2O)n and (HF)n, and two charged clusters, H3O+(H2O)n and F(H2O)n (n = 1–20)—using three complementary density-based frameworks: (i) total-energy decomposition, (ii) global conceptual DFT (CDFT) descriptors, and (iii) information-theoretic approach (ITA) quantities. Strong linear correlations between the total frustration energy and most energy components, as well as CDFT indices, are revealed, enabling a quantitative interpretation of frustration from energetic and electronic-structure perspectives. Among ITA measures, only a subset, including Shannon entropy, Ghosh–Berkowitz–Parr entropy, Rényi entropy, and the relative Fisher information, exhibits robust and consistent correlations with frustration across all systems, indicating their suitability as ITA-based frustration descriptors. Particularly, the (HF)n clusters show uniformly excellent correlations for all descriptors due to their structurally simple and homogeneous hydrogen-bonding environment. Overall, this work provides a comprehensive density-based understanding of frustration and clarifies which descriptors reliably track its behavior. These insights establish a foundation for applying ITA and CDFT analyses to frustrated phenomena in broader chemical contexts, which could be applied to other systems, including molecular recognition, conformational dynamics, and catalysis. Full article
Show Figures

Graphical abstract

18 pages, 333 KB  
Article
A Small Patch Hypothesis in Cosmology
by Meir Shimon
Astronomy 2026, 5(1), 4; https://doi.org/10.3390/astronomy5010004 - 9 Feb 2026
Viewed by 606
Abstract
If our observable Universe is only a tiny region of a vastly larger and conformally older spacetime, then the usual formulations of the classical flatness and horizon problems of the Hot Big Bang can be reinterpreted as artifacts manifesting an observational selection effect; [...] Read more.
If our observable Universe is only a tiny region of a vastly larger and conformally older spacetime, then the usual formulations of the classical flatness and horizon problems of the Hot Big Bang can be reinterpreted as artifacts manifesting an observational selection effect; we occupy a small causal domain of a much larger causally-connected and possibly non-flat spacetime. A sufficiently large positive cosmological constant, Λ, sets the future asymptotic horizon scale of the observable Universe, ∼Λ1/2, thereby implying that the observable Universe may simply be a minute patch of a far larger pre-existing one, hereafter a Small Patch Hypothesis. Importantly, this observational bound is purely geometric; regardless of when the Universe is observed, the maximum accessible scale is finite and fixed by Λ, independent of inflationary dynamics, anthropic arguments, or assumptions about the global hosting spacetime. The externally possibly frozen past-eternal state implied by a pre-existing, causally connected spacetime motivates, but does not strictly require, viewing the perturbation field as being in (or arbitrarily close to) a coarse-grained maximum-entropy—equilibrium—configuration. Conditionalizing only on fixed mean and variance, a Gaussian distribution uniquely emerges, while the absence of entropy gradients corresponds to adiabaticity. In this work these features are therefore treated as plausible maximum-ignorance priors for super-horizon perturbations, rather than as rigorously derived consequences of a fully developed microscopic notion of gravitational entropy. In this sense, inflation becomes one viable realization of the proposed Small Patch Hypothesis. Here, one particular non-inflationary alternative is considered for illustrative purposes in which a primordial spectrum Pζ(k) of the gauge-invariant perturbation ζ that pre-dates the Big Bang grows logarithmically toward large scales, k0, and in fact diverges at some finite kc. If kcΛ1/2, then our local cosmic patch probes only the regime where ζ1 and appears exceptionally smooth. Over the comparatively narrow observable window, this Pζ(k) mimics a slightly red-tilted, inflation-like spectrum. Rather than introducing high-energy new fields, this perspective frames large-scale homogeneity, isotropy, Gaussianity, adiabaticity, and the observed thermodynamic Arrow of Time as possible consequences of restricted observational access to a much larger Universe in equilibrium, rather than signatures of a unique early-Universe mechanism. Current observations cannot distinguish this logarithmically running spectrum from the standard power-law one, but future probes—for example high-resolution 21-cm measurements of the Dark Ages—may be able to falsify it. Full article
12 pages, 1722 KB  
Proceeding Paper
Joint User Scheduling and Beamforming Design in Simultaneously Transmitting and Reflecting Reconfigurable-Intelligent-Surface-Assisted Device-to-Device Communications
by Zhi-Kai Su and Jung-Chieh Chen
Eng. Proc. 2025, 120(1), 53; https://doi.org/10.3390/engproc2025120053 - 6 Feb 2026
Viewed by 296
Abstract
Future wireless networks require efficient device-to-device (D2D) communication to meet the demands of increasing connectivity; however, practical challenges such as limited coverage and severe interference persist. This paper addresses these issues by employing simultaneously transmitting and reflecting reconfigurable intelligent surfaces (STAR-RISs) equipped with [...] Read more.
Future wireless networks require efficient device-to-device (D2D) communication to meet the demands of increasing connectivity; however, practical challenges such as limited coverage and severe interference persist. This paper addresses these issues by employing simultaneously transmitting and reflecting reconfigurable intelligent surfaces (STAR-RISs) equipped with low-resolution phase shifters, thereby enabling full-space coverage while conforming to hardware constraints. To further improve system performance, we propose an irregular STAR-RIS configuration, in which only a subset of elements is activated to enhance spatial diversity without increasing power consumption. Additionally, we introduce a group scheduling strategy that assigns users to different time slots, effectively mitigating interference and improving the overall sum rate. To solve the resulting high-dimensional and non-convex optimization problem, we develop a cross-entropy optimization framework that jointly optimizes element selection, amplitude and phase configurations, and user scheduling. Simulation results demonstrate that the proposed design significantly outperforms existing benchmarks in terms of both the sum rate and scalability, thus providing a practical and efficient solution for STAR-RIS-assisted D2D communication systems. Full article
(This article belongs to the Proceedings of 8th International Conference on Knowledge Innovation and Invention)
Show Figures

Figure 1

21 pages, 7440 KB  
Article
Magnetic Metal–Organic Framework: An Innovative Nanocomposite Adsorbent for the Removal of Emerging Drug Contaminants from Water
by Xueying Li, Asfandyar Shahab, Jinxiong Chen, Wei Li, Hua Zhang, Dunqiu Wang, Xinyu Tang, Mingxin Bin, Licheng Peng and Abubakr M. Idris
Water 2026, 18(3), 321; https://doi.org/10.3390/w18030321 - 28 Jan 2026
Cited by 1 | Viewed by 586
Abstract
The widespread use of antibiotics has taken a heavy toll on the environment, which cannot be ignored. Tetracycline antibiotics (TCs), as representative pharmaceutical contaminants, have emerged as a growing environmental concern due to their persistence and potential ecological risks. This study utilized 1,3,5-benzenetricarboxylic [...] Read more.
The widespread use of antibiotics has taken a heavy toll on the environment, which cannot be ignored. Tetracycline antibiotics (TCs), as representative pharmaceutical contaminants, have emerged as a growing environmental concern due to their persistence and potential ecological risks. This study utilized 1,3,5-benzenetricarboxylic acid (BTC) as a functionalizing reagent to synthesize magnetic nanoparticles NiFe2O4-COOH. These were then combined with Zr-MOF to create the magnetic adsorbent designated as NCF@Zr-MOF (where NCF represents carboxyl-functionalized nickel ferrite). Magnetic solid-phase extraction (MSPE) technology was employed to remove two representative tetracycline antibiotics, tetracycline (TC) and chlortetracycline (CTC) from the environment. The Langmuir model fitting revealed maximum adsorption reached 190.85 and 196.32 mg/g for TC and CTC, respectively, both of which conformed to the pseudo-second-order model during the adsorption process with spontaneous, heat-absorbing and entropy-increasing properties. Furthermore, following five cycles of adsorption and desorption, the removal rate for TCs was found to have decreased by 30%, yet the removal of CTCs remained at 95.32%. This adsorbent enables rapid separation via an external magnetic field. With its excellent stability and reusability, NCF@Zr-MOF shows great potential for removing antibiotics from water. Full article
Show Figures

Graphical abstract

18 pages, 3291 KB  
Article
Preparation, Adsorption Performance and Mechanism of Low-Cost Desert Sand-Based Pb (II) Ion-Imprinted Composites
by Yixin Sui, Jiaxiang Qi, Shuaibing Gao, Linlin Chai, Yahong Xie, Changyan Guo and Shawket Abliz
Polymers 2026, 18(1), 42; https://doi.org/10.3390/polym18010042 - 23 Dec 2025
Viewed by 661
Abstract
Pb (II) contamination in wastewater represents a grave threat to the environment and ecosystems. Consequently, there is an urgent need to prepare low-cost and highly efficient Pb (II) adsorbents. To address this need, abundant and low-cost natural silica-based desert sand (DS) was innovatively [...] Read more.
Pb (II) contamination in wastewater represents a grave threat to the environment and ecosystems. Consequently, there is an urgent need to prepare low-cost and highly efficient Pb (II) adsorbents. To address this need, abundant and low-cost natural silica-based desert sand (DS) was innovatively utilized as a carrier to develop efficient and selective Pb (II) adsorbents. Modified desert sand (MDS) was first prepared via 1 M HCl pretreatment for 2 h and subsequent KH550 silane modification. Pb (II)-imprinted composites (Pb (II)-IIP@MDS) were then fabricated via ion-imprinted polymerization, using Pb (II) as the template ion and N-hydroxymethacrylamide (NHMA)/hydroxyethyl methacrylate (HEMA) as dual functional monomers with a molar ratio of 1:1. The synthesized Pb (II)-IIP@MDS was comprehensively characterized by X-ray photoelectron spectrometer (XPS), scanning electron microscopy (SEM), and Fourier transform infrared spectroscopy (FT-IR). The adsorption capacity, selectivity, and reusability of this material for lead ions were evaluated through three experiments conducted within the optimized pH range of 6–7, with error bars indicated. In adsorption isotherm experiments, the initial Pb (II) concentration ranged from 50 to 500 mg·L−1, conforming to the Langmuir model (R2 = 0.992), with a theoretical maximum adsorption capacity reaching 107.44 mg·g−1; this indicates that the adsorbate forms a monolayer adsorption on the homogeneous imprinted sites. Kinetics data indicate that the process best fits a quasi-first-order kinetic model (R2 ≥ 0.988), while the favorable quasi-second-order kinetic fit (R2 ≥ 0.982) reflects the synergistic effect of physical diffusion and ion-imprinting chemistry, reaching equilibrium within 120 min. Thermodynamic parameters (ΔH0 = 12.51 kJ·mol−1, ΔS0 = 101.19 J·mol−1·K−1, ΔG0 < 0) confirmed endothermic, entropy-increasing, spontaneous adsorption. In multicomponent systems, Pb (II)-IIP@MDS showed distinct Pb (II) selectivity. It retained 80.3% adsorption efficiency after eight cycles. This work provides a promising strategy for fabricating low-cost, high-performance Pb (II) adsorbents, and Pb (II)-IIP@MDS stands as a practical candidate for the remediation of Pb (II)-contaminated wastewater. Full article
(This article belongs to the Special Issue Polymers for Environmental Applications)
Show Figures

Figure 1

22 pages, 4456 KB  
Article
Allosteric Conformational Locking of Sestrin2 by Leucine: An Integrated Computational Analysis of Branched-Chain Amino Acid Recognition and Specificity
by Muhammad Ammar Zahid, Abbas Khan, Mona A. Sawali, Osama Aboubakr Mohamed, Ahmed Mohammad Gharaibeh and Abdelali Agouni
Molecules 2025, 30(24), 4791; https://doi.org/10.3390/molecules30244791 - 16 Dec 2025
Viewed by 691
Abstract
Sestrin2 (SESN2) is a highly conserved stress-inducible protein that serves as a central hub for integrating cellular responses to nutrient availability, oxidative stress, and endoplasmic reticulum (ER) stress. A key function of SESN2 is its role as a direct sensor for the branched-chain [...] Read more.
Sestrin2 (SESN2) is a highly conserved stress-inducible protein that serves as a central hub for integrating cellular responses to nutrient availability, oxidative stress, and endoplasmic reticulum (ER) stress. A key function of SESN2 is its role as a direct sensor for the branched-chain amino acid (BCAA) leucine, which modulates the activity of the mechanistic target of rapamycin complex 1 (mTORC1), a master regulator of cell growth and metabolism. While the functional link between leucine and SESN2 is well-established, the precise molecular determinants that confer its high specificity for leucine over other BCAAs, such as isoleucine and valine, remain poorly understood. This study employs an integrated computational approach, spanning atomic interactions to global protein dynamics, combining molecular docking, extensive all-atom molecular dynamics (MD) simulations, and binding free energy calculations, to elucidate the structural and dynamic basis of BCAA-SESN2 recognition. Our thermodynamic analysis reveals a distinct binding affinity hierarchy (Leucine > Isoleucine > Valine), which is primarily driven by superior van der Waals interactions and the shape complementarity of leucine’s isobutyl side chain within the protein’s hydrophobic pocket. Critically, a quantitative analysis of the conformational ensemble reveals that leucine induces a dramatic collapse of the protein’s structural heterogeneity. This “conformational locking” mechanism funnels the flexible, high-entropy unbound protein—which samples 35 distinct conformations—into a sharply restricted ensemble of just 9 stable states. This four-fold reduction in conformational freedom is accompanied by a kinetic trapping effect, which significantly lowers the rate of transitions between states. This process of conformational selection stabilizes a well-defined, signaling-competent structure, providing a comprehensive, atom-to-global-scale model of SESN2’s function. In the context of these findings, this work provides a critical framework for understanding SESN2’s complex role in disease and offers a clear rationale for the design of next-generation allosteric therapeutics. Full article
Show Figures

Graphical abstract

12 pages, 1829 KB  
Article
Molecular and Thermodynamic Insights into the Enthalpy-Entropy Shift Governing HILIC Retention of Labelled Dextrans
by Matjaž Grčman, Črtomir Podlipnik, Matevž Pompe and Drago Kočar
Molecules 2025, 30(24), 4711; https://doi.org/10.3390/molecules30244711 - 9 Dec 2025
Viewed by 635
Abstract
Hydrophilic interaction liquid chromatography (HILIC) is widely used for the analysis of glycans and oligosaccharides, yet the molecular basis of retention remains incompletely understood. In this study, we investigated dextran ladders labelled with 2-aminobenzamide (2-AB) and Rapifluor-MS™ (Waters, Milford, MA, USA) across a [...] Read more.
Hydrophilic interaction liquid chromatography (HILIC) is widely used for the analysis of glycans and oligosaccharides, yet the molecular basis of retention remains incompletely understood. In this study, we investigated dextran ladders labelled with 2-aminobenzamide (2-AB) and Rapifluor-MS™ (Waters, Milford, MA, USA) across a wide range of degrees of polymerization (DP 2–15), temperature conditions (10 °C to 70 °C), and gradient programs using a Acquity™ Premier Glycan BEH Amide column (Bridged Ethylene Hybrid, Waters, Milford, MA, USA). Van’t Hoff analysis revealed distinct enthalpic and entropic contributions to retention, allowing identification of a mechanistic transition from enthalpy-dominated docking interactions at low DP to entropy-driven dynamic adsorption at higher DP. This transition occurred reproducibly between DP 4–6, depending on the fluorescent label, while gradient steepness primarily influenced the location of the minimum enthalpy. Molecular dynamics simulations provided additional evidence, showing increased conformational flexibility and end-to-end distance variability for longer oligomers. This finding is consistent with entropy-dominated adsorption accompanied by displacement of structured interfacial water. Together, these results establish a molecular-level framework linking retention thermodynamics, conformational behavior, and solvation effects, thereby advancing our mechanistic understanding of glycan separation in HILIC. Full article
Show Figures

Graphical abstract

28 pages, 7941 KB  
Article
Decoding GuaB: Machine Learning-Powered Discovery of Enzyme Inhibitors Against the Superbug Acinetobacter baumannii
by Mohammad Abdullah Aljasir and Sajjad Ahmad
Pharmaceuticals 2025, 18(12), 1842; https://doi.org/10.3390/ph18121842 - 2 Dec 2025
Cited by 1 | Viewed by 847
Abstract
Background/Objectives: GuaB, which is known as inosine 5′-phosphate dehydrogenase (IMPDH), is an enzymatic target involved in the de novo guanine biosynthetic pathway of the multidrug-resistant (MDR) Acinetobacter baumannii. GuaB has emerged as a potential therapeutic target to cope with increasing antibiotic resistance. [...] Read more.
Background/Objectives: GuaB, which is known as inosine 5′-phosphate dehydrogenase (IMPDH), is an enzymatic target involved in the de novo guanine biosynthetic pathway of the multidrug-resistant (MDR) Acinetobacter baumannii. GuaB has emerged as a potential therapeutic target to cope with increasing antibiotic resistance. Here, we used machine learning-based virtual screening as a verification technique to find potential inhibitors possessing different chemical scaffolds, using structure-based drug design as a discovery platform. Methods: Four machine learning models, built based on chemical fingerprint data, were trained, and the best models were used for virtual screening of the ChEMBL library, which covers 153 active molecules. Molecular dynamics (MD) simulations of 200 ns were carried out for all three compounds in order to explain conformational changes, evaluate stability, and provide validation of the docking results. Post-simulation analyses include principal component analysis (PCA), bond analysis, free-energy landscape (FEL), dynamic cross-correlation matrix (DCCM), radial distribution function (RDF), salt-bridge identification, and secondary-structure profiling, etc. Results: For molecular docking, the screened compounds were used against the GuaB protein to achieve proper docked conformation. Upon visual examination of the best-docked compounds, three leads (lead-1, lead-2, and lead-3) were found to have better interaction with the GuaB protein in comparison to the control. The mean RMSD scores between the three leads and the control were between 2.54 and 2.89 Å. In addition, the three leads as well as the control were characterized for pharmacokinetic features. All three leads met Lipinski’s Rule 5 and were thus drug-like. PCA and FEL analyses showed that lead-2 exhibited improved conformational stability, identified as deeper energy minima, whereas RDF and DCCM analyses revealed that lead-2 and lead-3 exhibited strong local structuring and concerted dynamics. In addition, lead-2 displayed a very rich hydrogen-bonding network with a total of 460 frames possessing such interactions, which is the highest among the complexes investigated here. Based on entropy calculations and the maximum entropy method of gamma–gram, lead-1 proved to be the most stable one with the lowest binding free-energy. Conclusions: This study provides an integrated machine learning-based virtual screening pipeline for the identification of new scaffolds to moderate infections associated with AMR; however, in vitro validation is still required to assess the efficacy of such compounds. Full article
(This article belongs to the Special Issue Application of Computer Simulation in Drug Design)
Show Figures

Figure 1

31 pages, 7049 KB  
Article
Objective Emotion Assessment Using a Triple Attention Network for an EEG-Based Brain–Computer Interface
by Lihua Zhang, Xin Zhang, Xiu Zhang, Changyi Yu and Xuguang Liu
Brain Sci. 2025, 15(11), 1167; https://doi.org/10.3390/brainsci15111167 - 29 Oct 2025
Cited by 1 | Viewed by 1098
Abstract
Background: The assessment of emotion recognition holds growing significance in research on the brain–computer interface and human–computer interaction. Among diverse physiological signals, electroencephalography (EEG) occupies a pivotal position in affective computing due to its exceptional temporal resolution and non-invasive acquisition. However, EEG signals [...] Read more.
Background: The assessment of emotion recognition holds growing significance in research on the brain–computer interface and human–computer interaction. Among diverse physiological signals, electroencephalography (EEG) occupies a pivotal position in affective computing due to its exceptional temporal resolution and non-invasive acquisition. However, EEG signals are inherently complex, characterized by substantial noise contamination and high variability, posing considerable challenges to accurate assessment. Methods: To tackle these challenges, we propose a Triple Attention Network (TANet), a triple-attention EEG emotion recognition framework that integrates Conformer, Convolutional Block Attention Module (CBAM), and Mutual Cross-Modal Attention (MCA). The Conformer component captures temporal feature dependencies, CBAM refines spatial channel representations, and MCA performs cross-modal fusion of differential entropy and power spectral density features. Results: We evaluated TANet on two benchmark EEG emotion datasets, DEAP and SEED. On SEED, using a subject-specific cross-validation protocol, the model reached an average accuracy of 98.51 ± 1.40%. On DEAP, we deliberately adopted a segment-level splitting paradigm—in line with influential state-of-the-art methods—to ensure a direct and fair comparison of model architecture under an identical evaluation protocol. This approach, designed specifically to assess fine-grained within-trial pattern discrimination rather than cross-subject generalization, yielded accuracies of 99.69 ± 0.15% and 99.67 ± 0.13% for the valence and arousal dimensions, respectively. Compared with existing benchmark approaches under similar evaluation protocols, TANet delivers substantially better results, underscoring the strong complementary effects of its attention mechanisms in improving EEG-based emotion recognition performance. Conclusions: This work provides both theoretical insights into multi-dimensional attention for physiological signal processing and practical guidance for developing high-performance, robust EEG emotion assessment systems. Full article
(This article belongs to the Section Neurotechnology and Neuroimaging)
Show Figures

Figure 1

14 pages, 957 KB  
Article
TECP: Token-Entropy Conformal Prediction for LLMs
by Beining Xu and Yongming Lu
Mathematics 2025, 13(20), 3351; https://doi.org/10.3390/math13203351 - 21 Oct 2025
Viewed by 3384
Abstract
Uncertainty quantification (UQ) for open-ended language generation remains a critical yet underexplored challenge, particularly in settings where token-level log-probabilities are available during decoding. We present Token-Entropy Conformal Prediction (TECP), which treats a log-probability-based token-entropy statistic as a nonconformity score and integrates it [...] Read more.
Uncertainty quantification (UQ) for open-ended language generation remains a critical yet underexplored challenge, particularly in settings where token-level log-probabilities are available during decoding. We present Token-Entropy Conformal Prediction (TECP), which treats a log-probability-based token-entropy statistic as a nonconformity score and integrates it with split conformal prediction to construct prediction sets with finite-sample coverage guarantees. We work in a white-box regime in which per-token log-probabilities are accessible during decoding. TECP estimates episodic uncertainty from the token-entropy structure of sampled generations and calibrates thresholds via conformal quantiles to ensure provable error control. Empirical evaluations across six large language models and two QA benchmarks (CoQA and TriviaQA) show that TECP consistently achieves reliable coverage and compact prediction sets, outperforming prior self-UQ methods. These results provide a principled and efficient solution for trustworthy generation in white-box, log-probability-accessible LLM settings. Full article
(This article belongs to the Topic Challenges and Solutions in Large Language Models)
Show Figures

Figure 1

11 pages, 1765 KB  
Article
Viscosity Analysis of Electron-Beam Degraded Gellan in Dilute Aqueous Solution
by Fathi Elashhab, Lobna Sheha, Nada Elzawi and Abdelsallam E. A. Youssef
Physchem 2025, 5(4), 40; https://doi.org/10.3390/physchem5040040 - 30 Sep 2025
Viewed by 943
Abstract
Gellan gum (Gellan), a versatile polysaccharide applied in gel formation and prebiotic formulations, is often processed to tailor its molecular properties. Previous studies employed gamma irradiation and chemical hydrolysis, though without addressing systematic scaling behavior. This study investigates the structural and conformational modifications [...] Read more.
Gellan gum (Gellan), a versatile polysaccharide applied in gel formation and prebiotic formulations, is often processed to tailor its molecular properties. Previous studies employed gamma irradiation and chemical hydrolysis, though without addressing systematic scaling behavior. This study investigates the structural and conformational modifications of Gellan in dilute aqueous salt solutions using a safer and eco-friendly approach: atmospheric low-dose electron beam (e-beam) degradation coupled with viscosity analysis. Native and E-beam-treated Gellan samples (0.05 g/cm3 in 0.1 M KCl) were examined by relative viscosity at varying temperatures, with intrinsic viscosity and molar mass determined via Solomon–Ciuta and Mark–Houwink relations. Molar mass degradation followed first-order kinetics, yielding rate constants and degradation lifetimes. Structural parameters, including radius of gyration and second virial coefficient, produced scaling coefficients of 0.62 and 0.15, consistent with perturbed coil conformations in a good solvent. The shape factor confirmed preservation of an ideal random coil structure despite irradiation. Conformational flexibility was further analyzed using theoretical models. Transition state theory (TST) revealed that e-beam radiation lowered molar mass and activation energy but raised activation entropy, implying reduced flexibility alongside enhanced solvent interactions. The freely rotating chain (FRC) model estimated end-to-end distance (Rθ) and characteristic ratio (C), while the worm-like chain (WLC) model quantified persistence length (lp). Results indicated decreased Rθ, increased lp, and largely unchanged C, suggesting diminished chain flexibility without significant deviation from ideal coil behavior. Overall, this work provides new insights into Gellan’s scaling laws and flexibility under aerobic low-dose E-beam irradiation, with relevance for bioactive polysaccharide applications. Full article
(This article belongs to the Section Theoretical and Computational Chemistry)
Show Figures

Figure 1

Back to TopTop