Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,938)

Search Parameters:
Keywords = general theory of information

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
30 pages, 746 KB  
Article
From the Visible to the Invisible: On the Phenomenal Gradient of Appearance
by Baingio Pinna, Daniele Porcheddu and Jurģis Šķilters
Brain Sci. 2026, 16(1), 114; https://doi.org/10.3390/brainsci16010114 - 21 Jan 2026
Viewed by 85
Abstract
Background: By exploring the principles of Gestalt psychology, the neural mechanisms of perception, and computational models, scientists aim to unravel the complex processes that enable us to perceive a coherent and organized world. This multidisciplinary approach continues to advance our understanding of [...] Read more.
Background: By exploring the principles of Gestalt psychology, the neural mechanisms of perception, and computational models, scientists aim to unravel the complex processes that enable us to perceive a coherent and organized world. This multidisciplinary approach continues to advance our understanding of how the brain constructs a perceptual world from sensory inputs. Objectives and Methods: This study investigates the nature of visual perception through an experimental paradigm and method based on a comparative analysis of human and artificial intelligence (AI) responses to a series of modified square images. We introduce the concept of a “phenomenal gradient” in human visual perception, where different attributes of an object are organized syntactically and hierarchically in terms of their perceptual salience. Results: Our findings reveal that human visual processing involves complex mechanisms including shape prioritization, causal inference, amodal completion, and the perception of visible invisibles. In contrast, AI responses, while geometrically precise, lack these sophisticated interpretative capabilities. These differences highlight the richness of human visual cognition and the current limitations of model-generated descriptions in capturing causal, completion-based, and context-dependent inferences. The present work introduces the notion of a ‘phenomenal gradient’ as a descriptive framework and provides an initial comparative analysis that motivates testable hypotheses for future behavioral and computational studies, rather than direct claims about improving AI systems. Conclusions: By bridging phenomenology, information theory, and cognitive science, this research challenges existing paradigms and suggests a more integrated approach to studying visual consciousness. Full article
Show Figures

Figure 1

13 pages, 2357 KB  
Article
A Prevention-Focused Geospatial Epidemiology Framework for Identifying Multilevel Vulnerability Across Diverse Settings
by Cindy Ogolla Jean-Baptiste
Healthcare 2026, 14(2), 261; https://doi.org/10.3390/healthcare14020261 - 21 Jan 2026
Viewed by 68
Abstract
Background/Objectives: Geographic Information Systems (GIS) offer essential capabilities for identifying spatial concentrations of vulnerability and strengthening context-aware prevention strategies. This manuscript describes a geospatial architecture designed to generate anticipatory, place-based risk identification applicable across diverse community and institutional environments. Interpersonal Violence (IPV), [...] Read more.
Background/Objectives: Geographic Information Systems (GIS) offer essential capabilities for identifying spatial concentrations of vulnerability and strengthening context-aware prevention strategies. This manuscript describes a geospatial architecture designed to generate anticipatory, place-based risk identification applicable across diverse community and institutional environments. Interpersonal Violence (IPV), one of several preventable harms that benefit from this spatially informed analysis, remains a critical public health challenge shaped by structural, ecological, and situational factors. Methods: The conceptual framework presented integrates de-identified surveillance data, ecological indicators, environmental and temporal dynamics into a unified spatial epidemiological model. Multilevel data layers are geocoded, spatially matched, and analyzed using clustering (e.g., Getis-Ord Gi*), spatial dependence metrics (e.g., Moran’s I), and contextual modeling to support anticipatory identification of elevated vulnerability. Framework Outputs: The model is designed to identify spatial clustering, mobility-linked risk patterns, and emerging escalation zones using neighborhood disadvantage, built-environment factors, and situational markers. Outputs are intended to support both clinical decision-making (e.g., geocoded trauma screening, and context-aware discharge planning), and community-level prevention (e.g., targeted environmental interventions and cross-sector resource coordination). Conclusions: This framework synthesizes behavioral theory, spatial epidemiology, and prevention science into an integrative architecture for coordinated public health response. As a conceptual foundation for future empirical research, it advances the development of more dynamic, spatially informed, and equity-focused prevention systems. Full article
Show Figures

Graphical abstract

42 pages, 1665 KB  
Article
What Is a Pattern in Statistical Mechanics? Formalizing Structure and Patterns in One-Dimensional Spin Lattice Models with Computational Mechanics
by Omar Aguilar
Entropy 2026, 28(1), 123; https://doi.org/10.3390/e28010123 - 20 Jan 2026
Viewed by 209
Abstract
This work formalizes the notions of structure and pattern for three distinct one-dimensional spin-lattice models (finite-range Ising, solid-on-solid, and three-body), using information- and computation-theoretic methods. We begin by presenting a novel derivation of the Boltzmann distribution for finite one-dimensional spin configurations embedded in [...] Read more.
This work formalizes the notions of structure and pattern for three distinct one-dimensional spin-lattice models (finite-range Ising, solid-on-solid, and three-body), using information- and computation-theoretic methods. We begin by presenting a novel derivation of the Boltzmann distribution for finite one-dimensional spin configurations embedded in infinite ones. We next recast this distribution as a stochastic process, thereby enabling us to analyze each spin-lattice model within the theory of computational mechanics. In this framework, the process’s structure is quantified by excess entropy E (predictable information) and statistical complexity Cμ (stored information), and the process’s structure-generating mechanism is specified by its ϵ-machine. To assess compatibility with statistical mechanics, we compare the configurations jointly determined by the information measures and ϵ-machines to typical configurations drawn from the Boltzmann distribution, and we find agreement. We also include a self-contained primer on computational mechanics and provide code implementing the information measures and spin-model distributions. Full article
(This article belongs to the Special Issue Ising Model—100 Years Old and Still Attractive)
Show Figures

Figure 1

11 pages, 214 KB  
Commentary
Persistent Traumatic Stress Exposure: Rethinking PTSD for Frontline Workers
by Nicola Cogan
Healthcare 2026, 14(2), 255; https://doi.org/10.3390/healthcare14020255 - 20 Jan 2026
Viewed by 133
Abstract
Frontline workers across health, emergency, and social care sectors are repeatedly exposed to distressing events and chronic stressors as part of their occupational roles. Unlike single-event trauma, these cumulative exposures accrue over time, generating persistent psychological and physiological strain. Traditional diagnostic frameworks, particularly [...] Read more.
Frontline workers across health, emergency, and social care sectors are repeatedly exposed to distressing events and chronic stressors as part of their occupational roles. Unlike single-event trauma, these cumulative exposures accrue over time, generating persistent psychological and physiological strain. Traditional diagnostic frameworks, particularly post-traumatic stress disorder (PTSD), were not designed to capture the layered and ongoing nature of this occupational trauma. This commentary introduces the concept of Persistent Traumatic Stress Exposure (PTSE), a framework that reframes trauma among frontline workers as an exposure arising from organisational and systemic conditions rather than solely an individual disorder. It aims to reorient understanding, responsibility, and intervention from a purely clinical lens toward systems, cultures, and organisational duties of care. PTSE is presented as an integrative paradigm informed by contemporary theory and evidence on trauma, moral injury, organisational stress, and trauma-informed systems. The framework synthesises findings from health, emergency, and social care settings, illustrating how repeated exposure, ethical conflict, and institutional pressures contribute to cumulative psychological harm. PTSE highlights that psychological injury may build across shifts, careers, and lifetimes, requiring preventive, real-time, and sustained responses. The framework emphasises that effective support is dependent on both organisational readiness, the structural conditions that enable trauma-informed work, and organisational preparedness, the practical capability to enact safe, predictable, and stigma-free responses to trauma exposure. PTSE challenges prevailing stigma by framing trauma as a predictable occupational hazard rather than a personal weakness. It aligns with modern occupational health perspectives by advocating for systems that strengthen psychological safety, leadership capability and access to support. By adopting PTSE, organisations can shift from reactive treatment models toward proactive cultural and structural protection, honouring the lived realities of frontline workers and promoting long-term wellbeing and resilience. Full article
22 pages, 405 KB  
Article
A Cointegrated Ising Spin Model for Asynchronously Traded Futures Contracts: Spread Trading with Crude Oil Futures
by Kostas Giannopoulos
J. Risk Financial Manag. 2026, 19(1), 79; https://doi.org/10.3390/jrfm19010079 - 19 Jan 2026
Viewed by 145
Abstract
Pairs trading via futures calendar spreads offers a robust market-neutral approach to exploiting transient mispricings, yet real-time implementation is hindered by asynchronous trading. This paper introduces a Cointegrated Ising Spin Model, CISM, for real-time signal generation in high-frequency spread trading. The model [...] Read more.
Pairs trading via futures calendar spreads offers a robust market-neutral approach to exploiting transient mispricings, yet real-time implementation is hindered by asynchronous trading. This paper introduces a Cointegrated Ising Spin Model, CISM, for real-time signal generation in high-frequency spread trading. The model links the macro-level equilibrium of cointegration with micro-level agent interactions, representing prices as magnetizations in an agent-based system. A novel Δ-weighted arbitrage force dynamically adjusts agents’ corrective behavior to account for information staleness. Calibrated on tick-by-tick Brent crude oil futures, the model produces a time-varying probability of spread reversion, enabling probabilistic trading decisions. Backtesting demonstrates a 74.65% success rate, confirming the CISM’s ability to generate stable, data-driven arbitrage signals in asynchronous environments. The model bridges macro-level cointegration with micro-level agent interactions, representing prices as magnetizations within an agent-based Ising system. A novel feature is a Δ-weighted arbitrage force, where the corrective pressure applied by agents in response to the standard Error Correction Term is dynamically amplified based on information staleness. The model is calibrated on historical tick data and designed to operate in real time, continuously updating its probability-based trading signals as new quotes arrive. The model is framed within the context of Discrete Choice Theory, treating agent transitions as utility-maximizing decisions within a Vector Logistic Autoregressive (VLAR) framework. Full article
(This article belongs to the Special Issue Financial Innovations and Derivatives)
Show Figures

Figure 1

22 pages, 1120 KB  
Review
Beyond Cognitive Load Theory: Why Learning Needs More than Memory Management
by Andrew Sortwell, Evgenia Gkintoni, Jesús Díaz-García, Peter Ellerton, Ricardo Ferraz and Gregory Hine
Brain Sci. 2026, 16(1), 109; https://doi.org/10.3390/brainsci16010109 - 19 Jan 2026
Viewed by 192
Abstract
Background: The role of cognitive load theory (CLT) in understanding effective pedagogy has received increased attention in the fields of education and psychology in recent years. A considerable amount of literature has been published on the CLT construct as foundational guidance for instructional [...] Read more.
Background: The role of cognitive load theory (CLT) in understanding effective pedagogy has received increased attention in the fields of education and psychology in recent years. A considerable amount of literature has been published on the CLT construct as foundational guidance for instructional design by focusing on managing cognitive load in working memory to enhance learning outcomes. However, recent neuroscientific findings and practical critiques suggest that CLT’s emphasis on content-focused instruction and cognitive efficiency may overlook the complexity of human learning. Methods: This conceptual paper synthesises evidence from cognitive science, developmental psychology, neuroscience, health sciences and educational research to examine the scope conditions and limitations of CLT when applied as a general framework for K–12 learning. One of the major theoretical issues identified is the lack of consideration for the broad set of interpersonal and self-management skills, creating potential limitations for real-world educational contexts, where social-emotional and self-regulatory abilities are as crucial as cognitive competencies. Results: As a result of the critique, this paper introduces the Neurodevelopmental Informed Holistic Learning and Development Framework as a neuroscience-informed construct that integrates cognitive, emotional, and interpersonal dimensions essential for effective learning. Conclusions: In recognising the limitations of CLT, the paper offers practitioners contemporary, neurodevelopmentally informed insights that extend beyond cognitive efficiency alone and better reflect the multidimensional nature of real-world learning. Full article
(This article belongs to the Special Issue Neuroeducation: Bridging Cognitive Science and Classroom Practice)
Show Figures

Graphical abstract

16 pages, 961 KB  
Article
“What Kind of Physical Education Lesson Do I Envision?”: A Theoretically Grounded Analysis Based on Teacher and Student Perspectives
by Rahmi Yıldız and Oğuzhan Çalı
Sustainability 2026, 18(2), 887; https://doi.org/10.3390/su18020887 - 15 Jan 2026
Viewed by 201
Abstract
Physical Education (PE) is envisioned differently across generations, yet these perspectives can be aligned with contemporary curriculum reform. Guided by Strauss–Howe generational theory and Turkey’s 2025 Türkiye Century Education Model, this qualitative study examines lesson design preferences among teachers (Generations X and Y) [...] Read more.
Physical Education (PE) is envisioned differently across generations, yet these perspectives can be aligned with contemporary curriculum reform. Guided by Strauss–Howe generational theory and Turkey’s 2025 Türkiye Century Education Model, this qualitative study examines lesson design preferences among teachers (Generations X and Y) and students (Generation Z). Thirty-two purposively selected participants from provinces identified by Ministry success indicators completed semi-structured interviews. Data were analysed through directed content analysis alongside thematic analysis. Findings indicate convergence on gamified, technology-supported, and individualized PE with process-oriented, fair assessment. Teachers endorse this vision while foregrounding constraints associated with infrastructure, time, space, and class size. The emergent profile mirrors the 2025 curriculum’s virtue–value–action orientation and its literacy and socio-emotional competencies. Four priorities translate the framework into implementable design: (i) multi-evidence assessment that captures performance and growth, (ii) systematic differentiation via station-based and modular activity designs, (iii) short feedback cycles coupled with structured student-voice mechanisms, and (iv) strengthened school digital infrastructure with targeted professional learning to build digital pedagogical competence. Overall, the study articulates a generationally informed, feasible architecture for PE that bears implications for curriculum development, teacher education, and school improvement. Full article
Show Figures

Figure 1

36 pages, 6828 KB  
Article
Discriminating Music Sequences Method for Music Therapy—DiMuSe
by Emil A. Canciu, Florin Munteanu, Valentin Muntean and Dorin-Mircea Popovici
Appl. Sci. 2026, 16(2), 851; https://doi.org/10.3390/app16020851 - 14 Jan 2026
Viewed by 110
Abstract
The purpose of this research was to investigate whether music empirically associated with therapeutic effects contains intrinsic informational structures that differentiate it from other sound sequences. Drawing on ontology, phenomenology, nonlinear dynamics, and complex systems theory, we hypothesize that therapeutic relevance may be [...] Read more.
The purpose of this research was to investigate whether music empirically associated with therapeutic effects contains intrinsic informational structures that differentiate it from other sound sequences. Drawing on ontology, phenomenology, nonlinear dynamics, and complex systems theory, we hypothesize that therapeutic relevance may be linked to persistent structural patterns embedded in musical signals rather than to stylistic or genre-related attributes. This paper introduces the Discriminating Music Sequences (DiMuSes) method, an unsupervised, structure-oriented analytical framework designed to detect such patterns. The method applies 24 scalar evaluators derived from statistics, fractal geometry, nonlinear physics, and complex systems, transforming sound sequences into multidimensional vectors that characterize their global temporal organization. Principal Component Analysis (PCA) reduces this feature space to three dominant components (PC1–PC3), enabling visualization and comparison in a reduced informational space. Unsupervised k-Means clustering is subsequently applied in the PCA space to identify groups of structurally similar sound sequences, with cluster quality evaluated using Silhouette and Davies–Bouldin indices. Beyond clustering, DiMuSe implements ranking procedures based on relative positions in the PCA space, including distance to cluster centroids, inter-item proximity, and stability across clustering configurations, allowing melodies to be ordered according to their structural proximity to the therapeutic cluster. The method was first validated using synthetically generated nonlinear signals with known properties, confirming its capacity to discriminate structured time series. It was then applied to a dataset of 39 music and sound sequences spanning therapeutic, classical, folk, religious, vocal, natural, and noise categories. The results show that therapeutic music consistently forms a compact and well-separated cluster and ranks highly in structural proximity measures, suggesting shared informational characteristics. Notably, pink noise and ocean sounds also cluster near therapeutic music, aligning with independent evidence of their regulatory and relaxation effects. DiMuSe-derived rankings were consistent with two independent studies that identified the same musical pieces as highly therapeutic.The present research remains at a theoretical stage. Our method has not yet been tested in clinical or experimental therapeutic settings and does not account for individual preference, cultural background, or personal music history, all of which strongly influence therapeutic outcomes. Consequently, DiMuSe does not claim to predict individual efficacy but rather to identify structural potential at the signal level. Future work will focus on clinical validation, integration of biometric feedback, and the development of personalized extensions that combine intrinsic informational structure with listener-specific response data. Full article
21 pages, 495 KB  
Article
Does Earning Management Matter for the Tax Avoidance and Investment Efficiency Nexus? Evidence from an Emerging Market
by Ingi Hassan Sharaf, Racha El-Moslemany, Tamer Elswah, Abdullah Almutairi and Samir Ibrahim Abdelazim
J. Risk Financial Manag. 2026, 19(1), 67; https://doi.org/10.3390/jrfm19010067 - 14 Jan 2026
Viewed by 228
Abstract
This study examines the impact of tax avoidance practices on investment efficiency in Egypt, with particular emphasis on the moderating role of earnings management by exploring whether these tactics reflect managerial opportunism or serve as a mechanism to ease financial constraints. We employ [...] Read more.
This study examines the impact of tax avoidance practices on investment efficiency in Egypt, with particular emphasis on the moderating role of earnings management by exploring whether these tactics reflect managerial opportunism or serve as a mechanism to ease financial constraints. We employ panel data regression to analyze a sample of 58 non-financial firms listed on the Egyptian Exchange (EGX) over the period 2017–2024, yielding 464 firm-year observations. Data are collected from official corporate websites, EGX, and Egypt for Information Dissemination (EGID). Grounded in agency theory, signaling theory, and pecking order theory, this study reveals how conflicts of interest and information asymmetry between managers and stakeholders lead to managerial opportunism. The findings show that tax avoidance undermines the investment efficiency in the Egyptian market. Earnings manipulation further intensified this effect due to the financial statements’ opacity. A closer examination reveals that earnings management exacerbates overinvestment by masking managerial decisions. Conversely, for financially constrained firms with a tendency to underinvest, tax avoidance and earnings management may contribute to improved efficiency by generating internal liquidity and alleviating external financing constraints. These results provide valuable insights for regulators, highlighting that policy should be directed against managerial opportunism and improving transparency, instead of focusing solely on curbing tax avoidance. From an investor perspective, they should closely monitor and understand the tax-planning strategies to ensure they enhance the firm’s value. Full article
(This article belongs to the Special Issue Tax Avoidance and Earnings Management)
Show Figures

Figure 1

43 pages, 548 KB  
Review
Minimum Spacetime Length and the Thermodynamics of Spacetime
by Valeria Rossi, Sergio L. Cacciatori and Alessandro Pesci
Entropy 2026, 28(1), 97; https://doi.org/10.3390/e28010097 - 13 Jan 2026
Viewed by 129
Abstract
Theories of emergent gravity have established a deep connection between entropy and the geometry of spacetime by looking at the latter through a thermodynamic lens. In this framework, the macroscopic properties of gravity arise in a statistical way from an effective small-scale discrete [...] Read more.
Theories of emergent gravity have established a deep connection between entropy and the geometry of spacetime by looking at the latter through a thermodynamic lens. In this framework, the macroscopic properties of gravity arise in a statistical way from an effective small-scale discrete structure of spacetime and its information content. In this review, we begin by outlining how theories of quantum gravity imply the existence of a minimum length of spacetime as a general feature. We then describe how such a structure can be implemented in a way that is independent from the details of the quantum fluctuations of spacetime via a bi-tensorial quantum metric qαβ(x,x) that yields a finite geodesic distance in the coincidence limit xx. Finally, we discuss how the entropy encoded by these microscopic degrees of freedom can give rise to the field equations for gravity through a thermodynamic variational principle. Full article
(This article belongs to the Special Issue Time in Quantum Mechanics)
21 pages, 3957 KB  
Article
Aero-Engine Fault Diagnosis Method Based on DANN and Feature Interaction
by Wei Huo, Baoshan Zhang and Feng Zhou
Machines 2026, 14(1), 96; https://doi.org/10.3390/machines14010096 - 13 Jan 2026
Viewed by 100
Abstract
The fault data of the aero-engine source domain are constrained by factors such as variable operating conditions, structural coupling, fault correlations, and information attenuation. Consequently, the obtained fault features often exhibit localities. This leads to significant discrepancies in fault feature distributions between the [...] Read more.
The fault data of the aero-engine source domain are constrained by factors such as variable operating conditions, structural coupling, fault correlations, and information attenuation. Consequently, the obtained fault features often exhibit localities. This leads to significant discrepancies in fault feature distributions between the source and target domains, resulting in poor generalization capabilities and insufficient stability in aero-engine fault diagnosis. To address these issues, an aero-engine fault diagnosis method based on Domain-Adversarial Neural Network (DANN) and Feature Interaction (FI-DANN) is proposed. Firstly, a fault diagnosis network architecture is designed based on traditional DANN by incorporating a feature interaction module into its feature extractor. Secondly, the Kronecker product is employed to fully excavate nonlinear relationships between the features, thereby increasing the number of fault features to obtain higher-dimensional and more accurate fault features. Finally, based on information entropy theory, the number of interacted features is controlled through a weighted combination, ensuring that the retained features possess greater fault information content. This guarantees the strong generalization capability and high stability of the model. The experimental results show that the best fault diagnosis accuracies of Convolutional Neural Network (CNN), traditional DANN, and FI-DANN are 79.64%, 90.00%, and 99.03%, respectively, indicating that the proposed FI-DANN can effectively integrate multi-source fault information and enhance the accuracy, stability, and generalization capability of fault diagnosis models. Full article
Show Figures

Figure 1

31 pages, 13946 KB  
Article
The XLindley Survival Model Under Generalized Progressively Censored Data: Theory, Inference, and Applications
by Ahmed Elshahhat and Refah Alotaibi
Axioms 2026, 15(1), 56; https://doi.org/10.3390/axioms15010056 - 13 Jan 2026
Viewed by 93
Abstract
This paper introduces a novel extension of the classical Lindley distribution, termed the X-Lindley model, obtained by a specific mixture of exponential and Lindley distributions, thereby substantially enriching the distributional flexibility. To enhance its inferential scope, a comprehensive reliability analysis is developed under [...] Read more.
This paper introduces a novel extension of the classical Lindley distribution, termed the X-Lindley model, obtained by a specific mixture of exponential and Lindley distributions, thereby substantially enriching the distributional flexibility. To enhance its inferential scope, a comprehensive reliability analysis is developed under a generalized progressive hybrid censoring scheme, which unifies and extends several traditional censoring mechanisms and allows practitioners to accommodate stringent experimental and cost constraints commonly encountered in reliability and life-testing studies. Within this unified censoring framework, likelihood-based estimation procedures for the model parameters and key reliability characteristics are derived. Fisher information is obtained, enabling the establishment of asymptotic properties of the frequentist estimators, including consistency and normality. A Bayesian inferential paradigm using Markov chain Monte Carlo techniques is proposed by assigning a conjugate gamma prior to the model parameter under the squared error loss, yielding point estimates, highest posterior density credible intervals, and posterior reliability summaries with enhanced interpretability. Extensive Monte Carlo simulations, conducted under a broad range of censoring configurations and assessed using four precision-based performance criteria, demonstrate the stability and efficiency of the proposed estimators. The results reveal low bias, reduced mean squared error, and shorter interval lengths for the XLindley parameter estimates, while maintaining accurate coverage probabilities. The practical relevance of the proposed methodology is further illustrated through two real-life data applications from engineering and physical sciences, where the XLindley model provides a markedly improved fit and more realistic reliability assessment. By integrating an innovative lifetime model with a highly flexible censoring strategy and a dual frequentist–Bayesian inferential framework, this study offers a substantive contribution to modern survival theory. Full article
(This article belongs to the Special Issue Recent Applications of Statistical and Mathematical Models)
Show Figures

Figure 1

23 pages, 838 KB  
Article
Stability for Caputo–Hadamard Fractional Uncertain Differential Equation
by Shida Peng, Zhi Li, Jun Zhang, Yuncong Zhu and Liping Xu
Fractal Fract. 2026, 10(1), 50; https://doi.org/10.3390/fractalfract10010050 - 12 Jan 2026
Viewed by 137
Abstract
This paper focuses on the Caputo-Hadamard fractional uncertain differential equations (CH-FUDEs) governed by Liu processes, which combine the Caputo–Hadamard fractional derivative with uncertain differential equations to describe dynamic systems involving memory characteristics and uncertain information. Within the framework of uncertain theory, this Liu [...] Read more.
This paper focuses on the Caputo-Hadamard fractional uncertain differential equations (CH-FUDEs) governed by Liu processes, which combine the Caputo–Hadamard fractional derivative with uncertain differential equations to describe dynamic systems involving memory characteristics and uncertain information. Within the framework of uncertain theory, this Liu process serves as the counterpart to Brownian motion. We establish some new Bihari type fractional inequalities that are easy to apply in practice and can be considered as a more general tool in some situations. As applications of those inequalities, we establish the well-posedness of a proposed class of equations under specified non-Lipschitz conditions. Building upon this result, we establish the notions of stability in distribution and stability in measure solutions to CH-FUDEs, deriving sufficient conditions to ensure these stability properties. Finally, the theoretical findings are verified through two numerical examples. Full article
(This article belongs to the Section General Mathematics, Analysis)
Show Figures

Figure 1

39 pages, 2940 KB  
Article
Trustworthy AI-IoT for Citizen-Centric Smart Cities: The IMTPS Framework for Intelligent Multimodal Crowd Sensing
by Wei Li, Ke Li, Zixuan Xu, Mengjie Wu, Yang Wu, Yang Xiong, Shijie Huang, Yijie Yin, Yiping Ma and Haitao Zhang
Sensors 2026, 26(2), 500; https://doi.org/10.3390/s26020500 - 12 Jan 2026
Viewed by 256
Abstract
The fusion of Artificial Intelligence and the Internet of Things (AI-IoT, also widely referred to as AIoT) offers transformative potential for smart cities, yet presents a critical challenge: how to process heterogeneous data streams from intelligent sensing—particularly crowd sensing data derived from citizen [...] Read more.
The fusion of Artificial Intelligence and the Internet of Things (AI-IoT, also widely referred to as AIoT) offers transformative potential for smart cities, yet presents a critical challenge: how to process heterogeneous data streams from intelligent sensing—particularly crowd sensing data derived from citizen interactions like text, voice, and system logs—into reliable intelligence for sustainable urban governance. To address this challenge, we introduce the Intelligent Multimodal Ticket Processing System (IMTPS), a novel AI-IoT smart system. Unlike ad hoc solutions, the novelty of IMTPS resides in its theoretically grounded architecture, which orchestrates Information Theory and Game Theory for efficient, verifiable extraction, and employs Causal Inference and Meta-Learning for robust reasoning, thereby synergistically converting noisy, heterogeneous data streams into reliable governance intelligence. This principled design endows IMTPS with four foundational capabilities essential for modern smart city applications: Sustainable and Efficient AI-IoT Operations: Guided by Information Theory, the IMTPS compression module achieves provably efficient semantic-preserving compression, drastically reducing data storage and energy costs. Trustworthy Data Extraction: A Game Theory-based adversarial verification network ensures high reliability in extracting critical information, mitigating the risk of model hallucination in high-stakes citizen services. Robust Multimodal Fusion: The fusion engine leverages Causal Inference to distinguish true causality from spurious correlations, enabling trustworthy integration of complex, multi-source urban data. Adaptive Intelligent System: A Meta-Learning-based retrieval mechanism allows the system to rapidly adapt to new and evolving query patterns, ensuring long-term effectiveness in dynamic urban environments. We validate IMTPS on a large-scale, publicly released benchmark dataset of 14,230 multimodal records. IMTPS demonstrates state-of-the-art performance, achieving a 96.9% reduction in storage footprint and a 47% decrease in critical data extraction errors. By open-sourcing our implementation, we aim to provide a replicable blueprint for building the next generation of trustworthy and sustainable AI-IoT systems for citizen-centric smart cities. Full article
(This article belongs to the Special Issue AI-IoT for New Challenges in Smart Cities)
Show Figures

Figure 1

10 pages, 2429 KB  
Article
The Effect of Coherence on Instruction Following from News Outlets
by Michael O’Sullivan, Conor McCloskey and Louise McHugh
Behav. Sci. 2026, 16(1), 102; https://doi.org/10.3390/bs16010102 - 12 Jan 2026
Viewed by 213
Abstract
Research has shown that preferences exist in following information from coherent sources and that incoherent material can diminish overall trust in sources from readers. In line with relational frame theory, coherence, or “sense-making”, has emerged as an important factor in the process of [...] Read more.
Research has shown that preferences exist in following information from coherent sources and that incoherent material can diminish overall trust in sources from readers. In line with relational frame theory, coherence, or “sense-making”, has emerged as an important factor in the process of rule-following, but some research has been confounded by the degree of extremity used to establish coherence. The present study investigated the role of speaker coherence in rule-following preferences between newspaper outlets. Participants (N = 64) each viewed four news articles that had been published across two anonymized Irish newspaper outlets. Each article was categorized by its level of coherence and level of controversy. Rule-following was measured through the likelihood of participants adopting new habits and following general advice from the news outlet after reading each story. Participants also selected their preferred outlet from which to follow general advice at the end of the study. Results indicated that participants had greater rule-following preferences for coherent outlets, regardless of how controversial the material was perceived to be. Speaker coherence was discussed in terms of its impact on the perceived credibility of media outlets and avenues for reducing polarization. Full article
(This article belongs to the Section Cognition)
Back to TopTop