Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (544)

Search Parameters:
Keywords = reconciliation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
14 pages, 251 KB  
Article
Redeeming Politics: An Appraisal of the Wesleyan Political Theology of Theodore R. Weber in a Time of Christian Nationalism
by Darryl W. Stephens
Religions 2026, 17(5), 508; https://doi.org/10.3390/rel17050508 - 22 Apr 2026
Abstract
Theodore R. Weber, founding member and past president of the Society of Christian Ethics, developed a distinctive political theology based on John Wesley’s soteriology. Weber argued that a fully developed concept of the political image of God, which humans reflect in their social [...] Read more.
Theodore R. Weber, founding member and past president of the Society of Christian Ethics, developed a distinctive political theology based on John Wesley’s soteriology. Weber argued that a fully developed concept of the political image of God, which humans reflect in their social and institutional contexts, would move beyond the individualistic focus of Wesley’s ordo salutis, allowing Wesleyans a shared language to talk about political institutions and social ethics distinct from the order of preservation and order of creation approaches. Weber’s understanding of political sanctification, in which individuals and institutions contribute to God’s redemptive work of reconciliation, is a unique contribution to political theology. Weber argued that politics is part of discipleship but cautioned against attempts to Christianize society. His Wesleyan political theology provides parameters for faithful political action in refuting and working against a Christian Nationalist agenda in the United States of America. This article offers an appraisal of Weber’s Wesleyan political theology through an extensive review of his writings on the topic. The method engages in close textual readings of Weber’s publications, providing a clear synthesis of the major themes in his scholarly writings spanning over four decades. The purpose is to provide a cogent overview of Weber’s perspective and approach so that his legacy of writings may continue to inform new scholarly audiences. Full article
(This article belongs to the Special Issue Critical Issues in Christian Ethics)
23 pages, 3485 KB  
Article
Physical Key Extraction in Galvanic Coupling Communications: Reliability and Security Analysis
by Giacomo Borghini, Stefano Caputo, Anna Vizziello, Pietro Savazzi, Antonio Coviello, Maurizio Magarini, Sara Jayousi and Lorenzo Mucchi
Information 2026, 17(4), 374; https://doi.org/10.3390/info17040374 - 16 Apr 2026
Viewed by 139
Abstract
The evolution toward sixth-generation (6G) networks envisions humans as active nodes within a fully interconnected digital ecosystem, supported by data collected from in-body and on-body sensors. Since many of these devices are not equipped to connect directly to 6G networks, Wireless Body Area [...] Read more.
The evolution toward sixth-generation (6G) networks envisions humans as active nodes within a fully interconnected digital ecosystem, supported by data collected from in-body and on-body sensors. Since many of these devices are not equipped to connect directly to 6G networks, Wireless Body Area Networks (WBANs) serve as an essential intermediate layer. However, conventional radio-frequency technologies face limitations in terms of energy efficiency, security, and data integrity, motivating the adoption of lightweight security mechanisms. Physical Layer Security (PLS), and in particular Physical Key Extraction (PKE), offers a promising solution by enabling legitimate devices to derive shared cryptographic keys from the reciprocal properties of the communication channel. Galvanic coupling (GC) communication has recently emerged as an on-body transmission technology alternative to radio-frequency (RF), which exploits low-power electrical signals propagating through biological tissue. Building on prior feasibility studies, this work proposes a PKE framework tailored to GC channels, integrating a lightweight key reconciliation method, based on Hamming (7,4) error-correction codes, and evaluating system performance through dedicated reliability and security Key Performance Indicators (KPIs). Results reveal a trade-off shaped by electrode placement and channel quantization parameters. Among the ones tested, the optimal configuration is achieved with a 3 cm transverse inter-electrode spacing at both transmitter and receiver, and a 3 cm longitudinal separation between transmitter and receiver, by quantizing the channel impulse response with two quantization bits. While this work focuses on validating the method in controlled conditions in order to establish a reliable study framework, future developments will focus on enhanced reconciliation, privacy amplification, and analysis of the GC channel considering physiological and environmental variations. Full article
(This article belongs to the Special Issue Advances in Wireless Communications Systems, 3rd Edition)
Show Figures

Figure 1

25 pages, 595 KB  
Article
The Socio-Religious Forgiveness and Reconciliation in Desmond Mpilo Tutu, as a Possible Inspiration for the Post-Genocide Rwanda
by Celestin Ntaganira
Religions 2026, 17(4), 474; https://doi.org/10.3390/rel17040474 - 9 Apr 2026
Viewed by 341
Abstract
Rwanda has experienced the tragedy of conflict and hatred based on what was politically created as ethnicity. That bad condition grew in history with the post-colonial leaders and produced the genocide of the Tutsi population in the country in 1994. Currently, there is [...] Read more.
Rwanda has experienced the tragedy of conflict and hatred based on what was politically created as ethnicity. That bad condition grew in history with the post-colonial leaders and produced the genocide of the Tutsi population in the country in 1994. Currently, there is no open violence in Rwanda, but there are some significant elements of socio-religious crisis that are consequences of the recent past, genocide, and war. Therefore, in this article, the effort is made to examine what has been done in Society and the Catholic Church, to which 44% of Rwandans religiously belong, and what the weak points are in the Church and the State’s reconciliation efforts, that could be improved by inspiration through the concept of reconciliation of Desmond Mpilo Tutu. To carry on this research, this study adopts a comparative and hermeneutic method where the various sources on the Rwandan journey in forgiveness and reconciliation are analysed, and then, the forgiveness and reconciliation work of Desmond Mpilo Tutu. The meeting of two contexts shows that both victims and perpetrators need the restoration of their humanity and dignity, but also that there is “no future without forgiveness”. Full article
(This article belongs to the Special Issue Social Justice in Theological Education: Challenges and Opportunities)
Show Figures

Figure 1

33 pages, 430 KB  
Article
The Yamabe Flow Under the Rotational Ansatz of Noncompact (Pseudo-Riemannian) Solitons: Schwarzschild Solitons and Generalized-Schwarzschild Ones
by Orchidea Maria Lecian
Axioms 2026, 15(4), 267; https://doi.org/10.3390/axioms15040267 - 7 Apr 2026
Viewed by 210
Abstract
The present paper is aimed at studying the convergence of the Yamabe flow in the case of noncompact solitons. The more specified example of locally conformally flat noncompact solitons is addressed with the aim to newly analyse the qualities of the Ricci scalar. [...] Read more.
The present paper is aimed at studying the convergence of the Yamabe flow in the case of noncompact solitons. The more specified example of locally conformally flat noncompact solitons is addressed with the aim to newly analyse the qualities of the Ricci scalar. The particular case of noncompact pseudo-Riemannian solitons is studied; moreover, in the instances of Schwarzschild and Generalized-Schwarzschild geometries, rescalings of spherically symmetric weights are performed. For this purpose, new results are achieved as far as the considered structures are concerned. The Myers Theorem is upgraded as the new Myers paradigm of spacetime-dimensional manifolds, where the Einstein Field Equations can now be taken into account. In particular, the Myers Theorems are studied here as far as their new implementation in General Relativity Theory is concerned. As a first important result, the Myers mean curvature is found to coincide with the Ricci scalar in General Relativity Theory, where the 4-position of the observer, from which the 4-velocity 4-vector is calculated from, is taken as that of the observer solidal with the reference frame of the photon. The following results are also of relevance. In more detail, the umbilicity conditions are applied. At a further step, the role of the umbilicity conditions in GR after the Myers Theorems are studied for weighted manifolds and specific new implications of weighted manifolds are developed. The description of the weighted Schwarzschild manifolds and that of the weighted Generalized-Schwarzschild manifolds are newly studied as follows: as a new finding, the Birkhoff Theorem is newly reconciled with the rotational ansatz of the metrised solitons, and the comparison with the previous results about the Brendle non-metrised solitons is accomplished with the outcome stressing the new roles of the new rescalings of the metric tensor with respect to the previous known results of the scaling of the metric tensor of the non-metrised solitons. In the present framework, these procedures allow one to prove the reconciliation of the EFEs with the Yamabe flow. The flow on the tipping lightcones is newly written. The umbilicity condition is studied in General Relativity after the upgrade of the Myers Theorems as far as the sectional curvatures are concerned; as a result, the Calabi–Bernstein description is implemented in General Relativity, as well as the Chen–Yau requirements, and the cases of weighted manifolds are taken into account. More specifically, the equal-time 2-dimensional space surfaces are studied analytically, onto which the weighted General-Relativistic solitons which satisfy the Einstein field equations after the Yamabe flow are projected due to the rotational ansatz. As an accessory introductory result, the class of Wu non-metrised solitons are proven to be discarded in several aspects of the Wu description as the conditions provided after the work of Wu are not compatible with metrisation. Full article
(This article belongs to the Section Hilbert’s Sixth Problem)
17 pages, 673 KB  
Article
Quality of Drug Allergy Documentation in a Resource-Limited Paper-Based Hospital in Pakistan: Audit of Concordance and Completeness
by Akef Obeidat, Athar Ud Din, Muhammad Amir Khan, Amara Asad Khan, Eshal Atif, Muhammad Atif Mazhar, Muhammad Zain Khan and Sadia Qazi
Healthcare 2026, 14(7), 957; https://doi.org/10.3390/healthcare14070957 - 6 Apr 2026
Viewed by 374
Abstract
Background/Objectives: Accurate drug allergy documentation is essential for patient safety; however, documentation quality remains poor worldwide. In resource-limited settings that rely on paper records, allergy information may become fragmented across multiple forms, and evidence on concordance between paper-based documentation systems is limited. This [...] Read more.
Background/Objectives: Accurate drug allergy documentation is essential for patient safety; however, documentation quality remains poor worldwide. In resource-limited settings that rely on paper records, allergy information may become fragmented across multiple forms, and evidence on concordance between paper-based documentation systems is limited. This audit assessed concordance between clinical notes and drug Kardex records, and completeness of drug allergy documentation entries, in a manual hospital system. Methods: This retrospective clinical audit, reported in accordance with SQUIRE 2.0 guidelines, examined 88 randomly selected patient records from 525 consecutive admissions to a general medicine ward in Pakistan during June–July 2024, retrospectively reviewed in August 2024. The audit assessed allergy status documentation in clinical notes and the drug Kardex, evaluated completeness against five internationally recommended elements (drug name, reaction description, severity, date, and treatment), and measured inter-system concordance using McNemar’s test and Cohen’s kappa. Results: Drug allergy status was documented in 25.0% of clinical notes (95% CI: 16.5–35.4%) versus 94.3% of drug Kardex records (95% CI: 87.2–98.1%), representing a 69.3 percentage-point gap (McNemar χ2 = 59.06, p < 0.001). Inter-system agreement was poor (κ = 0.0079; 95% CI: −0.046 to 0.062), with an overall concordance of 28.4%. Discordant pairs showed that undocumented allergy status was far more likely in clinical notes than in the drug Kardex (OR = 62.00). Kardex-only documentation occurred in 62 of 88 patients (70.5%). Among nine patients with documented allergy history in at least one source, none met the five-element completeness standards (0%; 95% CI: 0.0–33.6%). Recorded entries were generic statements such as “drug allergy” or “allergic to antibiotics” without clinically actionable details. Conclusions: Drug allergy documentation showed two major quality failures: poor concordance between parallel paper records and lack of actionable detail in recorded entries. The two systems functioned independently rather than as complementary safety checks, with allergy information often present in the drug Kardex but absent from clinical notes. This Kardex-only failure mode may be a practical target for quality improvement through structured five-element templates, prompts for clinicians to review the drug Kardex, and interdisciplinary allergy-reconciliation workflows. These strategies require prospective evaluation in this setting. Full article
(This article belongs to the Section Healthcare Quality, Patient Safety, and Self-care Management)
Show Figures

Figure 1

33 pages, 515 KB  
Article
From Nonviolence to Reconciliation: The Prophetic Political Ethics of War and Peace
by Harris Sadik Kirazli
Religions 2026, 17(4), 449; https://doi.org/10.3390/rel17040449 - 4 Apr 2026
Viewed by 346
Abstract
This article re-examines Islamic ethics of war and peace by returning to the formative Meccan–Medinan trajectory of the Prophet Muḥammad’s life, where early Islamic moral reasoning developed amid persecution, migration, diplomacy, and armed conflict. Contemporary debates frequently portray Islam either as a tradition [...] Read more.
This article re-examines Islamic ethics of war and peace by returning to the formative Meccan–Medinan trajectory of the Prophet Muḥammad’s life, where early Islamic moral reasoning developed amid persecution, migration, diplomacy, and armed conflict. Contemporary debates frequently portray Islam either as a tradition that sacralizes violence through jihad or as one that reduces peace to purely inward spirituality. Both perspectives obscure the historically grounded ethical discourse that emerged within the early Muslim community. This study argues that the Qurʾān—understood within the Islamic tradition as the authoritative source of ethical guidance—together with prophetic practice articulated a coherent moral framework governing the use of force, the pursuit of peace, and the restoration of social order after conflict. Drawing on Qurʾānic discourse, canonical ḥadīth, classical tafsīr and sīrah literature, and modern scholarship in Islamic studies, religious ethics, and conflict resolution theory, the article reconstructs how early Islamic sources represent the ethical regulation of violence. The analysis identifies a threefold trajectory in prophetic practice: a Meccan phase characterized by nonviolent endurance and moral witness under persecution; a Medinan phase marked by constitutional governance, plural coexistence, and tightly regulated defensive warfare; and a culminating ethic of negotiated peace and post-conflict reconciliation exemplified in the Treaty of Ḥudaybiyyah and the Conquest of Mecca. Taken together, these stages reveal an integrated moral vision in which force is neither celebrated nor treated as a default instrument of political expansion, but permitted only under strict ethical constraints shaped by justice (ʿadl), mercy (raḥma), proportionality, and the protection of communal life. By reconstructing this early prophetic framework, the article demonstrates that Islamic sources contain significant internal resources for limiting violence, regulating warfare, and prioritizing reconciliation. In doing so, it contributes to contemporary scholarship on Islamic ethics and situates the prophetic model within broader global debates on the moral regulation of war, peacebuilding, and post-conflict justice. Full article
(This article belongs to the Special Issue The Ethics of War and Peace: Religious Traditions in Dialogue)
42 pages, 4153 KB  
Article
Hierarchical Reconciliation of Fifty-One Years of Highway–Rail Grade Crossing Data with Verified Multistage Inference
by Raj Bridgelall
Algorithms 2026, 19(4), 282; https://doi.org/10.3390/a19040282 - 3 Apr 2026
Viewed by 239
Abstract
Highway–rail grade crossing (HRGC) safety research relies on federal incident and inventory datasets that span multiple decades. However, inconsistencies in geographic identifiers and incomplete reconstruction of crossing denominators can distort exposure-based rate metrics. This study develops, documents, and validates a transparent nine-stage reconciliation [...] Read more.
Highway–rail grade crossing (HRGC) safety research relies on federal incident and inventory datasets that span multiple decades. However, inconsistencies in geographic identifiers and incomplete reconstruction of crossing denominators can distort exposure-based rate metrics. This study develops, documents, and validates a transparent nine-stage reconciliation pipeline applied to 51 years (1975–2025) of national HRGC incident data from the Federal Railroad Administration Form 57 and Form 71 datasets. The hierarchical pipeline integrated deterministic alignment and multistage inference methods to produce an audited, geographically consistent dataset. The study formalizes four longitudinal county-level cumulative exposure indices that characterize spatiotemporal patterns of incident concentration relative to static population and infrastructure denominators. These metrics include accumulated incidents per million population (AIPM), accumulated incidents per crossing (AIPC), crossings per million population (CPM), and crossings per 100 square miles (CPHSM). All four metrics exhibited pronounced right-skewness: AIPM, CPM, and CPHSM approximated exponential forms, and AIPC approximated a log-normal form. Statistical tests detected statistically significant tail deviations in three metrics; CPM did not reject the exponential fit at conventional significance levels. Spatial analysis shows coherent regional concentration in incident rates in the Central Plains and lower Mississippi corridors. The national time series exhibits a late-1970s plateau, sustained exponential decline beginning around 1980, and stabilization but persistent incident rates after 2001. Population-normalized AIPM remained statistically indistinguishable between the reconciled and record-dropped datasets; however, crossing-based metrics changed materially when reconstructing denominators from the reconciled crossing universe. Statistical comparisons confirmed that incident-only denominators introduced substantial measurement bias in local risk assessment. State-level rank reversals persisted even when omnibus distributional tests failed to reject equality. By formalizing multistage data cleaning and quantifying its analytical impact over an unprecedented longitudinal horizon, this study establishes denominator integrity and geographic reconciliation as prerequisites for valid HRGC exposure assessment and provides a framework for future predictive modeling. Full article
(This article belongs to the Special Issue Transportation and Traffic Engineering)
Show Figures

Graphical abstract

35 pages, 1234 KB  
Article
EHMN 2026: A Thermodynamically Refined, SBML-Standardised Human Metabolic Network for Genome-Scale Analysis and QSP Integration
by Igor Goryanin, Leonid Slovianov, Stephen Checkley and Irina Goryanin
Metabolites 2026, 16(4), 236; https://doi.org/10.3390/metabo16040236 - 31 Mar 2026
Viewed by 440
Abstract
Background: Genome-scale metabolic models (GEMs) are foundational tools for systems biology, enabling quantitative interrogation of human metabolism across physiological and pathological states. However, many legacy reconstructions exhibit heterogeneous identifier usage, incomplete pathway integration, and limited thermodynamic refinement, constraining reproducibility, interoperability, and translational applicability. [...] Read more.
Background: Genome-scale metabolic models (GEMs) are foundational tools for systems biology, enabling quantitative interrogation of human metabolism across physiological and pathological states. However, many legacy reconstructions exhibit heterogeneous identifier usage, incomplete pathway integration, and limited thermodynamic refinement, constraining reproducibility, interoperability, and translational applicability. Methods: We present EHMN 2026, an update of the Edinburgh Human Metabolic Network. The reconstruction was refined through systematic identifier reconciliation using MetaNetX and ChEBI mappings, duplicate reaction consolidation, thermodynamic directionality assessment, and structured pathway annotation via Reactome. The final model was encoded in Systems Biology Markup Language (SBML) Level 3 Version 2 with the Flux Balance Constraints (FBC2) package, ensuring explicit gene–protein–reaction (GPR) representation and compatibility with modern constraint-based modelling toolchains. Results: EHMN 2026 comprises 11 compartments, 14,321 metabolites (species), and 22,642 reactions, supported by 3996 gene products. Of all reactions, 9638 (42.6%) contain GPR associations, linking metabolic transformations to 2887 unique Ensembl gene identifiers (ENSG). Pathway integration yielded 2194 unique Reactome identifiers, providing structured pathway-level organisation of metabolic functions. Thermodynamic refinement reduced infeasible energy-generating cycles and improved reaction directionality coherence while preserving global network connectivity. The reconstruction is fully SBML-compliant and portable across major modelling platforms. Compared with Recon3D and Human1, EHMN 2026 uniquely combines native Reactome reaction-level annotation, systematic MetaNetX identifier harmonisation, documented thermodynamic cycle elimination (37 cycles, 0 remaining), and an 11-compartment architecture supporting organelle-specific modelling—features designed for QSP and multi-layer integration applications. Conclusions: EHMN 2026 delivers a rigorously harmonised, thermodynamically refined, and pathway-annotated human metabolic reconstruction with enhanced annotation depth and standards-based interoperability. By combining genome-scale coverage with structured gene and pathway integration, the model establishes a robust computational backbone for reproducible metabolic analysis and provides a scalable foundation for future multi-layer systems pharmacology and integrative modelling frameworks. Full article
Show Figures

Figure 1

11 pages, 2557 KB  
Review
TB Data Improvement in Nkembo Health Treatment Center in Libreville, Gabon
by Casimir Manzengo, Farai Mavhunga, Nlandu Roger Ngatu, Fleur Lignenguet, Stredice Manguinga and Ghislaine Asseko Nkone
Trop. Med. Infect. Dis. 2026, 11(4), 90; https://doi.org/10.3390/tropicalmed11040090 - 27 Mar 2026
Viewed by 489
Abstract
Although the estimated tuberculosis (TB) incidence in Gabon is declining, there have been challenges with treatment coverage, HIV status and treatment outcome documentation. Thus, the National TB Program (NTP) conducted an innovative data review at the Nkembo Health Treatment Center in Libreville, which [...] Read more.
Although the estimated tuberculosis (TB) incidence in Gabon is declining, there have been challenges with treatment coverage, HIV status and treatment outcome documentation. Thus, the National TB Program (NTP) conducted an innovative data review at the Nkembo Health Treatment Center in Libreville, which manages more than 70% of Gabonese TB patients. Since our hypothesis was that the Nkembo treatment center was struggling with data mismanagement due to the workload, the objective was to perform a TB data quality review and triangulation exercise at the Nkembo health facility in Libreville, from January to August 2023, and propose recommendations for data improvement. Methods: The study used the data reconciliation method. This is a process that involves comparing and aligning data from multiple sources to ensure consistency, accuracy, and integrity. The primary purpose of data reconciliation is to identify and resolve discrepancies or differences between datasets and make them consistent. Using the “TB onion model”, analysis identified data mismanagement as a key contributor to underreporting. A data review compared TB records to TB registry data and patient folders from January to August 2023 for notification and to the 2022 cohort for treatment results. The study focused on notified TB cases, HIV status and TB treatment outcome documentation. Discrepancies were reconciled, and treatment outcomes re-evaluated. Results: After review, statistically significant increases were observed: +22% for total TB cases (p = 0.0003), +141% for the number of TB cases with known HIV status (p = 0.0017) and +104% for the number of TB cases successfully treated (p = 0.0001), as compared with the previous data. Discussion: This data reconciliation showed the usefulness of triangulation across data sources to improve the completeness of data. Also, current reported data underestimate the number of reported cases, documentation of HIV status, and treatment success. Conclusions: The study shows that data reconciliation can improve TB programmatic data completeness to better reflect program performance. Full article
(This article belongs to the Special Issue Tuberculosis Control in Africa and Asia)
Show Figures

Figure 1

15 pages, 1364 KB  
Article
CAMS F Edge DTN: Context-Aware Offline-First Synchronization and Local Reasoning Using CRDTs and MQTT-SN
by Nelson Iván Herrera, Estevan Ricardo Gómez-Torres, Edgar E. González, Renato M. Toasa and Paúl Baldeón
Future Internet 2026, 18(4), 180; https://doi.org/10.3390/fi18040180 - 26 Mar 2026
Viewed by 743
Abstract
Context-aware mobile applications operating in environments with intermittent or unreliable connectivity must support offline-first behavior while preserving consistent decision-making and timely synchronization. Traditional cloud-centric architectures often fail to provide adequate availability, responsiveness, and reliable context reasoning under such conditions. This paper presents CAMS-F [...] Read more.
Context-aware mobile applications operating in environments with intermittent or unreliable connectivity must support offline-first behavior while preserving consistent decision-making and timely synchronization. Traditional cloud-centric architectures often fail to provide adequate availability, responsiveness, and reliable context reasoning under such conditions. This paper presents CAMS-F Edge DTN, an edge-centric runtime designed to support offline-first context-aware applications operating under intermittent connectivity. The proposed approach extends the CAMS domain-specific language (DSL) with declarative policies for semantic reconciliation, opportunistic synchronization, and context-aware conflict resolution. The runtime integrates Conflict-Free Replicated Data Types (CRDTs), opportunistic communication channels such as Bluetooth and Wi-Fi Direct, and MQTT-SN messaging to enable robust data exchange across mobile, vehicular, and edge nodes. CAMS F-Edge DTN supports offline-first execution by allowing applications to evaluate contextual rules locally and reconcile distributed state asynchronously when connectivity becomes available. The approach is evaluated through controlled experiments and case studies in rural logistics and healthcare distribution scenarios. The experimental results show that the proposed architecture maintains 96–99% operational availability under intermittent connectivity and up to 100% availability during fully offline operation, while achieving low-latency local reasoning (<10 ms median latency) and deterministic state convergence through CRDT-based synchronization mechanisms. Full article
Show Figures

Figure 1

44 pages, 643 KB  
Article
A Hybrid Multi-Agent System for Early Scam Detection in Crypto-Assets
by Mario Trerotola, Mimmo Parente and Davide Calvaresi
Appl. Sci. 2026, 16(7), 3122; https://doi.org/10.3390/app16073122 - 24 Mar 2026
Viewed by 645
Abstract
The rapid expansion of crypto-asset markets and the introduction of the Markets in Crypto-Assets Regulation (MiCAR) pose novel supervisory challenges. Existing blockchain intelligence platforms focus predominantly on on-chain surveillance, leaving gaps in off-chain documentary due diligence automation. This paper presents a Multi-Agent System [...] Read more.
The rapid expansion of crypto-asset markets and the introduction of the Markets in Crypto-Assets Regulation (MiCAR) pose novel supervisory challenges. Existing blockchain intelligence platforms focus predominantly on on-chain surveillance, leaving gaps in off-chain documentary due diligence automation. This paper presents a Multi-Agent System (MAS) integrating Large Language Model (LLM) capabilities with rule-based compliance frameworks. The architecture comprises seven specialized agents: a Coordinator Agent for orchestration; data acquisition agents (Searcher, Crawler); three parallel analytical agents—Heuristic Agent (LLM-powered qualitative risk assessment), Compliance Agent (hybrid-AI MiCAR asset classification and regulatory requirement verification), and On-Chain Agent (machine learning-based fraud detection); and a Reconciliator Agent synthesizing findings into unified alerts. Component-level empirical validation on 150 projects indicates 95% output reproducibility (identical alert tier and score deviation 0.05 across five reruns) and 210 s mean latency, providing proof-of-concept evidence for the integrated pipeline. A pilot user evaluation (six researchers/master students and two experts from regulatory authorities) provides preliminary usability evidence and surfaces domain-specific feedback from regulatory-authority experts. The architecture advances proactive regulatory technology by enabling scalable analysis combining off-chain documentary evidence with on-chain forensics. Full article
Show Figures

Figure 1

16 pages, 322 KB  
Article
Daisaku Ikeda’s Philosophy and Practice of Interfaith Dialogue and the UN Sustainable Development Goals (SDGs): Human Revolution and Pathways to Global Peace
by Chang-Eon Lee
Religions 2026, 17(3), 375; https://doi.org/10.3390/rel17030375 - 17 Mar 2026
Viewed by 297
Abstract
This paper examines the philosophy and practice of interfaith dialogue (IFD) developed by Daisaku Ikeda (1928–2023), a prominent religious leader and peace philosopher. It explores how his dialogical approach can contribute to the United Nations’ Sustainable Development Goals (SDGs) and pathways to global [...] Read more.
This paper examines the philosophy and practice of interfaith dialogue (IFD) developed by Daisaku Ikeda (1928–2023), a prominent religious leader and peace philosopher. It explores how his dialogical approach can contribute to the United Nations’ Sustainable Development Goals (SDGs) and pathways to global peace. Ikeda’s dialogue is not confined to doctrinal debate or temporary reconciliation among faith communities. Rather, it is framed as a transformative process in which participants from diverse religious and civilizational traditions rebuild relationships through mutual respect and understanding, thereby contributing to personal transformation and broader societal change. Focusing on Ikeda’s core concepts—humanism, the dignity of life, and human revolution—this study first clarifies the philosophical foundations of his interfaith dialogue rooted in Nichiren Buddhism and a life-affirming worldview. It then examines major dialogues with global thinkers and leaders (e.g., Arnold J. Toynbee, Linus Pauling, Mikhail Gorbachev, and Johan Galtung) and selected institutional practices associated with Soka Gakkai International (SGI), the Institute of Oriental Philosophy (IOP), and the Ikeda Center for Peace, Learning, and Dialogue. These cases illustrate how Ikeda’s IFD functions as praxis for civilizational understanding, social cohesion, conflict transformation, and solidarity for the public good. The paper further analyzes the linkages between Ikeda’s IFD and SDG 16 (Peace, Justice and Strong Institutions), SDG 17 (Partnerships for the Goals), SDG 4 (Quality Education—especially Target 4.7 on Global Citizenship Education), and SDG 10 (Reduced Inequalities). It argues that IFD can operate as both a normative and practical resource for mitigating religious conflict, strengthening inclusion, enhancing global citizenship education and education for sustainable development (ESD), and fostering multistakeholder partnerships. The paper also reflects on the challenges of translating an approach grounded in a particular religious tradition into broader SDG governance contexts. Full article
15 pages, 211 KB  
Article
Beyond Alternative History: Time Travel and Historical Continuity in Kindred and The Incident at the Gamō Residence
by Kumiko Saito
Literature 2026, 6(1), 5; https://doi.org/10.3390/literature6010005 - 17 Mar 2026
Viewed by 352
Abstract
Time travel in science fiction, a subgenre distinct yet often overlapping with alternative history, often explores historical contingency through counterfactual scenarios to produce alternative histories. Yet some works deliberately negate this potential, presenting time travelers who refrain from altering the past despite possessing [...] Read more.
Time travel in science fiction, a subgenre distinct yet often overlapping with alternative history, often explores historical contingency through counterfactual scenarios to produce alternative histories. Yet some works deliberately negate this potential, presenting time travelers who refrain from altering the past despite possessing the apparent ability to do so. This essay examines this underexplored narrative mode through a comparative analysis of Octavia E. Butler’s Kindred and Miyabe Miyuki’s The Incident at the Gamō Residence. Framing the narrative device as a non-interventionist history, it explores how both novels deploy time travel not to revise history but to confront the ethical, emotional, and cultural implications of engaging with historically traumatic events that remain causally intact. Drawing on science fiction theory and historiographical debates, the essay argues that these texts redirect the function of time travel toward ethical reflection, embodied experience, and the formation of national identity. While Kindred presents history as an ongoing system of racialized violence that resists reconciliation, The Incident at the Gamō Residence frames historical violence through affective memory and postwar nostalgia, facilitating symbolic closure. Together, these novels demonstrate how time travel can serve as a critical apparatus for negotiating national trauma without recourse to historical revision. Full article
16 pages, 255 KB  
Article
Green Growth or Grey Gains: Rethinking Financial Development and Foreign Direct Investment Impacts on Ecological Sustainability in Sub-Saharan Africa
by Wisdom Okere and Cosmas Ambe
Sustainability 2026, 18(6), 2782; https://doi.org/10.3390/su18062782 - 12 Mar 2026
Viewed by 310
Abstract
Regulatory bodies have observed an increase in environmental issues due to firms’ interactions with the environment. Nonetheless, reconciliation actions are emerging, driven by the pursuit of sustainable development goals. This study investigated the impact of financial development and foreign direct investment on ecological [...] Read more.
Regulatory bodies have observed an increase in environmental issues due to firms’ interactions with the environment. Nonetheless, reconciliation actions are emerging, driven by the pursuit of sustainable development goals. This study investigated the impact of financial development and foreign direct investment on ecological footprints in sub-Saharan African nations, while examining the mediating role of regulatory quality and control for corruption. The research was motivated by the growing environmental degradation in the region amid growing capital inflows and financial market expansion. Using panel data of 18 sub-Saharan African countries between 1996 and 2023, sourced from the World Bank database and World Governance Indicators, we employed an Autoregressive Distributed Lag model to assess the short- and long-run relationships among ecological footprint, financial development, foreign direct investment, and key institutional factors. Results from the baseline model show that financial development significantly increases ecological footprints, while the effect of foreign direct investments is insignificant in the absence of institutional factors. However, when mediating variables are introduced, foreign direct investment significantly worsens ecological footprint, and regulatory quality and control for corruption show strong moderating effects, confirming the pollution haven hypothesis. Also, all control variables (trade openness, gross domestic product per capita, government expenditure, and population density) show significant outcomes with environmental sustainability. The findings underscore the importance of institutional factors in shaping sustainable foreign direct investment flows and financial systems. These research findings offer policy pathways for aligning investment strategies with sustainability goals in sub-Saharan Africa. Recommendations include strengthening the nation’s institutional framework, linking foreign direct investment to environmental compliance and promoting green finance policies across the region. Full article
25 pages, 639 KB  
Article
AI-Assisted Value Investing: A Human-in-the-Loop Framework for Prompt-Guided Financial Analysis and Decision Support
by Andrea Caridi, Marco Giovannini and Lorenzo Ricciardi Celsi
Electronics 2026, 15(6), 1155; https://doi.org/10.3390/electronics15061155 - 10 Mar 2026
Viewed by 914
Abstract
Value investing remains grounded in intrinsic value estimation, margin-of-safety reasoning, and disciplined fundamental analysis, but its practical execution is increasingly constrained by the scale, heterogeneity, and velocity of modern financial information. Recent advances in artificial intelligence (AI), particularly large language models and automated [...] Read more.
Value investing remains grounded in intrinsic value estimation, margin-of-safety reasoning, and disciplined fundamental analysis, but its practical execution is increasingly constrained by the scale, heterogeneity, and velocity of modern financial information. Recent advances in artificial intelligence (AI), particularly large language models and automated information-extraction systems, create new opportunities to accelerate financial analysis; however, their outputs remain probabilistic, context-dependent, and potentially error-prone, making governance and verification essential. This article proposes an AI-assisted value investing framework that integrates automated extraction, valuation modeling, explainability, and human-in-the-loop (HITL) supervision into a unified decision-support architecture. The framework is organized into three layers: (i) a data layer for traceable extraction and normalization of structured and unstructured financial information; (ii) a modeling layer for automated key performance indicator (KPI) computation, forecasting support, and discounted cash flow (DCF) valuation; and (iii) an explainability and governance layer for traceability, verification, model-risk control, and analyst oversight. A central contribution of the paper is the operational characterization of prompt literacy as a determinant of analytical reliability, showing that structured, context-aware prompts materially affect extraction correctness, usability, and verification effort. The framework is evaluated through a case study using Rivanna AI on three large U.S. beverage firms—namely, The Coca-Cola Company, PepsiCo, and Keurig Dr Pepper—selected as a controlled, information-rich setting for comparative analysis. The results indicate that the proposed workflow can reduce end-to-end analysis time from approximately 25–40 h in a traditional manual process to approximately 8–12 h in an AI-assisted setting, including citation/source verification, unit and period reconciliation, and review of key valuation assumptions. Rather than eliminating analyst effort, AI shifts it from manual information processing toward verification, adjudication, and interpretation. Overall, the findings position AI not as an autonomous decision-maker, but as a governed reasoning accelerator whose effectiveness depends on structured human guidance, traceability, and disciplined validation. In value investing, a discipline traditionally grounded in labor-intensive fundamental analysis and disciplined intrinsic value estimation, AI introduces the potential to scale analytical coverage and accelerate evidence synthesis. However, AI systems in financial contexts are probabilistic, context-sensitive, and inherently dependent on human interaction, raising critical questions about reliability, governance, and operational integration. This article proposes a structured framework for AI-driven value investing that preserves the foundational principles of intrinsic value, margin of safety, and economic reasoning, while redesigning the analytical workflow through automation, explainability, and human-in-the-loop (HITL) supervision. The proposed architecture integrates three layers: (i) an AI-enabled data layer for traceable extraction and normalization of structured and unstructured financial information; (ii) a modeling and valuation layer combining automated KPI computation, machine learning forecasting, and discounted cash flow (DCF) valuation; and (iii) an explainability and governance layer ensuring traceability, verification, and model risk control. A central contribution of this work is the operational characterization of prompt literacy, namely the ability to formulate structured, context-aware requests to AI systems, as a critical determinant of system reliability and analytical correctness. Through a focused case study using an AI-assisted analysis platform (Rivanna AI) on three U.S. beverage firms, we provide evidence that structured prompt formulation can improve extraction consistency, reduce verification overhead, and increase workflow efficiency in a human-supervised setting. In this setting, analysis time decreased from a manual range of approximately 25–40 h to 8–12 h with AI assistance and HITL validation, while preserving traceability and decision accountability. The reported hour savings should be interpreted as conservative estimates from the initial deployment phase; additional efficiency gains are expected as operational maturity increases, driven by learning-economy effects. The findings position AI not as an autonomous decision-maker but as a probabilistic reasoning accelerator whose effectiveness depends on structured human guidance, verification discipline, and prompt-driven interaction. These results redefine the role of the financial analyst from manual data processor to reasoning architect, responsible for designing, guiding, and validating AI-assisted analytical workflows. Full article
(This article belongs to the Special Issue Feature Papers in Artificial Intelligence)
Show Figures

Figure 1

Back to TopTop