Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (283)

Search Parameters:
Keywords = causal discovery

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
13 pages, 1318 KB  
Article
Low-Density Lipoprotein Cholesterol Is Independently Associated with White Matter Injury Beyond Coronary Artery Calcium: Insights into Brain Aging
by Özgür Çakır, Burak Açar, Mustafa Kemal Dönmez, Almotasem Shatat, Sena Destan Bünül, Rıdvan Erten, Ahmet Yalnız and Ercüment Çiftçi
J. Clin. Med. 2026, 15(9), 3277; https://doi.org/10.3390/jcm15093277 - 25 Apr 2026
Viewed by 129
Abstract
Background/Objectives: The interplay between cardiovascular risk factors and brain aging remains incompletely understood. We aimed to investigate the comparative associations of coronary artery calcium (CAC) and low-density lipoprotein cholesterol (LDL-C) with MRI-derived volumetric measures of the brain. Methods: In this retrospective, [...] Read more.
Background/Objectives: The interplay between cardiovascular risk factors and brain aging remains incompletely understood. We aimed to investigate the comparative associations of coronary artery calcium (CAC) and low-density lipoprotein cholesterol (LDL-C) with MRI-derived volumetric measures of the brain. Methods: In this retrospective, single-center, cross-sectional study, 84 participants who underwent coronary computed tomography for CAC scoring and brain magnetic resonance imaging within 90 days were included; LDL-C levels were available in 69 participants for LDL-based analyses. Brain volumetric measures were obtained using the automated lesionBrain pipeline within the volBrain platform, which performs fully automated tissue segmentation and lesion quantification based on multi-atlas and patch-based approaches. Associations were evaluated using Spearman’s correlation with false discovery rate correction and hierarchical multivariable regression, supported by bootstrap validation and post hoc power analysis. The cohort had a mean age of 58.0 ± 13.0 years (range 19–78) and was derived from routine clinical imaging. Results: LDL-C was positively associated with abnormal white matter volume (ρ = 0.334, p = 0.005), although this did not remain statistically significant after FDR correction (pFDR = 0.090). In fully adjusted models, LDL-C remained the only independent predictor (β = 0.006, 95% CI: 0.002–0.010, p = 0.007; standardized β = 0.225; partial R2 = 11.7%), corresponding to a 6.2% increase in abnormal white matter volume per 10 mg/dL increase (derived from log-transformed models). CAC showed only a marginal association (p = 0.059). Post hoc power analysis demonstrated adequate power for LDL-C but insufficient power for CAC. Neither marker was associated with gray matter volume. Conclusions: In this cross-sectional cohort, higher LDL-C was independently associated with greater abnormal white matter volume after adjustment for cardiovascular risk factors, statin use, and CAC. No CAC–brain association was detected in this cohort, but limited statistical power means that small CAC effects cannot be excluded. These findings should be interpreted as associative rather than causal or mechanistic. Full article
(This article belongs to the Section Nuclear Medicine & Radiology)
Show Figures

Figure 1

30 pages, 7225 KB  
Article
Causal Learning for Continuous Variables with an Improved Bayesian Network Constructed by Symmetric Kernel Function Acceleration
by Chenghao Wei, Pukai Wang, Chen Li and Zhiwei Ye
Symmetry 2026, 18(5), 731; https://doi.org/10.3390/sym18050731 - 24 Apr 2026
Viewed by 134
Abstract
Bayesian network-based causal structure learning provides an effective framework for uncovering causal relationships among continuous variables. However, many existing methods for continuous data still rely on strong parametric distribution assumptions, which may introduce information loss and reduce Bayesian network modeling accuracy. Kernel density [...] Read more.
Bayesian network-based causal structure learning provides an effective framework for uncovering causal relationships among continuous variables. However, many existing methods for continuous data still rely on strong parametric distribution assumptions, which may introduce information loss and reduce Bayesian network modeling accuracy. Kernel density estimation (KDE), a non-parametric statistical method that is more flexible in density estimation form, offers a versatile framework for conducting conditional independence (CI) tests. This approach enables the estimation of mutual information and conditional mutual information, thereby facilitating the identification of underlying structural relationships. Nevertheless, the high computational cost of KDE-based CI testing restricts its practical application in continuous-variable causal learning. To address this issue, this study introduces a radial symmetric kernel-based acceleration scheme within a Fast Fourier Transform (FFT) framework to improve the efficiency of density estimation. On this basis, an enhanced Bayesian network structure learning method is developed for continuous variables, enabling more efficient estimation of mutual information and conditional mutual information while improving the computational efficiency and empirical stability of variable dependency discovery. With proper bandwidth and grid resolution, the proposed MMHC-FFTKDE framework achieves a reduction in computational runtime and improves efficiency compared to MMHC-KDE in the ablation setting, while maintaining competitive F1-scores and SHD for causal structure discovery. Full article
(This article belongs to the Special Issue Application of Symmetry/Asymmetry and Machine Learning)
16 pages, 549 KB  
Article
Hair Trace Element Imbalance in Smokers with HFpEF: A Pilot Study of Micronutrient and Metal Homeostasis
by Beata Krasińska, Tomasz Urbanowicz, Ievgen Spasenenko, Krzysztof J. Filipiak, Krzysztof Bartuś, Zbigniew Krasiński, Andrzej Tykarski and Anetta Hanć
Biomedicines 2026, 14(5), 970; https://doi.org/10.3390/biomedicines14050970 - 23 Apr 2026
Viewed by 348
Abstract
Background: Trace elements function as essential micronutrients involved in oxidative balance, mitochondrial activity, and cardiovascular metabolism. Cigarette smoking represents a significant source of toxic metals and may disrupt systemic trace element homeostasis. Alterations in micronutrient and metal balance may contribute to oxidative stress, [...] Read more.
Background: Trace elements function as essential micronutrients involved in oxidative balance, mitochondrial activity, and cardiovascular metabolism. Cigarette smoking represents a significant source of toxic metals and may disrupt systemic trace element homeostasis. Alterations in micronutrient and metal balance may contribute to oxidative stress, endothelial dysfunction, and myocardial remodeling, which are central mechanisms in the pathogenesis of heart failure with preserved ejection fraction (HFpEF). This study aimed to investigate whether smokers with HFpEF exhibit distinct hair trace element profiles compared with smokers without HFpEF. Methods: In this prospective pilot study, scalp hair samples were collected from adults undergoing clinical evaluation for suspected cardiovascular disease. Trace element concentrations were determined using inductively coupled plasma mass spectrometry (ICP-MS). Participants were first stratified according to smoking status and subsequently, within the smoker subgroup, according to HFpEF diagnosis based on the Heart Failure Association Pre-test assessment, Echocardiography and natriuretic peptide score (HFA-PEFF) algorithm. Differences in trace element concentrations were analyzed using appropriate statistical tests, with multiple-comparison correction using the Benjamini–Hochberg false discovery rate (FDR). Active smoking was defined as ≥10 cigarettes per day for at least 1 year, and cumulative exposure was quantified in pack-years. Results: Fifty-eight participants were included, including 27 active smokers. In unadjusted analyses, several trace elements differed between smokers with HFpEF and those without HFpEF, including vanadium, lithium, aluminum, and copper. However, after FDR correction, only copper remained significantly elevated in smokers with HFpEF (q = 0.004). Hair copper concentrations were markedly higher in the HFpEF group compared with smokers without HFpEF. These differences were observed alongside echocardiographic features consistent with diastolic dysfunction and structural cardiac remodeling. Conclusions: In this hypothesis-generating pilot study, smokers with HFpEF demonstrated elevated hair copper concentrations, suggesting disturbances in trace element and micronutrient homeostasis. Altered copper metabolism may reflect oxidative stress-related cardiometabolic remodeling associated with HFpEF. These findings raise the hypothesis that cardiometabolic phenotype, rather than smoking exposure alone, may modulate trace element homeostasis in HFpEF; however, causal relationships cannot be established. Full article
(This article belongs to the Section Molecular and Translational Medicine)
26 pages, 1507 KB  
Article
Transcriptomic Profiling Combined with Machine Learning and Mendelian Randomization Identifies Diagnostic Biomarkers and Immune Infiltration Patterns in Diabetic Kidney Disease
by Haiwen Liu, Qiang Fu and Jing Chen
Molecules 2026, 31(9), 1390; https://doi.org/10.3390/molecules31091390 - 23 Apr 2026
Viewed by 126
Abstract
Diabetic kidney disease (DKD) affects approximately 40% of patients with diabetes mellitus and remains a leading cause of end-stage renal disease worldwide. Early diagnosis and identification of therapeutic targets are critical for improving patient outcomes, yet reliable biomarkers are lacking. This study integrated [...] Read more.
Diabetic kidney disease (DKD) affects approximately 40% of patients with diabetes mellitus and remains a leading cause of end-stage renal disease worldwide. Early diagnosis and identification of therapeutic targets are critical for improving patient outcomes, yet reliable biomarkers are lacking. This study integrated transcriptomic data from the Gene Expression Omnibus (GEO) database (GSE96804, GSE30528, and GSE142025) with machine learning algorithms and Mendelian randomization (MR) to identify diagnostic biomarkers for DKD. Differentially expressed genes (DEGs) were identified and intersected with key modules from weighted gene co-expression network analysis (WGCNA). Four machine learning methods—least absolute shrinkage and selection operator (LASSO), random forest (RF), support vector machine-recursive feature elimination (SVM-RFE), and extreme gradient boosting (XGBoost)—were applied for feature selection. Five hub genes (SPP1, CD44, VCAM1, C3, and TIMP1) were identified at the intersection of these approaches. Two-sample MR analysis using eQTL data from the eQTLGen Consortium and kidney function GWAS from the CKDGen Consortium provided evidence supporting potential causal associations between SPP1, C3, and TIMP1 expression and estimated glomerular filtration rate decline. Immune infiltration analysis via CIBERSORT estimated elevated proportions of M1 macrophages and activated CD4+ memory T cells in DKD samples, with all five hub genes showing correlations with macrophage infiltration. A diagnostic model based on these five genes achieved a cross-validated area under the receiver operating characteristic curve (CV-AUC) of 0.938 in the discovery dataset and AUC values of 0.917 and 0.889 in two independent external validation cohorts. Drug–gene interaction analysis identified 10 candidate compounds targeting the hub genes. These findings provide a computational framework for identifying candidate diagnostic biomarkers and generating hypotheses regarding potential therapeutic targets for DKD; however, all results are derived from in silico analyses and require experimental validation—including qPCR, immunohistochemistry, and prospective clinical cohort studies—before clinical applicability can be established. Full article
Show Figures

Graphical abstract

18 pages, 1479 KB  
Article
Temporal Dynamics of Market Microstructure in Cryptocurrency Perpetual Futures: Econometric Evidence from Centralized and Decentralized Exchanges
by Petar Zhivkov, Venelin Todorov and Slavi Georgiev
Int. J. Financial Stud. 2026, 14(5), 103; https://doi.org/10.3390/ijfs14050103 - 23 Apr 2026
Viewed by 295
Abstract
We apply rolling-window econometric methods, including GARCH(1,1) estimation, Bai–Perron structural break detection, CUSUM stability testing, and Granger causality analysis in bivariate VAR frameworks, to analyze the temporal dynamics of market integration in cryptocurrency perpetual futures, tracking funding rate correlations, arbitrage prevalence, and volatility [...] Read more.
We apply rolling-window econometric methods, including GARCH(1,1) estimation, Bai–Perron structural break detection, CUSUM stability testing, and Granger causality analysis in bivariate VAR frameworks, to analyze the temporal dynamics of market integration in cryptocurrency perpetual futures, tracking funding rate correlations, arbitrage prevalence, and volatility persistence across 26 exchanges and 812 symbols over two months (November 2025 through January 2026). Using 53 overlapping seven-day rolling windows on 9.1 million hourly observations, we find that the two-tiered market structure previously documented in a static snapshot (centralized exchanges tightly integrated, decentralized exchanges fragmented) persists qualitatively but varies substantially in magnitude, with the integration gap ranging from 0.041 to 0.222. Structural break tests detect no discrete regime shifts; the market evolves through gradual drift. GARCH(1,1) analysis reveals that near-integrated (IGARCH) volatility behavior, previously reported as a general property, appears in only 24.5% of windows, concentrated in specific time periods. Granger causality tests show that mid-tier exchanges lead the largest venue (Binance) more frequently than the reverse, challenging a simple size-based price discovery hierarchy. Intraday spread patterns are statistically significant and linked to funding rate settlement mechanics, with spreads peaking approximately two hours after standard settlement times. These findings have implications for systemic risk assessment: market surveillance frameworks that focus on the largest venue may miss price discovery signals originating from mid-tier exchanges. Full article
(This article belongs to the Special Issue Mathematical Finance: Theory, Methods, and Applications)
Show Figures

Figure 1

18 pages, 835 KB  
Review
Genomic Resources and Gene Family Studies in Longan (Dimocarpus longan Lour.): Progress, Limitations, and Prospects
by Xiang Li, Liqin Liu, Xiaowen Hu, Shengyou Shi, Tianzi Li and Jiannan Zhou
Horticulturae 2026, 12(5), 513; https://doi.org/10.3390/horticulturae12050513 - 22 Apr 2026
Viewed by 495
Abstract
The rapid accumulation of genome-scale data has transformed plant biology from descriptive genetics to predictive and increasingly mechanistic genomics. Longan (Dimocarpus longan Lour.) is an economically important subtropical fruit tree in China and Southeast Asia, but compared with model plants and major [...] Read more.
The rapid accumulation of genome-scale data has transformed plant biology from descriptive genetics to predictive and increasingly mechanistic genomics. Longan (Dimocarpus longan Lour.) is an economically important subtropical fruit tree in China and Southeast Asia, but compared with model plants and major temperate fruit crops, its genomic resources and functional studies have developed relatively late. Here, we review recent progress in longan genomics with emphasis on three interrelated areas: genome assembly and annotation, transcriptomic resources, and representative gene family studies associated with flowering, somatic embryogenesis, and transporter-mediated stress tolerance. The progression from the first draft genome of ‘Honghezi’ to the chromosome-scale assemblies of ‘Jidanben’ and ‘Shixia’ has substantially improved contiguity and gene annotation, thereby enabling population-genomic analysis, genome-wide gene family identification, and candidate-gene discovery. Available transcriptomic datasets further support studies of reproductive development, stress responses, and embryogenic competence, although cross-study integration remains limited. We also summarize how gene family analyses have advanced the current understanding of floral induction, continuous flowering, somatic embryogenesis, mineral transport, and sugar transport in longan. Importantly, the field is still dominated by cataloguing and expression-based inference, whereas causal validation, pan-genomic analysis, and multi-omics integration remain insufficient. We therefore argue that future progress in longan molecular breeding will depend on integrating high-quality genomic resources with functional validation, standardized comparative annotation, and improved transformation or regeneration systems. Full article
Show Figures

Figure 1

30 pages, 11334 KB  
Article
An Ensembled Causal Analysis Workflow: Discovering Mechanical Patterns in Engineering from Entangled Networks
by Siyang Zhou
Information 2026, 17(5), 400; https://doi.org/10.3390/info17050400 - 22 Apr 2026
Viewed by 134
Abstract
Extracting causal relations from complex dynamic systems has become an appealing topic for decades, especially for machine design engineering, industrial manufacturing, and equipment maintenance, which usually suffer from a large number of tangled relationships. Although many causality detection methods have been utilized, evaluating [...] Read more.
Extracting causal relations from complex dynamic systems has become an appealing topic for decades, especially for machine design engineering, industrial manufacturing, and equipment maintenance, which usually suffer from a large number of tangled relationships. Although many causality detection methods have been utilized, evaluating and choosing appropriate methods, and developing proper workflow remain challenges. In this paper, a causal analysis workflow designed to detect hidden patterns involved with mechanical mechanisms is presented. In particular, various causality measures are ensembled, enabling the search for refined causal mechanisms, the impact of constitutive law, and spatial distribution of causality from the entangled raw network. Based on numerical experiments, several beneficial conclusions can be drawn: Separating typical stages is necessary for a complex process; The constitutive property has a great impact on causal inference; The discrepancy of causality among different locations of monitor points mainly depends on whether it is near the fixed boundary, near to the load, or in contact with friction; Granger Causality is suitable for discovering linear dependencies among material, load, and geometry, while constraint-based and score-based algorithms excel in identifying nonlinear causality in metal plasticity, severe discontinuity in contact, impulsive dynamic load, or damping phenomenon. Full article
50 pages, 1551 KB  
Article
Causally Informative Entropic Inequalities within Families of Distributions with Shared Marginals
by Daniel Chicharro
Entropy 2026, 28(4), 472; https://doi.org/10.3390/e28040472 - 20 Apr 2026
Viewed by 163
Abstract
The joint probability distribution of observable variables from a system is constrained by the underlying causal structure. In the presence of hidden variables, untestable independencies that involve hidden variables lead to testable causally-imposed inequality constraints for observable variables, whose violation can reject the [...] Read more.
The joint probability distribution of observable variables from a system is constrained by the underlying causal structure. In the presence of hidden variables, untestable independencies that involve hidden variables lead to testable causally-imposed inequality constraints for observable variables, whose violation can reject the compatibility of a causal structure with data. One type of causally informative inequalities is entropic inequalities, which appear in the space of entropic terms associated with the distribution of observable variables. We derive a new type of minimum information (minInf) entropic inequalities that substantially increases causal inference power. These new entropic inequalities appear when considering the constraints that the causal structure imposes on entropic terms determined by information minimization within families of distributions that preserve sets of marginals shared with the original distribution. We introduce a new family of minInf data processing inequalities and a procedure to recursively combine different types of data processing inequalities to create tighter testable entropic inequalities. We extensively illustrate the applicability of this procedure in the instrumental causal scenario, integrating the new inequalities with standard instrumental entropic inequalities constructed with multivariate instrumental sets. We also provide additional examples with other types of entropic inequalities, such as the Information Causality and Groups-Decomposition inequalities. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

3 pages, 139 KB  
Editorial
Bayesian Networks and Causal Discovery
by Xiaoguang Gao and Zidong Wang
Entropy 2026, 28(4), 438; https://doi.org/10.3390/e28040438 - 13 Apr 2026
Viewed by 189
Abstract
The discovery of the precise causal representations underlying complex data forms the bedrock of artificial intelligence research [...] Full article
(This article belongs to the Special Issue Bayesian Networks and Causal Discovery)
21 pages, 320 KB  
Article
Xenoepistemics
by Jordi Vallverdú
Philosophies 2026, 11(2), 57; https://doi.org/10.3390/philosophies11020057 - 8 Apr 2026
Viewed by 318
Abstract
Epistemology remains tacitly anthropocentric: it treats knowledge as something produced and validated through human cognitive capacities such as understanding, intuition, and transparent justification. Yet contemporary science and artificial intelligence increasingly depend on non-human systems that generate mathematically valid results, empirically successful models, and [...] Read more.
Epistemology remains tacitly anthropocentric: it treats knowledge as something produced and validated through human cognitive capacities such as understanding, intuition, and transparent justification. Yet contemporary science and artificial intelligence increasingly depend on non-human systems that generate mathematically valid results, empirically successful models, and operationally reliable inferences that no human can fully survey or interpret. This article develops xenoepistemics, a structural theory of non-anthropocentric knowledge. The central claim is that epistemic evaluation must be reformulated in terms of system-level properties—reliability, robustness, counterfactual sensitivity, and domain transfer—rather than mentalistic notions such as belief or understanding. I offer (i) a definition of xenoepistemic systems as systems that track structure in a target domain without requiring human-style semantic access; (ii) a minimal account of epistemic agency without minds that avoids trivialization; and (iii) a non-circular trust framework that distinguishes empirical success from epistemic legitimacy using independent validation regimes. This paper addresses a reflexive worry—that a human-authored theory cannot dethrone human epistemology—by separating standpoint from object: xenoepistemics is articulated by humans but is not about human cognition. I discuss the pragmatic value of xenoepistemic knowledge production, the limits of independent verification for opaque systems, domain-relative thresholds for xenoepistemic authority, and the problem of constitutionally human-inaccessible knowledge. Finally, I diagnose and formalize the Marcusian regress paradox: recurrent goalpost-shifting, whereby every machine competence is reclassified as irrelevant once achieved. Xenoepistemics reframes this debate by treating non-human knowledge as a present reality requiring new norms, not as a future curiosity. Full article
(This article belongs to the Special Issue Intelligent Inquiry into Intelligence)
Show Figures

Figure 1

28 pages, 395 KB  
Review
Integrating Transcriptomics and Metabolomics to Unravel the Molecular Mechanisms of Meat Quality: A Systematic Review
by Kaiyue Wang, Ren Mu, Yongming Zhang and Xingdong Wang
Foods 2026, 15(8), 1271; https://doi.org/10.3390/foods15081271 - 8 Apr 2026
Viewed by 558
Abstract
Meat quality serves as a pivotal determinant of consumer purchasing behavior and of the economic viability of the livestock industry; as such, research into its regulatory mechanisms is of critical significance for the development of modern agriculture. Traditional investigations into meat quality have [...] Read more.
Meat quality serves as a pivotal determinant of consumer purchasing behavior and of the economic viability of the livestock industry; as such, research into its regulatory mechanisms is of critical significance for the development of modern agriculture. Traditional investigations into meat quality have predominantly centered on sensory and physicochemical assessments of ultimate phenotypic traits, thereby facing inherent limitations in systematically deciphering the intricate molecular regulatory networks underlying meat quality formation. By contrast, an integrated analysis of the transcriptome and metabolome effectively connects the cascade of “gene transcription—metabolic regulation—phenotypic determination,” which has emerged as a core methodological paradigm in contemporary research on the molecular mechanisms governing meat quality. This review systematically delineates the evolutionary trajectory and principal technological frameworks of meat quality evaluation systems, with a focused synthesis of recent advances achieved through combined transcriptomic and metabolomic analyses in the field of meat quality regulation. The scope of this review encompasses core transcriptional regulatory networks associated with meat quality attributes, pivotal metabolic pathways, signal transduction mechanisms, and protein degradation dynamics. Furthermore, the regulatory impacts exerted by genetic variation among breeds, nutritional modulation, rearing environments, and stress responses on meat quality characteristics are comprehensively elucidated. Integrative analysis reveals that combined transcriptome–metabolome approaches transcend the inherent limitations of single-omics investigations, systematically unraveling the hierarchical regulatory mechanisms governing fundamental meat quality traits, such as muscle fiber type differentiation, postmortem glycolytic progression, intramuscular fat deposition, and flavor compound accumulation. Such integrative strategies have facilitated the identification of functional genes and metabolic biomarkers with potential utility for the early prediction of meat quality outcomes. Concurrently, this review acknowledges persistent challenges confronting the field, including the absence of standardized protocols for multi-omics data integration, insufficient functional causal validation, and a discernible disconnect between research discoveries and practical industrial implementation. Building upon this comprehensive assessment, prospective directions for future multi-omics research in meat quality are proposed, accompanied by the formulation of an integrated end-to-end improvement framework spanning fundamental research, technological innovation, and industrial application. Collectively, this review provides a systematic theoretical foundation for the in-depth elucidation of mechanisms that determine meat quality and the precision-oriented regulation of quality-determining traits in livestock production practices, thereby offering substantial scientific guidance for quality improvement initiatives within the animal husbandry sector. Full article
(This article belongs to the Section Meat)
26 pages, 1682 KB  
Article
Impact Factors and Policy Effectiveness of Renewable Energy Generation in China
by Songyuan Liu, Shuaiqi Hu, Mei Wang, Yue Song, Yichuan Jin and Lingfeng Tan
Sustainability 2026, 18(7), 3519; https://doi.org/10.3390/su18073519 - 3 Apr 2026
Viewed by 318
Abstract
As China accelerates toward carbon neutrality, decrypting the causal drivers of renewable energy expansion is paramount for effective policy design. We develop a hybrid analytical framework bridging data-driven K2 structural learning with expert-informed Bayesian Networks to map the intricate interdependencies between policy instruments, [...] Read more.
As China accelerates toward carbon neutrality, decrypting the causal drivers of renewable energy expansion is paramount for effective policy design. We develop a hybrid analytical framework bridging data-driven K2 structural learning with expert-informed Bayesian Networks to map the intricate interdependencies between policy instruments, resource endowments, and socio-economic variables. This causal mapping reveals a fundamental paradigm shift from resource-bound growth to institutional-steered expansion, particularly in the solar sector where the Renewable Portfolio Standard (RPS) has superseded natural radiation as the primary determinant for capacity scaling. Forward sensitivity and backward diagnostic analyses demonstrate that achieving high-growth milestones requires a synergistic convergence of technological cost reductions and mandatory consumption quotas; conversely, the absence of RPS leads to a 64% degradation in systemic causal connectivity. These findings underscore the necessity of transitioning from price-side stimuli to structural consumption-side mandates to ensure a resilient energy transition. Ultimately, this framework and the identified causal pathways provide a strategic blueprint for other emerging economies navigating the complex transition from subsidy-dependent to market-resilient renewable energy landscapes under stringent climate constraints. Full article
(This article belongs to the Section Energy Sustainability)
Show Figures

Figure 1

29 pages, 5274 KB  
Article
Enhanced Causal Discovery for Autocorrelated Time Series via Adaptive Momentary Conditional Independence
by Minglong Gao and Yingchun Zhou
Mathematics 2026, 14(7), 1129; https://doi.org/10.3390/math14071129 - 27 Mar 2026
Viewed by 398
Abstract
Discovering causal relationships from time series data is essential for understanding complex dynamical systems across a range of domains. However, strong autocorrelation often limits the detection power of existing algorithms and increases the risk of false positives. To address these challenges, the Adaptive [...] Read more.
Discovering causal relationships from time series data is essential for understanding complex dynamical systems across a range of domains. However, strong autocorrelation often limits the detection power of existing algorithms and increases the risk of false positives. To address these challenges, the Adaptive Momentary Conditional Independence (aMCI) method is introduced to mitigate the masking effects of autocorrelation and maintain control over false discovery rates. The aMCI method adaptively modifies the conditioning set to reduce the impact of autocorrelation on the accuracy of causal discovery. In addition, a multi-phase algorithm, the Enhanced Causal Discovery via aMCI (ECD-aMCI) algorithm, is proposed to robustly learn the causal graph by effectively applying the aMCI framework. The algorithm is designed to be hyperparameter-insensitive, order-independent, and provably consistent under oracle conditions. Extensive evaluations on simulated and benchmark datasets demonstrate that the proposed algorithm substantially improves the accuracy of causal discovery from time series, especially in the presence of strong autocorrelation. Full article
(This article belongs to the Section D1: Probability and Statistics)
Show Figures

Figure 1

17 pages, 763 KB  
Review
Mapping the Extended Pain Pathway: Human Genetic and Multi-Omic Strategies for Next-Generation Analgesics
by Ari-Pekka Koivisto
Int. J. Mol. Sci. 2026, 27(7), 3035; https://doi.org/10.3390/ijms27073035 - 26 Mar 2026
Viewed by 622
Abstract
The 2025 approval of the selective NaV1.8 blocker suzetrigine for acute pain marked a pivotal advance in analgesic drug development. Yet the subsequent failure of Vertex’s next-generation NaV1.8 inhibitor VX993 to demonstrate clinical analgesia underscores enduring challenges in translating mechanistic promise into patient [...] Read more.
The 2025 approval of the selective NaV1.8 blocker suzetrigine for acute pain marked a pivotal advance in analgesic drug development. Yet the subsequent failure of Vertex’s next-generation NaV1.8 inhibitor VX993 to demonstrate clinical analgesia underscores enduring challenges in translating mechanistic promise into patient benefit. This review examines why promising targets and compounds, spanning NaV and TRP channels, often falter and outlines a path toward more reliable target selection and validation. I first summarize the pain pathway, from nociceptor transduction through spinal processing to cortical perception, emphasizing how inflammation and peripheral sensitization reshape excitability. Historically serendipitous, pain drug discovery now prioritizes molecular precision. Most approved chronic pain therapies act in the CNS and are limited by modest efficacy and adverse effects. Nociceptor-enriched targets (NaV1.7/1.8/1.9; TRP channels) remain attractive, yet redundancy among NaV subtypes and the necessity of blocking targets at the correct anatomical sites complicate translation. Human genetics and multi-omics provide a powerful, unbiased engine for target discovery. Rare high-impact variants offer strong causal hypotheses, while common polygenic contributions illuminate broader susceptibility. Large biobanks increasingly reveal a mismatch between legacy pain targets and genetically supported candidates across neuronal and non-neuronal cells. Human DRG transcriptomics highlight NaV channel redundancy. Human in vitro electrophysiology and PK/PD analyses show suzetrigine achieves ~90–95% NaV1.8 engagement, yet neurons can still fire unless additional channels are blocked. Species differences and drug distribution (including BBB/PNS penetration and P-gp efflux) critically influence efficacy; centrally accessible blockade (e.g., for NaV1.7 or TRPA1) may be necessary to achieve robust analgesia, challenging peripherally restricted strategies. Osteoarthritis illustrates how obesity-driven metabolic inflammation, synovial immune activation, subchondral bone remodeling, and specific nociceptor subtypes converge to drive mechanical pain. Multi-omic integration across diseased human tissues can pinpoint causal processes and cell types, enabling more selective and safer target choices. I propose a practical framework for target validation that integrates: (i) rigorous human genetic support; (ii) cell-type and site-of-action mapping; (iii) human-relevant electrophysiology and PK/PD with verified target engagement; (iv) species-appropriate models; (v) consideration of modality (small molecule, biologic, RNA, targeted protein degradation). Advancing genetically and anatomically aligned targets, tested at the right sites and exposures, offers the best path to genuinely effective, better-tolerated pain therapeutics. Full article
(This article belongs to the Special Issue Pain Pathways Rewired: Moving past Peripheral Ion Channel Strategies)
Show Figures

Figure 1

16 pages, 259 KB  
Article
Candidate SCOR-Linked Financial Proxies: Exploratory Evidence from a 12-Firm Panel Using SCOR_E Ratio Analysis of Supply Chain Efficiency
by Juan Roman
Logistics 2026, 10(4), 70; https://doi.org/10.3390/logistics10040070 - 25 Mar 2026
Viewed by 582
Abstract
Background: Many SCOR performance measures rely on internal operational data, which limits empirical work using public information. Methods: This study evaluates a small set of publicly auditable, SCOR-linked ratios (SCOR_E) in a panel of 12 publicly traded firms across four sectors from 2000 [...] Read more.
Background: Many SCOR performance measures rely on internal operational data, which limits empirical work using public information. Methods: This study evaluates a small set of publicly auditable, SCOR-linked ratios (SCOR_E) in a panel of 12 publicly traded firms across four sectors from 2000 to 2022. Using firm- and year-fixed-effects panel models, the paper examines whether these candidate proxies show pre-specified directional associations within firms and whether the same ratios are associated with operating margin in parallel models. Instrumental-variable (IV) specifications are reported only as sensitivity analyses, and nearly all are weak by the paper’s reported first-stage diagnostics. Results: Accordingly, most findings are interpreted as associative rather than causal. After false-discovery-rate adjustment and weak-instrument-robust inference, only four firm–proxy pairs meet the paper’s detection criterion; all remaining estimates are treated as non-robust. Conclusions: The contribution is therefore narrow: this is a constrained exploratory screening exercise showing which candidate mappings survive the paper’s inferential filters in this sample and which do not. The results do not establish a validated cross-industry scorecard, a scalable benchmarking framework, or a basis for policy claims. Full article
(This article belongs to the Topic Decision Science Applications and Models (DSAM))
Back to TopTop