entropy-logo

Journal Browser

Journal Browser

Bayesian Network and Signal Processing

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 31 July 2026 | Viewed by 808

Special Issue Editor


E-Mail Website
Guest Editor
Department of Electrical Engineering, State University of São Paulo, Guaratinguetá 12516-410, SP, Brazil
Interests: dynamic system modeling; causal models; network models; dynamic Bayesian network; complex system models
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues, 

Bayesian networks have emerged as a cornerstone for modeling uncertainty and capturing complex dependencies in signal processing applications. These probabilistic graphical models enable robust inference in scenarios ranging from speech enhancement and biomedical signal analysis to adaptive filtering and network optimization. Despite their theoretical appeal, real-world deployment faces challenges such as scalability for high-dimensional systems, dynamic architectural adaptations, and integration with modern machine learning frameworks. This Special Issue seeks to bridge these gaps by advancing methodological innovations and expanding the frontiers of Bayesian network applications in signal processing. 

Challenges and Opportunities 

While Bayesian networks offer principled frameworks for uncertainty quantification, their application to nonstationary, large-scale signal processing tasks remains nontrivial. Key challenges include the following: 

  • Dynamic architectural learning: Real-world systems like gene regulatory networks exhibit time-varying interactions, requiring models that detect structural switches while estimating latent states under evolving noise conditions. 
  • Computational scalability: Sequential Monte Carlo methods and variational inference techniques must adapt to multi-modal sensor data streams without sacrificing real-time performance. 
  • Integration with deep learning: Hybrid architectures combining Bayesian networks with neural networks could enhance pattern recognition while preserving interpretability. 
  • Uncertainty-aware optimization: Bayesian optimization strategies show promise for hyperparameter tuning in nonlinear systems but require efficient surrogate models for high-dimensional spaces. 

Scope of Contributions 

This Special Issue invites original research addressing these challenges through theoretical advancements and practical implementations. Topics of interest include, but are not limited to, the following: 

  • Advanced inference algorithms: Local Gibbs sampling, stochastic gradient MCMC, and distributed variational methods for large-scale networks. 
  • Dynamic network architectures: Switching models for time-varying systems in speech processing, biomedical engineering, and financial signal analysis. 
  • Cross-domain integration: Bayesian tensor decomposition, quantum-inspired networks, and blockchain-secured distributed inference frameworks. 
  • Real-world applications: Case studies in acoustic echo cancelation, environmentally robust speech recognition, and multi-omics data fusion. 

Submissions should emphasize methodological rigor while demonstrating tangible benefits over conventional approaches. We particularly encourage interdisciplinary studies that translate Bayesian network theory into deployable signal processing solutions for healthcare, telecommunications, and IoT systems.  By fostering dialog between theorists and practitioners, this Special Issue aims to solidify Bayesian networks as a transformative tool for 21st-century signal processing challenges. 

Dr. Carlos Dias Maciel
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 250 words) can be sent to the Editorial Office for assessment.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Bayesian network
  • large-scale models
  • quantum-inspired network
  • distributed inference
  • probabilistic models

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • Reprint: MDPI Books provides the opportunity to republish successful Special Issues in book format, both online and in print.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

30 pages, 618 KB  
Article
Learning Continuous Decomposable Models Using Mutual Information and Statistical Copulas
by Luiz Desuó Neto, Henrique de Oliveira Caetano, Matheus de Souza Sant’Anna Fogliatto and Carlos Dias Maciel
Entropy 2026, 28(3), 293; https://doi.org/10.3390/e28030293 - 4 Mar 2026
Viewed by 384
Abstract
Learning dependence graphs from multivariate continuous data is challenging when marginal distributions are heterogeneous, since likelihood-based nonparametric scores can be sensitive to smoothing choices and can confound marginal irregularities, including non-identifiability, with dependence. This work studies structure learning in the class of decomposable [...] Read more.
Learning dependence graphs from multivariate continuous data is challenging when marginal distributions are heterogeneous, since likelihood-based nonparametric scores can be sensitive to smoothing choices and can confound marginal irregularities, including non-identifiability, with dependence. This work studies structure learning in the class of decomposable (chordal) Markov random fields, where junction tree factorizations enable tractable inference and local score updates. Our first contribution is a theoretical result showing that, under decomposability, mutual information can be expressed as a difference of clique/separator copula entropies, yielding a dependence-only decomposition aligned with the clique/separator structure. Building on this identity, we define an information-theoretic objective for decomposable graphs with a complexity penalty that preserves clique/separator additivity, and we derive closed-form local score differences for chordality-preserving single-edge insertions and deletions. To make the score computable from data, we instantiate clique/separator copula entropies using pseudo-observations and a probit-transformed kernel density estimator with predictive log score evaluation to mitigate boundary effects on the unit hypercube. The resulting nonparametric greedy procedure improves edge recovery accuracy on synthetic chordal benchmarks compared with a likelihood-driven nonparametric baseline, and it produces interpretable dependence summaries on an airway epithelial gene expression dataset. Concretely, this paper contributes (1) a decomposable mutual information identity via clique/separator copula entropies, (2) a copula information score with an additive complexity penalty for decomposable graphs, (3) a closed-form local score, enabling greedy chordal add or delete search, (4) a practical nonparametric copula entropy estimation pipeline, and (5) empirical gains on synthetic and real data. Full article
(This article belongs to the Special Issue Bayesian Network and Signal Processing)
Show Figures

Figure 1

Back to TopTop