Next Article in Journal
Masked Channel Modeling Enables Vision Transformers to Learn Better Semantics
Previous Article in Journal
A Fusion of Entropy-Enhanced Image Processing and Improved YOLOv8 for Smoke Recognition in Mine Fires
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Adaptive Learned Belief Propagation for Decoding Error-Correcting Codes

Telecom Paris, Institut Polytechnique de Paris, 91120 Palaiseau, France
*
Author to whom correspondence should be addressed.
Entropy 2025, 27(8), 795; https://doi.org/10.3390/e27080795
Submission received: 12 June 2025 / Revised: 16 July 2025 / Accepted: 22 July 2025 / Published: 25 July 2025
(This article belongs to the Section Information Theory, Probability and Statistics)

Abstract

Weighted belief propagation (WBP) for the decoding of linear block codes is considered. In WBP, the Tanner graph of the code is unrolled with respect to the iterations of the belief propagation decoder. Then, weights are assigned to the edges of the resulting recurrent network and optimized offline using a training dataset. The main contribution of this paper is an adaptive WBP where the weights of the decoder are determined for each received word. Two variants of this decoder are investigated. In the parallel WBP decoders, the weights take values in a discrete set. A number of WBP decoders are run in parallel to search for the best sequence- of weights in real time. In the two-stage decoder, a small neural network is used to dynamically determine the weights of the WBP decoder for each received word. The proposed adaptive decoders demonstrate significant improvements over the static counterparts in two applications. In the first application, Bose–Chaudhuri–Hocquenghem, polar and quasi-cyclic low-density parity-check (QC-LDPC) codes are used over an additive white Gaussian noise channel. The results indicate that the adaptive WBP achieves bit error rates (BERs) up to an order of magnitude less than the BERs of the static WBP at about the same decoding complexity, depending on the code, its rate, and the signal-to-noise ratio. The second application is a concatenated code designed for a long-haul nonlinear optical fiber channel where the inner code is a QC-LDPC code and the outer code is a spatially coupled LDPC code. In this case, the inner code is decoded using an adaptive WBP, while the outer code is decoded using the sliding window decoder and static belief propagation. The results show that the adaptive WBP provides a coding gain of 0.8 dB compared to the neural normalized min-sum decoder, with about the same computational complexity and decoding latency.
Keywords: belief propagation; neural networks; low-density parity-check codes; optical fiber communication belief propagation; neural networks; low-density parity-check codes; optical fiber communication

Share and Cite

MDPI and ACS Style

Tasdighi, A.; Yousefi, M. Adaptive Learned Belief Propagation for Decoding Error-Correcting Codes. Entropy 2025, 27, 795. https://doi.org/10.3390/e27080795

AMA Style

Tasdighi A, Yousefi M. Adaptive Learned Belief Propagation for Decoding Error-Correcting Codes. Entropy. 2025; 27(8):795. https://doi.org/10.3390/e27080795

Chicago/Turabian Style

Tasdighi, Alireza, and Mansoor Yousefi. 2025. "Adaptive Learned Belief Propagation for Decoding Error-Correcting Codes" Entropy 27, no. 8: 795. https://doi.org/10.3390/e27080795

APA Style

Tasdighi, A., & Yousefi, M. (2025). Adaptive Learned Belief Propagation for Decoding Error-Correcting Codes. Entropy, 27(8), 795. https://doi.org/10.3390/e27080795

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop