Next Article in Journal
A Social Recommendation Model Based on Basic Spatial Mapping and Bilateral Generative Adversarial Networks
Next Article in Special Issue
Causal Factor Disentanglement for Few-Shot Domain Adaptation in Video Prediction
Previous Article in Journal
Optimizing the Cooling System of High-Speed Train Environmental Wind Tunnels Using the Gene-Directed Change Genetic Algorithm
Previous Article in Special Issue
Efficient, Formal, Material, and Final Causes in Biology and Technology
 
 
Correction to Entropy 2023, 25(1), 26.
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Correction

Correction: Zhang, J.; Liu, K. Neural Information Squeezer for Causal Emergence. Entropy 2023, 25, 26

1
School of Systems Sciences, Beijing Normal University, Beijing 100875, China
2
Swarma Research, Beijing 100085, China
*
Author to whom correspondence should be addressed.
Entropy 2023, 25(10), 1387; https://doi.org/10.3390/e25101387
Submission received: 7 September 2023 / Accepted: 8 September 2023 / Published: 28 September 2023
(This article belongs to the Special Issue Causality and Complex Systems)
There was an error in the original publication. We found that due to previous negligence, some important content was missing in the previous manuscript [1]. It is a lemma in the appendix that supports the theorem.
A correction has been made to Appendix B:
Lemma A1.
(Bijection mapping does not affect mutual information): For any given continuous random variables  X  and  Z , if there is a bijection (one to one) mapping  f  and another random variable  Y  such that for any  x D o m   X  there is a  y = f   ( x )   D o m   ( Y ) , and vice versa, where  D o m   X  denotes the domain of the variable  X , then the mutual information between  X  and  Z  is equal to the information between  Y  and  Z , that is:
I   X ; Z = I   Y ; Z .
Proof. 
Because there is a one to one mapping f : X   Y , we have:
p X x = p Y y d e t   J ,
where p Y and p X are the density functions of X , Y , J = f X x is the Jacobian matrix of f , and if we insert Equation (A16) into the expression of the mutual information of I   ( X ; Z ) , and replace the integration for x with the one for y , we have:
I   X ; Z = X Z   p X Z x , z · l n p X Z x , z p X x p Z z · d z · d x = X Z   p X x p Z | X z | x · l n p X x p Z | X z | x p X x p Z z · d z · d x = Y Z   | d e t ( J ) | · p Y y · p Z | Y z | y · l n p Z | Y z | y p Z z · d e t J 1 · d z · d y = I   Y ; Z .
And Equation (A17) can also be proved because of the commutativeness of the mutual information. □
Due to the insertion of Lemma A1, the number of following Lemma are changed.
The authors state that the scientific conclusions are unaffected. This correction was approved by the Academic Editor. The original publication has also been updated.

Reference

  1. Zhang, J.; Liu, K. Neural Information Squeezer for Causal Emergence. Entropy 2023, 25, 26. [Google Scholar] [CrossRef] [PubMed]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, J.; Liu, K. Correction: Zhang, J.; Liu, K. Neural Information Squeezer for Causal Emergence. Entropy 2023, 25, 26. Entropy 2023, 25, 1387. https://doi.org/10.3390/e25101387

AMA Style

Zhang J, Liu K. Correction: Zhang, J.; Liu, K. Neural Information Squeezer for Causal Emergence. Entropy 2023, 25, 26. Entropy. 2023; 25(10):1387. https://doi.org/10.3390/e25101387

Chicago/Turabian Style

Zhang, Jiang, and Kaiwei Liu. 2023. "Correction: Zhang, J.; Liu, K. Neural Information Squeezer for Causal Emergence. Entropy 2023, 25, 26" Entropy 25, no. 10: 1387. https://doi.org/10.3390/e25101387

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop