Next Article in Journal
Quantum Attacks on Sum of Even–Mansour Construction with Linear Key Schedules
Previous Article in Journal
Closed-System Solution of the 1D Atom from Collision Model
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Representation Learning for Dynamic Functional Connectivities via Variational Dynamic Graph Latent Variable Models

College of Automation Science and Technology, South China University of Technology, Guangzhou 510641, China
*
Author to whom correspondence should be addressed.
Entropy 2022, 24(2), 152; https://doi.org/10.3390/e24020152
Submission received: 6 December 2021 / Revised: 14 January 2022 / Accepted: 14 January 2022 / Published: 19 January 2022
(This article belongs to the Topic Machine and Deep Learning)

Abstract

Latent variable models (LVMs) for neural population spikes have revealed informative low-dimensional dynamics about the neural data and have become powerful tools for analyzing and interpreting neural activity. However, these approaches are unable to determine the neurophysiological meaning of the inferred latent dynamics. On the other hand, emerging evidence suggests that dynamic functional connectivities (DFC) may be responsible for neural activity patterns underlying cognition or behavior. We are interested in studying how DFC are associated with the low-dimensional structure of neural activities. Most existing LVMs are based on a point process and fail to model evolving relationships. In this work, we introduce a dynamic graph as the latent variable and develop a Variational Dynamic Graph Latent Variable Model (VDGLVM), a representation learning model based on the variational information bottleneck framework. VDGLVM utilizes a graph generative model and a graph neural network to capture dynamic communication between nodes that one has no access to from the observed data. The proposed computational model provides guaranteed behavior-decoding performance and improves LVMs by associating the inferred latent dynamics with probable DFC.
Keywords: neural latent variable models; dynamic functional connectivities; variational information bottleneck; dynamic graphs neural latent variable models; dynamic functional connectivities; variational information bottleneck; dynamic graphs

Share and Cite

MDPI and ACS Style

Huang, Y.; Yu, Z. Representation Learning for Dynamic Functional Connectivities via Variational Dynamic Graph Latent Variable Models. Entropy 2022, 24, 152. https://doi.org/10.3390/e24020152

AMA Style

Huang Y, Yu Z. Representation Learning for Dynamic Functional Connectivities via Variational Dynamic Graph Latent Variable Models. Entropy. 2022; 24(2):152. https://doi.org/10.3390/e24020152

Chicago/Turabian Style

Huang, Yicong, and Zhuliang Yu. 2022. "Representation Learning for Dynamic Functional Connectivities via Variational Dynamic Graph Latent Variable Models" Entropy 24, no. 2: 152. https://doi.org/10.3390/e24020152

APA Style

Huang, Y., & Yu, Z. (2022). Representation Learning for Dynamic Functional Connectivities via Variational Dynamic Graph Latent Variable Models. Entropy, 24(2), 152. https://doi.org/10.3390/e24020152

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop