Abstract
Causal discovery from time-series data seeks to capture both intra-slice (contemporaneous) and inter-slice (time-lagged) causal relationships among variables, which are essential for many scientific domains. Unlike causal discovery from static data, the time-series setting requires longer sequences with a larger number of observed time steps. To address this challenge, we propose STIC, a novel gradient-based framework that leverages Short-Term Invariance with Convolutional Neural Networks (CNNs) to uncover causal structures. Specifically, STIC exploits both temporal and mechanistic invariance within short observation windows, treating them as independent units to improve sample efficiency. We further design two causal convolution kernels corresponding to these two types of invariance, enabling the estimation of window-level causal graphs. To justify the use of CNNs for causal discovery, we theoretically establish the equivalence between convolution and the generative process of time-series data under the assumption of identifiability in additive noise models. Extensive experiments on synthetic data, as well as an fMRI benchmark, demonstrate that STIC consistently outperforms existing baselines and achieves state-of-the-art performance, particularly when the number of available time steps is limited.