Skip to content

Researching causal relationships in time series data using Temporal Convolutional Networks (TCNs) combined with attention mechanisms. This approach aims to identify complex temporal interactions. Additionally, we're incorporating uncertainty quantification to enhance the reliability of our causal predictions.

License

Notifications You must be signed in to change notification settings

m4urin/temporal-causal-discovery

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Temporal Causal Discovery with Machine Learning

This repository contains code that is part of my thesis for my master's degree in Computer Science: Data Science and Artificial Intelligence at Antwerp University. The objective of the thesis is to develop a method for robust causal discovery in time series data using machine learning. Please feel free to contact me if you have any questions or suggestions.

The complete thesis is available for access and can be downloaded in PDF format. This document presents a detailed overview of the research, methodology, and results obtained in this study.

Abstract

In this study, we explore the complexities and challenges in temporal causal discovery using deep learning. Additive models can identify temporal causal relationships in data (Bussmann et al.). However, due to their inability to effectively approximate interactive (non-additive) relationships, they might overlook a relationship and incorrectly assign causal effects to variables. Furthermore, expanding the receptive field of a model to capture long-range relationships increases the complexity and potentially results in inaccurate causal predictions. Considering the real-world implications of such predictions, there arises a need to quantify the uncertainty of these models to enhance the robustness and reliability of their causal predictions. In this study, we provide a comprehensive overview of the challenges in temporal causal discovery, covering both general challenges as well as specific challenges associated with methods suggested by prior works. To address these challenges, we make three key contributions: (1) We incorporate a Temporal Convolutional Network (TCN) to process time series data. This architecture expands the receptive field and increases the complexity, which allows the model to learn more complex, non-linear, and long-range relationships. (2) We introduce the Temporal Attention Mechanism for Causal Discovery (TAMCaD) architecture. This framework is capable of capturing interactive relationships. Furthermore, as it produces a causal matrix for every timestep, TAMCaD can also identify contemporaneous relationships. (3) We describe the process of generating synthetic time series data that hold all of these properties. (4) By integrating predictive uncertainty into attention logits and causal contributions (Valdenegro-Toro et al.), we quantify both aleatoric (data-centric) and epistemic (model-centric) uncertainties, paving the way for future research to enhance the precision and interpretability of the identified causal relationships. By reducing the complexity of a TCN using weight-sharing and recurrent layers, we achieve comparable performance, while reducing the number of learnable parameters. While TAMCaD shows the ability to learn interactive relationships, we find that the interpretability of attentions remains a challenge. Our findings further suggest that additive models are adept at identifying the most evident relationships, which currently makes them more robust than our proposed attention-based method. Nonetheless, our findings also suggest that the attention-based approach holds promise for improving temporal causal discovery.

About

Researching causal relationships in time series data using Temporal Convolutional Networks (TCNs) combined with attention mechanisms. This approach aims to identify complex temporal interactions. Additionally, we're incorporating uncertainty quantification to enhance the reliability of our causal predictions.

Topics

Resources

License

Stars

Watchers

Forks