Stars
Structured state space sequence models
Differentiable controlled differential equation solvers for PyTorch with GPU support and memory-efficient adjoint backpropagation.
[AAAI23] This it the official github for AAAI23 paper "Spatio-Temporal Meta-Graph Learning for Traffic Forecasting"
Official implementation for "iTransformer: Inverted Transformers Are Effective for Time Series Forecasting" (ICLR 2024 Spotlight), https://openreview.net/forum?id=JePfAI8fah
A Library for Advanced Deep Time Series Models.
Official repository for the paper "Scalable Spatiotemporal Graph Neural Networks" (AAAI 2023)
Official repository for the paper "Image Deraining Transformer".
Learning A Sparse Transformer Network for Effective Image Deraining (CVPR 2023)
Official repository for the paper "Filling the G_ap_s: Multivariate Time Series Imputation by Graph Neural Networks" (ICLR 2022)
Hackable and optimized Transformers building blocks, supporting a composable construction.
[AAAI2023] A PyTorch implementation of PDFormer: Propagation Delay-aware Dynamic Long-range Transformer for Traffic Flow Prediction.
Graph Neural Networks for Irregular Time Series
Code for "Neural Controlled Differential Equations for Irregular Time Series" (Neurips 2020 Spotlight)
pytorch implementation of Domain-Adversarial Training of Neural Networks
TACTiS-2: Better, Faster, Simpler Attentional Copulas for Multivariate Time Series, from ServiceNow Research
An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730
Reformer, the efficient Transformer, in Pytorch
Code release for "Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting" (NeurIPS 2022), https://arxiv.org/abs/2205.14415
[TKDD 2023] AdaTime: A Benchmarking Suite for Domain Adaptation on Time Series Data
tsl: a PyTorch library for processing spatiotemporal data.
Multivariate Time Series Transformer, public version
Self-supervised contrastive learning for time series via time-frequency consistency
Codebase for Generative Adversarial Imputation Networks (GAIN) - ICML 2018
A PyTorch implementation of the Transformer model in "Attention is All You Need".