Stars
Model-Based Generative Adversarial Imitation Learning
Code for the paper "Generative Adversarial Imitation Learning"
PyTorch implementation of Advantage Actor Critic (A2C), Proximal Policy Optimization (PPO), Scalable trust-region method for deep reinforcement learning using Kronecker-factored approximation (ACKT…
Inverse RL algorithms (APP, MaxEnt, GAIL, VAIL)
[WWW'23] "SimRec: Graph-less Collaborative Filtering"
The official implementation for "Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks" which is accepted to NeurIPS 2022.
Implementation of Graph Convolutional Networks in TensorFlow
Graph Neural Network Library for PyTorch
Pytorch Repo for DeepGCNs (ICCV'2019 Oral, TPAMI'2021), DeeperGCN (arXiv'2020) and GNN1000(ICML'2021): https://www.deepgcns.org
Code for EvolveGCN: Evolving Graph Convolutional Networks for Dynamic Graphs
A PyTorch implementation of "Cluster-GCN: An Efficient Algorithm for Training Deep and Large Graph Convolutional Networks" (KDD 2019).
Code and resources on scalable and efficient Graph Neural Networks
Example code for Weight Normalization, from "Weight Normalization: A Simple Reparameterization to Accelerate Training of Deep Neural Networks"
Code for "How to Initialize your Network? Robust Initialization for WeightNorm & ResNets"
Write a cross_entropy function in pytorch to remove the abnormal nan value
Reproduce CKA: Similarity of Neural Network Representations Revisited
MATLAB/Octave library for stochastic optimization algorithms: Version 1.0.20
Intel® AI Reference Models: contains Intel optimizations for running deep learning workloads on Intel® Xeon® Scalable processors and Intel® Data Center GPUs
torch-optimizer -- collection of optimizers for Pytorch
Pytorch implementation for "Large-Scale Long-Tailed Recognition in an Open World" (CVPR 2019 ORAL)
Distilling Knowledge via Knowledge Review, CVPR 2021
A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.
Pytorch implementation of various Knowledge Distillation (KD) methods.
Call all Node.js modules directly from DOM/WebWorker and enable a new way of writing applications with all Web technologies.
Wrong project! You should head over to http://github.com/sshuttle/sshuttle