Skip to content

PyTorch implementation of our paper "Memory Efficient Data-Free Distillation for Continual Learning", published in Pattern Recognition

Notifications You must be signed in to change notification settings

XiaorongLi-95/DFD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DFD

PyTorch implementation of our paper "Memory Efficient Data-Free Distillation for Continual Learning", published in Pattern Recognition.

Title: Memory Efficient Data-Free Distillation for Continual Learning

Authors: Xiaorong Li, Shipeng Wang, Jian Sun, Zongben Xu

Email: [email protected], [email protected]

Usage

sh scripts/dfd.sh

Prepare Dataset

For 20mini, you can download this dataset at here.

Requirements

Python (3.6)

PyTorch (1.9.0)

Citation

@ARTICLE{Li_2023_pr,
  author={Xiaorong Li and Shipeng Wang and Jian Sun and Zongben Xu},
  journal={Pattern Recognition},
  volumn={144},
  pages={109875},
  year={2023},
  issn={0031-3203}

Acknowledgment

The code is based on Adam-NSCL and Continual-Learning-Benchmark.

About

PyTorch implementation of our paper "Memory Efficient Data-Free Distillation for Continual Learning", published in Pattern Recognition

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published