Skip to content
/ SRe2L Public
forked from VILA-Lab/SRe2L

(NeurIPS 2023 spotlight) Large-scale Dataset Distillation/Condensation, 50 IPC (Images Per Class) achieves the highest 60.8% on original ImageNet-1K val set.

Notifications You must be signed in to change notification settings

Hiter-Q/SRe2L

This branch is 1 commit ahead of, 6 commits behind VILA-Lab/SRe2L:main.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Aug 18, 2024
37d668c · Aug 18, 2024

History

78 Commits
Dec 1, 2023
May 19, 2024
Apr 25, 2024
Jun 29, 2023
Aug 18, 2024

Repository files navigation

Large-scale Dataset Distillation

This is a collection of our work targeted at large-scale dataset distillation.

SCDD : Self-supervised Compression Method for Dataset Distillation .

CDA : Dataset Distillation in Large Data Era, arXiv:2311.18838.

SRe2L (@NeurIPS'23 spotlight): Squeeze, Recover and Relabel: Dataset Condensation at ImageNet Scale From A New Perspective. 111

Citation

@article{yin2023dataset,
  title={Dataset Distillation in Large Data Era},
  author={Yin, Zeyuan and Shen, Zhiqiang},
  journal={arXiv preprint arXiv:2311.18838},
  year={2023}
}
@inproceedings{yin2023squeeze,
  title={Squeeze, Recover and Relabel: Dataset Condensation at ImageNet Scale From A New Perspective},
  author={Yin, Zeyuan and Xing, Eric and Shen, Zhiqiang},
  booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
  year={2023},
}

About

(NeurIPS 2023 spotlight) Large-scale Dataset Distillation/Condensation, 50 IPC (Images Per Class) achieves the highest 60.8% on original ImageNet-1K val set.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.8%
  • Shell 1.2%