This repository provides a PyTorch implementation of the paper "Scalable Infomin Learning", NeurIPS 2022.
We consider learning representation with the following objective:
We show that to minimise
See also the materials: Poster, Slides, Demo. The demo is a minimalist jupyter notebook for trying our method.
- Python 3.5+
- Pytorch 1.12.1
- Torchvision 0.13.1
- Numpy, scipy, matplotlib
We strongly recommend to use conda to manage/update library dependence:
conda install pytorch torchvision matplotlib
Please run the following script to download the PIE dataset (contributed by https://github.com/bluer555/CR-GAN)
bash scripts/download_pie.sh
For fairness experiments, the data is in the /data folder.
at /mi
- Pearson Correlation
- Distance Correlation
- Neural Total Correlation
- Neural Renyi Correlation
- CLUB
- Sliced mutual information
at /tasks
- Fairness
- Disentangled representation learning
- Domain adaptation
If you find our paper / repository helpful, please consider citing:
@article{chen2023scalable,
title={Scalable Infomin Learning},
author={Chen, Yanzhi and Sun, Weihao and Li, Yingzhen and Weller, Adrian},
journal={arXiv preprint arXiv:2302.10701},
year={2023}
}