Official PyTorch implementation of "Co-Attention Aligned Mutual Cross-Attention for Cloth-Changing Person Re-Identification". (ACCV 2022 Oral)
Qizao Wang, Xuelin Qian, Yanwei Fu, Xiangyang Xue
Fudan University
- Python == 3.8
- PyTorch == 1.12.1
Please download cloth-changing person Re-ID datasets and place them in any path DATASET_ROOT
. Take Celeb-reID as an example:
DATASET_ROOT
└─ Celeb-reID
├── train
├── query
└── gallery
python main.py --gpu_devices 0 --pose_net_path POSE_NET_PATH --dataset celeb --dataset_root DATASET_ROOT --dataset_filename Celeb-reID --save_dir SAVE_DIR --save_checkpoint
--pose_net_path
: replace POSE_NET_PATH
with the path of pretrained HRNet weights (download here)
--dataset_root
: replace DATASET_ROOT
with your dataset root path
--save_dir
: replace SAVE_DIR
with the path to save log files and checkpoints
python main.py --gpu_devices 0 --pose_net_path POSE_NET_PATH --dataset celeb --dataset_root DATASET_ROOT --dataset_filename Celeb-reID --resume RESUME_PATH --save_dir SAVE_DIR --evaluate
--resume
: replace RESUME_PATH
with the path of the saved checkpoint
- Celeb-reID
Backbone | Rank-1 | Rank-5 | mAP |
---|---|---|---|
ResNet-50 | 57.5 | 71.5 | 12.3 |
- LTCC
Backbone | Setting | Rank-1 | mAP |
---|---|---|---|
ResNet-50 | Cloth-Changing | 36.0 | 15.4 |
ResNet-50 | Standard | 73.2 | 35.3 |
You can achieve similar results with released code.
For implementation simplicity, the code in the repository does not support the cloth-changing setting on the LTCC dataset. Please refer to our latest work, FIRe-CCReID or CSSC-CCReID, which includes additional functionality and code for other datasets.
Please cite the following paper in your publications if it helps your research:
@inproceedings{wang2022co,
title={Co-attention aligned mutual cross-attention for cloth-changing person re-identification},
author={Wang, Qizao and Qian, Xuelin and Fu, Yanwei and Xue, Xiangyang},
booktitle={Proceedings of the Asian Conference on Computer Vision},
pages={2270--2288},
year={2022}
}
Any questions or discussions are welcome!
Qizao Wang ([email protected])