Control Color (CtrlColor) achieves highly controllable multimodal image colorization based on stable diffusion model.
Region colorization | Iterative editing |
📖 For more visual results and applications of CtrlColor, go checkout our project page.
- 2024.12.16: The test codes (gradio demo), colorization model checkpoint, and autoencoder checkpoint are now publicly available.
- required packages in
CtrlColor_environ.yaml
# git clone this repository
git clone https://github.com/ZhexinLiang/Control-Color.git
cd Control_Color
# create new anaconda env and install python dependencies
conda env create -f CtrlColor_environ.yaml
conda activate CtrlColor
Please download the checkpoints of both colorization model and vae from [Google Drive] and put both checkpoints in ./pretrained_models
folder.
You can use the following cmd to run gradio demo:
python test.py
Then you will get our interactive interface as below:
If you find our work useful for your research, please consider citing the paper:
@article{liang2024control,
title={Control Color: Multimodal Diffusion-based Interactive Image Colorization},
author={Liang, Zhexin and Li, Zhaochen and Zhou, Shangchen and Li, Chongyi and Loy, Chen Change},
journal={arXiv preprint arXiv:2402.10855},
year={2024}
}
If you have any questions, please feel free to reach out at [email protected]
.