-
Baidu
- Beijing, China
- http://www.baidu.com
Stars
[ICML 2024] Mastering Text-to-Image Diffusion: Recaptioning, Planning, and Generating with Multimodal LLMs (RPG)
Open-Sora: Democratizing Efficient Video Production for All
Implementations from the free course Deep Reinforcement Learning with Tensorflow and PyTorch
End-to-End Object Detection with Transformers
[ICLR 2020] Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
The easiest way to use deep metric learning in your application. Modular, flexible, and extensible. Written in PyTorch.
Background Matting: The World is Your Green Screen
Multi Task Vision and Language
Training neural models with structured signals.
Mixture-of-Embeddings-Experts
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard
StyleGAN2 - Official TensorFlow Implementation
Noise2Noise: Learning Image Restoration without Clean Data - Official TensorFlow implementation of the ICML 2018 paper
[ICCV 2019] "DeblurGAN-v2: Deblurring (Orders-of-Magnitude) Faster and Better" by Orest Kupyn, Tetiana Martyniuk, Junru Wu, Zhangyang Wang
深度学习500问,以问答形式对常用的概率知识、线性代数、机器学习、深度学习、计算机视觉等热点问题进行阐述,以帮助自己及有需要的读者。 全书分为18个章节,50余万字。由于水平有限,书中不妥之处恳请广大读者批评指正。 未完待续............ 如有意合作,联系[email protected] 版权所有,违权必究 Tan 2018.06
Deep Plug-and-Play Super-Resolution for Arbitrary Blur Kernels (CVPR, 2019) (PyTorch)
pytorch implementation for "Deep Flow-Guided Video Inpainting"(CVPR'19)
The project is an official implementation of our CVPR2019 paper "Deep High-Resolution Representation Learning for Human Pose Estimation"
Train the HRNet model on ImageNet
[arXiv 2019] "Contrastive Multiview Coding", also contains implementations for MoCo and InstDis
A comprehensive collection of recent papers on graph deep learning
A collection of important graph embedding, classification and representation learning papers with implementations.