-
Ant Group
- Hangzhou, China
Stars
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
Code for the paper "Language Models are Unsupervised Multitask Learners"
Graph Neural Network Library for PyTorch
中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
Python package built to ease deep learning on graph, on top of existing DL frameworks.
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
【PyTorch】Easy-to-use,Modular and Extendible package of deep-learning based CTR models.
Conditional Transformer Language Model for Controllable Generation
中文预训练XLNet模型: Pre-Trained Chinese XLNet_Large