Stars
Repository hosting code for "Actions Speak Louder than Words: Trillion-Parameter Sequential Transducers for Generative Recommendations" (https://arxiv.org/abs/2402.17152).
12 Weeks, 24 Lessons, AI for All!
Making large AI models cheaper, faster and more accessible
Example models using DeepSpeed
Code and documentation to train Stanford's Alpaca models, and generate the data.
中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
Easy-to-use,Modular and Extendible package of deep-learning based CTR models .