- chengdu
Stars
A latent text-to-image diffusion model
Instruct-tune LLaMA on consumer hardware
Scripts for fine-tuning Meta Llama with composable FSDP & PEFT methods to cover single/multi-node GPUs. Supports default & custom datasets for applications such as summarization and Q&A. Supporting…
State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterprise-grade infrastructure.
A tiny scalar-valued autograd engine and a neural net library on top of it with PyTorch-like API
Offline speech recognition API for Android, iOS, Raspberry Pi and servers with Python, Java, C# and Node
Using Low-rank adaptation to quickly fine-tune diffusion models.
Hunyuan-DiT : A Powerful Multi-Resolution Diffusion Transformer with Fine-Grained Chinese Understanding
Notebooks using the Hugging Face libraries 🤗
MTEB: Massive Text Embedding Benchmark
骆驼:A Chinese finetuned instruction LLaMA. Developed by 陈启源 @ 华中师范大学 & 李鲁鲁 @ 商汤科技 & 冷子昂 @ 商汤科技
minLoRA: a minimal PyTorch library that allows you to apply LoRA to any PyTorch model.
Training and Fine-tuning an llm in Python and PyTorch.