Stars
DeepSeek-V2: A Strong, Economical, and Efficient Mixture-of-Experts Language Model
A Bilingual Role Evaluation Benchmark for Large Language Models
Rapid fuzzy string matching in Python using various string metrics
A GPT-4 AI Tutor Prompt for customizable personalized learning experiences.
A large-scale 7B pretraining language model developed by BaiChuan-Inc.
Measuring Massive Multitask Language Understanding | ICLR 2021
Video PreTraining (VPT): Learning to Act by Watching Unlabeled Online Videos
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
The most powerful and modular diffusion model GUI, api and backend with a graph/nodes interface.
ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
TradeMaster is an open-source platform for quantitative trading empowered by reinforcement learning 🔥 ⚡ 🌈
Cramming the training of a (BERT-type) language model into limited compute.
A curated list of the latest breakthroughs in AI (in 2022) by release date with a clear video explanation, link to a more in-depth article, and code.
VITS: Conditional Variational Autoencoder with Adversarial Learning for End-to-End Text-to-Speech
Ongoing research training transformer language models at scale, including: BERT & GPT-2
🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
the Guzheng playing dataset of "A Music-driven Deep Generative Adversarial Model for Guzheng Playing Animation"
AI based multi-label girl image classification system, implemented by using TensorFlow.
Trainable, memory-efficient, and GPU-friendly PyTorch reproduction of AlphaFold 2
Beyond the Imitation Game collaborative benchmark for measuring and extrapolating the capabilities of language models
Generate 8-bit chiptunes with deep learning
Tutel MoE: An Optimized Mixture-of-Experts Implementation
Easy and Efficient Transformer : Scalable Inference Solution For Large NLP model
⭐ Linux / Windows / macOS 跨平台 V2Ray 客户端 | 支持 VMess / VLESS / SSR / Trojan / Trojan-Go / NaiveProxy / HTTP / HTTPS / SOCKS5 | 使用 C++ / Qt 开发 | 可拓展插件式设计 ⭐