-
Microsoft
- Redmond, Washington
- https://renll.github.io/
Stars
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Models and examples built with TensorFlow
The world's simplest facial recognition api for Python and the command line
openpilot is an operating system for robotics. Currently, it upgrades the driver assistance system on 275+ supported cars.
Making large AI models cheaper, faster and more accessible
TensorFlow code and pre-trained models for BERT
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
Graph Neural Network Library for PyTorch
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
Fast and memory-efficient exact attention
Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
DALL·E Mini - Generate images from a text prompt
This is an official implementation for "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows".
End-to-End Object Detection with Transformers
RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RN…
🏄 Scalable embedding, reasoning, ranking for images and sentences with CLIP
Clean, minimal, accessible reproduction of DeepSeek R1-Zero
Hackable and optimized Transformers building blocks, supporting a composable construction.
Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
Code for the paper "Jukebox: A Generative Model for Music"
Official PyTorch Implementation of "Scalable Diffusion Models with Transformers"
XLNet: Generalized Autoregressive Pretraining for Language Understanding