Lists (3)
Sort Name ascending (A-Z)
Stars
🦜🔗 Build context-aware reasoning applications
The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoints, and example notebooks that show how to use the model.
TensorFlow Tutorial and Examples for Beginners (support TF v1 & v2)
Implement a ChatGPT-like LLM in PyTorch from scratch, step by step
Instruct-tune LLaMA on consumer hardware
PyTorch tutorials and fun projects including neural talk, neural style, poem writing, anime generation (《深度学习框架PyTorch:入门与实战》)
This repository contains demos I made with the Transformers library by HuggingFace.
TensorFlow Tutorials with YouTube Videos
My continuously updated Machine Learning, Probabilistic Models and Deep Learning notes and demos (2000+ slides) 我不间断更新的机器学习,概率模型和深度学习的讲义(2000+页)和视频链接
My blogs and code for machine learning. http://cnblogs.com/pinard
Using Low-rank adaptation to quickly fine-tune diffusion models.
Python code for "Probabilistic Machine learning" book by Kevin Murphy
[NeurIPS 2024 Oral][GPT beats diffusion🔥] [scaling laws in visual generation📈] Official impl. of "Visual Autoregressive Modeling: Scalable Image Generation via Next-Scale Prediction". An *ultra-sim…
A course in reinforcement learning in the wild
An annotated implementation of the Transformer paper.
SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
CoTracker is a model for tracking any point (pixel) on a video.
Fault-tolerant, highly scalable GPU orchestration, and a machine learning framework designed for training models with billions to trillions of parameters
OmniGen: Unified Image Generation. https://arxiv.org/pdf/2409.11340
We unified the interfaces of instruction-tuning data (e.g., CoT data), multiple LLMs and parameter-efficient methods (e.g., lora, p-tuning) together for easy use. We welcome open-source enthusiasts…
Benchmarking large language models' complex reasoning ability with chain-of-thought prompting
A Unified Library for Parameter-Efficient and Modular Transfer Learning
Medusa: Simple Framework for Accelerating LLM Generation with Multiple Decoding Heads
[ACL 2024] An Easy-to-use Knowledge Editing Framework for LLMs.
Deep Learning Specialization by Andrew Ng, deeplearning.ai.
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.
An automatic evaluator for instruction-following language models. Human-validated, high-quality, cheap, and fast.
Bottom-up attention model for image captioning and VQA, based on Faster R-CNN and Visual Genome