-
Cursor
- New York City
- amansanger.com
- @amanrsanger
Highlights
- Pro
Stars
Convert any URL to an LLM-friendly input with a simple prefix https://r.jina.ai/
LLM training code for Databricks foundation models
⏰ 🔥 A TCP proxy to simulate network and system conditions for chaos and resiliency testing
Fast, collaborative live terminal sharing over the web
MII makes low-latency and high-throughput inference possible, powered by DeepSpeed.
Minimal Python library to connect to LLMs (OpenAI, Anthropic, Google, Groq, Reka, Together, AI21, Cohere, Aleph Alpha, HuggingfaceHub), with a built-in model performance benchmark.
Utils for streaming large files (S3, HDFS, gzip, bz2...)
microsoft / Megatron-DeepSpeed
Forked from NVIDIA/Megatron-LMOngoing research training transformer language models at scale, including: BERT & GPT-2
A PyTorch repo for data loading and utilities to be shared by the PyTorch domain libraries.
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Segmentation models with pretrained backbones. Keras and TensorFlow Keras.
Top-level domain name registry service on Google Cloud Platform