Stars
🚀 JavaScript diagramming library that uses SVG and HTML for rendering.
Get up and running with Llama 3.3, Phi 4, Gemma 2, and other large language models.
Repository for “PlanRAG: A Plan-then-Retrieval Augmented Generation for Generative Large Language Models as Decision Makers”, NAACL24
LaTeX Thesis Template for the University of Chinese Academy of Sciences
Collection of papers for scalable automated alignment.
LaTeX Proposal Template for the University of Chinese Academy of Sciences
An Easy-to-use, Scalable and High-performance RLHF Framework (70B+ PPO Full Tuning & Iterative DPO & LoRA & RingAttention & RFT)
Implementation of "Describe, Explain, Plan and Select: Interactive Planning with Large Language Models Enables Open-World Multi-Task Agents"
A simple UI for exploring the American Diabetes Association (ADA) 2018 Clinical Practice Guidelines
NexusRaven-13B, a new SOTA Open-Source LLM for function calling. This repo contains everything for reproducing our evaluation on NexusRaven-13B and baselines.
A comprehensive list of papers using large language/multi-modal models for Robotics/RL, including papers, codes, and related websites
A List of Awesome Swift Playgrounds
Data and code accompanying the paper "Reasoning about Goals, Steps, and Temporal Ordering with WikiHow"
A Multi-Turn Dialogue Corpus based on Alpaca Instructions
JARVIS-1: Open-world Multi-task Agents with Memory-Augmented Multimodal Language Models
👑 Easy-to-use and powerful NLP and LLM library with 🤗 Awesome model zoo, supporting wide-range of NLP tasks from research to industrial applications, including 🗂Text Classification, 🔍 Neural Search…
Paddle通用文本分类 UTC(Universal Text Classification)的pytorch实现
[ICML 2024] Official repository for "Language Agent Tree Search Unifies Reasoning Acting and Planning in Language Models"
JsonTuning: Towards Generalizable, Robust, and Controllable Instruction Tuning
A high-throughput and memory-efficient inference and serving engine for LLMs
PyTorch code for "A Graph-Based Neural Model for End-to-End Frame Semantic Parsing" (EMNLP2021)
A large-scale 7B pretraining language model developed by BaiChuan-Inc.
A tool for extracting plain text from Wikipedia dumps