- United Kingdom
-
11:24
(UTC)
Highlights
- Pro
Stars
Understanding Different Design Choices in Training Large Time Series Models
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
[ICLR 2024] DNABERT-2: Efficient Foundation Model and Benchmark for Multi-Species Genome
Large Concept Models: Language modeling in a sentence representation space
Bringing BERT into modernity via both architecture changes and scaling
A generative world for general-purpose robotics & embodied AI learning.
Dataset and modelling infrastructure for modelling "event streams": sequences of continuous time, multivariate events with complex internal dependencies.
A simple set of MEDS polars-based ETL and transformation functions
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
Simple and efficient pytorch-native transformer text generation in <1000 LOC of python.
KAG is a logical form-guided reasoning and retrieval framework based on OpenSPG engine and LLMs. It is used to build logical reasoning and factual Q&A solutions for professional domain knowledge ba…
Let your Claude able to think
Codebase for reproducing the experiments of the semantic uncertainty paper (short-phrase and sentence-length experiments).
The simplest, fastest repository for training/finetuning medium-sized GPTs.
🚀🚀 「大模型」3小时完全从0训练26M的小参数GPT!🌏 Train a 26M-parameter GPT from scratch in just 3 hours!
Medical Graph RAG: Graph RAG for the Medical Data
SuperPrompt is an attempt to engineer prompts that might help us understand AI agents.
[Nature Reviews Bioengineering🔥] Application of Large Language Models in Medicine. A curated list of practical guide resources of Medical LLMs (Medical LLMs Tree, Tables, and Papers)
A Python toolkit/library for reality-centric machine/deep learning and data mining on partially-observed time series, including SOTA neural network models for scientific analysis tasks of imputatio…
整理开源的中文大语言模型,以规模较小、可私有化部署、训练成本较低的模型为主,包括底座模型,垂直领域微调及应用,数据集与教程等。
A modular graph-based Retrieval-Augmented Generation (RAG) system
Code for 'LLM2Vec: Large Language Models Are Secretly Powerful Text Encoders'