Stars
fay是一个帮助数字人(2.5d、3d、移动、pc、网页)或大语言模型(openai兼容、deepseek)连通业务系统的agent框架。
A latent text-to-image diffusion model
The RedPajama-Data repository contains code for preparing large datasets for training large language models.
A Toolkit to Help Optimize Large Onnx Model
Making large AI models cheaper, faster and more accessible
ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
micronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantization and Training of Neural Networks for Efficient Integer-…
大规模中文自然语言处理语料 Large Scale Chinese Corpus for NLP
The Learning Interpretability Tool: Interactively analyze ML models to understand their behavior in an extensible and framework agnostic interface.
Python for《Deep Learning》,该书为《深度学习》(花书) 数学推导、原理剖析与源码级别代码实现
Reformer, the efficient Transformer, in Pytorch
供应链中台系统基础版,集成零售管理, 电子商务, 供应链管理, 财务管理, 车队管理, 仓库管理, 人员管理, 产品管理, 订单管理, 会员管理, 连锁店管理, 加盟管理, 前端React/Ant Design, 后端Java Spring+自有开源框架,全面支持MySQL, PostgreSQL, 全面支持国产数据库南大通用GBase 8s,通过REST接口调用,前后端完全分离。
Companion code to my O'Reilly book "Flask Web Development", second edition.
Compression of NMT transformer model with tensor methods
State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterprise-grade infrastructure.
An open-source neural machine translation toolkit developed by Tsinghua Natural Language Processing Group
🚀 State-of-the-art parsers for natural language.