Starred repositories
Large-scale open domain KNOwledge grounded conVERsation system based on PaddlePaddle
AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data…
Japanese text normalizer for mecab-neologd
Code for evaluating Japanese pretrained models provided by NTT Ltd.
utanaka2000 / fairseq
Forked from facebookresearch/fairseqFacebook AI Research Sequence-to-Sequence Toolkit written in Python.
repository to research & share the machine learning articles
RaNNC is an automatic parallelization middleware used to train very large-scale neural networks.
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
Full description can be found here: https://discuss.huggingface.co/t/pretrain-gpt-neo-for-open-source-github-copilot-model/7678?u=ncoop57
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Rust bindings for the C++ api of PyTorch.
Well tested & Multi-language evaluation framework for text summarization.
MoverScore: Text Generation Evaluating with Contextualized Embeddings and Earth Mover Distance
データ分析勉強用 advent calendar (https://adventar.org/calendars/2631) リポジトリ
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Unsupervised text tokenizer for Neural Network-based text generation.
This repository demonstrate training T5 transformers using tensorflow 2
Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"
GPT-2 based neural conversational model with TensorFlow.
Transformer Chatbot in TensorFlow 2 with TPU support.
A Japanese NLP Library using spaCy as framework based on Universal Dependencies