-
Nanjing University
- iamzxs.com
Stars
vineyard (v6d): an in-memory immutable data manager. (Project under CNCF, TAG-Storage)
Bottom-up attention model for image captioning and VQA, based on Faster R-CNN and Visual Genome
Code for ICLR 2020 paper "VL-BERT: Pre-training of Generic Visual-Linguistic Representations".
pycorrector is a toolkit for text error correction. 文本纠错,实现了Kenlm,T5,MacBERT,ChatGLM3,Qwen2.5等模型应用在纠错场景,开箱即用。
Chinese GPT2: pre-training and fine-tuning framework for text generation
中文语言理解测评基准 Chinese Language Understanding Evaluation Benchmark: datasets, baselines, pre-trained models, corpus and leaderboard
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
Tensorflow implementation of attention mechanism for text classification tasks.
🔥 TensorFlow Code for technical report: "YOLOv3: An Incremental Improvement"
Unsupervised Word Segmentation for Neural Machine Translation and Text Generation
The implementation of text classification using character level convoultion neural networks using Keras
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
MovieLens based recommender system.使用MovieLens数据集训练的电影推荐系统。
TensorFlow code and pre-trained models for BERT
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Example TensorFlow codes and Caicloud TensorFlow as a Service dev environment.