Stars
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
Aligning pretrained language models with instruction data generated by themselves.
Pre-trained word vectors of 30+ languages
NCRF++, a Neural Sequence Labeling Toolkit. Easy use to any sequence labeling tasks (e.g. NER, POS, Segmentation). It includes character LSTM/CNN, word LSTM/CNN and softmax/CRF components.
YEDDA: A Lightweight Collaborative Text Span Annotation Tool. Code for ACL 2018 Best Demo Paper Nomination.
The codes to the paper "Discourse Representation Structure Parsing"
reproduce "Attention Modeling for Targeted Sentiment"
Borrow openNMT codes to implememt DRS parser for various languages