Stars
Aligning pretrained language models with instruction data generated by themselves.
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Borrow openNMT codes to implememt DRS parser for various languages
Pre-trained word vectors of 30+ languages
Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
The codes to the paper "Discourse Representation Structure Parsing"
NCRF++, a Neural Sequence Labeling Toolkit. Easy use to any sequence labeling tasks (e.g. NER, POS, Segmentation). It includes character LSTM/CNN, word LSTM/CNN and softmax/CRF components.
reproduce "Attention Modeling for Targeted Sentiment"
YEDDA: A Lightweight Collaborative Text Span Annotation Tool. Code for ACL 2018 Best Demo Paper Nomination.