Papers, code and slides for my session at the live@Manning NLP conference, 2020 covering my talk on Deep Transfer Learning for Natural Language Processing
The intent of this session is to journey through the recent advancements in deep transfer learning for NLP by taking a look at various state-of-the-art models and methodologies. These will include:
- Pre-trained embeddings for Deep Learning Models (FastText with CNNs\Bi-directional LSTMs + Attention)
- Universal Embeddings (Sentence Encoders, NNLMs)
- Transformers (BERT, DistilBERT etc.)
We will also look at the power of some of these models, especially transformers with a couple of hands-on tutorials with code.