- SNLP01:绪论
- SNLP02:预备知识
- SNLP03:形式语言与自动机
- SNLP04:语料库与语言知识库
- SNLP05:语言模型
- SNLP06:概率图模型
- SNLP07:自动分词、命名实体识别与词性标注
- SNLP08:句法分析、语义分析、文本分类与情感分类概述
- Lecture01:Introduction and Word Vectors
- Lecture02:Word Vectors and Word Senses
- Lecture03:Neural Networks
- Lecture04:Backpropagation
- Lecture05:Dependency Parsing
- Lecture06:Language Models and RNNs
- Lecture07:Vanishing Gradients, Fancy RNNs
- Lecture08:Translation, Seq2Seq, Attention
- Lecture09:Practical Tips for Projects
- Lecture10:Question Answering
- Lecture11:Convolutional Networks for NLP
- Lecture12:Subword Models
- Lecture13:Contextual Word Embeddings
- Lecture14:Transformers and Self-Attention
- Lecture15:Natural Language Generation
- Lecture16:Coreference Resolution
- Lecture17:Multitask Learning
- Lecture18:Constituency Parsing, TreeRNNs
- Lecture19:Bias in AI
- Lecture20:Future of NLP + Deep Learning
- A Primer on Neural Network Models for Natural Language Processing
- Attention Is All You Need
- BERT Pre-training of Deep Bidirectional Transformers for Language Understanding
- Deep contextualized word representations
- Distributed Representations of Words and Phrases and their Compositionality
- Neural Architectures for Named Entity Recognition