Skip to content

YNNEKUW/NLP_Paper_Pool

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

56 Commits
 
 

Repository files navigation

NLP_Paper_Pool

Table of Contents

Dataset

Open Domain QA

Multi-hop QA

Machine Translation

Machine Translation (Non-Autoregressive)

Machine Translation (Low-Resource)

Model Compression

Attention

Transformers

Training Tips for Transformers

Positional Encoding

Char Embedding

-November 2020: CharBERT: Character-aware Pre-trained Language Model

Long Text

Word Sense Disambiguation

Pretraining

Sequence Span Rewriting

Auxiliary Tasks

Special Tokens Across Layers

Sub-modules

Loss

Miscellaneous

Explaination

Rich Answer Type

Optimizer

Text Attribute Transfer

Layer Analysis

Pre-Funetunning

Need to update

  1. An Attention Free Transformer

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published