Skip to content

shreemayisonti7/nlp_project

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 

Repository files navigation

Fine-tuning BERT with Distilled Data for Semantic Similarity, Textual Entailment and Word Sense Disambiguation

In this project:

  1. We will create Distilled data for QQP, RTE and WiC superGLUE tasks
  2. Fine-tune BERT with the distilled samples
  3. Compare its performance with models fine-tuned with entire datasets in-terms of accuracy and compute time.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages