Skip to content

Commit

Permalink
Fix typo in BERT notebook
Browse files Browse the repository at this point in the history
Signed-off-by: Rajeev Rao <[email protected]>
  • Loading branch information
rajeevsrao committed Jul 8, 2021
1 parent 718a92f commit eeca567
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion demo/BERT/notebooks/BERT-TRT-FP16.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@
"# BERT QA Inference with TensorRT FP16\n",
"\n",
"\n",
"Bidirectional Embedding Representations from Transformers ([BERT](https://arxiv.org/abs/1810.04805)) is a method of pre-training language representations which obtains state-of-the-art results on a wide array of Natural Language Processing (NLP) tasks. \n",
"Bidirectional Encoder Representations from Transformers ([BERT](https://arxiv.org/abs/1810.04805)) is a method of pre-training language representations which obtains state-of-the-art results on a wide array of Natural Language Processing (NLP) tasks. \n",
"\n",
"BERT provided a leap in accuracy for NLU tasks that brought high-quality language-based services within the reach of companies across many industries. To use the model in production, you need to consider factors such as latency, in addition to accuracy, which influences end user satisfaction with a service. BERT requires significant compute during inference due to its 12/24-layer stacked multi-head attention network. This has posed a challenge for companies to deploy BERT as part of real-time applications until now.\n",
"\n",
Expand Down

0 comments on commit eeca567

Please sign in to comment.