Skip to content

Latest commit

 

History

History
10 lines (7 loc) · 693 Bytes

README.md

File metadata and controls

10 lines (7 loc) · 693 Bytes

Front Ende GPT-3-Encoder

Front End Javascript BPE Encoder Decoder for GPT-2 / GPT-3.

About

The GPT family of models process text using tokens, which are common sequences of characters found in text. The models understand the statistical relationships between these tokens, and excel at producing the next token in a sequence of tokens.

GPT-2 and GPT-3 use byte pair encoding to turn text into a series of integers to feed into the model. This is a front end javascript implementation of Latitude node.js implementation of OpenAI's original python encoder/decoder which can be found here

Usage

TBA