Front End Javascript BPE Encoder Decoder for GPT-2 / GPT-3.
The GPT family of models process text using tokens, which are common sequences of characters found in text. The models understand the statistical relationships between these tokens, and excel at producing the next token in a sequence of tokens.
GPT-2 and GPT-3 use byte pair encoding to turn text into a series of integers to feed into the model. This is a front end javascript implementation of Latitude node.js implementation of OpenAI's original python encoder/decoder which can be found here
TBA