Skip to content

CodeRevolutionPlugins/GPT-3-Encoder-PHP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GPT-3-Encoder-Decoder-PHP

PHP BPE Text Encoder/Decoder for GPT-2 / GPT-3

About

GPT-2 and GPT-3 use byte pair encoding to turn text into a series of integers to feed into the model. This is a PHP implementation of OpenAI's original python encoder and decoder which can be found here. The main source of inspiration for writing this encoder was the NodeJS version of this encoder, found here.

You can test the results, by comparing the output generated by this script, with the official tokenizer page from OpenAI.

This specific encoder and decoder is used in the Aiomatic WordPress plugin, to count the number of tokens a string will use when sent to OpenAI API. Check more of my work on my website.

Usage

The mbstring PHP extension is needed for this tool to work correctly (in case non-ASCII characters are present in the tokenized text): details here on how to install mbstring

$prompt = "Many words map to one token, but some don't: indivisible. Unicode characters like emojis may be split into many tokens containing the underlying bytes: 🤚🏾 Sequences of characters commonly found next to each other may be grouped together: 1234567890";

$token_array = gpt_encode($prompt);

$original_text = gpt_decode($token_array);