A wrapper that lets you use the reverse-engineered Python library poe-api
library as if it was the OpenAI API for ChatGPT. You can connect your favorite OpenAI API based apps to this proxy and enjoy the ChatGPT API for free!
- Clone this repository to your local machine:
git clone https://github.com/juzeon/poe-openai-proxy.git
cd poe-openai-proxy/
- Install the python requirements for the
poe-api
library:
pip install -r requirements.txt
- Edit the configuration file according to the instructions in the comments:
vim config.toml
config.toml:
# The port number for the proxy service. The proxied OpenAI API endpoint will be: http://localhost:3700/v1/chat/completions
port = 3700
# A list of poe tokens. You can get them from the cookies on poe.com, they look like this: p-b=fdasac5a1dfa6%3D%3D
tokens = ["fdasac5a1dfa6%3D%3D","d84ef53ad5f132sa%3D%3D"]
# The gateway url for the Python backend of poe-api. Don't change this unless you modify external/api.py
gateway = "http://127.0.0.1:5000"
- Install dependencies from requirements.txt and start the Python backend for
poe-api
:
pip install -r requirements.txt
python external/api.py # Running on port 5000
- Build and start the Go backend:
go build
chmod +x poe-openai-proxy
./poe-openai-proxy
See OpenAI Document for more details on how to use the ChatGPT API.
Just replace https://api.openai.com/v1/chat/completions
in your code with http://localhost:3700/v1/chat/completions
and you're good to go.
Supported parameters:
Parameter | Note |
---|---|
model | It doesn't matter what you pass here, poe will always use gpt-3.5-turbo . |
messages | You can use this as in the official API, except for name . |
stream | You can use this as in the official API. |
Other parameters will be ignored.