可以部署到docker和云函数的OpenAI API代理 Simple proxy for OpenAi api via a one-line docker command
🎉 已经支持SSE,可以实时返回内容
以下英文由GPT翻译。The following English was translated by GPT.
You can deploy ./app.js to any environment that supports nodejs 14+, such as cloud functions and edge computing platforms.
- Copy app.js and package.json to the directory
- Install dependencies with yarn install
- Start the service with node app.js
docker run -p 9000:9000 easychen/ai.level06.com:latest
Proxy address is http://${IP}:9000
- PORT: Service port
- PROXY_KEY: Proxy access key, used to restrict access
- TIMEOUT: Request timeout, default 5 seconds
- Change the domain name/IP (with port number) of the original openai request address (such as https://api.openai.com) to the domain name/IP of this proxy
- If PROXY_KEY is set, add
:<PROXY_KEY>
after the openai key. If not set, no modification is required.
- Only GET and POST method interfaces are supported, and file-related interfaces are not supported.
SSE is not currently supported, so the stream-related options need to be turned offSupported now.
Taking https://www.npmjs.com/package/chatgpt
as an example:
chatApi= new gpt.ChatGPTAPI({
apiKey: 'sk.....:<proxy_key_here>',
apiBaseUrl: "http://localhost:9001/v1", // Replace with proxy domain name/IP
});
- SSE reference to chatgpt-api project related code