Simple proxy for OpenAi api via a one-line docker command
以下英文由GPT翻译。The following English was translated by GPT.
docker run -p 9000:9000 easychen/ai.level06.com:latest
The proxy address is http://${IP}:9000.
- PORT: Service port.
- PROXY_KEY: Proxy access key used to restrict access.
- TIMEOUT: Request timeout, default is 5 seconds.
- Change the request address of OpenAI (https://api.openai.com) to the address of this proxy (without a slash).
- If PROXY_KEY is set, add
:<PROXY_KEY>
after the OpenAI key. If not set, no modification is required.
- Only GET and POST method interfaces are supported, and file-related interfaces are not supported.
- SSE is currently not supported, so stream-related options need to be turned off.
Take https://www.npmjs.com/package/chatgpt
as an example.
chatApi= new gpt.ChatGPTAPI({
apiKey: 'sk.....:<proxy_key here>',
apiBaseUrl: "http://localhost:9001", // Pass the proxy address
});
const ret = await chatApi.sendMessage(text, {"onProgress":null}); // Do not implement onProgress, otherwise an error will occur.