Use Coze on your favorite OpenAI client.
This project converts the Coze API to the OpenAI API format, giving you access to Coze's LLMs, knowledge base, plugins, and workflows within your preferred OpenAI clients.
- Convert Coze API into an OpenAI API
- Support streaming and blocking
- Support Chatbots API on Coze
Note: Vercel's serverless functions have a 10-second timeout limit.
- Set the environment variable in the .env file
BOT_ID=
- Install dependencies
npm install
- Run the project
npm start
const response = await fetch('http://localhost:3000/v1/chat/completions', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': 'Bearer YOUR_COZE_API_KEY',
},
body: JSON.stringify({
model: 'Coze',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Hello, how are you?' },
],
}),
});
const data = await response.json();
console.log(data);
This project provides some additional configuration items set with environment variables:
Environment Variable | Required | Description | Example |
---|---|---|---|
BOT_ID |
Yes | The ID of the bot. Obtain it from the Develop page URL of your bot in Coze. The number after the bot parameter is the bot ID. | 73428668***** |
Coming Soon
- Image support
- Audio-to-text
- Text-to-audio
- Docker support
- Workflow Bot
- Variables support
Available Now
- Continuous dialogue
- Zeabur & Vercel deployment
- Streaming & Blocking
- Plugins on Coze
Feel free to reach out for any questions or feedback
This project is licensed under the MIT License.