-
-
Notifications
You must be signed in to change notification settings - Fork 12k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] Azure OpenAI o1-mini not support? #5885
Comments
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. |
Hey @YizhixiaoMoJun! I'm here to assist with bugs, questions, and contributions. Let's tackle this together while we wait for a human maintainer. The Azure OpenAI o1-mini model does not support the 'system' role in the configuration. Instead, it requires the system role message to use To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
As a workaround you can use litellm to create a proxy that behaves like OpenAI. That way LobeChat will think it is talking to OpenAI o1/o3 (mini) and it will work. |
@wolph try with the latest version? I think it should be fixed |
Unfortunately not. I think Lobe-Chat assumes that the model name and the deployment name are identical which is not always the case and in this case I cannot modify it. I'm not entirely sure what is currently happening, whenever I press generate/regenerate it just appears to be loading for a long time but never generates any results. The logs don't give any output either. |
Version v1.60.4 has same issue.
|
This issue is closed, If you have any questions, you can comment and reply. |
🎉 This issue has been resolved in version 1.62.2 🎉 The release is available on: Your semantic-release bot 📦🚀 |
📦 Environment
Docker
📌 Version
v1.49.12
💻 Operating System
Ubuntu
🌐 Browser
Chrome
🐛 Bug Description
connect to azure openai o1-mini model failed with this error, while others model (4o, 4o-mini) could work as expected.
the configuration looks like:
AZURE_MODEL_LIST: gpt-4o->gpt-4o=gpt-4o,gpt-4o-mini->gpt-4o-mini=gpt-4o-mini,o1-mini->o1-mini=o1-mini
📷 Recurrence Steps
No response
🚦 Expected Behavior
No response
📝 Additional Information
No response
The text was updated successfully, but these errors were encountered: