You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Aider version: 0.72.3
Python version: 3.11.10
Platform: Windows-10-10.0.26100-SP0
Python implementation: CPython
Virtual environment: No
OS: Windows 10 (64bit)
Git version: git version 2.46.0.windows.1
Aider v0.72.3
Model: deepseek-reasoner with diff edit format, infinite output
Git repo: .git with 269 files
Repo-map: using 4096 tokens, auto refresh
litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed
model=deepseek-reasoner
Pass model as E.g. For 'Huggingface' inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: https://docs.litellm.ai/docs/providers
I have same issue, no output when I try to use --model r1 argument and try to prompt something. However the web based chat (deepseek.com) have issues too.
Aider version: 0.72.3
Python version: 3.11.10
Platform: Windows-10-10.0.26100-SP0
Python implementation: CPython
Virtual environment: No
OS: Windows 10 (64bit)
Git version: git version 2.46.0.windows.1
Aider v0.72.3
Model: deepseek-reasoner with diff edit format, infinite output
Git repo: .git with 269 files
Repo-map: using 4096 tokens, auto refresh
litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed
model=deepseek-reasoner
Pass model as E.g. For 'Huggingface' inference endpoints pass in
completion(model='huggingface/starcoder',..)
Learn more:https://docs.litellm.ai/docs/providers
https://docs.litellm.ai/docs/providers
The text was updated successfully, but these errors were encountered: