You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @nmcbride Thanks for submitting this feature request!
To anyone else interested in this feature, please add a 👍 to the original post at the top to signal that you want this feature, and subscribe if you'd like to be notified.
dannyneira
changed the title
Allow us to specify the LLM endpoint to use for local models or other online ones.
Allow users to specify the LLM endpoint (APIs) for other models
Jan 17, 2025
Discord username (optional)
No response
Describe the solution you'd like?
Allow us to specify the LLM endpoint to use for local models or other online ones.
Would like to point warp to either a local LLM or alternative onlien LLM.
Is your feature request related to a problem? Please describe.
No response
Additional context
No response
How important is this feature to you?
3 (Fairly important)
Warp Internal (ignore) - linear-label:39cc6478-1249-4ee7-950b-c428edfeecd1
None
The text was updated successfully, but these errors were encountered: