Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to run with local LLM #67

Open
viegoxu01 opened this issue Nov 19, 2024 · 2 comments
Open

how to run with local LLM #67

viegoxu01 opened this issue Nov 19, 2024 · 2 comments

Comments

@viegoxu01
Copy link

I have run a vllm proxy server with my fine-tuned local llm, and have the URL for the vLLM proxy server. How can I use it within Knowleadge-Table in the same way as OpenAI servers. THX

@daboe01
Copy link

daboe01 commented Nov 21, 2024

ollama-support would be fine.

@WR-CREATOR
Copy link

ollama-support would be fine.

Can you share your fork? I tried using ollama but it gives me a validation error when I run it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants