-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Change endpoint port to the default llama-server port of 8080? #49
Comments
I think port 8080 is overloaded by many applications using it by default. The default 8012 port runs out of the box using the |
I see, would it make sense to add https://github.com/ggml-org/llama.vim/blob/master/doc/llama.txt#L44C1-L76C43 to the README.md? |
I ran into this and could figure it out after a few minutes of debugging but the main issue for me was that if the host is not reachable the only error message neovim is showing me was a cryptic "job failed exit code 7". Perhaps there is a better way to let users know that the host is not reachable instead and that will solve the usability issues. |
Yes, there should be some better indication. I tried to do it at some point, but it wasn't working correctly, so I disabled it. Btw, how did you mess up the ports? If you use the instructions from the README, it should work correctly. |
I read the instructions and the endpoint in the plugin. But then went 🦧 and typed in port 8013 instead of 8012. |
I think this would help making it work out of the box. Currently it's set to 8012, but the default is 8080
https://github.com/ggml-org/llama.vim/blob/master/autoload/llama.vim#L38C45-L38C49
The text was updated successfully, but these errors were encountered: