Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLama backend is broken #3198

Open
unhighghlow opened this issue Aug 7, 2024 · 1 comment
Open

LLama backend is broken #3198

unhighghlow opened this issue Aug 7, 2024 · 1 comment
Labels
bug Something isn't working unconfirmed

Comments

@unhighghlow
Copy link

unhighghlow commented Aug 7, 2024

LocalAI version:
latest-aio-gpu-nvidia-cuda-12

Environment, CPU architecture, OS, and Version:

$ uname -a
Linux server 6.1.0-21-amd64 #1 SMP PREEMPT_DYNAMIC Debian 6.1.90-1 (2024-05-03) x86_64 GNU/Linux

$ lsmod | grep nvidia
nvidia_uvm           1540096  0
nvidia_drm             77824  0                      
drm_kms_helper        208896  1 nvidia_drm                                  
nvidia_modeset       1314816  2 nvidia_drm                                  
video                  65536  1 nvidia_modeset                              
nvidia              56778752  19 
nvidia_uvm,nvidia_modeset                  drm                   
614400  4 drm_kms_helper,nvidia,nvidia_drm

docker-compose.yml

Describe the bug

All models with llama.cpp as the backend just don't work

To Reproduce

  1. Replicate my setup
  2. Chat with pre-installed llava from the webui
  3. See nothing in the webui
  4. See weird stuff in logs

Expected behavior

I should've received a response

Logs

Here's localai running from start to finish (with me running llava from webui)
localai-log.txt

Additional context

I have wiped /models and ran localai once before recording the log

From what I see, the model successfully loads in llama.cpp, but localai doesn't recognize this and tryes to use a bunch of other backend, untimately arriving at stablediffusion

@unhighghlow unhighghlow added bug Something isn't working unconfirmed labels Aug 7, 2024
@xxfogs
Copy link

xxfogs commented Sep 28, 2024

Any updates?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working unconfirmed
Projects
None yet
Development

No branches or pull requests

2 participants