-
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
Issues: mudler/LocalAI
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Add the new Multi-Modal model of mistral AI: pixtral-12b
enhancement
New feature or request
roadmap
#3535
opened Sep 12, 2024 by
SuperPat45
Confusing Something isn't working
confirmed
finish_reason
when using max_tokens
property in 'v1/chat/completions' endpoint
bug
#3533
opened Sep 10, 2024 by
daJuels
Cannot select models from dropdown
bug
Something isn't working
unconfirmed
#3493
opened Sep 7, 2024 by
jwaresoft
Identify model by type (tts, text, ...)
enhancement
New feature or request
roadmap
#3488
opened Sep 6, 2024 by
ecyht2
gpu_layers is not effective
bug
Something isn't working
unconfirmed
#3479
opened Sep 3, 2024 by
msameer
Flux GGUF (Replicate, Document, Add to Model)
bug
Something isn't working
unconfirmed
#3447
opened Sep 1, 2024 by
sfxworks
Intel ARC GPU - llama_model_load: can not find preferred GPU platform
bug
Something isn't working
unconfirmed
#3437
opened Aug 30, 2024 by
Xav-v
Completion endpoint does not count tokens when using vLLM backend
area/backends
area/vllm
bug
Something isn't working
python
Pull requests that update Python code
roadmap
#3436
opened Aug 30, 2024 by
ephraimrothschild
gpu + transformers-musicgen
enhancement
New feature or request
#3420
opened Aug 28, 2024 by
dave-gray101
Can't build LocalAI with llama.cpp with CUDA
bug
Something isn't working
unconfirmed
#3418
opened Aug 28, 2024 by
dimazig
Only use 4 CPU threads in P2P worker cluster
bug
Something isn't working
unconfirmed
#3410
opened Aug 26, 2024 by
titogrima
intel igpu not working
bug
Something isn't working
unconfirmed
#3382
opened Aug 26, 2024 by
maxvaneck
change chat colors
enhancement
New feature or request
roadmap
ux
#3381
opened Aug 26, 2024 by
maxvaneck
Ability to get a list of loaded models and unload a model by request
enhancement
New feature or request
#3378
opened Aug 25, 2024 by
Nyralei
integrate whisperX
enhancement
New feature or request
roadmap
#3375
opened Aug 25, 2024 by
hlzhangxt
whisper-diarization
enhancement
New feature or request
roadmap
#3374
opened Aug 25, 2024 by
hlzhangxt
Error message for custom embedding model: 'NoneType' object has no attribute 'tokenize'
bug
Something isn't working
unconfirmed
#3369
opened Aug 24, 2024 by
Ccccx
Can't start LocalAI (with REBUILD) on Xeon X5570 - Unwanted AVX dependency?
bug
Something isn't working
unconfirmed
#3367
opened Aug 24, 2024 by
chris-hatton
Coqui voice cloning support
enhancement
New feature or request
roadmap
#3347
opened Aug 20, 2024 by
SlackinJack
generate image doesn't work.
bug
Something isn't working
unconfirmed
#3261
opened Aug 18, 2024 by
mac01101101
Allow Setting New feature or request
roadmap
original_config_file
for Diffusers Backend
diffusers
enhancement
#3250
opened Aug 15, 2024 by
thiner
Faster Whisper as an additional alternative to Whisper.cpp
enhancement
New feature or request
roadmap
#3219
opened Aug 11, 2024 by
Nyralei
Automatically sync model folder
area/p2p
enhancement
New feature or request
roadmap
#3216
opened Aug 11, 2024 by
mudler
Load by default stablediffusion instead diffusers not working
bug
Something isn't working
unconfirmed
#3206
opened Aug 9, 2024 by
sestren
LLama backend is broken
bug
Something isn't working
unconfirmed
#3198
opened Aug 7, 2024 by
unhighghlow
ProTip!
no:milestone will show everything without a milestone.