Skip to content

Issues: OpenCSGs/llm-inference

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

Upgrade ray 2.20.0 enhancement New feature or request
#130 by SeanHH86 was closed May 7, 2024
enable reset generate config on fly enhancement New feature or request
#104 by depenglee1707 was closed Apr 23, 2024
API server startup slow bug Something isn't working
#97 by SeanHH86 was closed May 7, 2024
vllm cannot address "runtime_env"
#87 by depenglee1707 was closed Apr 16, 2024
Model streaming API enhancement
#67 by SeanHH86 was closed Apr 16, 2024
Enhance inference API to support OpenAI style enhancement New feature or request
#52 by SeanHH86 was closed May 7, 2024
Inference throw timeout sometime
#45 by SeanHH86 was closed Mar 25, 2024
Upgrade ray to 2.9.3
#23 by SeanHH86 was closed Mar 28, 2024
ProTip! What’s not been updated in a month: updated:<2025-02-11.