Skip to content

Commit

Permalink
add README about function calling
Browse files Browse the repository at this point in the history
  • Loading branch information
JianxinMa authored Feb 6, 2024
1 parent 221b62a commit 025c528
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -121,6 +121,8 @@ Clone [`llamafile`](https://github.com/Mozilla-Ocho/llamafile), run source insta
## Deployment
Now, Qwen1.5 is supported by multiple inference frameworks. Here we demonstrate the usage of `vLLM` and `SGLang`.

> NOTE: Neither the vLLM nor SGLang APIs currently offer built-in support for **function calling**. If you require function calling capabilities, please refer to the **[Qwen-Agent](https://github.com/QwenLM/Qwen-Agent)** project, which provides a wrapper around these APIs to support function calling.
### vLLM
We advise you to use `vLLM>=0.3.0` to build OpenAI-compatible API service. Start the server with a chat model, e.g. `Qwen1.5-7B-Chat`:
```shell
Expand Down

0 comments on commit 025c528

Please sign in to comment.