Table of Contents
This is an asynchronous client library implementing various RunPod serverless worker job requests. Currently it only implements the worker-vllm backend.
cargo add runpod-client
(Eventually, once the crate goes live on crates.io)
You can find example usage for various backends in src/examples. You can also use the example applications to test your RunPod endpoints. To compile the example applications, enable the chat
and/or diffuse
features.
- Add vLLM serverless support
- Add Stable Diffusion endpoints
- Additional serverless backends
- ?
See the open issues for a full list of proposed features (and known issues).
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
Distributed under the MIT License. See LICENSE.txt
for more information.