This repository is a fork of openai/swarm that adds support for LiteLLM, enabling integration with 100+ Large Language Models.
Note: Unlike the original swarm project which is experimental, we are actively using this fork in production environments with success. Feel free to try it in your production systems!
- All original features from openai/swarm
- Extended LLM support through LiteLLM integration
- Compatible with 100+ LLM providers including:
- OpenAI
- Anthropic
- Azure
- And many more through LiteLLM's provider ecosystem
- Clone this repository
- Install dependencies
- Configure your LLM provider credentials
- Run your swarm applications with any supported LLM
You can easily use different LLM providers by specifying the model name in your Agent configuration. For example, to use Claude:
from swarm import Swarm, Agent as SwarmAgent
# Initialize an agent with Claude
agent = SwarmAgent(
name="info_collector",
instructions="Your instructions here",
functions=[your_functions],
model="claude-3-5-haiku-20241022" # Specify Claude model
)
Make sure to:
- Set up your Anthropic API key in your environment:
ANTHROPIC_API_KEY
- Install LiteLLM:
pip install litellm
- Configure other provider-specific settings as needed
For other providers, simply use their respective model names as supported by LiteLLM.
For detailed documentation on:
- Original swarm functionality, visit the openai/swarm documentation
- LiteLLM provider setup, check LiteLLM's documentation
We welcome and encourage contributions to this project! Whether it's:
- Adding support for new LLM providers
- Improving documentation
- Fixing bugs
- Adding new features
Feel free to submit a Pull Request or open an issue for discussion.