Note: This codebase is a weekend project e.g., it's a bit messy right now and will be cleaned up gradually
Mistral.AI offers an API with five LLM models:
- Open Mistral 7B: Mistral 7B parameter model
- Open Mistral Mixtral-8x7B: MOE Model
- Mistral Small: Improved version of Open Mistral Mixtral-8x7B - API access only
- Mistral Medium: Medium Size Model - API access only
- Mistral Large: Largest model available - API access only
Whereas there are multiple ways to access the first two models (via various providers or by downloading the models from HuggingFace) the only current way to run Mistral Small, Medium or Large is via their API.
The Python code in this project uses the Mistral API to let you access all five of the Mistral models via a chat interface.
There are two versions of a chat interface:
python mistral_api_cli.py
: Command linepython mistral_api_web.py
: Web GUI
Before running either Python script, make sure you have the following prerequisites:
- Python 3.9 or higher.
pip
for installing dependencies.
-
Clone the Repository
git clone https://github.com/rogerkibbe/simple-mistral-api-client cd simple-mistral-api-client
-
Install Dependencies
- It's recommended to use a virtual environment:
python -m venv mistralenv source mistralenv/bin/activate # On Windows use `venv\Scripts\activate`
- Install the required packages:
pip install -r requirements.txt
- It's recommended to use a virtual environment:
-
Environment Variables:
- Create a
.env
file in the project's root directory. - Obtain a Mistral API key from MistralAI
- Add your Mistral API key to the
.env
file. Note: If the API key is not set in the.env
file, the script will prompt you to enter it manually.)MISTRAL_API_KEY=your_api_key_here
- Create a
-
Start the Script:
- Run the command line script using the command:
python mistral_api_cli.py
- Run the web script using the command:
panel serve mistral_api_web.py --show
- Run the command line script using the command:
-
Interacting with the command line version:
- Follow the on-screen prompts to select an AI model and start a conversation.
- Type 'stop' to exit the chat client.
-
Interacting with the Web version:
- A browser window will automatically open. If needed, the URL will be shown in the terminal like this:
Bokeh app running at: http://localhost:5006/mistral_api_web
- Choose the model and start your conversation
- Feel free to change the model or parameters during your conversation
- Quit the Python process to exit
- Ensure that your Python version is 3.9 or higher.
- Check if the API key is correctly entered in the
.env
file. - If you encounter any issues or exceptions, please take a look at the error message for more details.
- You can explore more of the Mistral API here
This was a fun weekend project to use Mistral's API. Feel free to make PRs or open issues if you have any suggestions. Improvements and suggestions are always welcome!