Warning
Discontinued due to an abundance of similar products. Try Msty, currently the most promising option as of September 2024.
Ollamate: Ollama Desktop Client for Everyday Use
Download (Mac only): https://github.com/humangems/ollamate/releases/latest
Ollamate is an open-source ChatGPT-like desktop client built around Ollama, providing similar features but entirely local. It leverages local LLM models like Llama 3, Qwen2, Phi3, etc. via Ollama, ensuring privacy and offline capability.
- Local LLM Models: Use opensource LLMs locally like Llama 3, Qwen2, and Phi3
- User-Friendly: Simple binary download for end users with Ollama installed (Apple Silicon Mac for now. Windows and Linux later).
- Developer-Friendly: Open-source and ready for contributions on GitHub.
- Ensure you have Ollama installed on your system. Make sure you have at least one model downloaded. (For example, Run
ollama run phi3
) - Download the binary from Github.
- Use it just like a normal app.
- Fork the repository on GitHub.
- Clone your forked repository to your local machine.
- Install dependencies and run the app.
git clone https://github.com/humangems/ollamate.git
cd ollamate
yarn install
yarn dev
- Electron: For building the cross platform desktop application.
- React and Redux: For building the user interface and state management
We welcome contributions from the community. To contribute:
- Fork the repository.
- Create a new branch (
git checkout -b feature-branch
). - Make your changes.
- Commit your changes (
git commit -m 'Add new feature'
). - Push to the branch (
git push origin feature-branch
). - Create a pull request.
This project is licensed under the MIT License. See the LICENSE file for more details.
For any inquiries or support, please open an issue on GitHub.
Thank you for using Ollamate! We hope it enhances your local LLM experience.