Easily chat with your local LLM. The Orac Interface is seamlessly integrated into your workflow; just press Ctrl+Space to start a conversation with your AI. Asking questions to your AI won't interrupt your workflow anymore.
a demo is available on the website.
Orac-Interface is currently compatible with macOS and utilizes Ollama for its operation. Follow the steps below to get started:
- Download Ollama: Visit https://ollama.com/ to download the Ollama software.
- Install a Local LLM: Use the command
pull model_name
in your terminal to install the local LLM you wish to use. - Verify Ollama is Running: Ensure that the Ollama application is running correctly. You should see the Ollama logo displayed in your top bar.
- Download Orac-Interface: Download the .dmg
- Launch Orac-Interface: After installation, launch the software and simply press
Ctrl+Space
to open Orac-Interface from anywhere on your system.
- Ctrl+Space: Open the input
- Esc: Hide the input
- Shift+Enter: New line
- Test: npm run start
- Build: npm run make
- Linux integration for broader OS support.
- Multimodal language support, allowing the interface to process images.
- Eliminate the dependency on Ollama to streamline the installation process.
- Windows integration to cater to a wider audience.
- Feature to save LLM responses, enabling easy retrieval for future reference.
- Allow shortcut modification
Contributions, feedback, and suggestions are more than welcomed.
You can also reach out to me on twitter if you have any question.