Skip to content

DominikD83/orac-interface

 
 

Repository files navigation

Orac Interface

About The Project

Orac Interface Easily chat with your local LLM. The Orac Interface is seamlessly integrated into your workflow; just press Ctrl+Space to start a conversation with your AI. Asking questions to your AI won't interrupt your workflow anymore.

a demo is available on the website.

How To Use

Orac-Interface is currently compatible with macOS and utilizes Ollama for its operation. Follow the steps below to get started:

  1. Download Ollama: Visit https://ollama.com/ to download the Ollama software.
  2. Install a Local LLM: Use the command pull model_name in your terminal to install the local LLM you wish to use.
  3. Verify Ollama is Running: Ensure that the Ollama application is running correctly. You should see the Ollama logo displayed in your top bar.
  4. Download Orac-Interface: Download the .dmg
  5. Launch Orac-Interface: After installation, launch the software and simply press Ctrl+Space to open Orac-Interface from anywhere on your system.

Shortcuts

  1. Ctrl+Space: Open the input
  2. Esc: Hide the input
  3. Shift+Enter: New line

Testing And Building The Project

  • Test: npm run start
  • Build: npm run make

What's Next

  • Linux integration for broader OS support.
  • Multimodal language support, allowing the interface to process images.
  • Eliminate the dependency on Ollama to streamline the installation process.
  • Windows integration to cater to a wider audience.
  • Feature to save LLM responses, enabling easy retrieval for future reference.
  • Allow shortcut modification

Links

Website
Discord

Contributions, feedback, and suggestions are more than welcomed.
You can also reach out to me on twitter if you have any question.

About

an app to seamlessly interact with your llm.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 93.5%
  • CSS 3.4%
  • HTML 2.0%
  • JavaScript 1.1%