
Tangent is a canvas for exploring AI conversations, treating each chat branch as an experiment you can merge, compare, and discard. It lets you resurrect conversations that hit context limits, pick up abandoned threads, and map the hidden connections between different discussions.
-
🌟 Resurrect & Continue: Seamlessly resume conversations after reaching a prior context limit.
-
🌿 Branch & Explore: Effortlessly create conversation forks at any point to test multiple approaches or ideas.
-
💻 Offline-First: Fully powered by local models, leveraging Ollama with plans to expand support.
-
📂 Topic Clustering: Dynamically organize and filter conversations by their inferred topics, streamlining navigation.
-
📜 Archive Support: Comprehensive compatibility with Claude and ChatGPT data exports, with additional integrations in development.
The idea is to make your interaction with AI assistants more of a visual/textual/audio exploration rather than a plain chat interface. Think less "chat app" and more "thoughts workbench" where you can experiment freely, revive old threads that still have potential, or dive into tangents.
tangent.mp4
- Whisper.cpp (
git clone https://github.com/ggerganov/whisper.cpp
->cd whisper.cpp
->sh ./models/download-ggml-model.sh base.en
->make
->make server && ./server
) - Ollama (project was kinda hardcoded for ollama but can be generalized to accept diff backends)
- Exported Archive Data (from Claude or ChatGPT).
Initialize a new venv (mac):
python3 -m venv my_env
source my_env/bin/activate
Install Python packages:
pip install flask flask-cors scikit-learn numpy pandas hdbscan umap-learn requests
Install Ollama
find the appropriate image for your system here: https://ollama.com/
Verify installation
ollama --version
ollama version is 0.4.4
Download models (embedding + llm)
if you choose to swap these pls see the
Configure local models
section below
ollama pull all-minilm
ollama pull qwen2.5-coder:7b
Start Ollama (download if u don't already have it)
ollama serve
Configure local models:
cd simplified-ab
export EMBEDDING_MODEL="custom-embedding-model"
export GENERATION_MODEL="custom-generation-model"
Then run with:
python3 tsne.py
Or all together:
python3 tsne.py --embedding-model "custom-embedding-model" --generation-model "custom-generation-model"
cd simplified-ui
npm i
npm start
if you get any missing pckg error just manually install it and restart the UI