Private AI Chat Assistant: Pure Browser, Zero Backend Download and run local LLMs within your browser. preview.webm Live site: https://private-ai-chat.vercel.app Running locally Install dependencies npm install Start web app npm run dev Navigate to http://localhost:5173/ Credits Wllama SmolLm - HuggingFace Llama 3.2 - Meta