Just a little PoC of a chat app running an LLM locally on Ollama. Just an excuse to have fun with websockets, htmx and Go.
Using:
- The Ollama Chat API
- With the official Go client
- Running the Mistral model locally
- The chat history is stored in-memory and sent to Ollama to have a working context
- htmx for the frontend interactions
- Go's text/template for the HTML templating
- Websocket for the realtime communication
- The htmx WebSockets extension is used to handle the connection, send and receive messages
- Melody for websocket handling on the backend
- TailwindCSS for the styling
- Chat template found on Codepen
- DiceBear for the funny avatars