Skip to content

Commit

Permalink
Merge pull request janhq#657 from janhq/docs/intro-and-models-spec
Browse files Browse the repository at this point in the history
[docs] Add Introduction and refactor Models Spec
  • Loading branch information
dan-homebrew authored Nov 19, 2023
2 parents 5557695 + 3595bb4 commit b384f58
Show file tree
Hide file tree
Showing 21 changed files with 445 additions and 347 deletions.
File renamed without changes.
3 changes: 3 additions & 0 deletions docs/docs/docs/extensions.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
---
title: Extending Jan
---
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
10 changes: 9 additions & 1 deletion docs/docs/intro/how-jan-works.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,11 @@
---
title: How Jan Works
---
---

- Local Filesystem
- Follow-on from Quickstart to show how things actually worked
- Write in a conversational style, show how things work under the hood
- Check how filesystem changed after each request
- Model loading into RAM/VRAM
- Explain how the .bin file is loaded via Llama.cpp
- Explain how it consumes RAM and VRAM, and refer to system monitor
106 changes: 18 additions & 88 deletions docs/docs/intro/introduction.md
Original file line number Diff line number Diff line change
@@ -1,104 +1,34 @@
---
title: Introduction
slug: /docs
slug: /intro
---

Jan can be used to build a variety of AI use cases, at every level of the stack:
Jan is a ChatGPT-alternative that runs on your own computer.

- An OpenAI compatible API, with feature parity for `models`, `assistants`, `files` and more
- A standard data format on top of the user's local filesystem, allowing for transparency and composability
- Automatically package and distribute to Mac, Windows and Linux. Cloud coming soon
- An UI kit to customize user interactions with `assistants` and more
- A standalone inference engine for low level use cases
Jan uses [open-source AI models](/guide/models), stores data in [open file formats](/specs/data-structures) and is is highly customizable via [extensions](/guide/extensions).

## Resources
Jan ships with an [OpenAI-compatible API](/api) and a powerful [Assistant framework](/guide/assistants) to create custom AIs.

<!-- (@Rex: to add some quickstart tutorials) -->
## Why Jan?

- Create an AI assistant
- Run an OpenAI compatible API endpoint
- Build a VSCode plugin with a local model
- Build a Jan platform module
#### 💻 Own your AI
Jan runs 100% on your own machine, [predictably](https://www.reddit.com/r/LocalLLaMA/comments/17mghqr/comment/k7ksti6/?utm_source=share&utm_medium=web2x&context=3), privately and even offline. No one else can see your conversations, not even us.

## Key Concepts
#### 🏗️ Extensions
Jan ships with a powerful [extension framework](/guide/extensions), which allows developers to extend and customize Jan's functionality. In fact, most core modules of Jan are [built as extensions](/specs/architecture) and use the same extensions API.

### Modules
#### 🗂️ Open File Formats
Jan stores data in a [local folder of non-proprietary files](/specs/data-structures). You're never locked-in and can do what you want with your data with extensions, or even a different app.

Jan is comprised of system-level modules that mirror OpenAI’s, exposing similar APIs and objects
#### 🌍 Open Source
Both Jan and [Nitro](https://nitro.jan.ai), our lightweight inference engine, are licensed via the open source [AGPLv3 license](https://github.com/janhq/jan/blob/main/LICENSE).

- Modules are modular, atomic implementations of a single OpenAI-compatible endpoint
- Modules can be swapped out for alternate implementations
- The default `messages` module persists messages in thread-specific `.json`
- `messages-postgresql` uses Postgres for production-grade cloud-native environments
<!-- ## Design Principles -->

| Jan Module | Description | API Docs |
| ---------- | ------------- | ---------------------------- |
| Chat | Inference | [/chat](/api/chat) |
| Models | Models | [/model](/api/model) |
| Assistants | Apps | [/assistant](/api/assistant) |
| Threads | Conversations | [/thread](/api/thread) |
| Messages | Messages | [/message](/api/message) |
<!-- OpenAI meets VSCode meets Obsidian.
### Local Filesystem
Minimalism: https://docusaurus.io/docs#design-principles. Not having abstractions is better than having the wrong abstractions. Assistants as code. Only including features that are absolutely necessary in the Jan API.
Jan use the local filesystem for data persistence, similar to VSCode. This allows for composability and tinkerability.
File-based: User should be able to look at a Jan directory and intuit how it works. Transparency. Editing things via a text editor, vs. needing a database tool for SQLite.
```sh=
/janroot # Jan's root folder (e.g. ~/jan)
/models # For raw AI models
/threads # For conversation history
/assistants # For AI assistants' configs, knowledge, etc.
```

```sh=
/models
/modelA
model.json # Default model settings
llama-7b-q4.gguf # Model binaries
llama-7b-q5.gguf # Include different quantizations
/threads
/jan-unixstamp-salt
model.json # Overrides assistant/model-level model settings
thread.json # thread metadata (e.g. subject)
messages.json # messages
content.json # What is this?
files/ # Future for RAG
/assistants
/jan
assistant.json # Assistant configs (see below)
# For any custom code
package.json # Import npm modules
# e.g. Langchain, Llamaindex
/src # Supporting files (needs better name)
index.js # Entrypoint
process.js # For electron IPC processes (needs better name)
# `/threads` at root level
# `/models` at root level
/shakespeare
assistant.json
model.json # Creator chooses model and settings
package.json
/src
index.js
process.js
/threads # Assistants remember conversations in the future
/models # Users can upload custom models
/finetuned-model
```

### Jan: a "global" assistant

Jan ships with a default assistant "Jan" that lets users chat with any open source model out-of-the-box.

This assistant is defined in `/jan`. It is a generic assistant to illustrate power of Jan. In the future, it will support additional features e.g. multi-assistant conversations

- Your Assistant "Jan" lets you pick any model that is in the root /models folder
- Right panel: pick LLM model and set model parameters
- Jan’s threads will be at root level
- `model.json` will reflect model chosen for that session
- Be able to “add” other assistants in the future
- Jan’s files will be at thread level
- Jan is not a persistent memory assistant
Participatory: https://www.getlago.com/blog/the-5-reasons-why-we-chose-open-source -->
4 changes: 4 additions & 0 deletions docs/docs/intro/quickstart.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
---
title: Quickstart
---

- Write in the style of comics, explanation
- Similar to why's (poignant) Guide to Ruby
- https://en.wikipedia.org/wiki/Why%27s_(poignant)_Guide_to_Ruby
34 changes: 18 additions & 16 deletions docs/docs/specs/architecture.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,24 @@
title: Architecture
---

- Jan is built using modules
- Plugin architecture (on Pluggable-Electron)

Jan is comprised of system-level modules that mirror OpenAI’s, exposing similar APIs and objects

- Modules are modular, atomic implementations of a single OpenAI-compatible endpoint
- Modules can be swapped out for alternate implementations
- The default `messages` module persists messages in thread-specific `.json`
- `messages-postgresql` uses Postgres for production-grade cloud-native environments

| Jan Module | Description | API Docs |
| ---------- | ------------- | ---------------------------- |
| Chat | Inference | [/chat](/api/chat) |
| Models | Models | [/model](/api/model) |
| Assistants | Apps | [/assistant](/api/assistant) |
| Threads | Conversations | [/thread](/api/thread) |
| Messages | Messages | [/message](/api/message) |

## Concepts

```mermaid
Expand All @@ -23,19 +41,3 @@ graph LR
- Model object
- Thread object
- Built-in tool object

## File system
```sh
janroot/
assistants/
assistant-a/
assistant.json
src/
index.ts
threads/
thread-a/
thread-b
models/
model-a/
model.json
```
68 changes: 67 additions & 1 deletion docs/docs/specs/data-structures.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,69 @@
---
title: Data Structures
---
---


```sh
janroot/
assistants/
assistant-a/
assistant.json
src/
index.ts
threads/
thread-a/
thread-b
models/
model-a/
model.json
```



Jan use the local filesystem for data persistence, similar to VSCode. This allows for composability and tinkerability.

```sh=
/janroot # Jan's root folder (e.g. ~/jan)
/models # For raw AI models
/threads # For conversation history
/assistants # For AI assistants' configs, knowledge, etc.
```

```sh=
/models
/modelA
model.json # Default model settings
llama-7b-q4.gguf # Model binaries
llama-7b-q5.gguf # Include different quantizations
/threads
/jan-unixstamp-salt
model.json # Overrides assistant/model-level model settings
thread.json # thread metadata (e.g. subject)
messages.json # messages
content.json # What is this?
files/ # Future for RAG
/assistants
/jan
assistant.json # Assistant configs (see below)
# For any custom code
package.json # Import npm modules
# e.g. Langchain, Llamaindex
/src # Supporting files (needs better name)
index.js # Entrypoint
process.js # For electron IPC processes (needs better name)
# `/threads` at root level
# `/models` at root level
/shakespeare
assistant.json
model.json # Creator chooses model and settings
package.json
/src
index.js
process.js
/threads # Assistants remember conversations in the future
/models # Users can upload custom models
/finetuned-model
```
2 changes: 1 addition & 1 deletion docs/docs/specs/fine-tuning.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
---
title: "Fine tuning"
title: "Fine-tuning"
---
Todo: @hiro
18 changes: 16 additions & 2 deletions docs/docs/specs/jan.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,17 @@
---
title: Jan
---
title: Jan (Assistant)
---

## Jan: a "global" assistant

Jan ships with a default assistant "Jan" that lets users chat with any open source model out-of-the-box.

This assistant is defined in `/jan`. It is a generic assistant to illustrate power of Jan. In the future, it will support additional features e.g. multi-assistant conversations

- Your Assistant "Jan" lets you pick any model that is in the root /models folder
- Right panel: pick LLM model and set model parameters
- Jan’s threads will be at root level
- `model.json` will reflect model chosen for that session
- Be able to “add” other assistants in the future
- Jan’s files will be at thread level
- Jan is not a persistent memory assistant
Loading

0 comments on commit b384f58

Please sign in to comment.