Skip to content
/ jan Public
forked from janhq/jan

Jan is an open source alternative to ChatGPT that runs 100% offline on your computer. Multiple engine support (llama.cpp, TensorRT-LLM)

License

Notifications You must be signed in to change notification settings

Alndaly/jan

Repository files navigation

Jan

Jan is a free, source-available and fair code licensed AI Inference Platform. We help enterprises, small businesses and hobbyists to self-host AI on their own infrastructure efficiently, to protect their data, lower costs, and put powerful AI capabilities in the hands of users.

Features

  • Web, Mobile and APIs
  • LLMs and Generative Art models
  • AI Catalog
  • Model Installer
  • User Management
  • Support for Apple Silicon, CPU architectures

Installation

Pre-Requisites

  • Supported Operating Systems: This setup is only tested and supported on Linux, Macbook Docker Desktop (For mac m1, m2 remember to change Docker platform export DOCKER_DEFAULT_PLATFORM=linux/amd64), or Windows Subsystem for Linux (WSL) with Docker.

  • Docker: Make sure you have Docker installed on your machine. You can install Docker by following the instructions here.

  • Docker Compose: Make sure you also have Docker Compose installed. If not, follow the instructions here.

  • Clone the Repository: Make sure to clone the repository containing the docker-compose.yml and pull the latest git submodules.

    git clone https://github.com/janhq/jan.git
    
    cd jan
    
    # Pull latest submodule
    git submodule update --init
  • Environment Variables: You will need to set up several environment variables for services such as Keycloak and Postgres. You can place them in .env files in the respective folders as shown in the docker-compose.yml.

    cp sample.env .env
    Service (Docker) env file
    Global env .env, just run cp sample.env .env
    Keycloak .env presented in global env and initiate realm in conf/keycloak_conf/example-realm.json
    Keycloak PostgresDB .env presented in global env
    jan-inference .env presented in global env
    app-backend (hasura) conf/sample.env_app-backend refer from here
    app-backend PostgresDB conf/sample.env_app-backend-postgres
    web-client conf/sample.env_web-client

Docker Compose

Jan offers an Docker Compose deployment that automates the setup process.

# Download models
# Runway SD 1.5
wget https://huggingface.co/runwayml/stable-diffusion-v1-5/resolve/main/v1-5-pruned-emaonly.safetensors -P jan-inference/sd/models

# Download LLM
wget https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML/resolve/main/llama-2-7b-chat.ggmlv3.q4_1.bin -P jan-inference/llm/models

Run the following command to start all the services defined in the docker-compose.yml

# Docker Compose up
docker compose up

To run in detached mode:

# Docker Compose up detached mode
docker compose up -d
Service (Docker) URL Credential
Keycloak http://localhost:8088 Admin credentials are set via the environment variables KEYCLOAK_ADMIN and KEYCLOAK_ADMIN_PASSWORD
app-backend (hasura) http://localhost:8080 Admin credentials are set via the environment variables HASURA_GRAPHQL_ADMIN_SECRET in file conf/sample.env_app-backend
web-client http://localhost:3000 Users are signed up to keycloak, default created user is set via conf/keycloak_conf/example-realm.json on keycloak with username: username, password: password
llm service http://localhost:8000

Usage

To get started with Jan, follow these steps:

  1. Install the platform as per the instructions above.
  2. Launch the web application via http://localhost:3000.
  3. Login with default user (username: username, password: password)
  4. Test the llm model with chatgpt session

Developers

Architecture

  • Architecture Diagram

Dependencies

Repo Structure

Jan is a monorepo that pulls in the following submodules

├── docker-compose.yml
├── mobile-client
├── web-client
├── app-backend
├── inference-backend
├── docs                # Developer Docs
├── adrs                # Architecture Decision Records

Live Demo

You can access the live demo at https://cloud.jan.ai.

Common Issues and Troubleshooting

Error in jan-inference service

Contributing

Contributions are welcome! Please read the CONTRIBUTING.md file for guidelines on how to contribute to this project.

License

This project is licensed under the Fair Code License. See LICENSE.md for more details.

Authors and Acknowledgments

Created by jan. Thanks to all contributors who have helped to improve this project.

Support and Contact

For support or to report issues, please email [email protected].

About

Jan is an open source alternative to ChatGPT that runs 100% offline on your computer. Multiple engine support (llama.cpp, TensorRT-LLM)

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • TypeScript 75.9%
  • Python 19.3%
  • JavaScript 2.1%
  • SCSS 1.6%
  • Makefile 0.6%
  • Dockerfile 0.4%
  • Other 0.1%