Skip to content
/ jan Public
forked from janhq/jan

Jan is an open source alternative to ChatGPT that runs 100% offline on your computer. Multiple engine support (llama.cpp, TensorRT-LLM)

License

Notifications You must be signed in to change notification settings

1043-nw/jan

Repository files navigation

Jan - Self-Hosted AI Platform

janlogo

GitHub commit activity Github Last Commit Github Contributors GitHub closed issues Discord

Getting Started - Docs - Changelog - Bug reports - Discord

⚠️ Jan is currently in Development: Expect breaking changes and bugs!

Jan is a self-hosted AI Platform to run AI in the enterprise. Easy-to use for users, and packed with useful organizational and security features.

We help you run AI on your own hardware, with 1-click installs for the latest models. Jan runs on a wide variety of hardware: from consumer grade Mac Minis, to datacenter-grade Nvidia H100s.

Jan can also connect to the latest AI engines like ChatGPT, with a security policy engine to protect your organization from sensitive data leaks.

Jan is free, source-available, and fair-code licensed.

Demo

👋 https://cloud.jan.ai

Jan Web GIF

Features

Self-Hosted AI

  • Self-hosted Llama2 and LLMs
  • Self-hosted StableDiffusion and Controlnet
  • 1-click installs for Models (coming soon)

3rd-party AIs

  • Connect to ChatGPT, Claude via API Key (coming soon)
  • Security policy engine for 3rd-party AIs (coming soon)
  • Pre-flight PII and Sensitive Data checks (coming soon)

Multi-Device

  • Web App
  • Jan Mobile support for custom Jan server (in progress)
  • Cloud deployments (coming soon)

Organization Tools

  • Multi-user support
  • Audit and Usage logs (coming soon)
  • Compliance and Audit policy (coming soon)

Hardware Support

  • Nvidia GPUs
  • Apple Silicon (in progress)
  • CPU support via llama.cpp (in progress)
  • Nvidia GPUs using TensorRT (in progress)

Documentation

👋 https://docs.jan.ai (Work in Progress)

Installation

⚠️ Jan is currently in Development: Expect breaking changes and bugs!

Step 1: Install Docker

Jan is currently packaged as a Docker Compose application.

Step 2: Clone Repo

git clone https://github.com/janhq/jan.git
cd jan

# Pull latest submodules
git submodule update --init --recursive

Step 3: Configure .env

We provide a sample .env file that you can use to get started.

cp sample.env .env

You will need to set the following .env variables

# TODO: Document .env variables

Step 4: Install Models

Note: This step will change soon with Nitro becoming its own library

We recommend that Llama2-7B (4-bit quantized) as a basic model to get started.

You will need to download the models to the jan-inference/llms/models folder.

cd jan-inference/llms/models

# Downloads model (~4gb)
# Download time depends on your internet connection and HuggingFace's bandwidth
wget https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML/resolve/main/llama-2-7b-chat.ggmlv3.q4_1.bin 

Step 5: docker compose up

Jan utilizes Docker Compose to run all services:

docker compose up
docker compose up -d # Detached mode

The table below summarizes the services and their respective URLs and credentials.

Service Container Name URL and Port Credentials
Jan Web jan-web-* http://localhost:3000 Set in conf/keycloak_conf/example-realm.json
- Default Username / Password
Hasura (Backend) jan-graphql-engine-* http://localhost:8080 Set in conf/sample.env_app-backend
- HASURA_GRAPHQL_ADMIN_SECRET
Keycloak (Identity) jan-keycloak-* http://localhost:8088 Set in .env
- KEYCLOAK_ADMIN
- KEYCLOAK_ADMIN_PASSWORD
Inference Service jan-llm-* http://localhost:8000 Set in .env
PostgresDB jan-postgres-* http://localhost:5432 Set in .env

Step 6: Configure Keycloak

Step 7: Use Jan

  • Launch the web application via http://localhost:3000.
  • Login with default user (username: username, password: password)

Step 8: Deploying to Production

  • TODO

About Jan

Jan is a commercial company with a Fair Code business model. This means that while we are open-source and can used for free, we require commercial licenses for specific use cases (e.g. hosting Jan as a service).

We are a team of engineers passionate about AI, productivity and the future of work. We are funded through consulting contracts and enterprise licenses. Feel free to reach out to us!

Repo Structure

Jan comprises of several repositories:

Repo Purpose
Jan AI Platform to run AI in the enterprise. Easy-to-use for users, and packed with useful organizational and compliance features.
Jan Mobile Mobile App that can be pointed to a custom Jan server.
Nitro Inference Engine that runs AI on different types of hardware. Offers popular API formats (e.g. OpenAI, Clipdrop). Written in C++ for blazing fast performance

Architecture

Jan builds on top of several open-source projects:

We may re-evaluate this in the future, given different customer requirements.

Contributing

Contributions are welcome! Please read the CONTRIBUTING.md file for guidelines on how to contribute to this project.

Please note that Jan intends to build a sustainable business that can provide high quality jobs to its contributors. If you are excited about our mission and vision, please contact us to explore opportunities.

Contact

  • For support: please file a Github ticket
  • For questions: join our Discord here
  • For long form inquiries: please email [email protected]

About

Jan is an open source alternative to ChatGPT that runs 100% offline on your computer. Multiple engine support (llama.cpp, TensorRT-LLM)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 75.9%
  • Python 19.3%
  • JavaScript 2.1%
  • SCSS 1.6%
  • Makefile 0.6%
  • Dockerfile 0.4%
  • Other 0.1%