Skip to content
/ jan Public
forked from janhq/jan

Jan is an open source alternative to ChatGPT that runs 100% offline on your computer

License

Notifications You must be signed in to change notification settings

mvdt/jan

Repository files navigation

Jan - Self-Hosted AI Platform

GitHub commit activity Github Last Commit Github Contributors GitHub closed issues Discord

Getting Started - Docs - Changelog - Bug reports - Discord

Jan is a self-hosted AI Platform. We help you run AI on your own hardware, giving you full control and protecting your enterprises' data and IP.

Jan is free, source-available, and fair-code licensed.

Demo

👋 https://cloud.jan.ai

Features

Multiple AI Engines

  • Self-hosted Llama2 and LLMs
  • Self-hosted StableDiffusion and Controlnet
  • Connect to ChatGPT, Claude via API Key (coming soon)
  • 1-click installs for Models (coming soon)

Cross-Platform

  • Web App
  • Jan Mobile support for custom Jan server (in progress)
  • Cloud deployments (coming soon)

Organization Tools

  • Multi-user support
  • Audit and Usage logs (coming soon)
  • Compliance and Audit (coming soon)
  • PII and Sensitive Data policy engine for 3rd-party AIs (coming soon)

Hardware Support

  • Nvidia GPUs
  • Apple Silicon (in progress)
  • CPU support via llama.cpp (in progress)

Usage

So far, this setup is tested and supported for Docker on Linux, Mac, and Windows Subsystem for Linux (WSL).

Dependencies

  • Install Docker: Install Docker here.

  • Install Docker Compose: Install Docker Compose here.

  • Clone the Repository: Clone this repository and pull in the latest git submodules.

    git clone https://github.com/janhq/jan.git
    
    cd jan
    
    # Pull latest submodules
    git submodule update --init --recursive
  • Export Environment Variables

export DOCKER_DEFAULT_PLATFORM=linux/$(uname -m)
  • Set a .env: You will need to set up several environment variables for services such as Keycloak and Postgres. You can place them in .env files in the respective folders as shown in the docker-compose.yml.

    cp sample.env .env
    Service (Docker) env file
    Global env .env, just run cp sample.env .env
    Keycloak .env presented in global env and initiate realm in conf/keycloak_conf/example-realm.json
    Keycloak PostgresDB .env presented in global env
    jan-inference .env presented in global env
    app-backend (hasura) conf/sample.env_app-backend refer from here
    app-backend PostgresDB conf/sample.env_app-backend-postgres
    web-client conf/sample.env_web-client

Install Models

wget https://huggingface.co/TheBloke/Llama-2-7B-Chat-GGML/resolve/main/llama-2-7b-chat.ggmlv3.q4_1.bin -P jan-inference/llm/models

Compose Up

Jan uses an opinionated, but modular, open-source stack that comes with many services out of the box, e.g. multiple clients, autoscaling, auth and more.

You can opt out of such services or swap in your own integrations via Configurations.

  • Run the following command to start all the services defined in the docker-compose.yml
# Docker Compose up
docker compose up

# Docker Compose up detached mode
docker compose up -d
  • This step takes 5-15 minutes and the following services will be provisioned:
Service URL Credentials
Web App http://localhost:3000 Users are signed up to keycloak, default created user is set via conf/keycloak_conf/example-realm.json on keycloak with username: username, password: password
Keycloak Admin http://localhost:8088 Admin credentials are set via the environment variables KEYCLOAK_ADMIN and KEYCLOAK_ADMIN_PASSWORD
Hasura App Backend http://localhost:8080 Admin credentials are set via the environment variables HASURA_GRAPHQL_ADMIN_SECRET in file conf/sample.env_app-backend
LLM Service http://localhost:8000

Usage

  • Launch the web application via http://localhost:3000.
  • Login with default user (username: username, password: password)
  • For configuring login theme, check out here

Configurations

TODO

Developers

Architecture

TODO

Dependencies

Repo Structure

Jan is a monorepo that pulls in the following submodules

├── docker-compose.yml
├── mobile-client       # Mobile app
├── web-client          # Web app
├── app-backend         # Web & mobile app backend
├── inference-backend   # Inference server
├── docs                # Developer Docs
├── adrs                # Architecture Decision Records

Common Issues and Troubleshooting

Contributing

Contributions are welcome! Please read the CONTRIBUTING.md file for guidelines on how to contribute to this project.

License

This project is licensed under the Fair Code License. See LICENSE.md for more details.

Authors and Acknowledgments

Created by Jan. Thanks to all contributors who have helped to improve this project.

Contact

For support: please file a Github ticket For questions: join our Discord here For long form inquiries: please email [email protected]

Current Features

  • Llama 7Bn
  • Web app and APIs (OpenAI compatible REST & GRPC)
  • Supports Apple Silicon/CPU & GPU architectures
  • Load balancing via Traefik
  • Login and authz via Keycloak
  • Data storage via Postgres, MinIO

About

Jan is an open source alternative to ChatGPT that runs 100% offline on your computer

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 93.2%
  • JavaScript 3.2%
  • SCSS 2.4%
  • Dockerfile 0.6%
  • Makefile 0.5%
  • Batchfile 0.1%